Subscribe to get latest news delivered straight to your inbox


    How the Emmy-Nominated “WandaVision” VFX Team Made Magic

    • 11.08.2021
    • By Hugh Hart
    The Credits

    Laden with special effects, big-name stars, and an audacious high concept, WandaVision represented a big swing for Marvel Studios when it debuted in January on Disney+. The bet paid off. Creator Jac Schaeffer’s series quickly became one of the season’s most talked-about new shows and it’s now validated all that buzz with a whopping 23 Emmy nominations. The hook? Superheroic witch Wanda Maximoff (Elizabeth Olsen) and android Vision (Paul Bettany) disguise themselves as man and wife living sitcom-perfect lives in small-town New Jersey. Juxtaposed against the couples’ seventies-styled retro innocence is a nefarious supernatural scheme that threatens to destroy Wanda and Vision’s safe harbor in the aftermath of 2019’s cataclysmic Avengers: Endgame. Oh, and one more twist—Vision died in Avengers: Infinity War, so his presence in WandaVision was all the more mysterious.

    Helping to jolt crimson-headed Vision from one dimension to the next is Toronto VFX company Monsters Aliens Robots Zombies (MARZ), which earned visual effects nominations for WandaVision as well as Netflix series The Umbrella Academy. Launched in 2018, MARZ uses artificial intelligence to deliver movie-quality effects on TV budgets.

    Visual effects supervisor Ryan Freer and MARZ Chief Operating Officer Matt Panousischecked in with The Credits to talk about Bettany’s chin, Vision’s cape, and other transformational tricks of the computer-generated trade.

    WandaVision VFX Reel — Vision | MARZ from MARZ VFX on Vimeo.

    Congratulations on your Emmy nominations for The Umbrella Academy and especially for WandaVision, which marks the first time you worked for Marvel. How did you get the gig?

    Ryan: We did a test for Marvel doing our version of a shot from Avengers: Age of Ultron, where Vision’s basically being born. Marvel gave us the [background] plate and some assets that had been done already by another vendor and asked us: Can you do this? We’d just done a bunch of head replacement stuff on HBO’s Watchmen so we were able to create the shot to their standard, and that got the ball rolling.

    Matt: The big caveat there is not just “can you do it?” but can you do it on a [shorter] TV timeline and [lower] budget. Marvel’s the epitome of premium episodic television so there was a lot of work that went into it getting the shot where it needed to be.

    How did this sitcom-inspired version of Vision differ from the big screen character?

    Ryan: In the movies, he’s very calm and collected but in our show, Vision does funny slap-sticky things. The director [Matt Shakman] and even Paul Bettany didn’t know if Vision being goofy was going to work. Also, we’ve never seen Vision in black and white, we’ve never seen him in the seventies. These are things we worked really hard with Marvel to perfect.

    Paul Bettany and Elizabeth Olsen in ‘WandaVision.’ Photo courtesy of Marvel Studios. ©Marvel Studios 2020. All Rights Reserved/Disney+

     

    Details are so important in making visual effects seem believable. What are some more subtle aspects of Vision that you guys obsessed over?

    Ryan: One of the little things people don’t notice is that Vision has eyelashes in our show, which he does not have in the movies. Another thing is that Paul Bettany has a very large chin, but Vision has a small chin. We got a lot of notes from Marvel: “Vision’s chin looks too much like Paul’s chin!” When you’re watching the show, you may not see it, but you feel it.

    L-r: Elizabeth Olsen and Paul Bettany in ‘WandaVision.’ Courtesy Marvel Studios/Disney+

     

    People definitely notice each time Paul Bettany’s human-looking character morphs into his true android self. How did you design those visual effects?

    Ryan: In one of the black and white episodes, we did a transformation of Paul going from a synthezoid to a person and sent it to Marvel. They said it looks great but we want it to be cheesy and retro like it’s from the fifties. So we did a couple of versions back and forth and wound up landing on this very glittery I Dreamed of Jeannie kind of thing. It’s funny because usually, you’re not supposed to notice the CG effect, but here, we threw in a visual effect from the era and blended multiple [styles of] visual effects on top of each other.

    There’s also an old-school vibe when we see voltage flickering across Vision’s face. What inspired that look?

    Ryan: In the [1982] movie Tron, they would actually cut some of the film and expose the light behind it to get the effect. That’s the kind of technology they had back then, so we took a lot of reference from that, which was super fun.

    Just to be clear, Vision’s beet-red head is computer generated?

    Ryan: The only thing we’re pulling from Paul’s acting is his eyes, his nose, and his mouth. That’s it. Everything else is CG whenever you see Paul Bettany as Vision, with no ears.

    Paul Bettany as Vision in Marvel Studios’ WandaVision. Courtesy Marvel Studios.

     

    How did you create the digital skin to make the human actor looks like the superhero Vision?

    Ryan: We’d receive footage of Paul Bettany wearing a bald cap, ears sticking out, and he’s got tracking markers all over his face and neck. We remove the markers with an in-house removal system driven by AI, because paintwork, especially track marking removal, can be very costly. Once we have a solid track of that CG head, we align the shoulders so it lines up properly. Then the animators go and create his jaw, his eyebrows, they knock out the ears, they smooth the skin, they’re adding these very fine panels on top of his cheeks and adding a gem on his chin. Everything has to be rock solid because if something starts jittering or not moving with his facial expression, then you lose the performance and that’s the most important part.

    Matt: We’ve made heavy investments in artificial intelligence to get things done faster. AI has ended up saving the client hundreds of thousands of dollars and tons of time, about a day of savings per shot. Multiply that across 400 shots that we did for the show and it adds up to about 400 artist days that are effectively gone.

    Vision likes to levitate. How did you pull that off?

    Ryan: The big episode six Halloween scene, where Vision transforms and flies up into the sky, was probably our most technically difficult shot. The entire ground [showing a nighttime vista of suburban Westview] is a digital map painting. They put Paul in a rig against a green screen all done up in his costume and makeup. When he flies up, the camera rotates around him, but we ended up going full digital-double with the body, which gave us a lot more control. And one of the cool things about Vision is that his cape is entirely CG because a [real] cape has a mind of its own, the way it ripples. You can’t get it to act the way you want.

    Elizabeth Olsen as Wanda and Paul Bettany as Vision in Marvel Studios’ WandaVision. Courtesy Marvel Studios.

     

    I imagine you had an entire team devoted to Vision’s CG cape?

    Ryan: Within our pipeline, we have a department that brings in the [background] plates, we have tracking, layout, animation, effects which is where the cape would be done, a lighting department, compositing. Each department has its own lead, so every small detail is looked at closely.

    Ryan, how did you train to become the guy who supervises everybody’s work?

    Ryan: l wanted to do something in the arts but I was also a computer nerd so I went into computer animation, took a three-year program at Durham College in Ontario, and loved it. Out of school, I did animation, motion graphics, visual effects — I’ve dabbled in everything enough to develop an eye for making things look good and understanding how to not make things look bad basically. I call myself more like a glorified generalist.

    Matt: When Ryan looks at something, he can see things that the artists can’t see.

    Ryan: A lot of it has to do with timing because every shot is based on reality – – until it’s not. Many times I’ll tell my team “That’s moving too fast,” or “It’s too slow.” If it doesn’t look right in the shot, you might have to cheat things for the camera whether it’s based on reality or not.

    This article was originally published in The Credits.

    ABOUT THE AUTHOR

    Hugh Hart

    Hugh Hart has covered movies, television and design for the New York Times, Los Angeles Times, Wired and Fast Company. Formerly a Chicago musician, he now lives in Los Angeles with his dog-rescuing wife Marla and their Afghan Hound.