Quantcast
Channel: Computer Graphics Daily News
Viewing all articles
Browse latest Browse all 1295

Exclusive: Weta Digital's VFX Supervisor Matt Aitken Talks about Visual Effects of Avengers: Endgame

$
0
0


Matt Aitken is one of Weta Digital’s most experienced Visual Effects Supervisors. He was nominated for the Academy Award® for Best Achievement in Visual Effects, and the BAFTA Film Award for Best Special Visual Effects, both for District 9. He has won multiple Visual Effects Society Awards including Outstanding Visual Effects for his work on Avengers: Infinity War.

Matt has recently finished supervising Weta Digital’s work on Avengers: Infinity War and Avengers: Endgame. And today We're lucky enough to have him to discuss about he works with the team at Weta Digital to create stunning VFX for AVENGERS: ENDGAME

1. How many VFX shots did Weta Digital deliver? How many artist were involved and how long did it take you to complete all shots?
Weta Digital delivered 494 shots for Avengers: Endgame. Our work on the film centred on the 3rd-act battle in the ruins of the Avengers Compound. Our first shots in the film are of Thanos's H-Ship attacking and destroying the compound, and our work goes through to Tony's snap and his death from the effects of the gauntlet.

The core team on the show at Weta Digital was 570 people.
I worked on Endgame for 8 months, from August 2018 until early April 2019. Most of the crew worked on the show for the last 4-6 months of production, coming on to the show in October or November of 2018. We delivered in early April 2019.

2. Recently we featured Weta's inhouse tool, Synapse. How was Synapse used for Endgame?
We used Synapse to simulate the very large-scale explosions when Thanos attacks and destroys the Avengers' compound. Synapse supports distributed simulations where a single complex event can be run on multiple different machines concurrently. This allows for large, highly-detailed volumetric and rigid body simulations to be created within the time constraints of a production schedule. Using Synapse we were able to simulate the dirt, dust, cement, smoke and fire components of these explosions in unison so that all these elements influenced each other, resulting in highly detailed and believable explosions.


3. Could you please share any other inhouse tools that were used? What are the benefits of using proprietary in house tools?

A lot of our pipeline is made up of proprietary tools. We see several advantages to developing our own software, writing our own tools is something that we have always done at Weta Digital going all the way back to the start of the company in the early 1990's. One of the main benefits is that in-house tools give us access to functionality that we simply couldn't get in commercially available software. So we can deliver work at a level of quality that other facilities would find very hard or even impossible to do.

One example of this is our facial animation workflow which includes proprietary tools for solving facial capture data, sculpting facial target sets and polishing facial performance. On Avengers: Endgame we used some new facial tech that has been developed recently at Weta Digital to enhance the Thanos facial performance: a tool called Deep Shapes. Deep Shapes adds another octave of detail to the facial performance without requiring any extra effort from the facial models team or the animator. These extra shapes are derived analytically and applied procedurally. Not a simulation, Deep Shapes adds complexity to the way one expression migrates into the next, but the start and end of the transition aren't affected so the animator still retains control over the shape of the face and the facial performance can't go off-model.


Another benefit of using proprietary in-house tools is that access to the code-base allows us to connect different aspects of the pipeline in highly efficient ways. For example we have developed our own physically-based path-trace renderer at Weta Digital called Manuka, this is our primary production renderer at the facility. We use it in conjunction with a real-time renderer called Gazebo, also developed in house. Because we have access to both the fx simulation work and the renderer at a code level we have been able to establish a very tight coupling between these two pipeline steps, opening up new possibilities in the way we render our fx simulations.

4. Can you name some main software / plugins that your teams use for Modeling, Shading, Simulation, Rendering and Compositing?
We use Autodesk Maya and ZBrush for modelling. Our texture painters use Mari from The Foundry—which was developed in-house at Weta Digital for Avatar before being licensed to The Foundry. Our animators work in Maya also but as mentioned above they are using a lot of in-house tools we have developed as Maya plug-ins. Compositing is done with Nuke from The Foundry, augmented by our own deep compositing workflow. Simulation work happens in Synapse, Houdini or Eddy, a Nuke plugin. Lighting operates within the Foundry's Katana framework but the shaders we develop are specifically relevant to our own Manuka renderer.


5.What is your favourite shot and why?

I'm immensely proud of all the work that Weta Digital delivered on Avengers: Endgame but if I had to single out one sequence as my favourite it would be the portals sequence, when all the heroes arrive on the battlefield through giant Dr Strange portals. The sequence is a combination of complex plate-based shots and very-large scale fully digital shots. It represents a big turning point in the story and I knew while we were working on that sequence that it had the potential to be a huge crowd-pleaser if we pulled it off successfully. There were some specific issues we had to contend with on that sequence. We had created Dr Strange’s portals for the Titan fight in Avengers: Infinity War, combining Houdini particle simulations with a volumetric smoke element. But the portals for Endgame had to work at a much larger scale while still being recognisably the same effect as the portals in the earlier films. We also had to optimise the simulations to support many portals in a single shot. Another aspect of our work on the portals was creating the environments inside the portals. We had to make sure that the terrain inside the portals, which was unique to each portal and independent of the other portal’s terrain, integrated with the crater environment which was common to all the portals, so that the hero armies could walk out smoothly in shots where the camera move revealed the environments both inside and outside the portals in the same world space. The environments inside the portals were all 3D environments rendered through the same camera as the crater environment so everything lined up correctly with parallax. The first time I saw the film with an audience of fans, they cheered loudly right through the portal sequence, that was very satisfying.


6.Which sequence or shot was the most complicated to create and why? How did your team solve the problem?
Probably the single most complex shot in our work on the film is the moment where the two armies charge one another and clash for the first time. There is a shot at that moment that lasts 45 seconds, a very long and complex shot that is primarily CG and takes place in a fully CG environment but also incorporates 3 separate plate elements at various moments through the shot. The shot starts with the camera wide on the two armies coming together, includes Giant Man punching out a Leviathan, and finishes up with Iron Man and Rescue collaborating to blast away at Thanos's army. The shot also includes many complex FX simulations. We took a 'divide and conquer' approach to organising the shot, splitting it up into 5 separate shots in animation so that 5 different animators could create the necessary details of the action. As these 5 'shots within a shot' were being worked up, we regularly ran a process to combine them all into the one master shot to check it was all working. Once the studio had approved the animation we had one of our senior lighters manage the lighting and rendering, then we took a team-based approach again to compositing the shot.


Another complex sequence was the ‘Women of Marvel’ beat. There’s a shot towards the end of the battle where Spidey hands the gauntlet to Captain Marvel and she is joined by pretty much every woman character from the MCU. That was an amazing day to be on set, just to witness the combined acting talent that was present on that day. The only actor who wasn’t able to be there was Tom Holland playing Peter Parker, so we filmed him separately a few weeks later and integrated him into the shot. In the action that follows, the Women of Marvel team up to help Captain Marvel get the gauntlet to the van and the quantum tunnel. Scarlet Witch and Valkyrie collaborate to take out Leviathans; Rescue, Shuri and Wasp join forces to send Thanos tumbling. The ideas for the specific action beats in this sequence were still being developed during the block of additional photography when most of the End Battle was shot, and so shooting for the Women of Marvel beat was limited to general coverage. Marvel eventually approached us in February and asked us to previsualise the action based a short text description of how it could play out. Marvel’s usual previs vendor The Third Floor had wrapped their work on the show and were off on other projects. So our animation team under animation supervisor Sidney Kombo-Kintombo brainstormed the details of the action and worked it up as rough animation which we edited up and sent through to Marvel. It helped that all but a few of the 25 shots in the sequence were full-cg. Marvel’s response to our proposal was very positive, their editor Jeff Ford did a pass on it to tighten it up and then that became our blueprint for finishing the sequence. Because our previs also worked as first-pass animation the process of working the shots through to final was a very direct one.


7. How did you split the workload among your team?
We are very organised here at Weta Digital with over 25 different departments, so artists become highly specialised in one particular discipline which allows for the creation of very high quality work. Animation work was primarily split between two teams under animation sequence supervisors Craig Young and Karl Rapley, with a third splinter team working on the very long clash shot under animation sequence supervisor Simeon Duncombe. Animation Supervisor Sidney Kombo-Kintombo took overall responsibility for all the animation work. Similarly shot work on Endgame was split into two teams of lighters and compositors under sequence visual effects supervisors Sean Noel Walker and Phillip Leonhardt. Each animation team was partnered with one of the shot teams to enable tight communication between animation lighting and comp. We split the work across these two animation and shot teams sequence by sequence as it was turned over to the facility by Marvel, to keep the work balanced throughout the production schedule.


8. What is Weta’s next film to come out and what is your next project?
We currently have the following projects in production here at Weta Digital: Gemini Man, Ad Astra, Lady and the Tramp, Birds of Prey (And the Fantabulous Emancipation of One Harley Quinn), The King's Man, Godzilla vs. Kong, Mulan, Jungle Cruise, The Umbrella Academy 2, Shadow in the Cloud, and the Avatar sequels. My next project isn't listed here, it's title is still under wraps so I can't tell you what it is I'm sorry! But it's going to be awesome, I'm very excited to be working on it.

Thank you very much Matt for your time and great interview, Also special thanks to Vanessa Gray & Amy Minty for great support!
   [full_width] [post_ads 2]

Viewing all articles
Browse latest Browse all 1295

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>