Quantcast
Channel: Computer Graphics Daily News
Viewing all articles
Browse latest Browse all 1295

Digital Domain: What is the different between Thanos in Avenger: EndGame & Thanos in Avengers: Infinity War?

$
0
0


Digital Domain has created the CGI of Thanos, Nebula and many Visual Effects for AVENGERS: ENDGAME and today CG Record has privilled to talk with DD's Scott Edelstein who has more than 20 years of experience in visual effects to talk about how their team work on the film, Also we have Darren Hendler, Director of DD's Digital Human Group in this stunning exclusive interview.

Could you share with us the role of Digital Domain in Avengers: Endgame?

Scott Edelstein: Of course! Endgame was actually a collaboration with 11 VFX vendors, including Digital Domain, so the Marvel Visual Effects Supervisor, Dan Deleeuw, had a total of 2496 VFX shots in the film. Of those shots, Digital Domain completed about 336, which were mostly of Thanos. We also created some amazing environments, like Titan 2, Nebulas interrogation cell, the prison and an alien planet being invaded by Thanos’ army.

As big as this movie was for Marvel, it was also a huge internal undertaking at Digital Domain. We rendered a total of 24,968,539 frames, used 66,599,777 core hours, and that consumed 967TB on disk.

Which office/department handled most of the work on this project? How did they enjoy the show :)?

Scott Edelstein: The split on “Avengers: Endgame” was about 70/30, with our Vancouver studio handling the bulk. Thanos was a very complicated character to create, so bringing him to life took senior artists from every department working together. We had compositing supervisors, Eric Kasanowski in Vancouver and Michael Melchiorre in LA, working very closely with CG Supervisors, Martin Johansson and Mark Rodahl, to ensure everything went smoothly. There was also a large team rollover from “Avengers: Infinity War” that was excited to help finish telling this story. We started with an experienced crew that was ready to go!

What is the most interesting in the process of creating CGI characters featuring in the movie? Had DD used machine-learning & deep-learning techniques to create the CGI characters?

Scott Edelstein: Ron Miller, the facial modeling supervisor at Digital Domain, says that to him, the most interesting thing is trying to accurately interpret an actor’s performance onto something that is not them, or even human, in this case. Everything from the broad strokes to micro expressions needs to be represented on-screen so that the audience sees the actor's performance come through completely. Digital Domain has been developing and refining our digital human technology under the expert leadership of Darren Hendler ever since “The Curious Case of Benjamin Button.” Since then, we have used machine-learning and proprietary tools to combine an actor’s performance with traditional animation techniques. We’ve applied these techniques in a number of characters, like the pixies in Maleficent and the Beast in Beauty and the Beast.


What is different from CGI Thanos in Infinity War and CGI Thanos in Endgame. How did your team enjoy the process of developing this character?

Scott Edelstein: During the little time we had between Infinity War and Endgame, Thanos went through only a few but significant updates. Learning from our experience on Infinity War, we implemented changes to Thanos' textures, lookdev and rig that wouldn't have been possible mid-show. These updates made the process more efficient but also allowed a more detailed performance to come through. In addition, we needed to create a version of Thanos that had been damaged from the gauntlet at the end of Infinity War and then again to destroy the Infinity Stones. We called this his “double snap” damage. To further sell his injuries Facial Rigging Supervisor, Rickey Cloudsdale, worked closely with Animation Supervisor, Phil Cramer, to add new controls that increased the fidelity of the face rig but also added areas of paralysis that would mimic muscle damage.

In Infinity War, Thanos's look was developed in V-Ray. How about the Thanos in Endgame?

Scott Edelstein: Thanos was developed in V-Ray for both films, however, there were some updates to the asset between shows that brought an even greater level of realism to the character. Thanos was also developed in Redshift to utilize GPU rendering for our animation team and anim-viz pipeline.

Could you name some software, plugins that had been used to create CGI characters for this film (Modeling, Shading, Rigging, Rendering, Compositing)?

Scott Edelstein: We use Maya from Autodesk with custom tools, nodes, and scripts for modeling, rigging, animation, look development and lighting. Sculpting was done in Mudbox, also from Autodesk with additional details being added in Z-Brush from Pixologic. Character Effects were built and rigged in Maya but simulated mostly using the Carbon plugin for Houdini from SideFX, while grooms were done using a proprietary Digital Domain tool called Samson. Textures were painted in Mari from The Foundry and Compositing was done in Nuke, which was developed at Digital Domain before being sold to Foundry. All FX were done in Houdini.


Do you mind sharing with us about DD in-house tools like: Atomic, Masquerade which had been used for the movie? Any other in-house tools that also featured in the process?

Darren Hendler:
At Digital Domain, we have a whole ecosystem of custom tools and software applications that help us create the work we do. For all of our hair grooming, we have an inhouse hair system called Samson, which allows us to craft each and every character and digital humans hairstyle hair by hair. We even use Samson to create the fine peach fuzz on an actor's face and, in many cases, the fine fuzz that sits on their wardrobes.

Our facial pipeline consists of a whole suite of custom tools and plugins. The backbone of the facial pipeline is Masquerade and Direct Drive. Masquerade is the software that takes the actor's facial markers and translates it into a realistic 3D actor face, and Direct Drive is our software suite that transfers the performance from the actors 3D face onto our characters. It is these 2 processes that, together, create Josh Brolin’s animated face and then translates it into a moving 3D Thanos face. This produces a great starting point for our animators to refine further.

Lastly, we would love to hear more about DD vision of Digital Human and Digital Human project in DD?

Darren Hendler: At Digital Domain we are constantly working towards creating technologies that will allow us to more easily and quickly translate an actor’s performance to their digital version or to a completely different character they are playing. We are working on newer capture technologies to try and capture even more details from their live performances, so we can create even more realistic digital characters. We are constantly looking at how the face moves, and how we can make it look more realistic, even when a character is animated purely by hand.

We are also pushing much of this technology into real-time, so we can see even faster previews (possibly live on set) of what the final result will look like in real-time while the actor is creating the performance. It is truly amazing what is possible in real-time! We have even been animating pore structures and facial blood flow on a character all in real-time. A lot of this technology is now spilling out to a host of new applications in the Games, AR and technology sectors. We are starting to see a massive demand for all this technology we have developed for films in all these new arenas.

Thank you for your time !
   [post_ads_2][full_width]

Viewing all articles
Browse latest Browse all 1295

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>