ILM’s Hal Hickel on the symbiotic relationship between actor and animator

Warcraft3_final

At the recent Trojan Horse was a Unicorn event in Malta, I had the opportunity to sit down with ILM animation supervisor Hal Hickel for a THU TV interview.

We talked about the wealth of CG characters Hickel has overseen which began with live action and motion captured performances, including Davy Jones from the Pirates of the Caribbean films, the Orcs in Warcraft, and Tarkin and K-2SO in Rogue One (in which the original actor playing Tarkin, Peter Cushing, had in fact passed away).

DSC_4815
Hickel (centre) gears up for the THU TV interview. Photo by John Crowcroft.

With before and after images from those films, here’s some of Hickel’s main takes on how he and his team tend to tackle a character where actor and animator need to combine to craft the final result.

DavyJones_plates
When we were gearing up to do Pirates 2, we had a bunch of problems to solve. One of them was, we knew we needed to do body motion capture on location, which is something we at ILM had not done before. We needed to do it in jungles and on ships at sea and on sets, because we didn’t want to capture Bill Nighy’s performance separately on a motion capture stage. We wanted him there with the other actors. And then we had Davy Jones’ beard, which was a massive problem. It’s probably the single most difficult thing we had to do on the show. So, we decided not to tackle facial motion capture, but we opted instead to shoot Bill on-set in a motion capture suit – what we called iMocap are is our version of on-set motion capture.
DavyJones_final
So, we’d filmed him and then the animators would just study the footage of face and keyframe animate Davey’s face. The thing is, it wasn’t just a mechanical process of saying, ‘Oh well this, you know, the mouth corner moved this much, so we’ll move our mouth corner that much.’ You really had to look at it and try and figure out what his intention was as an actor. Sometimes that’s a bit like tasting a stew and trying to figure out what they put in it. When an actor is doing something really subtle and there’s no subtext, really teasing that out and getting it right as you transform it, because that’s the other thing, is it wasn’t a one to one transfer. I mean, if Bill got angry and flared his nostrils, well, Davey doesn’t have a nose. So we had to find other ways to communicate certain things. So there was a translation that had to happen, but the intent was always to preserve exactly what Bill had done and communicate that faithfully.
Warcraft2_capture
On Warcraft, it was definitely our impression that at least some of the actors who had done shows before where they were creating characters using motion capture, that they seemed to have the impression that that was all good and everything, but ultimately later on the visual effects crew was going to just bulldoze over that with animation and obliterate it and kind of do their own thing. So we did a test pretty quickly, just a few weeks into principal photography where we took some early phase capture of Robert Kazinsky and transferred it onto Orgrim.
Warcraft2_final
Even though our Orgrim asset wasn’t quite finished yet, we got a nice looking render with some nice lighting and we took that back to set on a laptop and just went around and showed it to the actors to say, ‘Look, what you’re doing on set is gold and we are going to treat it with kid gloves because the whole idea is to get that from a to b – you will see yourselves in these characters at the end of the process. And I think it was a great comfort to them. I think they felt that was great, like, ‘It actually matters what I do on camera.’
RogueOne_Tarkin_capture
With Rogue One and Tarkin, the actor having passed away introduces a very difficult thing that I don’t think we have all the answers for in terms of our technology and our processes. Because the very hardest thing from my point of view on it was, well, we had a terrific actor – Guy Henry. But Guy doesn’t use his face the way Peter Cushing uses his face. We all use our face differently. He doesn’t smile like him. He doesn’t form the phonemes the same. So while we could get a great performance from Guy and we could apply that to Tarkin and get a realistic looking movement, it lacked Tarkin’s likeness. We had high realism, but we had problems with likeness. It looked like Peter Cushing’s cousin or something. So we’d have to then adjust the motion to the face. The animation team would have to adjust it – if he did a smile, say, to get it to look like a Tarkin smile or a Peter Cushing smile.
RogueOne_Tarkin_final
The problem was if you messed with it too much, of course it would start to feel like you’ve messed with it. It’s very easy to break capture. Even body capture people who’ve worked with it know that it’s sort of an interconnected web of motions. And if you just tweak the hips a little or move this a little, you can break stuff pretty quickly and it starts to look weird and Frankenstein’d together. So we had to find a line. We were trying to chase realism, but we’re also trying to chase likeness. And sometimes we had the sacrifice likeness a little bit to keep it feeling real and it would be a little less Cushing because we just didn’t want to push the motion around that much.
null
We didn’t do facial capture with K-2SO on Rogue One, but Alan Tudyk’s performance, his comic timing, every little choice of how he moved his head and the delivery of his lines – we never messed with his timing. We had to fit the body capture to K-2SO and his posture and everything, but, again, the whole job there was to preserve what Alan had done, not to change what he’d done, especially his timing. We never messed with his time. It was perfect comedy gold.
RogueOne_K2SO
Actors are still at the heart of the process. They’re the foundation on which we build everything else. To me that’s kind of exciting. It’s funny because when motion capture was first coming onto the scene in visual effects, there were a lot of animators who were afraid of it because it took away some of their creative authorship over the work and I think they assumed that pretty soon just everything would be done with motion capture. But in fact it’s provided us with some really creative interesting tasks to build characters where we’re partnering with an actor.

Solo’s old-school hyperspace jump

Solo4

A new clip promoting Solo’s Blu-ray release is out, and it showcases the old-school techniques used to realise the hyperspace sequences on the Millennium Falcon. It includes a description of the technique by ILM’s Rob Bredow. Check out the clip below.

The CGI tidal wave in Snake Eyes that no one got to see

SnakeEyes_wave2
Source: Behance page of Trevor Tuttle, a model maker at ILM on Snake Eyes.

This week is the 20th anniversary of Brian De Palma’s Snake Eyes, a film perhaps not thought of for any major VFX moments. But, in fact, the movie nearly did feature a key CG water sequence in what was still the early days of fluid sims.

This was for the original ending, which involved a hurricane and a tidal wave hitting the Atlantic City boardwalk and killing the film’s villain, played Gary Sinise. ILM was behind the wave simulation and several miniature elements, but the scene was cut after test screening audiences reacted adversely.

Brian De Palma spoke briefly about this original Snake Eyes ending in the 2015 Noah Baumbach and Jake Paltrow documentary, De Palma, which also showed a large portion of the deleted scene.

SnakeEyes_wave1
Source: Behance page of Trevor Tuttle, a model maker at ILM on Snake Eyes.

“My concept was, when you’re dealing with such corruption, you need God to come down and blow it all away,” said De Palma in the documentary, referring to the murder conspiracy in Snake Eyes led by Gary Sinise’s character. “It’s the only way. It’s the only thing that works. That was the whole idea of the wave.”

“And nobody thought it worked,” De Palma added. “So we came up with something else, which I never particularly thought worked as well as the original idea.”

For the tidal wave, ILM – under visual effects supervisor Eric Brevig and associate visual effects supervisor Ed Hirsh – capitalised on earlier work the studio had pioneered in particles for Twister to conceptualise the breaking wave, it smashing into the pier and the immense amount of foam produced.

SnakeEyes_miniature3
Source: Behance page of Trevor Tuttle, a model maker at ILM on Snake Eyes.

Among the Snake Eyes artists at ILM was Habib Zargarpour, who would later go on to be an associate visual effects supervisor for The Perfect Storm where, of course, incredibly elaborate CG fluid sims would be further realised.

Zargarpour told vfxblog that the Snake Eyes tidal wave was modelled and animated to break in a controlling way, and then shaded with a fractal shader. A mix of Softimage and Wavefront’s Dynamation was used to craft the computer graphics. “I’d also learned about fractals from Jimmy Mitchell on The Mask. He had this fractal shader, and he did a little bit of water, and all of a sudden my eyes just popped. I went, ‘Oh, my God, what you can do with this thing?!’ And that became the foundation of a lot of stuff I would do afterwards, in terms of particles work.”

“We messed with the fractals to get a particular look,” added Zargarpour, “just to get the semblance of particles that still look a bit like clouds for the foam. Then we’d try to refine the particles that are left behind, add a little spline to the mid-particles on the bleeding edge. And it would start to get a little more shape out of them.”

SnakeEyes_miniature6
Source: Behance page of Trevor Tuttle, a model maker at ILM on Snake Eyes.

Zargarpour says one thing he particularly remembers discovering on the Snake Eyes tidal wave project was how to make particles not look like dirt and dust, but instead like water. “It was all in how you light it,” he noted. “The key was in pRender, the particle-rendering we had for Twister, where you could cheat the size of the particle from the light POV, from each light. So, the trick for making them look like water was to take the keylight, or backlight, and make the particles look really small from that light’s point of view. That made the light go through and scatter. Otherwise, it’s going to look like chunky ice cream.”

“But if you wanted a rim light from that light’s point of view,” continues Zargarpour, “you could make the particles like giant ice cream cones, and huge tennis balls, and then that would just hit this hard edge and give you a rim.”

Several splash elements were filmed in miniature for the tidal wave sequence, with some ultimately finding their way into parts of the ending that was preserved. However, the pier and theme park were 3D models. “We did this technique, which was basically to turn the model into a soft body,” explained Zargarpour. “When you make a soft body, you also make springs out of the polygon edges, and then how tight those edges are determines how much things stretch or not. So we usually made it pretty tight springs, but then the interconnectivity gets overwhelmed by gravity and turbulence.”

SnakeEyes_miniature2
Source: Behance page of Trevor Tuttle, a model maker at ILM on Snake Eyes.

Although they were not seen, Snake Eyes’ tidal wave shots are part of a long line of ILM’s digital effects sequences involving tsunamis, storms and water sims. Interestingly, a different team worked on the CG water simulations for Deep Impact, released a few months earlier than Snake Eyes (see this vfxblog story with former ILMer Chris Horvath about a particular shot in Deep Impact).

And a final observation: fans of The Abyss might also be familiar with an original tidal wave sequence – produced by ILM with real wave and miniature footage that was both digitally and optically manipulated – that was cut from the 1989 film, but brought back for James Cameron’s special edition version.

Finding ‘Indo’ – how ILM made Fallen Kingdom’s Indoraptor a different dinosaur

dr0180.comp.040945.wroberts_v36.1109

It’s basically psychotic. It doesn’t have a mother. It has no sense of right or wrong and it’s a bit unhinged. – Alex Wuttke, visual effects supervisor, Jurassic World: Fallen Kingdom

The genetically engineered Indoraptor is a new kind of dinosaur introduced in Jurassic World: Fallen Kingdom. For the film’s effects teams, it was the opportunity to explore different creature behaviours, motion and emotion, particularly because ‘Indo’ was, as a result of his creation, somewhat of a neurotic dinosaur.

For on set, Neal Scanlan’s team built and puppeteered practical Indo pieces, while in CG, ILM worked on introducing twitches to the mentally unstable dino, which also had to have the ability to go from a biped to a quadruped. vfxblog sat down with ILM visual effects supervisors David Vickery and Alex Wuttke and animation supervisor Jance Rubinchik to talk through how they ‘found’ their Indoraptor character. Continue reading Finding ‘Indo’ – how ILM made Fallen Kingdom’s Indoraptor a different dinosaur

Fallen Kingdom: the film before the film

Prologue2

Before things start ramping up in Jurassic World: Fallen Kingdom, audiences get a taste of what’s to come thanks to a dramatic prologue sequence. Here, mercenaries visit Isla Nublar and quickly encounter the infamous T. Rex in the main street, and then later a leaping Mosasaurus. The sequence includes both a submarine arrival and an attempted helicopter exit that does not go so smoothly.

Many of the scenes involved fully digital creatures, but several helicopter shots featured either a real aircraft or a ‘buck’ chopper attached to a crane. Visual effects studio Important Looking Pirates was brought on as a partner to ILM for the prologue, its work overseen by visual effects supervisors David Vickery and Alex Wuttke. vfxblog visited Wuttke at ILM in London where he outlined how the sequence was pulled off. Continue reading Fallen Kingdom: the film before the film