Rogue One’s Random Cam, and other previs fun

ttf_rogue_205_ri1854_previs

Now that we’ve all seen Rogue One five times and deconstructed the storyline, the characters and the visual effects, it’s fun to also consider how some of the iconic shots came to be. Of course, ILM and Lucasfilm’s art department crafted many of the incredible concepts, but some of the first shot designs were done by the previs team at The Third Floor.

The Third Floor also helped establish ways for visualizing and planning the shoot, and choreographed incredibly detailed scenes such as the third act battles, thanks in part to a tool called Random Cam. The Third Floor previs/postvis supervisor Barry Howell takes vfxblog through the work.

vfxblog: Previs has become such an important tool to help filmmakers have a blueprint about what to shoot and for vfx studios to tackle shots. But what were the very first things The Third Floor did on Rogue One to get started on such a huge project, especially when you have a seemingly clear slate?

Barry Howell: We were thrilled to join the project, of course because it was a Star Wars production and also because myself along with The Third Floor’s other five founders met while working as previs artists on the “third floor” of Skywalker Ranch on Star Wars Episode III: Revenge of the Sith. So there was a bit of legacy there.

We had created the previs for Godzilla and were especially excited to support director Gareth Edwards again in realizing the creative ideas he and the producers at Lucasfilm had for the movie. Our first phase of work began in the art department at the Presidio, working with Production Designer Doug Chiang. The majority of that time was centered on building assets and environments for previs that reflected the concept art Doug and team were creating.

ttf_rogue_301_ds0030_previs

While this was happening, we kicked off a satellite team from The Third Floor London office, led by my co-supervisor Margaux Durand-Rival, at Pinewood. As things were being prepped to start previs sequence-building, I had the team create some animation cycles and vignettes of action based on ideas Gareth had passed to me that would later help us populate some of the bigger scenes, such as the epic ground battle at the end of the movie, very quickly.

I joined the Pinewood team once production fully shifted to the UK and we began developing previs for multiple sequences. Asset creation continued for quite some time, longer than most projects, mainly due to the sheer volume of unique creatures, vehicles and environments that needed to be created. In addition, whenever new designs were available we would ingest assets from ILM and the art department so that the previs reflected the latest and greatest that Gareth had approved.

In addition to creating shots, we also had the opportunity to do some look development and test studies for the big Death Star test firing moment. This included creating diagrams showing possible blast radiuses and potential waves of destruction from a variety of locations in space and from the surface.

vfxblog: Can you talk about the 360 environments used on set to help with lighting and what TTF’s role in these were?

Barry Howell: We worked closely with ILM, including Visual Effects Supervisors John Knoll and Mohen Leo, to create content for the LED screens. The LED setup presented a truly unique opportunity to use previs in a whole new way. Gareth wanted to immerse the actors into virtual backdrops at times when the script called for them to be inside any of the numerous cockpits, such as the U-Wing or X-Wings. Rather than looking at blue or green screen, the performers could react to the previs environments that would cast lighting and temperature cues similar to what the final visual backdrops would be. This was useful in lighting the performers more realistically and would aid in integrating them with the final effects.

ttf_rogue_240_tt2000_previs

We started by adding detail to the texture and geometry of our pre-existing previs environments. We then lit them according to what was needed for each setup, using a plugin to generate large-scale 360-degree spherical images straight from Maya’s Viewport 2.0 display.

From there, ILM would bring the footage into TouchDesigner, calibrating the images to optimize for brightness. These were then played back on set and sometimes combined with other elements, such as explosions or incoming laser fire. All of this was triggered via a tablet operated by John Knoll. The first time we walked onto set and saw the big screens playing out one of the previs renders, he came over to show us this setup and with a push of a button sent the entire room into hyperspace. Simply incredible!

vfxblog: What were some of the ways the director could use previs on set, especially in terms of using it to find different POVs? What tools were used for this? What did you deliver to make these possible?

Barry Howell: One of the things Gareth said he liked about previs was that it gives him a solid guide to reference when he gets to set and it allows him to show other departments some of his ideas in motion to bring everyone on the same page. But at the same time he also likes being able to go rogue with the camera and explore other angles while on set that may not have been visualized by drawing boards or animating previs.

To help with this, we created a new tool called Random Cam. With this, we could take any previs shot, select a few points of interest, prioritize them and set a boundary area for the camera to move around in. From there, the tool could auto-generate hundreds of new and unique viewpoints based on the criteria we had established. Some angles produced by the tool would be complete rubbish, but there were always others that Gareth would like.

ttf_rogue_opb0675_previs

vfxblog: Can you talk about the previs specifically for Jyn’s ship escaping from Jedha? What were the important beats here, and how were different and complex parts of that scene, such as the mass of debris, realized?

Barry Howell: The Destruction of Jedha was one of the most fun scenes to work on, mainly because the director gave us amazing latitude to explore some of our own ideas. He provided some concept pieces that he really liked to use as a guide for how massive the destruction would be, briefed us on the main beats he was interested in seeing and let us run from there.

The sequence was originally planned for a different type of planet so we did a couple of variations. By the time they were shooting the LED plates, we adjusted the previs to match the desert planet setting and rendered a 360 of debris falling around the camera for them to see how that would possibly cast shadows on the actors and interior of the ship as it left. The previs shots we delivered provided a lot of options to experiment with editorially.

ttf_rogue_sdb_0150_previs

vfxblog: The X-Wing attack on the Scarif Base required some complex choreography – can you talk about the virtual cinematography principles here. How did you ensure the sequence was watchable and coherent? Can you also talk about how various assets for this sequence were built and whether you used ILM models/iterations?

Barry Howell: We worked on the rush through the Space Gate early on then, towards the end of the shoot, we worked on developing the “battle” section. Gareth knew that the space battle was going to be epic and fun, but also that it featured substantial digital imagery and he wanted us to concentrate on scenes that involved the actors while they had them. As they got closer to the end of the shoot, he showed us ideas in storyboard form he had designed with Matt Allsopp, his lead concept artist. He asked us to animate a series of actions with multiple squadrons of X-Wings and Y-Wings making attack runs on the dry dock. He wanted us to visualize each of these actions from several POVs and provide that footage to editorial so he could begin fleshing out the structure of the sequence with his editors.

I did a rough graphical overview of the entire battle based on Gareth’s ideas using three-dimensional arrows to indicate where each of the squads were located at different moments. After that, we distributed these “events” to different previs artists to work through and create shots from, which were fed to editorial. We then packaged up our scenes and delivered them to ILM, where they imported them to the virtual camera stage for Gareth to direct.

Images copyright Lucasfilm and courtesy of The Third Floor, Inc.