Steven Spielberg’s Hook was released 25 years ago this week (it opened on December 11th, 1991) and at the time was one of ILM’s most intensive visual effects projects. VFX supe Eric Brevig oversaw a raft of flying scenes, matte paintings, models and even Go-Motion animation for Tinkerbell’s wings.
A major innovation on the show was the use of a projected matte painting in the sequence showing Peter (Robin Williams) flying towards Neverland – the first time a dimensional matte painting like that had been seen on film.
While attending FMX in the past few years, I’ve been checking in on the Dreamspace project. It was an European Commission project that brought together several entities in computer graphics, VFX and filmmaking to look at virtual production techniques.
These entities included The Foundry, ncam, Stargate Studios, CreW, Saarland University, iMinds and Filmakademie Baden-Württemberg – some pretty big names in the industry.
The project has now been concluded and some of the findings – including a demo of a virtual production app – were just presented at SIGGRAPH Asia by Volker Helzle, Head of Research & Development, Curriculum Coordinator Technical Director Course at Filmakademie.
“Dreamspace has two major strands,” Helzle told me here at SIGGRAPH Asia. “One is typical virtual production for visual effects and filmmaking in general, with a very strong focus on collaboration. How can we creatively collaborate on set and do changes in real-time but maintain established pipelines, so we can do these changes in a real-time environment and then go back into final post-production and apply fixes or do the final 10 per cent.”
“The second strand was immersive experiences – how can we use those technologies to tell stories in a different way and immerse people in them? It’s been a very interesting combination of different skillsets and different partners bringing different skillsets.”
After a couple of years of research, which included areas of real-time rendering, depth matting, VR, immersive experiences and other virtual production techniques, a number of prototypes have been made available for the community to check out.
One is the virtual production editing tools (VPET) developed by Animationsinstitut of Filmakademie Baden-Württemberg – it’s essentially a tablet app that can be used to consider set layout, lighting and animation. The app relies on Unity and includes a KATANA plug-in for lookdev and lighting.
You can see VPET in action in the video below.
For more information on what is available out of the Dreamspace project, check out their downloads page.
After pioneering the development of CG characters at ILM on The Abyss, Terminator 2: Judgment Day and Jurassic Park, Mark Dippé took on the directing duties for Spawn, based on the comic by Todd McFarlane. He was joined on the production by vfx supervisor and ILM ‘partner-in-crime’ Steve ‘Spaz’ Williams. The New Line film was released on August 1st, 1997 and contained over 400 vfx shots – a huge amount at the time – that were completed by 22 companies, with ILM as the lead vendor.
Spawn was a much anticipated film, made at a time before the explosion in comic book movies. It was a tough shoot for the first time feature film director, and an ambitious production in terms of its visual effects. Dippé and Williams are speaking this week, with Scott Ross, at SIGGRAPH Asia about their work on Terminator 2. In the spirit of looking back and key visual effects projects, vfxblog spoke to them briefly about the challenges of bringing Spawn to the screen. Continue reading The struggles – and successes – of Spawn
I had a chance to chat to the vfx supes, animation supervisor and other vfx artists on Fantastic Beasts and Where to Find Them. Here’s my breakdown for Cartoon Brew. Sorry but I just love the Niffler so much.
Back in the late 90s I was still at university and I stumbled upon the SIGGRAPH Video Review. One piece I watched was Paul Debevec’s The Campanile Movie where he’d used image-based modeling and rendering techniques to do virtual camera moves on a bell tower at the UC Berkeley campus. It was also an example of photogrammetry. I was absolutely fascinated. Then The Matrix came out and used kinda similar techniques during the bullet-time sequences. I remembered all that vividly. At fxguide, Paul’s work on the light stages and then with virtual humans was a regular point of discussion. So I am stoked to present an interview I’ve done with Paul about his range of research that has influenced VFX, CG, VR and animation. He’s also speaking next week at SIGGRAPH Asia (where I’ll be, too). Here is the interview at Cartoon Brew.