The oral history of the Dinosaur Input Device or: how to survive the near death of stop-motion
By Ian Failes
In visual effects lore, it is well-known that the full-motion dinosaurs of Jurassic Park were originally intended as stop-motion puppets animated by Phil Tippett’s Tippett Studio. That is, until a secret ILM test with computer-generated dinosaurs convinced director Steven Spielberg to go with that digital approach, perhaps changing the course of VFX history in the process.
But, determined to stay ‘in the game’ and continue to contribute a rich knowledge of dinosaur movement, Tippett Studio combined with ILM to build the Dinosaur Input Device. This ‘DID’, which would later also be known as the Digital Input Device, was a dino-shaped sensor-covered armature that could translate stop-motion-like input to a CG model, allowing Tippett’s traditionally trained animators to lend their skills to this new wave of digital animation. It also would also ultimately be awarded a Technical Achievement Award from the Academy (presented to Craig Hayes, Brian Knep, Rick Sayre and Tom Williams).
In this new oral history as part of Jurassic Park’s 25th anniversary, vfxblog speaks with some of the original developers and users of the DID to find out how it worked, how it sometimes didn’t work, and where it made a major impact on the film. Plus, there’s a bonus section on the DID’s surprising influence in Tippett Studio’s major headway into CGI on Starship Troopers.
‘I feel extinct’
Phil Tippett (dinosaur supervisor): When I was first sent the script to Jurassic Park, they were talking about doing full-scale animatronics and I thought, that’s never going to happen. It cooked for about a year, and then eventually it was decided to do high speed puppets with go-motion.
Brian Knep (computer graphics software developer, ILM): Go-motion was something Phil had pioneered on Dragonslayer where you would pose a creature in a motion-control rig, and then move to the next pose and it would basically hold both poses and move between them as you open and close the shutter, giving you this nice motion blur that you didn’t get in normal stop-motion.
Above: Senior Animator Randal M. Dutra’s stop-motion animation for Jurassic Park’s “Dinosaur Movement Bible.”
Craig Hayes (computer interface engineer, Tippett Studio): Basically all the full-motion dinosaurs were going to be foam, latex, aluminium and steel armature stop-motion puppets, that would then be shot on bluescreen. And they would be composited and also further motion blurred with computer graphics.
Brian Knep: That was actually the first thing I did on Jurassic Park at ILM, these tests to make the go-motion look less ‘jumpy’ by adding motion blur. It was a bit of experimental project where we took some of the software that was used to do 2D morphing on Willow – MORF – and we created a way where we could take two frames of stop-motion animation and basically create a morph between them. Then we would render that morph at, let’s say, 60 different states midway between the frames and then composite all those to create a fake motion blur.
Craig Hayes: When I saw the first motion blur tests applied to stop-motion animation, I was kind of flabbergasted. It was just really exciting because it just filled in so many of the blanks in ways that even to a certain extent the go-motion stuff didn’t do.
Brian Knep: We got some good results but it was very slow and a pain in the ass. At the same time I was working on that, [CG animator] Steve ‘Spaz’ Williams and [co-visual effects supervisor] Mark Dippe and some of the other folks were working on some secret 3D tests.
[So much great documentation already exists about the CGI beginnings of Jurassic Park that this oral history won’t be dealing with that side of things.]
Phil Tippett: I went down with [visual effects supervisor] Dennis Muren when he presented the T-Rex test to Steven and Steven went, ‘Wow, that’s what we’re going to do,’ and he asked me how I would feel and I said, ‘I feel extinct’. And he said, ‘Oh that’s quirky, I’m going to put that in the movie!’ He had Dr Grant say that line.
Craig Hayes: All of a sudden all the work that we had been putting into gearing up for the stop-motion kind of production just evaporated literally overnight. But somewhere in this mix I think that everybody realised that there was a big difference between an animated dinosaur skeleton – the CGI test – and getting from that to fully fleshed out characters.
Phil Tippett: We had already spent a long time on storyboards and once they were locked, production wanted animatics of the T-rex paddock and the raptor kitchen scene to be laid out. At the same time, we had made a dinosaur bible where we picked the T-rex and raptor and did a walk cycle or a running cycle or behavioural things. Steven wanted them not to perform like monsters but have a palpable sense of being animals. So we had to cross-over between what was going to be computer graphics and practical dinosaurs by Stan Winston. I actually knew probably more about dinosaur motion than anyone else – I knew a bunch of paleos and would hang with them.
Craig Hayes: The other thing was, yes, they’d chosen to go all-CG for these full-motion shots, but, really, the talent that was available to the production from Tippett in terms of stop-motion animators was huge and it couldn’t be ignored.
Tom Williams (supervisor of software and digital technology, ILM): Phil Tippett had these people who just understood animal motion.They would go into the wilderness and draw animals. One of them showed me this beautiful thing, I think it was a mountain lion. I was like, ‘Wow, did you take a picture?’ He was like, ‘No, I sketched it right there.’ You look at it and you could see all the sinew and everything.
Phil Tippett: At the same time there weren’t a lot of CG animators who were trained to do living characters – people had done TV commercials with flying logos and cartoony things, but it’s a whole different world when you’re trying to match your artificial characters in a photographic background plate that’s shot on earth with a certain kind of lighting and specific gravity. You have to really nail and know what you’re doing to have all that stuff fit in.
Craig Hayes: Now, when I was working on RoboCop I designed and built the big robot, so I took the money I got paid and I actually bought my first contemporary computer, which was an Amiga, I think it was an Amiga 1000. This was a chance to play with something that was a little bit more than just a toe into the world of graphics. This is years before they had inverse kinematics and other techniques that really made computer animation more achievable. So I was playing around with potentiometers and making wooden mock-ups and hooking potentiometers up the joints. I really couldn’t take it much further than that because, first of all, the analogue electronics component of this thing were really kind of not trivial and these devices I was using were just pretty cheap and horrible and there were just a lot of roadblocks. So it was something that I wanted to do and kind of was hoping to do at some point, but I always I guess back then assumed it would be on a hobby level. But Jurassic was where I knew it could work.
Brian Knep: Initially, this project to try to get the stop-motion animation to work for the Dinosaur Input Device was quite controversial. There were some people that felt like ILM should be able to do everything. Some people felt we shouldn’t. We couldn’t because we didn’t have enough animators but we still wanted to be involved. It was a tense thing between the two companies and it was a bit tricky for me because I loved both of them and I was a little bit in the middle.
Above: Stop-motion animation for Jurassic Park’s animatics by Randal M. Dutra, Tom St. Amand and Kim Blanchette.
Phil Tippett: And then, I actually got sick and got pneumonia and had to stay in bed for two weeks. I thought it was all over. But Dennis and Craig, they developed the feasibility for doing what was essentially a stop-motion armature but making it motion capture – it had little sensors on it that you could plug into the computer and physically manipulate in real space and have that duplicated inside the computer as a wire frame; the Dinosaur Input Device.
Birth of the DID
Randal M. Dutra (senior animator, Tippett Studio): Quite simply: The “Dinosaur Input Device” was created to keep Tippett Shop in the Jurassic Park game. It was announced to the shop in April of 1992 that JP was going CG. As Senior Animator for Tippett on JP, I was already five months into pre-production having completed the “Dinosaur Movement Bible” (Raptor and T.rex) and was immersed in the Animatic phase of the “Raptor Kitchen” sequence.
Craig Hayes: There was no time to train these stop motion animators how to become computer animators. I mean, I say this sort of jokingly, but at the same time, really these guys were not necessarily computer savvy, and sometimes even the idea of moving a mouse around was a little bit foreign. I remember with a couple of the guys when they were first holding a mouse it’s like they were spending more time watching what their hand was doing to the mouse than watching the cursor on the screen.
Randal M. Dutra: I, Phil, and Tom St. Amand had no prior, or practical, computer background to speak of. Our talents and experience lay within the realms of traditional stop-motion animation and related disciplines, which we had practiced together for years at his shop prior to JP and for various ILM projects. So the advent of CGI was a huge upheaval for all of us. Everything had suddenly been upended.
Craig Hayes: So, clearly, we were not going to be able to educate our stop-motion animators about the computer graphics in the short order, but we could give them the tools to do their job, which they were very skilled at. So we put this presentation together, Dennis Muren, and Phil Tippett, and our production, VFX production guys, we said, ‘Let’s try this DID’. And everybody went for it and ILM said, ‘Well, we need to put together some people.’ So from ILM we had Tom Williams and Brian Knep, and Tom knew this guy Rick Sayre, from Pixar, who was a genius at hardware and software.
Rick Sayre (Sci-Tech winner): The question was, well how do we keep them involved and contributing at a high level? Dennis and Mark Dippe pulled me together for lunch because Pixar and ILM had a history together and they pulled me together and they talked about what might be possible, what might be quick, and I had strong opinions about using potentiometers which were all the rage at that point. And the rest is history.
[Pixar’s online library includes a paper written about the DID that appeared in proceedings of SIGCHI 1995]
Brian Knep: At the time, the Tippett folks were really fighting for their lives or worried about their livelihood, and this was a way in for them, into the 3D world. If they could prove that it was going to work, then they would get more work basically, so there was a lot on the line.
Tom Williams: The philosophy was, you take character animators to do character animation, you take stop-motion animators and let them do stop-motion animation. You take artists and painters and brilliant people in the traditional animation space and let them use tools, which seemed comfortable to them.
Rick Sayre: Aside from being at Pixar, I did a number of other things. I had co-produced a robotic opera and a number of folks from ILM I think had seen some of the crazy stuff I was cooking up. I’d made these exoskeletons that then had synchronised projections that were somewhat triggered by that, at that point there was the Jurassic Park transition from, ‘It’s all going be stop-motion’ to ‘It’s all going to be computer animated,’ but there was still an awful lot that the stop-motion animators could bring to the table when it came to monsters and dinosaurs in particular.
OK, how exactly do you build a DID?
Randal M. Dutra: To allow continuing contributions of the Tippett Shop to JP, a group of technically talented individuals was assembled who began development of the DID. The DID was basically a modified stop motion armature––expertly designed and machined by Tom St. Amand––with a major difference: motion encoders were affixed to each joint. A guiding factor of this new design was that the joints could only be comprised of single-axis hinges and swivels. A system I actually preferred, as it gave me very positive control when animating.
Craig Hayes: Tom St. Amand was really at the peak of his career in terms of armature design. He had it worked down to a science. He knew which type of materials to use. He knew that he you could mix the aluminium, and the phosphor bronze washers and the steel shoulder screws, et cetera. He had a grease that came from an army surplus store. He had this olive drab can of grease that was his special grease, and it was the perfect grease.
Tom St. Amand (animator, Tippett Studio): Phil Tippett originally hired me on Jurassic Park to build and animate puppets. I was beginning to plan some of the armatures and figure out what type joints would be required when we got the word that all of the animation was going to be done via computer graphics. So then I was involved with the engineering and construction of the DIDs, along with Craig and Bart Trickel. They were a bit larger than a regular puppet would have been. The models were unique in that they consisted exclusively of hinge and swivel joints rather than the standard ball and socket ones. We built two T-rex models and two raptors. Stuart Ziff and Gary Platek did the electrical hookups. We also used motion control setups with stepper motors to pre-program the main body and foot moves.
Craig Hayes: The trick was, for every joint on that armature, which there were as many axis of movement on a stop motion armature as there would be on a real animal or person, we had to figure out how to actually sort of encode that. So what I did was searched around and found the absolutely smallest devices that I could find that would do what we needed; they were optical encoders. Optical encoders are basically like little fan blades. There’s a little shaft, like a motor shaft, and around that shaft is a blade of a series of little fingers. And then there’s effectively an LED, and a little receiver. So a light and a receiver, and each time one of those fan blades goes in front of that LED it blocks the light.
Tom Williams: Those optical encoders weren’t very precise at the time. But what we could measure was their position on an average. When you fed that to the animation software system via an armature, which they had developed using Rick’s encoders, it just did it all brilliantly.
Craig Hayes: These encoders were about the size of a sugar cube and a half. They weren’t perfect because they had a limited amount of resolution. But they were the best we could do for the size. The next step was figure out how to hook these small devices up to every single joint. So imagine the elbow of your arm, you need to somehow figure out how to clamp a device on that’s right inline with the axis of pivot of your elbow. And then the same thing had to happen for the joints from your elbow to your wrist. So now you’ve got a different kind of axis, which is more inline with your limb. And in those cases what we did was we worked out ways to use gears to drive these encoders. And then we had to basically pick a stop-motion armature and pack with it one encoder for every single joint. And that’s a lot of joints, it’s a lot of devices. It’s a lot of wire.
We were really lucky at the time in that Phil knew Stuart Ziff, who he had worked with on some of the go-motion stuff. Stuart is kind of an electromechanical artistic genius. And he came in to help out and brought another world of experience in terms of things like the wires that we would use. I would use wires from the local Radio Shack, or a local electronics surplus store. And I thought that stuff was fine. Well, Stuart comes in, and it’s like, ‘Oh, no. This is never gonna work because you’re gonna stress those things out with too many bends and they’re gonna break.’
So one of the things he introduced us to was this wire made by a company called, Cooner, which had literally hundreds of strands. Each strand in the wire was so fine that the wire was incredibly supple and incredibly resilient and flexible and it was not going to break. What we were doing in a way was effectively reproducing a nervous system, a very crude nervous system on this mechanical armature.
For every encoder there were four wires. There was a positive, negative, and then A, and B signals coming out of them. So you can imagine if you go from rear wrist, that’s one, to your radius and all of that is two, to your elbow, three, to your upper arm, four to your shoulder, five, six, seven, eight. Pretty soon you’ve got 50 or 60 different devices packed onto this thing, and 50 to 60 times four wires running through this thing. It becomes an interesting sculptural kind of a challenge right there.
Turning the DID into a living, breathing CGI dinosaur
Craig Hayes: One of the decisions that was made early on for the stop-motion part was that Phil really wanted to use motion control for everything because the motion control rigs allow you to layer and build up your performances in such a way that it is editable. The animator can say, ‘I’m going to programme my blocking for the dinosaur. It’s going to run across the stage from here to here. I’m going to programme that blocking in with the motion control system.’ So the motion control system would go ahead and move the stop-motion armature through space and then the animator is going to layer on more complex animation on top of that. The motion control will be responsible for overall body motion, but not for the nuances of a tail, a head, et cetera.
Now, when we built this thing, in my mind, it was pretty obvious that we would build a device like this to do keyframes. You wouldn’t animate every single frame like you do stop-motion because you don’t need to. One of the things a computer is great at is interpolating between those frames. So, I sort of figured, ‘Yeah, we don’t really need the motion control at all.’ But it turns out that, psychologically, or just due to human nature, the animator still needed this motion control basis as part of their workflow. It was not realistic to expect them to be able to just completely abandon certain parts of their craft.
Tom St. Amand: Because working with the DID’s involved motion control programming, we tried to work out our moves beforehand as carefully as we could. We would block in the moves roughly and shoot a quick test to see if we were on the right track. In some ways the process was more user-friendly than traditional stop-motion. We didn’t have to worry about hot lights, accidentally bumping the camera, or inadvertently leaving machinist surface gauges in the shot.
Craig Hayes: But getting the motion control system working with the DID was a real challenge, because it works by using these stepper motors and stepper motors are not trivial. Some of them are about the size of a can of soup. And they’re really incredible devices, but they’re also just really electrically noisy in many respects. So you put a couple of these stepper motors in an area with many, many feet of wire, the input device, which is now acting as sort of a radio antennae, and now you start creating these pulses with these stepper motors, and it creates a very noisy environment.
Brian Knep: And then after they’d actually built the armature, we had to find a way for it to drive a 3D model in Softimage. But, of course, the 3D physical model did not match the 3D model, particularly in the spine. We had different joints and the joints bent in different ways. If you think about it on a 3D model, on a virtual model, your joints are mathematical points so they have no volume and they can move in all directions. In a physical model you have a joint that has space, it has limitations in how far it can move. That was a big challenge.
‘It’s a dinosaur’ – just
Tom St. Amand: Before actual shot production had begun, Randy Dutra had done some motion tests which would act as guides for the other animators. Randy, Kim Blanchette and I also produced stop-motion animatics (previs) which would help guide the live action production.
Randal M. Dutra: An early test was a simple “chain” of hinge joints equipped with encoders about 16 inches long that doubled for a tail. I animated it undulating in left to right swings, all single-framed. When viewed at ILM, they were curious if it had been key-framed to get the resultant smoothness of action. I had key-framed it, but as a stop-motion animator––meaning that every frame was a key-frame. So that was a promising introductory test supporting our DID progression.
Randal M. Dutra: Once the initial “finished” DID was ready to be put through its paces––a completed Raptor––I was the first animator to pilot this jet. Three more DIDs would eventually be built for a total of two Raptors and two T.rex. This allowed for the concurrent animation of shots during production by Tom St. Amand and me utilizing the DID, the Tondreau go-motion system, and the animation software of the time, Softimage. But as the other three DIDs had yet to be finished in a fully-functioning mode, I had the responsibility to start “shooting” the Raptor Kitchen sequence with this shiny new contraption. Our future depended on results.
Tom St Amand: We could also see the DID on a monitor as we worked, and draw Sharpie tics as guides. Randy and I acted out a lot of the moves ourselves, used stopwatches for timing specific actions, and looked at reference footage of animals walking and moving whenever we could. Phil didn’t want our work to look like ‘animation’ but more like real, living creatures. Like if you’d just happened upon them, walking through the jungle. Sometimes he acted out stuff with us.
Randal M. Dutra: Since I had already choreographed and animated all of my shots (and more) of the Raptors and T.rex––in the pre-vis Animatics and “Movement Bible”––the essential groundwork was laid for my later DID animation. I was wholly familiar with the demands of each shot. Referring to the most current storyboards, I had Cameraman/Computer Tech Steve Reding shoot video of me with my hi-8 camera in a corner of the animation stage area. I pantomimed the dramatic actions and timings of the characters, further building and refining them. Then, with stopwatch in hand, I would break the video down into beats and frame counts. The whole architecture of the performance was worked out, complete with integrated behavioral references and accents.
Some immediate benefits were no camera set-ups, no specific stage lighting needed; general work lighting was fine. However, I still had to animate exactly to the intended camera angle for each shot. I could check dinosaur positioning on the monitors for framing, but I did not have to deal with constantly avoiding set/stage lighting, placed flags, or camera and tripod. Different takes could be saved and logged. Long, slower movements could be judiciously key-framed. Slight pops could be edited in the function curves if necessary. I could use the system to do very rough block-ins and run-throughs, using only four to five key frames; thereby nailing down the dinosaurs’ scale (which actually differed aesthetically with each shot!) and the broader “global” actions within the frame. This provided ILM CG artists quick return data to begin lighting, and was also helpful on other technical fronts.
Brian Knep: When Tippett’s team first started doing the dinosaur input device, you know, they would just move this thing, move the dinosaur and then hit a keyboard, hit key like the space bar or the return key, I can’t remember, and that would save the pose. But when the animators first started doing it, they would move it and then they would run out of the frame, like they used to with a real camera. They didn’t realise that they wouldn’t be in frame anymore! I loved that.
Randal M. Dutra: A tool I often used as a reference for positioning and tracking my dinosaur animations were monitor drawings–– grease pencil or Sharpie drawings on cels from the “quick shade” renders. I produced scores of them. I’d mark frame numbers, make marginal notes, and add reminders directly on my working references. During JP production, after each of my shots, portions of my self-generated references strangely vanished from my set––even the running ostrich research drawings for the Raptor in the “Movement Bible”. I wondered where they went. Fast-forward to 24 years later when they resurfaced in a 2016 auction. There were six separate “lots” of my JP materials, all uncredited save for one––and that was erroneously attributed to Phil.
Randal M. Dutra: I made “secondary animation notes” (arms, fingers, toes, blinks, etc.) to accompany my already established action and choreography. These details by necessity cued off of the foundational performance. Computer Tech Adam Valdez artfully incorporated such notes, and also efficiently addressed various “emergencies” as they arose such as roving “floating feet”, a strange artifact that the DID system would occasionally gift us with. So the feet sometimes had to be re-locked-down/constrained in CG, even though the original DID animation had the feet firmly planted.
Our DID shots also underwent what was called a smoothing pass: a procedure that “remedies” any anomalies of the key frames shaping the animation function curves that do not fall within perfect alignment. But in so doing, I feel we lost portions of the more positive, inherent organic aspects of the original, hand-wrought DID animation. At the time, with CG being such a new visual, production at large wanted to keep things “smooth,” avoiding at all costs any artifacts resembling the “older”, traditional stop-motion. After all, this was the future. The two Raptors entering the kitchen, “RK-3”, remains an effective shot, but my accents, timings and extremes were somewhat dampened in this smoothing process. But again, we were all learning.
Adam Valdez (computer systems, Tippett Studio): The animation derived via the DID method was inherently a bit bumpy just because I think, unlike normal puppets, it wasn’t so straightforward to see the shape of the animal – the guys were animating skeletons. So we went in and did some smoothing of curves. Rigging of the characters was done at ILM and we modified it a little bit in Softimage. I think once Phil and Spielberg started giving further notes, there was little choice but do some tuning passes in the computer. So in the end, some of the shots were basically the stop-motion animator’s work with a little fixing. But in other cases we did modify the animation quite a bit later on. For example, when Spielberg asked for the T-rex to be 300 to 400% bigger in his debut shot, that required quite a bit of messing around to make it work. When he’s chewing tires on the flipped over jeep, there was a good bit of secondary motion and specific lineups to the jeep that had to be done. Similarly, dialling in the Raptors in the kitchen required a lot of shot by shot tuning – because eyelines and marks and connection to plate objects could only really be done with the plate in the computer at the same time. So in the end I think the stop-motion guys really charted the performances and it was a group effort with their overwatch that delivered the finals.
Roadblocks and repairs
Randal M. Dutra: As I began animating the Raptor DID, it became immediately apparent that the size of the encoders cramped the arm joints, hampering free movement. Raptors most likely kept their arms tucked close to their bodies, just like their kin––birds––when not in use. Their arm/wing anatomy is remarkably similar. So a second pass of adding “secondary” details was employed. Initially, Tom and I did that by CG, using Softimage, mouse and keyboard. I still have my Softimage tutorial notes. We were starting to become familiar with this system, but unfortunately as the scheduled soon tightened, our time was better spent moving ahead creating and securing the driving, core performances.
Craig Hayes: I remember very clearly because of these electromagnetic issues there were a number of things that were holding us back. And it was very frustrating in a way because we knew this would work. I mean, I knew it would work, and I knew that it was going to work really well, but there were a lot of these roadblocks.
Randal M. Dutra: The DID was eventually tamed, but it had its share of artifacts and problems. Since I was test-piloting this apparatus, I was dealing with the challenges arising everyday on production along with the supporting technical crew. Some of my completed DID animation tests would suddenly vanish into the ethers, leaving not a trace of the hours of work invested. “Joint chains” in the necks of the Raptor and T.rex presented problems with accumulated error, which translated into varying distances of where the CG dinosaur’s nose would actually end––crucial for interactions with props and objects, such as the Ford Explorer vehicle. A compensating length of wooden dowel was often hot-glued or “waxed” to the nose end of the DID skulls to remind us of the digital model’s discrepancy.
Brian Knep: It was very difficult to create a joint that can move in all three axes around the same point. The Tippett guys were able to do that in some areas but not in all of them. Some of the spine joints moves in all dimensions. The spine joints were not in the same places, and I think we even had a different number of spine joints, depending on whether it was the T-Rex or the velociraptor. Basically we got this nice animation and we had to translate it into our 3D models and some of the subtlety was lost, particularly in the spine, and that was quite difficult and it might even have sometimes created weird artifacts that our then 3D animators would have to come and fix up.
So that was probably the hardest thing and I was never quite satisfied with any solutions we had. There were a lot of different attempts at sort of approximating splines basically – reproducing their animation with a set of splines and then mapping our joints onto those splines. Here’s some things that would happen; like a foot that was supposed to be on a car, lets say, would not be exactly on the car. So our animators, and when I say ‘our’, I mean the ILM 3D animators, would then have to go and lock that foot using inverse kinematics to the car. Or the spine was moving in a nice way as a creature turned and it would just kind of pop a little bit as, maybe, hit the extremes and so the 3D animators would have to go and clean that up again.
Craig Hayes: The thing about these devices that is interesting is that they’re ‘relative’ devices versus ‘absolute’ devices, which means they don’t really know that you just turned them on from a blank state. None of the devices that we used at each joint really knew, ‘Oh, I’m at 15 degrees.’ They were all relative. So at the beginning of an animation session what you would need to do is put your input device into what we call the neutral position, or like the dead dog pose. So we would have to literally go through and straighten out every single joint and make these things as absolutely squared in sort of a neutral pose as possible.
And then you could start moving, and it would register, ‘Oh, I moved this far from zero.’ But they weren’t absolute. So, that coupled with noisy electronic spikes could really wreak havoc. If one of the legs all of a sudden goes, ‘Oh, hey, now, I think I’m at a completely different angle because I got a little electrical spike…’, you’d look on your computer screen and see the representation of the model and just all of a sudden its legs were shooting out to the side. And you knew that the animator didn’t do that and the model is not doing that, but your computer model is doing that.
Adam Valdez: It was also tricky because ILM was the only place we could render and see a comp of the work. I remember each evening we would beam scene files – literally over microwave data link – to the other side of the Bay Area, and they would download that file and put it into RenderMan. Usually in a day or two we’d see a render and that’s what Phil would review with Spielberg. It was slow overall. Just flipbooking your shot to see how the animation was working took a whole lunch time. We played a lot of darts while we waited.
Craig Hayes: Sometimes one of our animators couldn’t read the shot too well in Softimage because it’d be this grey dinosaur against a grey background. So we’d put a green material on the CG dinosaur, not really thinking anything about it, and then we’d send it off to ILM that night and have some big explosion because their system wasn’t really prepared for it. There were a lot of phone calls, a lot of like, ‘You guys, what’s going on here?!’ And, like, ‘Oh, I don’t know. It looks fine on our side.’ A lot of interesting discussions about, ‘How come this is backwards?’
Craig Hayes: But I do clearly remember one time we were in ILM dailies and Randy Dutra had done an animation test of the T-Rex biting into a car tyre. And the data transfer had worked and this thing shows up on screen with motion blur and lighting and everything else, and it was just great. Just right then and there we were like, ‘OK, here is the spark of life that we had been trying to get across the whole time.’
Randal M. Dutra: It was a rare treat indeed to view our animation fully rendered, lit, and integrated so beautifully and expertly by Dennis Muren and his team after being run through the ILM pipeline. To “see” the effect of those blurs of action, muscle/skin tweaks, added unsteady camera, and environmental interactions…it truly was magic.
‘Do you have a doctor on staff?’
Brian Knep: One time we heard that [producer] Kathy Kennedy was coming over to ILM for a review. Craig and I were downstairs and he was animating the DID. That wasn’t his forte but he was just animating it and I was running the software that we had in this little room. And the way the DID was set up, it wasn’t set up very nicely, so Craig was moving the model and then he was kind of rolling on the floor to the other side of the room and hitting a key on the keyboard to record and then rolling back.
There was some bug or something, I can’t remember, but I went upstairs to where my normal office was and I was working on it, and then my phone rings and I answer it, and it’s Craig. He goes, ‘Hey, Brian.’ I’m like, ‘Hey, Craig, how’s it going?’ He goes, ‘Do you have a doctor on staff?’ I said, ‘No, what are you talking about?’ He said, ‘Do we have a nurse?’ I said, ‘I think we might have a nurse. What’s going on?’ He goes, ‘Well, I think I popped my knee out of my socket.’
Craig Hayes: It’s actually a bit of a long story. On the weekend before, I was mountain bike riding with a friend who I hadn’t really ridden with before. So I’m kind of following him around taking my leads from him, and he goes through this, and I go through that. I got a little bit overconfident. And I go over an edge and realise it’s like two stories down. I was not at all prepared for it. So I go down halfway, and I’m like, ‘I am not going all the way to the bottom.’ It’s a steep drop off, so I kind of bailed, and I ended up crashing and banged myself up a little bit. I didn’t really think much of it, other than the fact that I was horrified that I was about to ride off a two story, in my mind, a cliff.
Anyway, so I banged myself up. I didn’t think much about it. The next week I was up at ILM and we were hooking up the miles of wires that had to be hooked up to the DID. And I’m underneath the table and all of a sudden my leg won’t straighten out. My leg is just locked up in a bent position. I’m like, ‘Oh, that’s weird. Holy shit, man, my leg won’t work. Oh, man.’
Brian Knep: I’m like, ‘Holy shit, Craig! What the fuck?’ So I run down there, there was some kind of nurse or something, and I say, ‘We’ve got to go to the hospital now,’ but Craig said, ‘No, I have to finish this.’ But his knee was popped out, and he still rolling on the floor because he had to finish that piece because Kathleen Kennedy was coming. It was a very intense moment. He was the kind of guy that would do that, you know.
Craig Hayes: We did end up going to the hospital. And then the emergency room doctor is like, ‘Oh, what’s going on here?’ I told him my leg won’t straighten out. He’s like, ‘Oh, well, that’s interesting. Let’s just try and pull it out.’ I’m like, ‘No, I don’t think you should do that because it doesn’t want to go straight.’ But he starts pulling my leg to straighten it out and, man, it was like I was seeing stars. He tries to lever my leg open and basically opens my joint up even more. And to this day, my leg will, on occasion, if I’m not being just right, it will actually kind of lock up again. And I think that ultimately the damage was done by this guy trying to crank my leg open. I reinforced my healthy disregard for doctors in general that day.
Legacy of the DID
Brian Knep: When we won a Sci-Tech award in 1997, it was just amazing. From the point of view of the folks who work in the backroom, it was really lovely to steal some recognition, because you often don’t get that. When you work at a place like ILM, you just see how much work that was behind all these scenes. And even though I was closely associated with some of these movies, there were still huge parts of it that I didn’t see, you know. Tonnes and tonnes of people working on all kinds of things, so it’s nice to see some of us get that recognition.
[In that same year, Knep also received a Scientific and Engineering Award for his contribution to ILM’s Viewpaint 3D Paint System, which saw significant use on Jurassic Park]
Craig Hayes: The Sci-Techs were total fun. We all got dressed up, and hung out. We just had a good time and great food. It’s its own event and so in a way there’s a lot less pressure. It’s more relaxed than the actual Academy Awards because it’s really for technicians.
Brian Knep: I remember my parents were able to come, which was kind of nice, and my mom ran into Helen Hunt in the bathroom and, well, she just said ‘You’re Helen Hunt.’ Helen said, ‘Yes, I am.’ And my mom said, ‘You’re giving my son an award.’ And Helen said, ‘Yes, I am.’ Then my mom was mortified and talked about it for years afterwards.
Adam Valdez: The DID was not that different from motion capture or mixed reality techniques emerging today, so it was very ahead of its time. The idea of mixing the physical and computer graphics worlds is what defines virtual production, so we can say the DID was definitely pioneering. Building a bridge to a few key animators was its necessity. Combining it with motion-control bases for physical support was quite cumbersome, but it all did its job. We have experimented recently with puppettering as the basis of motion input, so the fundamental notion is still sound!
Randal M. Dutra: When I look back at my JP cel drawings, photos, sketches and working animation notes––they are a fresh reminder of what a remarkable and historic opportunity the making of JP was. Two and a half years later Spielberg and ILM secured me as CG Animation Director for Jurassic Park: The Lost World; our industry peers nominated us for an Oscar. No DIDs were involved, for they too went the way of the dinosaur, but will forever remain a part of JP lore. Dinosaurs, and our continuing fascination with them, reflect our need, our insatiable appetite, for fantastic creatures–––and how cool it is that these creatures actually walked the earth…
BONUS SECTION: The DID goes back to the movies
[Trey Stokes was supervisor, character animation department at Tippett Studio on Starship Troopers where he and his team utilised the DIDs for bug animation. He first met Phil Tippett and Craig Hayes during production on RoboCop 2 while freelancing for deGraf/Wahrman on a real-time animation system to animate the CGI face on Cain, which Tippett incorporated into their stop-motion character. Stokes then also had experience at Boss Film on Species with another real-time performance system. A live demo of the Boss puppetry system was done at SIGGRAPH, where Stokes again met Hayes.]
Trey Stokes (supervisor, character animation department, Tippett Studio on Starship Troopers): Craig told me Tippett Studio was working on something and I should come take a look. So I flew up to Berkeley and they told me about the Troopers project. They had a demo video and bug designs and so on – it all looked amazing. It took a while before the movie got a green light, but eventually I relocated to Berkeley and became head of the Tippett animation department. Which was kinda nuts, because I had no real CG experience – I’d only moved mocap devices around. And even that was just a brief detour, most of my career had been stage puppetry and rubber movie monsters. But Phil said he hired me because I’d come from a practical fx background and he wanted someone who spoke that language.
So it was an amazing opportunity, but it meant I had to learn CG animation while already being the head of the CG animation department on a hundred million dollar movie. Everybody I was supposed to be supervising knew more about CG than I did. So there were a lot of late nights with the Softimage manual, trying to figure out how to do my job.
Craig Hayes: In between Jurassic Park and Starship Troopers, we did some work on Tremors II. We were actually able to take this thing with a T-Rex hip joint and legs and modify it to work for these little guys. So we were able to do some work there and we also were able to repurpose a lot of the parts and do some animatics for a Pinocchio show. This is where, I think, the animators really started to get the idea of keyframing as opposed to stop-motion every single frame, and that’s where things really became interesting. We were able to get rid of the motion control, which at the end of the day was a huge hindrance. I was not happy about it, but, anyway, we were able to get rid of the motion control and let the animators work in a different way.
Trey Stokes: I don’t think they had a working bug input device when I first interviewed, but by the time I started they had one up and running. One of the first things I remember is Phil and I grabbing hold of the armature and trying different moves, just using it like a puppet and experimenting. You might notice there’s a behaviour the bugs sometimes have – they lift one front leg off the ground, sort of like a dog pointing. That gesture came out of those test sessions, it was something we stumbled on that Phil liked.
Overall, the goal was to use real-world animal behaviors – predators have a way of moving that people just automatically recognize. We watched documentaries about lions and wolves and so on, and applied that to how we moved the bugs. Even when it didn’t necessarily make sense – for example that big pincer on the front of a warrior bug isn’t its mouth, it’s a weapon. But it looks like a mouth, so we animated it like one. For some reason, when the bugs scream they raise their front pincer. It’s just what warrior bugs do, okay? That’s our story and we’re sticking to it.
Craig Hayes: By the time we got to Starship Troopers, we were able to say, ‘Well, what if we designed an armature that was from the get-go designed to incorporate these things we had learned.’ So Tom was able to bring a lot of different ideas to that. Merrick Cheney came on as well to help out with machining. And we were able to use CAD software to make these things not only much more integrated, but also create these zeroing plates and use the numerical information we had from the design of these parts to make these jigs that we could put the bugs on. What that meant was, we were able to put the bug armatures onto this plate literally clamp every joint down, hit zero, and pop and bop and get to work. So we were able to really streamline the zeroing process and also the integration of the encoders. So now that they were designed from the get-go, designed to be included into these parts as opposed to bolted on in the manner we did for Jurassic Park.
Trey Stokes: Something that may not be clear to folks now is that we had – I think – eleven animators in all, and the majority were keyframe animators. There were four DID stations built, but our problem was that across town Henry Selick was finishing James and the Giant Peach and ramping up to make Monkeybone. Every stop-motion animator in the Bay Area was booked. After literally months of searching we were able to get Tom Gibbons, who had worked on Giant Peach, and Randy Link, from a stop-motion TV show called Bump In The Night. But that was it. Meanwhile we were able to fill our keyframe seats with good people, but we simply could not find animators for those other two DID stations.
Again, the irony was that I was the supposed head of a stop-motion and CG animation department, but all my experience was in puppetry. I spent my free time messing with the empty DID stations and I realised that the bug model was choppy on screen in realtime, but the actual data was all being captured just fine. So we widened our search to include puppeteers, and pretty soon after that Kirrie Edis came on board. And that was the DID team – Tom and Randy mostly used their armatures frame by frame, and Kirrie worked mostly in realtime. Then they’d polish their captured data with the curve editor – they each developed their own workflow for that. But that fourth station we never were able to fill.
Craig Hayes: The biggest thing was the ability for the animators to learn how to be able to be more free with the DID. And in fact, on Starship Troopers, in some cases what we would do is you take the armatures, loosen up the joints to the right tension and effectively record it in real-time. So now they were puppeteer-ing these things as opposed to exclusively stop-motion animating them. Trey Stokes was able to come in and now treat these things more like real-time puppets than stop-motion puppets. I think that was very helpful as well.
Trek Stokes: Once we started production, we learned there were certain shots that were especially suited for the DID folks. A couple of examples:
– At one point the troopers enter a valley after it’s been napalmed from the air. We needed to fill the valley with dead bugs in contorted poses. This was a huge pain with keyframing because if you crunched the keyframe model the IK chains would go haywire and rotate the legs in unexpected ways. So the pose would change, you’d get geometry intersections, and so on. But with a DID it was easy – twist the armature into a pose, snap one frame, repeat. As I recall it was Randy who got that shot and he was able to fill the valley with contorted bugs much faster than a keyframe animator could have.
– Early on, we developed a basic bug walk cycle. For the shots of bugs stampeding, the FX animation team would multiply that cycle through a particle system to generate the swarm. The first results looked like, as Phil put it, a merry-go-round. Every bug was walking in exactly the same way and so the swarm had a sort of robotic vibe. Kirrie did a bunch of animations that we could layer on top of the generic walk cycle – bugs shifting their weight, leaning different directions, just doing random things. She cranked out a ton of that animation in realtime, and when that was blended into the swarm it created the messy lifelike feel we needed.
– When the Planet P compound is besieged, we had to do multiple wide shots of the entire landscape covered with bugs. We realised – rather late in the game – that once the bugs reached the compound they couldn’t keep walking. They had to stay in place, all packed together. So again, Kirrie cranked out a huge amount of bug animation, this time of bugs standing in place doing…whatever. There’s a shot where Lt. Rasczak turns around and sees bugs out to the horizon – that shot is basically Kirrie multiplied a million times, and she generated all that animation in a day or two.
By the way, if you look closely at that shot, at the tip of Rasczak’s gun barrel – there’s one bug out there flailing back and forth like he’s having a little seizure. Kirrie improvised that because she figured there’d be at least one bug getting claustrophobic and freaking out. Completely by accident, the particle system randomly put that bug into a prominent position on screen. We were going to replace him, but Verhoeven saw an early version of the shot and LOVED that bug. So he’s in the movie.
I stayed on for one more project after Troopers – that was My Favorite Martian for Disney. At the same time the studio also worked on Virus and Komodo and Armageddon but as far as I remember we didn’t use DID’s for any of them. The DID’s had to be very specifically machined and calibrated for Jurassic Park and Troopers, and even on Troopers we only had warrior bug DID’s – all the other bug types were keyframed. We often talked about developing a more modular system that could adapt quickly to new projects but never had the time or money for it. Also, when Troopers began Tippett Studio was a stop-motion company that dabbled in CG. By the end of Troopers, it was a fully-functioning CG production house – we basically built the facility around ourselves along the way. The DID’s helped make that transition, but to my knowledge Tippett hasn’t used them since.
On that note – when I went in for that first interview in 1997, I geeked out over all the iconic movie props that were on display. The garbage monster eyeball from Star Wars, a Cantina Band mask, stop motion puppets from RoboCop…I was in fanboy heaven. Just a few months ago, I happened to visit the studio again for the first time in a very long time, and there on a table was a warrior DID. Well, hey there, old friend! So I started posing it, and then I noticed an employee – someone I didn’t know and who didn’t know me – giving me a funny look. Oh, right – you’re not supposed to mess with the museum pieces.
Illustration by Maddi Becke.
More from vfxblog’s ‘Jurassic Week’:
Viewpaint: ILM’s secret weapon on Jurassic Park
The surprising game-changing VFX of Jurassic Park III
‘The Lost World’ turns 20: Animation director Randal M. Dutra reflects on those early days of the digital age
Back to vfxblog.com