When CGI just won't suffice
There are places in filmmaking where computer generated images and digital compositing are simply still not good enough.
On one hand, this might seem unfair. Cinema as a discipline is well over a century old, and we've only been using digital techniques to work on it for about 35 years, taking the likes of the original Tron as a starting point. But there have been significant successes in some areas, such as the CGI creature effects pioneered in Jurassic Park, the continued successes in Lord of the Rings and the spectacularly tentacular full-face replacement on Bill Nighy for the Pirates of the Caribbean sequels. These films – even given the limited technology base available in 1993 for Spielberg's dinosaurs – produced results which were believable, fully photographic, and obviated older techniques like stop motion almost overnight.
But none of this is new. It's an old subject. So why are there things CG still can't do?
Well, let's look at a few examples. A particularly prototypical instance is that of fire and pyrotechnic effects, computer-based attempts have seen enormous advancement with the likes of Fume FX - a piece of software first released in 2006 and now available in its third iteration for both Max and Maya. It is described as a fluid dynamics simulator, although in most cases it's used to render fire and smoke. It's very effective for a certain range of applications – rocket trails, billowing smoke and dust clouds. But, in other scenarios, such as the always-popular exploding helicopter, no high-end feature film would think of doing it any other way than with live pyrotechnics, because to date the alternatives have always looked perhaps slightly suspect.
Figuring out why this is the case requires us to know a little more about the underlying physics, as well as the computer attempts to simulate them. The very best flame simulations can, under some circumstances, be photorealistic. They tend to operate by attempting to simulate the billions of glowing particles in a real fire, generally by simulating smaller, more manageable mathematical models of real-world fluid dynamics and rendering the result based on certain assumptions. To some extent it is therefore an issue of computer horsepower: we just don't have systems - even those available on generous movie budgets - which can simulate a real fire to a point where it is indistinguishable from the real thing. And in the exceptional circumstance that a filmmaker could pull it off, he would still be cheaper out blowing things up for real.
It isn't quite as simple though as hoping that more megahertz and more gigabytes will allow us to replace practical effects. Not only is a real fire effect – or any other effect for that matter - a complicated thing in its own right, the interaction of these things with their environment makes computer simulation tricky.
Fire moves around objects that it collides with. It absorbs, reflects and emits light. It is complex. And while software like Mocha, Boujou, PFTrack and others have made the combination of real and unreal (or at least separately photographed) elements easier than ever before, they don't yet solve this problem. We remain some way from being able to insert fire into a scene with correct interactive geometry, lighting and shadowing without dedicating a huge amount of human intervention to the problem – and human intervention is expensive.
Comparatively small-scale effects are counterintuitively difficult too. Even quite small movies still use the device widely referred to as a Sweeney gun to produce the effect of bullets striking a hard surface. It's arguably an unrealistic effect, but the white flashes of the bad guys' rounds impacting the cover behind which the hero is sheltering has become part of movie lore.
From a purely analytical perspective, it's difficult to fathom why this should be. The effect is short, often just a few frames in duration. The ‘real world effect’ consists of a few superheated pieces of metal moving rapidly outward from a central point, achieved using something very like a paintball gun, which propels a brittle, breakable capsule filled with small gravel and shavings of the metal zirconium toward the target.
Zirconium is a flammable metal that ignites when struck hard. On paper, it would seem that any competent particle system – such as Red Giant's seminal Particular for After Effects – could reliably simulate this, and to some extent it can. But a dimly-lit scene, smoke or water in the air, nearby actors and other effects, and it becomes difficult to convincingly integrate the flashes of light produced by the burning metal, the puff of dust produced, and other apparently trivial aspects of the effect that are essential to realism.
In a similar vein, the latest Rambo sequel was occasionally frowned on for using digitally-inserted blood effects to represent bullet hits, a significant criticism for an action movie. Okay, these things can be done digitally, and done well, but the imposition of expensive work in post-production often means that it's better done for real.
Ultimately, there will always be very high-end productions – those directed by real-ale filmmakers such as Tarantino and Nolan – who will do things a certain way because their producers give the directors the artistic choice, and so aren't really models for this sort of discussion. But there are other reasons why, for instance, the attack scene in Pearl Harbor was done not by computer animators, but by a lot of pyrotechnicians and a whole lot of explosives.