“Quantum Leaper”

This week issue 40 of Infinity magazine comes out, featuring a couple of articles I wrote, including one about the cult sci-fi series Quantum Leap. The show saw Dr. Sam Beckett (Scott Bakula) bouncing around time into other people’s bodies and striving to put right what once went wrong, while his holographic friend Al (Dean Stockwell) smoked cigars, letched, and relayed exposition from Ziggy the computer.

I end the article by wondering whether it’s time for someone like Netflix to bring the show back (it definitely is). What I don’t mention in the magazine is that – unbeknownst to almost everyone – Quantum Leap has already been rebooted once.

This, my loyal readers, is the story of Quantum Leaper.

 

Season One (1995)

As teenagers, my friend David Abbott and I were huge Quantum Leap fans, and were bereft when the show was axed in 1993. I was developing an interest in filmmaking, having dabbled in 2D computer animation on my Atari ST and borrowed my grandfather’s Video-8 camcorder on a couple of occasions. When I was given that camcorder for my 15th birthday, David and I decided that we would make our own version of Quantum Leap, which we imaginatively titled Quantum Leaper.

The first episode was called “Just What the Doctor Ordered” and saw my character – named, again with great imagination, Neil – leaping into a doctor just as his patient is flatlining. I don’t remember much about the plot, but I do remember that we climbed the nearby Malvern Hills to film a fight scene.

Dave played Albert, my holographic helper, communicating with Project Quantum Leap’s supercomputer Ziggy by means of a special hand-link, just like Dean Stockwell did. Unlike Dean Stockwell’s, this hand-link was a calculator.

The two of us also played all the supporting characters (often with the judicious addition of a hat or jacket) and operated the camera, unless we were both in shot, in which case it was locked off. Much of the the editing was done in camera – rewinding the 8mm videotape, cueing it up to the exact moment the last piece of action ended, then hitting record and calling action simultaneously – and the rest I did tape-to-tape with two VCRs connected together. A cheap four-track disco mixer enabled the addition of music (badly composed by me) and sound effects (many of which were sampled from Quantum Leap itself). As YouTube was still years away, the only viewers for the series were our parents and friends, forced to sit down in front of the TV and watch it off VHS.

Episode two, “Boom!”, saw the fictional Neil as a bomb disposal expert supposedly in Northern Ireland in 1980, though like the first episode it was all shot in and around my house. My sister Kate was drafted in to play a journalist whose life Neil has to save.

“A Leap into the Blue” was the next episode, with Neil in the body of a parachutist. Scenes of characters in free-fall were shot with us standing in front of a white wall; I digitised the footage on my ST with a Videomaster cartridge and composited scrolling clouds into the background. The resolution of the Videomaster was very limited – maybe 320×240 – the frame rate was very low too, and it could only do black and white.

A digitised visual effect using a shot of a plane stolen from some TV programme or other

Next we shot a “pilot” episode explaining how Neil and Albert switched places with Sam and Al. I remember digitising shots of Scott Bakula and Dean Stockwell from Quantum Leap and compositing them atrociously into our own footage. At about 30 minutes long, the pilot was double the length of our other episodes.

Then we continued the series where we’d left off. Dave’s script “One Giant Leap” has Neil on a space shuttle mission, an episode that included NASA footage taped off the TV. We made almost no attempt to create sets; the space shuttle cockpit was a plain wall, a computer keyboard and a piece of card to cover an incongruous bookcase.

The space shuttle cockpit “set”

The next two episodes find Neil meeting (and shooting) an evil future version of himself, then leaping into the crazy future space year of 2017. The latter involves a flying car – my mum’s Citroen AX with the wheels framed out, intercut with an extremely crude CGI model.

Dave’s episodes “Virtual Leaping” and “Bullets Over Leaping” see Neil become a VR programmer (with a headset made of Lego) and then an actor (in a studio suspiciously like Dave’s shed).

The VR headset “prop”

My next episode has Neil leaping into himself and saving his father’s life. (My actual dad provided some splendidly wooden acting.) But doing this causes a paradox, and the season finale sees Neil and Albert swap places (as Sam and Al do in a classic Quantum Leap episode) and Neil having to restore the timeline to prevent the destruction of the universe.

We were ambitious. You can say that much for us.

 

Season Two (1996)

The following year, while doing our GCSEs, we began work on a second season. In between I’d made a bad 40-minute comedy, Bob the Barbarian, and an appalling feature-length sci-fi film, The Dark Side of the Earth, and I’d learnt a few things that would lift the production values of Season Two very slightly. I’d also nagged my parents into buying me a genlock which would let me superimpose CGI over analogue video, meaning I didn’t have to digitise footage and suffer the horrendous image degradation any more.

The holographic Albert enters the Imaging Chamber, an effect enabled by my new genlock.

The actual Quantum Leaping effect from this era of the show is surprisingly decent given the equipment we were working with. We would lock the camera off and jump-cut to a blue filter being over the lens, then a white glow would creep over me – an animation I achieved in software called Deluxe Paint – followed by tendrils of electricity. The screen would then fade to white and a similar effect would play out in reverse to show the leap in.

Leaping from life to life, striving to put right what once went wrong…

Another improvement was that we managed to convince a few other friends to act in the series, including fellow Quantum Leap fan Lee Richardson, as well as Chris Jenkins, Conrad Allen, Matt Hodges, Si Timbrell and Jim McKelvie. Recognising my lack of musical talent at last, I abandoned composing and instead used soundtrack CDs from Star Trek: Deep Space Nine (Dennis McCarthy), the John Woo film Broken Arrow (Hans Zimmer), and the Doctor Who story “The Curse of Fenric” (Mark Ayres). Albert’s hand-link prop got an upgrade too, from a calculator to a custom Lego build with flashing lights.

Lee Richardson “acting” in the control room “set”

Season Two opens with Dave’s episodes “Project Hijacked” and “Oh Brother, Where Art Thou?” which focus on events at Project Quantum Leap, supposedly a high-tech facility in the New Mexico desert in 2005. In reality it was a living room with a control console made out of painted cardboard boxes and Christmas lights. In an early manifestation of my cinematography leanings, I snooted the ceiling light with a rolled-up piece of silver card, lending a little bit of mood to the look.

At the time, Dave’s family were training a hearing dog, Louis, so I wrote an episode to feature him; “Silence is Golden” sees Neil leap into a deaf man, and was followed by the morbid “Ashes to Ashes” where he leaps into a corpse.

The next episode, Dave’s “Driven to Distraction”, is probably the best of the lot. For once there were few enough characters that no-one needed to confusingly play dual roles, and there is plenty of action to boot. (I uploaded this episode to YouTube so long ago that the ten-minute time limit still applied.)

The X-Files-inspired “Close Encounters of the Leaping Kind” comes next, with Neil as a ufologist bothered by a shadowy government agent. Then Neil becomes a teenager who must prevent a drugs overdose, then a one-armed man who must overcome prejudice to hold down a job. Cringingly entitled “Not So Armless”, this latter was shot in a newsagent’s owned by a friend’s parents, one of the series’ few non-domestic locations.

Like Quantum Leap we had a mirror shot in every episode where Neil would see the leapee’s reflection looking back at him. Sometimes Dave would track the camera behind my back and we’d hide a cut in the darkness to swap me with whoever was playing the reflection. Another time we pretended the serving hatch in Dave’s house was a mirror and the two of us synchronised our movements. For a fight scene in “Not So Armless” Chris hid one arm inside his t-shirt so that Neil’s mirror image could appear to punch the antagonist with an invisible fist!

Facing mirror images that were not his own…

The penultimate episode of the season features several brief leaps, ending with one to Hiroshima in 1945, where the A-bomb detonation (more footage off the TV) causes both Neil and Albert to leap simultaneously. In the finale, Albert becomes a mountaineer caught in an avalanche, while Neil is a member of the rescue team – a premise thieved from the Quantum Leap novel “Search and Rescue”. We started shooting it during snowy weather, but the snow thawed and the episode was never completed. The friends who had been appearing as supporting characters now had part-time jobs and couldn’t spare the time for filming.

 

Legacy

We wrote all six episodes of a third season which would have explained how Neil became the evil future version of himself seen in an earlier episode, but nothing was ever filmed.

In 1997 we began a remake of the pilot using the experience we had gained since shooting the original, but again it was never completed. One part we did film was an action sequence with me on the roof rack of a car while the driver swerves around trying to throw me off. We shot this on Malvern’s Castlemorton Common and used a dummy of me for some of the wider and more dangerous shots. Its acting was probably better than mine. We remade the scene four years later as part of my Mini-DV feature The Beacon.

Today only five of the 20 Quantum Leaper episodes that we made survive, the rest having been callously taped over at some point in my late teens. That’s probably for the best, as most of it was hilariously bad, but making it taught me a hell of a lot about filmmaking. Without it, I doubt I’d have a career in cinematography today.

His only guide on these journeys is Al, an observer from his own time…
“Quantum Leaper”

“Mission: Impossible” and the Dawn of Virtual Sets

The seventh instalment in the Mission: Impossible franchise was originally scheduled for release this July. It’s since been pushed back to next September, which is a minor shame because it means there will be no release in 2021 to mark the quarter of a century since Tom Cruise first chose to accept the mission of bringing super-spy Ethan Hunt to the big screen.

Today, 1996’s Mission: Impossible is best remembered for two stand-out sequences. The first, fairly simple but incredibly tense, sees Cruise descend on a cable into a high-security vault where even a single bead of sweat will trigger pressure sensors in the floor.

The second, developing from the unlikely to the downright ludicrous, finds Cruise battling Jon Voight atop a speeding Channel Tunnel train, a fight which continues on the skids of a helicopter dragged along behind the Eurostar, ending in an explosion which propels Cruise (somehow unscathed) onto the rear of the train.

It is the second of those sequences which is a landmark in visual effects, described by Cinefex magazine at the time as “the dawn of virtual sets”.

“In Mission: Impossible, we took blue-screen elements of actors and put them into believable CG backgrounds,” said VFX supervisor John Knoll of Industrial Light and Magic. Building on his work on The Abyss and Terminator 2, Knoll’s virtual tunnel sets would one day lead to the likes of The Mandalorian – films and TV shows shot against LED screens displaying CG environments.

Which is ironic, given that if Tom Cruise was remaking that first film today, he would probably insist on less trickery, not more, and demand to be strapped to the top of a genuine speeding Eurostar.

The Channel Tunnel had only been open for two years when Mission: Impossible came out, and the filmmakers clearly felt that audiences – or at least American audiences – were so unfamiliar with the service that they could take a number of liberties in portraying it. The film’s tunnel has only a single bore for both directions of travel, and the approaching railway line was shot near Glasgow.

That Scottish countryside is one of the few real elements in the sequence. Another is the 100ft of full-size train that was constructed against a blue-screen to capture the lead actors on the roof. To portray extreme speed, the crew buffeted the stars with 140mph wind from a parachute-training fan.

Many of the Glasgow plates were shot at 12fps to double the apparent speed of the camera helicopter, which generally flew at 80mph. But when the plate crew tried to incorporate the picture helicopter with which Jean Reno’s character chases the train, the under-cranking just looked fake, so the decision was taken to computer-generate the aircraft in the vast majority of the shots.

The train is also CGI, as are the tunnel entrance and some of its surroundings, and of course the English Channel is composited into the Glaswegian landscape. Once the action moves inside the tunnel, nothing is real except the actors and the set-pieces they’re clinging to.

“We cheated the scale to keep it tight and claustrophobic,” said VFX artist George Hull, admitting that the helicopter could not have fitted in such a tunnel in reality. “The size still didn’t feel right, so we went back and added recognisable, human-scale things such as service utility sheds and ladders.”

Overhead lights spaced at regular intervals were simulated for the blue-screen work. “When compositing the scenes into the CG tunnel months later, we could marry the environment by timing those interactive lights to the live-action plates,” explained Hull.

Employing Alias for modelling, Softimage for animation, RenderMan for rendering, plus custom software like ishade and icomp, ILM produced a sequence which, although it wasn’t completely convincing even in 1996, is still exciting.

Perhaps the best-looking part is the climactic explosion, which was achieved with a 1/8th scale miniature propelled at 55mph through a 120ft tunnel model. (The runaway CGI which followed Jurassic Park’s 1993 success wisely stayed away from explosions for many years, as their dynamics and randomness made them extremely hard to simulate on computers of the time.)

Knoll went on to supervise the Star Wars prequels’ virtual sets (actually miniatures populated with CG aliens), and later Avatar and The Mandalorian. Meanwhile, Cruise pushed for more and more reality in his stunt sequences as the franchise went on, climbing the Burj Khalifa for Ghost Protocol, hanging off the side of a plane for Rogue Nation, skydiving and flying a helicopter for Fallout, and yelling at the crew for Mission: Impossible 7.

At least, I think that last one was real.

“Mission: Impossible” and the Dawn of Virtual Sets

5 Ingenious Visual Effects With No CGI

How were visual effects achieved before the advent of computer generated imagery (CGI)? Most of us know that spaceships used to be miniatures, and monsters used to be puppets or people in suits, but what about the less tangible effects? How did you create something as exotic as an energy beam or a dimensional portal without the benefit of digital particle simulations? The answer was often a combination of chemistry, physics, artistry and ingenuity. Here are five examples.

 

1. “Star Trek” transporters

The original series of Star Trek, premiered in 1966, had to get creative to achieve its futuristic effects with the budget and technology available. The Howard Anderson Company was tasked with realising the iconic transporter effect which enables Kirk’s intrepid crew to beam down to alien planets. Darrell Anderson created the characteristic sparkles of the dematerialisation by filming backlit aluminium powder being sprinkled in front of a black background in slow motion. Hand-drawn mattes were then used to ensure that the sparkling powder only appeared over the characters.

 

2. “Ghostbusters” proton packs

The much-loved 1984 comedy Ghostbusters features all kinds of traditional effects, including the never-to-be-crossed particle streams with which the heroes battle their spectral foes. The streams consist of five layers of traditional cell animation – the same technique used to create, say, a Disney classic like Sleeping Beauty – which were composited and enhanced on an optical printer. (An optical printer is essentially two or more film projectors connected to a camera so that multiple separate elements can be combined into a single shot.) Composited onto the tips of the Ghostbusters’ guns were small explosions and other pyrotechnic effects shot on a darkened stage.

 

3. “Lifeforce” energy beams

This cult 1985 sci-fi horror film, most notable for an early screen appearance by Patrick Stewart, features alien vampires which drain the titular lifeforce from their victims. To visualise this lifeforce, VFX supervisor John Dykstra settled on a process whereby a blue argon laser was aimed at a rotating tube made of highly reflective mylar. This threw flowing lines of light onto a screen where it would be captured by the camera for later compositing with the live-action plates. The tube could be deliberately distorted or dented to vary the effects, and to add more energy to certain shots multiple brief elements of a flashing xenon bulb were added to the mix.

 

4. “Big Trouble in Little China” portal

A mixture of chemical and optical effects were employed for certain shots in the 1986 action-comedy Big Trouble in Little China. Director John Carpenter wanted an effervescent effect like “an Alka-Seltzer tablet in water” to herald the appearance of a trio of warriors known as the Three Storms. After many tests, the VFX team determined that a combination of green paint, metallic powder and acetone, heated in a Pyrex jar on a hotplate, produced an interesting and suitable effect. The concoction was filmed with a fisheye lens, then that footage was projected onto a dome to make it look like a ball of energy, and re-photographed through layers of distorted glass to give it a rippling quality.

 

5. “Independence Day” cloud tank

By 1996, CGI was replacing many traditional effects, but the summer blockbuster Independence Day used a healthy mix of both. To generate the ominous clouds in which the invading spacecraft first appear, the crew built what they called the “Phenomenon Rig”. This was a semi-circle of halogen lights and metal piping which was photographed in a water tank. Paint was injected into the water through the pipes, giving the appearance of boiling clouds when lit up by the lamps within. This was digitally composited with a live-action background plate and a model shot of the emerging ship.

See also: “Top Five Low-tech Effects” and “5 Simple but Effective Camera Tricks”

5 Ingenious Visual Effects With No CGI

The History of Forced Perspective

A miniature ship with a real camel, people and helicopters in “Close Encounters of the Third Kind”

“These are small,” Father Ted once tried to explain to Father Dougal, holding up toy cows, “but the ones out there are far away.” We may laugh at the gormless sitcom priest, but the chances are that we’ve all confounded size and distance, on screen at least.

The ship marooned in the desert in Close Encounters of the Third Kind, the cliff at the end of Tremors, the runways and planes visible through the windows of Die Hard 2’s control tower, the helicopter on the boat in The Wolf of Wall Street, even the beached whale in Mega Shark Versus Giant Octopus – all are small, not far away.

The most familiar forced perspective effect is the holiday snap of a friend or family member picking up the Eiffel Tower between thumb and forefinger, or trying to right the Leaning Tower of Pisa. By composing the image so that a close subject (the person) appears to be in physical contact with a distant subject (the landmark), the latter appears to be as close as the former, and therefore much smaller than it really is.

Building Moon's forced perspective corridor
Building the forced perspective corridor for “Moon”

Architects have been playing tricks with perspective for centuries. Italy’s Palazzo Spada, for example, uses diminishing columns and a ramped floor to make a 26ft corridor look 100ft long. Many film sets – such as the basement of clones in Moon – have used the exact same technique to squeeze extra depth out of limited studio space or construction resources.

Even a set that is entirely miniature can benefit from forced perspective, with a larger scale being used in the foreground and a smaller one in the background, increasing the perceived depth. For example, The Terminator’s “Future War” scenes employ skulls of varying size, with background ruins on an even smaller scale.

“Princess Nicotine”

An early cinematic display of forced perspective was the 1908 short Princess Nicotine, in which a fairy who appears to be cavorting on a man’s tabletop is actually a reflection in a distant mirror. “The little fairy moves so realistically that she cannot be explained away by assuming that she is a doll,” remarked a Scientific American article of the time, “and yet it is impossible to understand how she can be a living being, because of her small stature.”

During the 1950s, B movies featuring fantastically shrunk or enlarged characters made full use of forced perspective, as did the Disney musical Darby O’Gill and the Little People. VFX supervisor Peter Ellenshaw, interviewed for a 1994 episode of Movie Magic, remembered the challenges of creating sufficient depth of field to sell the illusion: “You had to focus both on the background and the foreground [simultaneously]. It was very difficult. We had to use so much light on set that eventually we blew the circuit-breakers in the Burbank power station.”

One of many ingenious forced perspective shots in “The Gate”
This behind-the-scenes angle reveals how the above shot was done.

Randall William Cook was inspired years later by Ellenshaw’s work when he was called upon to realise quarter-scale demonic minions for the 1987 horror movie The Gate. Faced with a tiny budget, Cook devised in-camera solutions with human characters on raised foreground platforms, and costumed minions on giant set-pieces further back, all carefully designed so that the join was undetectable. As the contemporary coverage in Cinefex magazine noted, “One of the advantages of a well-executed forced perspective shot is that the final product requires no optical work and can therefore be viewed along with the next day’s rushes.”

A subgroup of forced perspective effects is the hanging miniature – a small-scale model suspended in front of camera, typically as a set extension. The 1925 version of Ben Hur used this technique for wide shots of the iconic chariot race. The arena of the Circus Maximus was full size, but in front of and above it was hung a miniature spectators’ gallery containing 10,000 tiny puppets which could stand and wave as required.

Setting up a foreground miniature for a later Who story, Inferno (1970)
Setting up a foreground miniature for the 1970 “Doctor Who” story “Inferno”

Doctor Who used foreground miniatures throughout its classic run, often more successfully than it used the yellow-fringed chromakey of the time. Earthly miniatures like radar dishes, missile launchers and big tops were captured on location, in camera, with real skies and landscapes behind them. The heroes convincingly disembark from an alien spaceship in the Tom Baker classic “Terror of the Zygons” by means of a foreground miniature and the actors jumping off the back of a van in the distance. A third-scale Tardis was employed in a similar way when the production wanted to save shipping costs on a 1984 location shoot on Lanzarote.

Even 60 years on from Ben Hur, Aliens employed the same technique to show the xenomorph-encrusted roof in the power plant nest scene. The shot – which fooled studio executives so utterly that they complained about extravagant spending on huge sets – required small lights to be moved across the miniature in sync with the actors’ head-torches.

The red line shows the division between hanging miniature and full-scale set in “Aliens”.

The Aliens shot also featured a tilt-down, something only possible with forced perspective if the camera pivots around its nodal point – the point within the lens where the light focuses. Any other type of camera movement gives the game away due to parallax, the optical phenomenon which makes closer objects move through a field of view more quickly than distant ones.

The 1993 remake of Attack of the 50ft Woman made use of a nodal pan to follow Daniel Baldwin to the edge of an outdoor swimming pool which a giant Daryl Hannah is using as a bath. A 1/8th-scale pool with Hannah in was mounted on a raised platform to perfectly align on camera with the real poolside beyond, where Baldwin stood.

The immediacy of forced perspective, allowing actors of different scales to riff off each other in real time, made it the perfect choice for the seasonal comedy Elf. The technique is not without its disadvantages, however. “The first day of trying, the production lost a whole day setting up one shot and never captured it,” recalls VFX supervisor Joe Bauer in the recent documentary Holiday Movies That Made Us.

This shot from “Elf” was accomplished with an extended tricycle allowing Papa Elf to sit much further behind young Buddy than he appears. Tiny puppet hands on Buddy’s shoulders complete the illusion.

Elf’s studio, New Line, was reportedly concerned that the forced perspective shots would never work, but given what a certain Peter Jackson was doing for that same studio at the same time, they probably shouldn’t have worried.

The Lord of the Rings employed a variety of techniques to sell the hobbits and dwarves as smaller than their human friends, but it was in the field of forced perspective that the trilogy was truly groundbreaking. One example was an extended cart built to accommodate Ian McKellen’s Gandalf and Elijah Wood’s supposedly-diminutive Frodo. “You could get Gandalf and Frodo sitting side by side apparently, although in fact Elijah Wood was sitting much further back from the camera than Gandalf,” explains producer Barrie Osborne in the trilogy’s extensive DVD extras.

Jackson insisted on the freedom to move his camera, so his team developed a computer-controlled system that would correct the tell-tale parallax. “You have the camera on a motion-controlled dolly, making it move in and out or side to side,” reveals VFX DP Brian Van’t Hul, “but you have another, smaller dolly [with one of the actors on] that’s electronically hooked to it and does the exact same motion but sort of in a counter movement.”

Forced perspective is still alive and kicking today. For Star Wars Episode IX: The Rise of Skywalker, production designer Kevin Jenkins built a 5ft sand-crawler for shooting in the Jordan Desert. “It was placed on a dressed table at height,” he explained on Twitter, “and the Jawa extras were shot at the same time a calculated distance back from the mini. A very fine powdery sand was dressed around for scale. We even made a roller to make mini track prints! Love miniatures :)”

Filming the Jawa sand-crawler for “Rise of the Skywalker”
The History of Forced Perspective

“Terminator 2: Judgment Day” Retrospective

Next month, Terminator 2: Judgment Day turns 30. Made by a director and star at the peaks of their powers, T2 was the most expensive film ever at the time, and remains both the highest-grossing movie of Arnold Schwarzenegger’s career and the sequel which furthest out-performed its progenitor. It is also one of a handful of films that changed the world of visual effects forever, signalling as it did – to borrow the subtitle from its woeful follow-up – the rise of the machines.

No fate but what we make: Linda Hamilton as Sarah Connor

The original Terminator, a low-budget surprise hit in 1984, launched director James Cameron’s career and cemented Schwarzenegger’s stardom, but it wasn’t until 1990 that the sequel was green-lit, mainly due to rights issues. At the Cannes Film Festival that year, Cameron handed executive producer Mario Kassar his script.

Today it’s easy to forget how risky it was to turn the Terminator, an iconic villain, an unstoppable, merciless death machine from an apocalyptic future, into a good guy who doesn’t kill anyone, stands on one leg when ordered, and looks like a horse when he attempts to smile. But Kassar didn’t balk, granting Cameron a budget ten times what he had had for the original, while stipulating that the film had to be in cinemas just 14 months later.

Even with some expensive sequences cut – including John Connor sending Kyle Reese back through time in the heart of Skynet HQ, a scene that would ultimately materialise in Terminator Genisys – the script was lengthy and extremely ambitious. Beginning on October 8th, 1990, the shooting schedule was front-loaded with effects shots to give the maximum time for CGI pioneers Industrial Light and Magic to realise the liquid metal T-1000 (Robert Patrick).

Rather than CGI, the T-1000’s head in this shot is a chrome model lifted into frame by a crew member.

To further ease ILM’s burden, every trick in the book was employed to get T-1000 shots in camera wherever possible: quick shots of the villain’s fight with the T-800 (Schwarzenegger) in the steel mill finale were done with a stuntman in a foil suit; a chrome bust of Patrick was hand-raised into frame for a helicopter pilot’s reaction shot; the reforming of the shattered T-1000 was achieved by blowing mercury around with a hair dryer; bullet hits on the character’s torso were represented by spring-loaded silver “flowers” that burst out of a pre-scored shirt on cue.

One of the chilling full-size T-800 endoskeleton puppets created by Stan Winston Studio for the Future War sequence

Stan Winston Studio also constructed a number of cable-controlled puppets to show more extensive damage to the morphing menace. These included “Splash Head”, a bust of Patrick with the head split in two by a shotgun blast, and “Pretzel Man”, the nightmarish result of a grenade hit moments before the T-1000 falls to its doom in the molten steel.

Traditional models and rear projection are used throughout the film. A few instances are all too obvious to a modern audience, but most still look great and some are virtually undetectable. Did you know that the roll-over and crash of the cryo-tanker were shot with miniatures? Or that the T-800 plucking John off his bike in the drainage channel was filmed against a rear projection screen?

Plenty of the action was accomplished without such trickery. The production added a third storey to a disused office building near Silicon Valley, then blew it up with 100 gallons of petrol, to show the demise of Cyberdyne Systems. DP Adam Greenberg lit 5.5 miles of freeway for the car chase, and pilot Chuck Tamburro really did fly the T-1000’s police helicopter under a 20ft underpass.

Chaotic, confusing action scenes are the norm today, but it is notable that T2’s action is thrilling yet never unclear. The film sends somewhat mixed messages though, with its horrific images of nuclear annihilation and the T-800’s morality lessons from John juxtaposed with indulgent violence and a reverence for firearms. “I think of T2 as a violent movie about world peace,” Cameron paradoxically stated. “It’s an action movie about the value of human life.”

More Stan Winston puppets were used to depict Sarah’s death by nuclear blast in her nightmare.

Meanwhile, 25 person-years of human life were being devoted by ILM to the T-1000’s metallic morphing abilities. Assistant VFX supervisor Mark Dippé noted: “We were pushing the limits of everything – the amount of disc space we had, the amount of memory we had in the computers, the amount of CPUs we had. Each shot, even though it only lasted about five seconds on the screen, typically would take about eight weeks to complete.”

Robert Patrick shooting reference footage for ILM’s animators

The team began by painting a 2×2” grid on a near-naked Patrick and shooting reference footage of him walking, before laser-scanning his head at the appropriately-named Cyberware Laboratory. Four separate computer models of the T-1000 were built on Silicon Graphics Iris 4Ds, from an amorphous blob to a fully-detailed chrome replica of Patrick, each with corresponding points in 3D space so that the custom software Model Interp could morph between them.

Other custom applications included Body Sock, a solution to gaps that initially appeared when the models flexed their joints, Polyalloy Shader, which gave the T-1000 its chrome appearance, and Make Sticky, with which images of Patrick were texture-mapped onto the distorting 3D model, as when he melts through a barred gate at the mental hospital.

The film’s legacy in visual effects – for which it won the 1992 Oscar – cannot be understated. A straight line can be drawn from the water tendril in Cameron’s The Abyss, through T2 to Jurassic Park and all the way on to Avatar, with which Cameron again broke the record for the highest-grossing film of all time. The Avatar sequels will undoubtedly push the technology even further, but for many Cameron fans his greatest achievement will always be Terminator 2: Judgment Day, with its perfect blend of huge stunts, traditional effects and groundbreaking CGI.

“Terminator 2: Judgment Day” Retrospective

“Jurassic Park” Retrospective

With the temporary closure of Cineworlds around the UK, the future of theatrical exhibition once more hangs in the balance. But just a couple of months ago cinemas were reopening and people were positive that the industry would recover. One of the classic blockbusters that was re-released to plug the gaps in the release schedule ahead of Christopher Nolan’s Tenet was a certain quite popular film about dinosaurs. I described my trip to see it recently, but let’s put that hideous experience behind us and concentrate on the film itself.

Thanks in no small part to the excellent “making of” book by Don Shay and Jody Duncan, Jurassic Park was a formative experience for the 13-year-old Neil Oseman, setting me irrevocably on the path to filmmaking as a career. So let me take you back in time and behind the scenes of an iconic piece of popcorn fodder.

 

Man creates dinosaurs

Even before author Michael Crichton delivered the manuscript of his new novel in May 1990, Steven Spielberg had expressed an interest in adapting it. A brief bidding war between studios saw Joe Dante (Gremlins), Tim Burton (Batman) and Richard Donner (Superman) in the frame to direct, but Spielberg and Universal Pictures were the victors.

Storyboards by David Lowery. Lots of the film’s storyboards are reproduced in “The Making of Jurassic Park” by Don Shay and Jody Duncan.

The screenplay went through several drafts, first by Crichton himself, then by Malio Scotch Marmo and finally by David Koepp, who would go on to script Mission: Impossible, Spider-Man and Panic Room. Pre-production began long before Koepp finished writing, with Spielberg generating storyboards based directly on scenes from the book so that his team could figure out how they were going to bring the dinosaurs to life.

Inspired by a life-size theme park animatronic of King Kong, Spielberg initially wanted all the dinsoaurs to be full-scale physical creatures throughout. This was quickly recognised as impractical, and instead Stan Winston Studio, creators of the Terminator endoskeleton, the Predator make-up and the fifteen-foot-tall Alien queen, focused on building full-scale hydraulically-actuated dinosaurs that would serve primarily for close-ups and mids.

Stan Winston’s crew with their hydraulic behemoth

Meanwhile, to accomplish the wider shots, Spielberg hired veteran stop-motion animator Phil Tippett, whose prior work included ED-209 in RoboCop, the tauntaun and AT-AT walkers in The Empire Strikes Back, and perhaps most relevantly, the titular creature from Dragonslayer. After producing some beautiful animatics – to give the crew a clearer previsualisation of the action than storyboards could provide – Tippett shot test footage of the “go-motion” process he intended to employ for the real scenes. Whilst this footage greatly improved on traditional stop-motion by incorporating motion blur, it failed to convince Spielberg.

https://youtu.be/_7tUlXz9MrA

At this point, Dennis Muren of Industrial Light and Magic stepped in. Muren was the visual effects supervisor behind the most significant milestones in computer-generated imagery up to that point: the stained-glass knight in Young Sherlock Holmes (1986), the water tendril in The Abyss (1989) and the liquid metal T-1000 in Terminator 2: Judgment Day (1991). When Spielberg saw his test footage – initially just skeletons running in a black void – the fluidity of the movement immediately grabbed the director’s attention. Further tests, culminating in a fully-skinned tyrannosaur stalking a herd of gallimimuses, had Spielberg completely convinced. On seeing the tests himself, Tippett famously quipped: “I think I’m extinct.”

The first CGI test

Tippett continued to work on Jurassic Park, however, ultimately earning a credit as dinosaur supervisor. Manipulating a custom-built armature named the Dinosaur Input Device, Tippett and his team were able to have their hands-on techniques recorded by computer and used to drive the CG models.

Building on his experiences working with the E.T. puppet, Spielberg pushed for realistic animal behaviours, visible breathing, and bird-like movements reflecting the latest paleontological theories, all of which would lend credibility to the dinosaurs. Effects co-supervisor Mark Dippe stated: “We used to go outdoors and run around and pretend we were gallimisuses or T-Rexes hunting each other, and shoot [reference] film.”

 

Dinosaurs eat man

Stan Winston’s triceratops was the first dinosaur to go before the cameras, and the only one to be filmed on location.

Production began in August 1992 with three weeks on the Hawaiian island of Kauai. Filming progressed smoothly until the final day on location, which had to be scrubbed due to Hurrican Iniki (although shots of the storm made it into the finished film). After a brief stint in the Mojave Desert, the crew settled into the stages at Universal Studios and Warner Brothers to record the bulk of the picture.

The most challenging sequence to film would also prove to be the movie’s most memorable: the T-Rex attack on the jeeps containing Sam Neill’s Dr. Grant, Jeff Goldblum’s Ian Malcolm, lawyer Gennaro and the children, Lex and Tim. It was the ultimate test for Stan Winston’s full-scale dinosaurs.

The T-Rex mounted on its motion simulator base on Stage 16 at Warner Brothers

The main T-Rex puppet weighed over six tonnes and was mounted on a flight simulator-style platform that had to be anchored into the bedrock under the soundstage. Although its actions were occasionally pre-programmed, the animal was mostly puppeteered live using something similar to the Dinosaur Input Device.

But the torrential rain in which the scene takes place was anathema to the finely tuned mechanics and electronics of the tyrannosaur. “As [the T-Rex] would get rained on,” Winston explained, “his skin would soak up water, his weight would change, and in the middle of the day he would start having the shakes and we would have to dry him down.”

Although hints of this shaking can be detected by an eagle-eyed viewer, the thrilling impact of the overall sequence was clear to Spielberg, who recognised that the T-Rex was the star of his picture. He hastily rewrote the ending to bring the mighty creature back, relying entirely on CGI for the new climax in which it battles raptors in the visitor centre’s rotunda.

The CGI T-Rex in the rewritten finale

 

Woman inherits the earth

After wrapping 12 days ahead of schedule, Jurassic Park hit US cinemas on June 11th, 1993. It became the highest-grossing film of all time, a title which it would hold until Titanic’s release four years later. 1994’s Oscar ceremony saw the prehistoric blockbuster awarded not only Best Visual Effects but also Best Sound Editing and Best Sound Mixing. Indeed, Gary Rydstrom’s contribution to the film – using everything from a dolphin/walrus combination for the raptors’ calls, to the sound of his own dog playing with a rope toy for the T-Rex – cannot be overstated.

Jurassic Park has spawned four sequels to date (with a fifth on the way), and its impact on visual effects was enormous. For many years afterwards, blockbusters were filled with CGI that was unable to equal, let alone surpass, the quality of Jurassic Park’s. Watching it today, the CGI is still impressive if a little plasticky in texture, but I believe that the full-size animatronics which form the lion’s share of the dinosaurs’ screen time are what truly give the creatures their memorable verisimilitude. The film may be 27 years old, but it’s still every bit as entertaining as it was in 1993.

This article first appeared on RedShark News.

Director of photography Dean Cundey, ASC with the brachiosaur head puppet
“Jurassic Park” Retrospective

Will “The Mandalorian” Revolutionise Filmmaking?

Last week, Greig Fraser, ASC, ACS and Baz Idoine were awarded the Emmy for Outstanding Cinematography for a Single-camera Series (Half-hour) for The Mandalorian. I haven’t yet seen this Star Wars TV series, but I’ve heard and read plenty about it, and to call it a revolution in filmmaking is not hyperbole.

Half of the series was not shot on location or on sets, but on something called a volume: a stage with walls and ceiling made of LED screens, 20ft tall, 75ft across and encompassing 270° of the space. I’ve written before about using large video screens to provide backgrounds in limited aways, outside of train windows for example, and using them as sources of interactive light, but the volume takes things to a whole new level.

In the past, the drawback of the technology has been one of perspective; it’s a flat, two-dimensional screen. Any camera movement revealed this immediately, because of the lack of parallax. So these screens tended to be kept to the deep background, with limited camera movement, or with plenty of real people and objects in the foreground to draw the eye. The footage shown on the screens was pre-filmed or pre-rendered, just video files being played back.

The Mandalorian‘s system, run by multiple computers simultaneously, is much cleverer. Rather than a video clip, everything is rendered in real time from a pre-built 3D environment known as a load, running on software developed for the gaming industry called Unreal Engine. Around the stage are a number of witness cameras which use infra-red to monitor the movements of the cinema camera in the same way that an actor is performance-captured for a film like Avatar. The data is fed into Unreal Engine, which generates the correct shifts in perspective and sends them to the video walls in real time. The result is that the flat screen appears, from the cinema camera’s point of view, to have all the depth and distance required for the scene.

The loads are created by CG arists working to the production designer’s instructions, and textured with photographs taken at real locations around the world. In at least one case, a miniature set was built by the art department and then digitised. The scene is lit with virtual lights by the DP – all this still during preproduction.

The volume’s 270° of screens, plus two supplementary, moveable screens in the 90° gap behind camera, are big enough and bright enough that they provide most or all of the illumination required to film under. The advantages are obvious. “We can create a perfect environment where you have two minutes to sunset frozen in time for an entire ten-hour day,” Idoine explains. “If we need to do a turnaround, we merely rotate the sky and background, and we’re ready to shoot!”

Traditional lighting fixtures are used minimally on the volume, usually for hard light, which the omni-directional pixels of an LED screen can never reproduce. If the DPs require soft sources beyond what is built into the load, the technicians can turn any off-camera part of the video screens into an area of whatever colour and brightness are required –  a virtual white poly-board or black solid, for example.

A key reason for choosing the volume technology was the reflective nature of the eponymous Mandalorian’s armour. Had the series been shot on a green-screen, reflections in his shiny helmet would have been a nightmare for the compositing team. The volume is also much more actor- and filmmaker-friendly; it’s better for everyone when you can capture things in-camera, rather than trying to imagine what they will look like after postproduction. “It gives the control of cinematography back to the cinematographer,” Idoine remarks. VR headsets mean that he and the director can even do a virtual recce.

The Mandalorian shoots on the Arri Alexa LF (large format), giving a shallow depth of field which helps to avoid moiré problems with the video wall. To ensure accurate chromatic reproduction, the wall was calibrated to the Alexa LF’s colour filter array.

Although the whole system was expensive to set up, once up and running it’s easy to imagine how quickly and cheaply the filmmakers can shoot on any given “set”. The volume has limitations, of course. If the cast need to get within a few feet of a wall, for example, or walk through a door, then that set-piece has to be real. If a scene calls for a lot of direct sunlight, then the crew move outside to the studio backlot. But undoubtedly this technology will improve rapidly, so that it won’t be long before we see films and TV episodes shot entirely on volumes. Perhaps one day it could overtake traditional production methods?

For much more detailed information on shooting The Mandalorian, see this American Cinematographer article.

Will “The Mandalorian” Revolutionise Filmmaking?

10 Clever Camera Tricks in “Aliens”

In 1983, up-and-coming director James Cameron was hired to script a sequel to Ridley Scott’s 1979 hit Alien. He had to pause halfway through to shoot The Terminator, but the subsequent success of that movie, along with the eventually completed Aliens screenplay, so impressed the powers that be at Fox that they greenlit the film with the relatively inexperienced 31-year-old at the helm.

Although the sequel was awarded a budget of $18.5 million – $7.5 million more than Scott’s original – that was still tight given the much more expansive and ambitious nature of Cameron’s script. Consequently, the director and his team had to come up with some clever tricks to put their vision on celluloid.

 

1. Mirror Image

When contact is lost with the Hadley’s Hope colony on LV-426, Ripley (Sigourney Weaver) is hired as a sort of alien-consultant to a team of crack marines. The hypersleep capsules from which the team emerge on reaching the planet were expensive to build. Production designer Peter Lamont’s solution was to make just half of them, and place a mirror at the end of the set to double them up.

 

2. Small Screens

Wide shots of Hadley’s Hope were accomplished with fifth-scale miniatures by Robert and Dennis Skotak of 4-Ward Productions. Although impressive, sprawling across two Pinewood stages, the models didn’t always convince. To help, the crew often downgraded the images by showing them on TV monitors, complete with analogue glitching, or by shooting through practical smoke and rain.

 

3. Big Screens

The filmmakers opted for rear projection to show views out of cockpit windscreens and colony windows. This worked out cheaper than blue-screen composites, and allowed for dirt and condensation on the glass, which would have been impossible to key optically. Rear projection was also employed for the crash of the dropship – the marines’ getaway vehicle – permitting camera dynamics that again were not possible with compositing technology of the time.

 

4. Back to Front

A highlight of Aliens is the terrifying scene in which Ripley and her young charge Newt (Carrie Henn) are trapped in a room with two facehuggers, deliberately set loose by sinister Company man Carter Burke (Paul Reiser). These nightmarish spider-hands were primarily puppets trailing cables to their operators. To portray them leaping onto a chair and then towards camera, a floppy facehugger was placed in its final position and then tugged to the floor with a fishing wire. The film was reversed to create the illusion of a jump.

 

5. Upside Down

Like Scott before him, Cameron was careful to obfuscate the man-in-a-suit nature of the alien drones wherever possible. One technique he used was to film the creatures crawling on the floor, with the camera upside-down so that they appeared to be hanging from the ceiling. This is seen when Michael Biehn’s Hicks peeks through the false ceiling to find out how the motion-tracked aliens can be “inside the room”.

 

6. Flash Frames

All hell (represented by stark red emergency lighting) breaks loose when the aliens drop through the false ceiling. To punch up the visual impact of the movie’s futuristic weapons, strobelights were aimed at the trigger-happy marines. Taking this effect even further, editor Ray Lovejoy spliced individual frames of white leader film into the shots. As a result, the negative cutter remarked that Aliens‘ 12th reel had more cuts than any complete movie he’d ever worked on.

 

7. Cotton Cloud

With most of the marines slaughtered, Ripley heads to the atmospheric processing plant to rescue Newt from the alien nest. Aided by the android Bishop (Lance Henriksen) they escape just before the plant’s nuclear reactor explodes. The ensuing mushroom cloud is a miniature sculpture made of cotton wool and fibreglass, illuminated by an internal lightbulb!

 

8. Hole in the floor

Returning to the orbiting Sulaco, Ripley and friends are ambushed by the stowaway queen, who rips Bishop in half. A pre-split, spring-loaded dummy of Henriksen was constructed for that moment, and was followed by the simple trick of concealing the actor’s legs beneath a hole in the floor. As in the first movie, android blood was represented by milk. This gradually soured as the filming progressed, much to Henriksen’s chagrin as the script required him to be coated in the stuff and even to spit it out of his mouth.

 

9. Big Battle

The alien queen was constructed and operated by Stan Winston Studios as a full-scale puppet. Two puppeteers were concealed inside, while others moved the legs with rods or controlled the crane from which the body hung. The iconic power loader was similar, with a body builder concealed inside and a counter-weighted support rig. This being before the advent of digital wire removal, all the cables and rods had to be obfuscated with smoke and shifting shadows, though they can still be seen on frame grabs like this one. (The queen is one of my Ten Greatest Movie Puppets of All Time.)

 

10. Little Battle

For wide shots of the final fight, both the queen and the power loader were duplicated as quarter scale puppets. Controlled from beneath the miniature set via rods and cables, the puppets could perform big movements, like falling into the airlock, which would have been very difficult with the full-size props. (When the airlock door opens, the starfield beyond is a black sheet with Christmas lights on it!) The two scales cut seamlessly together and produce a thrilling finale to this classic film.

For more on the visual effects of James Cameron movies, see my rundown of the top five low-tech effects in Hollywood films (featuring Titanic) and a breakdown of the submarine chase in The Abyss.

10 Clever Camera Tricks in “Aliens”

The Long Lenses of the 90s

Lately, having run out of interesting series, I’ve found myself watching a lot of nineties blockbusters: Outbreak, Twister, Dante’s Peak, Backdraft, Daylight. Whilst eighties movies were the background to my childhood, and will always have a place in my heart, it was the cinema of the nineties that I was immersed in as I began my own amateur filmmaking. So, looking back on those movies now, while certain clichés stand out like sore thumbs, they still feel to me like solid examples of how to make a summer crowd-pleaser.

Let’s get those clichés out of the way first. The lead character always has a failed marriage. There’s usually an opening scene in which they witness the death of a spouse or close relative, before the legend “X years later” fades up. The dog will be saved, but the crotchety elderly character will die nobly. Buildings instantly explode towards camera when touched by lava, hurricanes, floods or fires. A stubborn senior authority figure will refuse to listen to the disgraced lead character who will ultimately be proven correct, to no-one’s surprise.

Practical effects in action on “Twister”

There’s an intensity to nineties action scenes, born of the largely practical approach to creating them. The decade was punctuated by historic advances in digital effects: the liquid metal T-1000 in Terminator 2 (1991), digital dinosaurs in Jurassic Park (1993), motion-captured passengers aboard the miniature Titanic (1997), Bullet Time in The Matrix (1999). Yet these techniques remained expensive and time-consuming, and could not match traditional methods of creating explosions, floods, fire or debris. The result was that the characters in jeopardy were generally surrounded by real set-pieces and practical effects, a far more nerve-wracking experience for the viewer than today, when we can tell that our heroes are merely imagining their peril on a green-screen stage.

One thing I was looking out for during these movie meanders down memory lane was lens selection. A few weeks back, a director friend had asked me to suggest examples of films that preferred long lenses. He had mentioned that such lenses were more in vogue in the nineties, which I’d never thought about before.

As soon as I started to consider it, I realised how right my friend was. And how much that long-lens look had influenced me. When I started out making films, I was working with the tiny sensors of Mini-DV cameras. I would often try to make my shots look more cinematic by shooting on the long end of the zoom. This was partly to reduce the depth of field, but also because I instinctively felt that the compressed perspective was more in keeping with what I saw at the cinema.

I remember being surprised by something that James Cameron said in his commentary on the Aliens DVD:

I went to school on Ridley [Scott]’s style of photography, which was actually quite a bit different from mine, because he used a lot of long lenses, much more so than I was used to working with.

I had assumed that Cameron used long lenses too, because I felt his films looked incredibly cinematic, and because I was so sure that cinematic meant telephoto. I’ve discussed in the past what I think people tend to mean by the term “cinematic”, and there’s hardly a definitive answer, but I’m now sure that lens length has little to do with it.

“Above the Clouds” (dir. Leon Chambers)

And yet… are those nineties films influencing me still? I have to confess, I struggle with short lenses to this day. I find it hard to make wide-angle shots look as good. On Above the Clouds, to take just one example, I frequently found that I preferred the wide shots on a 32mm than a 24mm. Director Leon Chambers agreed; perhaps those same films influenced him?

A deleted scene from Ren: The Girl with the Mark ends with some great close-ups shot on my old Sigma 105mm still lens, complete with the slight wobble of wind buffeting the camera, which to my mind only adds to the cinematic look! On a more recent project, War of the Worlds: The Attack, I definitely got a kick from scenes where we shot the heroes walking towards us down the middle of the street on a 135mm.

Apart from the nice bokeh, what does a long lens do for an image? I’ve already mentioned that it compresses perspective, and because this is such a different look to human vision, it arguably provides a pleasing unreality. You could describe it as doing for the image spatially what the flicker of 24fps (versus high frame rates) does for it temporally. Perhaps I shy away from short lenses because they look too much like real life, they’re too unforgiving, like many people find 48fps to be.

The compression applies to people’s faces too. Dustin Hoffman is not known for his small nose, yet it appears positively petite in the close-up below from Outbreak. While this look flatters many actors, others benefit from the rounding of their features caused by a shorter lens.

Perhaps the chief reason to be cautious of long lenses is that they necessitate placing the camera further from the action, and the viewer will sense this, if only on a subconscious level. A long lens, if misused, can rob a scene of intimacy, and if overused could even cause the viewer to disengage with the characters and story.

I’ll leave you with some examples of long-lens shots from the nineties classics I mentioned at the start of this post. Make no mistake, these films employed shorter lenses too, but it certainly looks to me like they used longer lenses on average than contemporary movies.

 

Outbreak

DP: Michael Ballhaus, ASC

 

Twister

DP: Jack N. Green, ASC

 

Daylight

DP: David Eggby, ACS

 

Dante’s Peak

DP: Andrzej Bartkowiak, ASC

 

Backdraft

DP: Mikael Salomon, ASC

For more on this topic, see my article about “The Normal Lens”.

The Long Lenses of the 90s

Why You Can’t Re-light Footage in Post

The concept of “re-lighting in post” is one that has enjoyed a popularity amongst some no-budget filmmakers, and which sometimes gets bandied around on much bigger sets as well. If there isn’t the time, the money or perhaps simply the will to light a scene well on the day, the flexibility of RAW recording and the power of modern grading software mean that the lighting can be completely changed in postproduction, so the idea goes.

I can understand why it’s attractive. Lighting equipment can be expensive, and setting it up and finessing it is one of the biggest consumers of time on any set. The time of a single wizard colourist can seem appealingly cost-effective – especially on an unpaid, no-budget production! – compared with the money pit that is a crew, cast, location, catering, etc, etc. Delaying the pain until a little further down the line can seem like a no-brainer.

There’s just one problem: re-lighting footage is fundamentally impossible. To even talk about “re-lighting” footage demonstrates a complete misunderstanding of what photographing a film actually is.

This video, captured at a trillion frames per second, shows the tranmission and reflection of light.

The word “photography” comes from Greek, meaning “drawing with light”. This is not just an excuse for pompous DPs to compare themselves with the great artists of the past as they “paint with light”; it is a concise explanation of what a camera does.

A camera can’t record a face. It can’t record a room, or a landscape, or an animal, or objects of any kind. The only thing a camera can record is light. All photographs and videos are patterns of light which the viewer’s brain reverse-engineers into a three-dimensional scene, just as our brains reverse-engineer the patterns of light on the retinae every moment of every day, to make sense of our surroundings.

The light from this object gets gradually brighter then gradually darker again – therefore it is a curved surface. There is light on the top of that nose but not on the underneath, so it must be sticking out. These oval surfaces are absorbing all the red and blue light and reflecting only green, so it must be plant life. Such are the deductions made continuously by the brain’s visual centre.

A compound lens for a prototype light-field camera by Adobe

To suggest that footage can be re-lit is to suggest that recorded light can somehow be separated from the underlying physical objects off which that light reflected. Now of course that is within the realms of today’s technology; you could analyse a filmed scene and build a virtual 3D model of it to match the footage. Then you could “re-light” this recreated scene, but it would be a hell of a lot of work and would, at best, occupy the Uncanny Valley.

Some day, perhaps some day quite soon, artificial intelligence will be clever enough to do this for us. Feed in a 2D video and the computer will analyse the parallax and light shading to build a moving 3D model to match it, allowing a complete change of lighting and indeed composition.

Volumetric capture is already a functioning technology, currently using a mix of infrared and visible-light cameras in an environment lit as flatly as possible for maximum information – like log footage pushed to its inevitable conclusion. By surrounding the subject with cameras, a moving 3D image results.

Sir David Attenborough getting his volume captured by Microsoft

Such rigs are a type of light-field imaging, a technology that reared its head a few years ago in the form of Lytro, with viral videos showing how depth of field and even camera angle (to a limited extent) could be altered with this seemingly magical system. But even Lytro was capturing light, albeit it in a way that allowed for much more digital manipulation.

Perhaps movies will eventually be captured with some kind of Radar-type technology, bouncing electromagnetic waves outside the visible spectrum off the sets and actors to build a moving 3D model. At that point the need for light will have been completely eliminated from the production process, and the job of the director of photography will be purely a postproduction one.

While I suspect most DPs would prefer to be on a physical set than hunched over a computer, we would certainly make the transition if that was the only way to retain meaningful authorship of the image. After all, most of us are already keen to attend grading sessions to ensure our vision survives postproduction.

The Lytro Illum 2015 CP+ by Morio – own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=38422894

But for the moment at least, lighting must be done on set; re-lighting after the fact is just not possible in any practical way. This is not to take away from the amazing things that a skilled colourist can do, but the vignettes, the split-toning, the power windows, the masking and the tracking – these are adjustments of emphasis.

A soft shadow can be added, but without 3D modelling it can never fall and move as a real shadow would. A face can be brightened, but the quality of light falling on it can’t be changed from soft to hard. The angle of that light can’t be altered. Cinematographers refer to a key-light as the “modelling” light for a reason: because it defines the 3D model which your brain reverse-engineers when it sees the image.

So if you’re ever tempted to leave the job of lighting to postproduction, remember that your footage is literally made of light. If you don’t take the time to get your lighting right, you might as well not have any footage at all.

Why You Can’t Re-light Footage in Post