5 Things You Didn’t Know About the Iris in Your Lens

Inside a lens, amongst the various glass elements, is an ingenious mechanism which we call the iris. Just like your biological iris, it controls the amount of light passing through the pupil to form an image. I’ve written about the iris’s use to control exposure before, and its well-known side effect of controlling depth of field. But here are five things that aren’t so commonly known about irises.

 

1. f-stops and the entrance pupil

This image shows the exit pupil because it’s seen through the rear element of the lens. A view through the front element would show the entrance pupil.

The f-number of a lens is the ratio of the focal length to the diameter of the aperture, but did you know that it isn’t the actual diameter of the aperture that’s used in this calculation? It’s the apparent diameter as viewed through the front of the lens. A lens might have a magnifying front element, causing the aperture to appear larger than its physical size, or a reducing one, causing it to appear smaller. Either way, it’s this apparent aperture – known as the entrance pupil – which is used to find the f-number.

 

2. No-parallax point

The no-parallax point of a lens is located at its entrance pupil. Sometimes called the nodal point, although that’s technically something different, this is the point around which the camera must pan and tilt if you want to eliminate all parallax. This is important for forced perspective work, for panoramas stitched together from multiple shots, and other types of VFX.

 

3. Focus

If you need to check your focal distance with a tape measure, many cameras have a handy Phi symbol on the side indicating where the sensor plane is located so that you can measure from that point. But technically you should be measuring to the entrance pupil. The sensor plane marker is just a convenient shortcut because the entrance pupil is in a different place for every lens and changes when the lens is refocused or zoomed. In most cases the depth of field is large enough for the shortcut to give perfectly acceptable results, however.

 

4. Bokeh shape

The bokeh of a 32mm Cooke S4 wide open at T2 (left) and stopped down to T2.8 (right). Note also the diffraction spikes visible in the righthand image.

The shape of the entrance pupil determines the shape of the image’s bokeh (out of focus areas), most noticeable in small highlights such as background fairy lights. The pupil’s shape is determined both by the number of iris blades and the shape of their edges. The edges are often curved to approximate a circle when the iris is wide open, but form more of a polygon when stopped down. For example, a Cooke S4 produces octagonal bokeh at most aperture settings, indicating eight iris blades. Incidentally, an anamorphic lens has a roughly circular aperture like any other lens, but the entrance pupil (and hence the bokeh) is typically oval because of the anamorphosing effect of the front elements.

 

5. Diffraction spikes

When the edge of an iris blade is straight or roughly straight, it spreads out the light in a perpendicular direction, creating a diffraction spike. The result is a star pattern around bright lights, typically most visible at high f-stops. Every blade produces a pair of spikes in opposite directions, so the number of points in the star is equal to twice the number of iris blades – as long as that number is odd. If the number of blades is even, diffraction spikes from opposite sides of the iris overlap, so the number of apparent spikes is the same as the number of blades, as in the eight-pointed Cooke diffraction pictured above right.

5 Things You Didn’t Know About the Iris in Your Lens

“Who Framed Roger Rabbit” Retrospective

With the recent releases of Tom and Jerry and Space Jam: A New Legacy, it’s clear that there’s an appetite for traditional cartoon characters in live-action movies. While this mash-up of techniques goes back at least as far as 1964’s Mary Poppins, perhaps no film has done it quite as well as Who Framed Roger Rabbit.

The 1988 movie was loosely based on a Gary K. Wolf novel published seven years earlier, Who Censored Roger Rabbit? However, most of the plot was jettisoned, keeping only the central characters: Eddie Valiant, a private detective; his client, the titular Roger Rabbit; Roger’s wife and femme fatale Jessica; and Roger’s colleague, the libidinous, cigar-smoking Baby Herman. The original villain, a genie of the lamp, was replaced in early script drafts by the hunter who killed Bambi’s mother in the 1942 Disney classic, and finally by Christopher Lloyd’s pop-eyed Judge Doom.

Ditching the contemporary setting of its source material, Who Framed Roger Rabbit? takes place in Hollywood, 1947, where cartoon characters (“toons”) co-exist with humans. Bob Hoskins plays the toon-hating Valiant, who reluctantly teams up with Roger after the latter is implicated in the murder of Marvin Acme. The unlikely pair’s investigations lead them to Toontown, where they uncover a conspiracy to demolish this animated region and build a freeway in its place. Screenwriters Jeffrey Price and Peter S. Seaman found inspiration for this plot in Roman Polanski’s 1974 thriller Chinatown. Several film noirs of the 1940s were also referenced, with Hoskins modelling his character on Humphrey Bogart.

Numerous famous cartoon characters make cameos, including Mickey Mouse, Daffy Duck, Donald Duck, Tweetie Pie and Betty Boop, with executive producer Steven Spielberg pulling his weight behind the scenes to accomplish the historic meeting of competing studios’ properties.

Robert Zemeckis pitched to direct Roger Rabbit in 1982, but his films’ poor box office up to that point put him out of the running. Terry Gilliam was in the frame for a time, while the likes of Harrison Ford, Chevvy Chase and Bill Murray were considered for the lead. Spielberg’s Amblin Entertainment joined the project in 1985, but the projected budget of $50 million was deemed too big to green-light. Meanwhile, Zemeckis’s Back to the Future made him far more bankable with the result that he signed on to direct Roger Rabbit that same year, albeit with a reduced budget of $30 million. Ironically, the film would go over schedule and wind up costing just over its original price tag.

The animation was directed by Richard Williams, otherwise best known for his title sequences for the Pink Panther films. Williams refused to work in LA, forcing the production to shoot primarily in England. While Williams and his 326-strong team set up in Camden Town, Zemeckis and company filmed the interiors at Elstree, with warehouses and bus depots in Shepherd’s Bush standing in for exteriors of Hollywood studios and backlots.

Some of the sets, including the Ink & Paint Club where Jessica is memorably introduced, were raised 10ft off the floor to accommodate puppeteers. Although no puppets are seen in the finished film, whenever a toon had to hold a real object it was either mounted on a rod coming up through the floor, marionetted on wires from above, or manipulated by a robotic arm.

Rehearsals were conducted using a dummy of Roger, or with voice artist Charles Fleischer bedecked in a rabbit suit – standing in. Hoskins even studied his three-year-old daughter’s antics with an imaginary friend to prepare for the challenge of acting to nothing.

Creating the film’s 55 minutes of animation took two years. The live-action footage was printed as a series of enlarged black-and-white frames over which a cel (sheet of transparent acetate) could be placed for the animator to draw on. 82,080 frames were generated in this way, every single one by hand.

To better blend the animated characters with the live backgrounds, Industrial Light and Magic composited layers of shading and shadows. The sparkling sequins on Jessica’s dress were achieved by shining a light through a plastic bag which had holes scratched in it.

The finished film attracted a degree of controversy, not least from the top brass at Disney. It’s easy to see why the family-friendly company would object to the over-sexualisation of Jessica, or to Valiant’s constant drinking and even bumming a cigarette off children at one point. But Zemeckis’s deal gave him final cut, so the compromise was to release the unaltered film under Disney’s Touchstone label.

The result was the second highest grossing film of 1988 and critical acclaim, with an impressive 97% on Rotten Tomatoes and four Academy Awards.

Like many articles on my blog, this one first appeared on RedShark News.

“Who Framed Roger Rabbit” Retrospective

6 Tips for Virtual Production

Part of the volume at ARRI Rental in Uxbridge, with the ceiling panel temporarily lowered

Virtual production technically covers a number of things, but what people normally mean by it is shooting on an LED volume. This is a stage where the walls are giant LED screens displaying real-time backgrounds for photographing the talent in front of. The background may be a simple 2D plate shot from a moving vehicle, for a scene inside a car, or a more elaborate set of plates shot with a 360° rig.

The most advanced set-ups do not use filmed backgrounds at all, but instead use 3D virtual environments rendered in real time by a gaming engine like Unreal. A motion-tracking system monitors the position of the camera within the volume and ensures that the proper perspective and parallax is displayed on the screens. Furthermore, the screens are bright enough that they provide most or all of the illumination needed on the talent in a very realistic way.

I have never done any virtual production myself, but earlier this year I was fortunate enough to interview some DPs who have, for British Cinematographer article. Here are some tips about VP shooting which I learnt from these pioneers.

 

1. Shoot large format

An ARRI Alexa Mini LF rigged with Mo-Sys for tracking its position within the volume

To prevent a moiré effect from the LED pixels, the screens need to be out of focus. Choosing an LF camera, with their shallower depth of field, makes this easier to accomplish. The Alexa Mini LF seems to be a popular choice, but the Sony Venice evidently works well too.

 

2. Keep your distance

To maintain the illusion, neither the talent nor the camera should get too close to the screens. A rule of thumb is that the minimum distance in metres should be no less than the pixel pitch of the screens. (The pixel pitch is the distance in millimetres between the centre of one pixel and the centre of the next.) So for a screen of 2.3mm pixel pitch, keep everything at least 2.3m away.

 

3. Tie it all together

Several DPs have found that the real foreground and the virtual background fit together more seamlessly if haze or a diffusion filter are used. This makes sense because both soften the image, blending light from nearby elements of the frame together. Other in-camera effects like rain (if the screens are rated weatherproof) and lens flares would also help.

 

4. Surround yourself

The back of ARRI’s main screen, composed of ROE LED panels

The most convincing LED volumes have screens surrounding the talent, perhaps 270° worth, and an overhead screen as well. Although typically only one of these screens will be of a high enough resolution to shoot towards, the others are important because they shed interactive light on the talent, making them really seem like they’re in the correct environment.

 

5. Match the lighting

If you need to supplement the light, use a colour meter to measure the ambience coming from the screens, then dial that temperature into an LED fixture. If you don’t have a colour meter you should conduct tests beforehand, as what matches to the eye may not necessarily match on camera.

 

6. Avoid fast camera moves

Behind the scenes at the ARRI volume, built in partnership with Creative Technology

It takes a huge amount of processing power to render a virtual background in real time, so there will always be a lag. The Mandalorian works around this by shooting in a very classical style (which fits the Star Wars universe perfectly), with dolly moves and jibs rather than a lot of handheld shots. The faster the camera moves, the more the delay in the background will be noticeable. For the same reason, high frame rates are not recommended, but as processing power increases, these restrictions will undoubtedly fall away.

6 Tips for Virtual Production

“Quantum Leaper”

This week issue 40 of Infinity magazine comes out, featuring a couple of articles I wrote, including one about the cult sci-fi series Quantum Leap. The show saw Dr. Sam Beckett (Scott Bakula) bouncing around time into other people’s bodies and striving to put right what once went wrong, while his holographic friend Al (Dean Stockwell) smoked cigars, letched, and relayed exposition from Ziggy the computer.

I end the article by wondering whether it’s time for someone like Netflix to bring the show back (it definitely is). What I don’t mention in the magazine is that – unbeknownst to almost everyone – Quantum Leap has already been rebooted once.

This, my loyal readers, is the story of Quantum Leaper.

 

Season One (1995)

As teenagers, my friend David Abbott and I were huge Quantum Leap fans, and were bereft when the show was axed in 1993. I was developing an interest in filmmaking, having dabbled in 2D computer animation on my Atari ST and borrowed my grandfather’s Video-8 camcorder on a couple of occasions. When I was given that camcorder for my 15th birthday, David and I decided that we would make our own version of Quantum Leap, which we imaginatively titled Quantum Leaper.

The first episode was called “Just What the Doctor Ordered” and saw my character – named, again with great imagination, Neil – leaping into a doctor just as his patient is flatlining. I don’t remember much about the plot, but I do remember that we climbed the nearby Malvern Hills to film a fight scene.

Dave played Albert, my holographic helper, communicating with Project Quantum Leap’s supercomputer Ziggy by means of a special hand-link, just like Dean Stockwell did. Unlike Dean Stockwell’s, this hand-link was a calculator.

The two of us also played all the supporting characters (often with the judicious addition of a hat or jacket) and operated the camera, unless we were both in shot, in which case it was locked off. Much of the the editing was done in camera – rewinding the 8mm videotape, cueing it up to the exact moment the last piece of action ended, then hitting record and calling action simultaneously – and the rest I did tape-to-tape with two VCRs connected together. A cheap four-track disco mixer enabled the addition of music (badly composed by me) and sound effects (many of which were sampled from Quantum Leap itself). As YouTube was still years away, the only viewers for the series were our parents and friends, forced to sit down in front of the TV and watch it off VHS.

Episode two, “Boom!”, saw the fictional Neil as a bomb disposal expert supposedly in Northern Ireland in 1980, though like the first episode it was all shot in and around my house. My sister Kate was drafted in to play a journalist whose life Neil has to save.

“A Leap into the Blue” was the next episode, with Neil in the body of a parachutist. Scenes of characters in free-fall were shot with us standing in front of a white wall; I digitised the footage on my ST with a Videomaster cartridge and composited scrolling clouds into the background. The resolution of the Videomaster was very limited – maybe 320×240 – the frame rate was very low too, and it could only do black and white.

A digitised visual effect using a shot of a plane stolen from some TV programme or other

Next we shot a “pilot” episode explaining how Neil and Albert switched places with Sam and Al. I remember digitising shots of Scott Bakula and Dean Stockwell from Quantum Leap and compositing them atrociously into our own footage. At about 30 minutes long, the pilot was double the length of our other episodes.

Then we continued the series where we’d left off. Dave’s script “One Giant Leap” has Neil on a space shuttle mission, an episode that included NASA footage taped off the TV. We made almost no attempt to create sets; the space shuttle cockpit was a plain wall, a computer keyboard and a piece of card to cover an incongruous bookcase.

The space shuttle cockpit “set”

The next two episodes find Neil meeting (and shooting) an evil future version of himself, then leaping into the crazy future space year of 2017. The latter involves a flying car – my mum’s Citroen AX with the wheels framed out, intercut with an extremely crude CGI model.

Dave’s episodes “Virtual Leaping” and “Bullets Over Leaping” see Neil become a VR programmer (with a headset made of Lego) and then an actor (in a studio suspiciously like Dave’s shed).

The VR headset “prop”

My next episode has Neil leaping into himself and saving his father’s life. (My actual dad provided some splendidly wooden acting.) But doing this causes a paradox, and the season finale sees Neil and Albert swap places (as Sam and Al do in a classic Quantum Leap episode) and Neil having to restore the timeline to prevent the destruction of the universe.

We were ambitious. You can say that much for us.

 

Season Two (1996)

The following year, while doing our GCSEs, we began work on a second season. In between I’d made a bad 40-minute comedy, Bob the Barbarian, and an appalling feature-length sci-fi film, The Dark Side of the Earth, and I’d learnt a few things that would lift the production values of Season Two very slightly. I’d also nagged my parents into buying me a genlock which would let me superimpose CGI over analogue video, meaning I didn’t have to digitise footage and suffer the horrendous image degradation any more.

The holographic Albert enters the Imaging Chamber, an effect enabled by my new genlock.

The actual Quantum Leaping effect from this era of the show is surprisingly decent given the equipment we were working with. We would lock the camera off and jump-cut to a blue filter being over the lens, then a white glow would creep over me – an animation I achieved in software called Deluxe Paint – followed by tendrils of electricity. The screen would then fade to white and a similar effect would play out in reverse to show the leap in.

Leaping from life to life, striving to put right what once went wrong…

Another improvement was that we managed to convince a few other friends to act in the series, including fellow Quantum Leap fan Lee Richardson, as well as Chris Jenkins, Conrad Allen, Matt Hodges, Si Timbrell and Jim McKelvie. Recognising my lack of musical talent at last, I abandoned composing and instead used soundtrack CDs from Star Trek: Deep Space Nine (Dennis McCarthy), the John Woo film Broken Arrow (Hans Zimmer), and the Doctor Who story “The Curse of Fenric” (Mark Ayres). Albert’s hand-link prop got an upgrade too, from a calculator to a custom Lego build with flashing lights.

Lee Richardson “acting” in the control room “set”

Season Two opens with Dave’s episodes “Project Hijacked” and “Oh Brother, Where Art Thou?” which focus on events at Project Quantum Leap, supposedly a high-tech facility in the New Mexico desert in 2005. In reality it was a living room with a control console made out of painted cardboard boxes and Christmas lights. In an early manifestation of my cinematography leanings, I snooted the ceiling light with a rolled-up piece of silver card, lending a little bit of mood to the look.

At the time, Dave’s family were training a hearing dog, Louis, so I wrote an episode to feature him; “Silence is Golden” sees Neil leap into a deaf man, and was followed by the morbid “Ashes to Ashes” where he leaps into a corpse.

The next episode, Dave’s “Driven to Distraction”, is probably the best of the lot. For once there were few enough characters that no-one needed to confusingly play dual roles, and there is plenty of action to boot. (I uploaded this episode to YouTube so long ago that the ten-minute time limit still applied.)

The X-Files-inspired “Close Encounters of the Leaping Kind” comes next, with Neil as a ufologist bothered by a shadowy government agent. Then Neil becomes a teenager who must prevent a drugs overdose, then a one-armed man who must overcome prejudice to hold down a job. Cringingly entitled “Not So Armless”, this latter was shot in a newsagent’s owned by a friend’s parents, one of the series’ few non-domestic locations.

Like Quantum Leap we had a mirror shot in every episode where Neil would see the leapee’s reflection looking back at him. Sometimes Dave would track the camera behind my back and we’d hide a cut in the darkness to swap me with whoever was playing the reflection. Another time we pretended the serving hatch in Dave’s house was a mirror and the two of us synchronised our movements. For a fight scene in “Not So Armless” Chris hid one arm inside his t-shirt so that Neil’s mirror image could appear to punch the antagonist with an invisible fist!

Facing mirror images that were not his own…

The penultimate episode of the season features several brief leaps, ending with one to Hiroshima in 1945, where the A-bomb detonation (more footage off the TV) causes both Neil and Albert to leap simultaneously. In the finale, Albert becomes a mountaineer caught in an avalanche, while Neil is a member of the rescue team – a premise thieved from the Quantum Leap novel “Search and Rescue”. We started shooting it during snowy weather, but the snow thawed and the episode was never completed. The friends who had been appearing as supporting characters now had part-time jobs and couldn’t spare the time for filming.

 

Legacy

We wrote all six episodes of a third season which would have explained how Neil became the evil future version of himself seen in an earlier episode, but nothing was ever filmed.

In 1997 we began a remake of the pilot using the experience we had gained since shooting the original, but again it was never completed. One part we did film was an action sequence with me on the roof rack of a car while the driver swerves around trying to throw me off. We shot this on Malvern’s Castlemorton Common and used a dummy of me for some of the wider and more dangerous shots. Its acting was probably better than mine. We remade the scene four years later as part of my Mini-DV feature The Beacon.

Today only five of the 20 Quantum Leaper episodes that we made survive, the rest having been callously taped over at some point in my late teens. That’s probably for the best, as most of it was hilariously bad, but making it taught me a hell of a lot about filmmaking. Without it, I doubt I’d have a career in cinematography today.

His only guide on these journeys is Al, an observer from his own time…
“Quantum Leaper”

“Mission: Impossible” and the Dawn of Virtual Sets

The seventh instalment in the Mission: Impossible franchise was originally scheduled for release this July. It’s since been pushed back to next September, which is a minor shame because it means there will be no release in 2021 to mark the quarter of a century since Tom Cruise first chose to accept the mission of bringing super-spy Ethan Hunt to the big screen.

Today, 1996’s Mission: Impossible is best remembered for two stand-out sequences. The first, fairly simple but incredibly tense, sees Cruise descend on a cable into a high-security vault where even a single bead of sweat will trigger pressure sensors in the floor.

The second, developing from the unlikely to the downright ludicrous, finds Cruise battling Jon Voight atop a speeding Channel Tunnel train, a fight which continues on the skids of a helicopter dragged along behind the Eurostar, ending in an explosion which propels Cruise (somehow unscathed) onto the rear of the train.

It is the second of those sequences which is a landmark in visual effects, described by Cinefex magazine at the time as “the dawn of virtual sets”.

“In Mission: Impossible, we took blue-screen elements of actors and put them into believable CG backgrounds,” said VFX supervisor John Knoll of Industrial Light and Magic. Building on his work on The Abyss and Terminator 2, Knoll’s virtual tunnel sets would one day lead to the likes of The Mandalorian – films and TV shows shot against LED screens displaying CG environments.

Which is ironic, given that if Tom Cruise was remaking that first film today, he would probably insist on less trickery, not more, and demand to be strapped to the top of a genuine speeding Eurostar.

The Channel Tunnel had only been open for two years when Mission: Impossible came out, and the filmmakers clearly felt that audiences – or at least American audiences – were so unfamiliar with the service that they could take a number of liberties in portraying it. The film’s tunnel has only a single bore for both directions of travel, and the approaching railway line was shot near Glasgow.

That Scottish countryside is one of the few real elements in the sequence. Another is the 100ft of full-size train that was constructed against a blue-screen to capture the lead actors on the roof. To portray extreme speed, the crew buffeted the stars with 140mph wind from a parachute-training fan.

Many of the Glasgow plates were shot at 12fps to double the apparent speed of the camera helicopter, which generally flew at 80mph. But when the plate crew tried to incorporate the picture helicopter with which Jean Reno’s character chases the train, the under-cranking just looked fake, so the decision was taken to computer-generate the aircraft in the vast majority of the shots.

The train is also CGI, as are the tunnel entrance and some of its surroundings, and of course the English Channel is composited into the Glaswegian landscape. Once the action moves inside the tunnel, nothing is real except the actors and the set-pieces they’re clinging to.

“We cheated the scale to keep it tight and claustrophobic,” said VFX artist George Hull, admitting that the helicopter could not have fitted in such a tunnel in reality. “The size still didn’t feel right, so we went back and added recognisable, human-scale things such as service utility sheds and ladders.”

Overhead lights spaced at regular intervals were simulated for the blue-screen work. “When compositing the scenes into the CG tunnel months later, we could marry the environment by timing those interactive lights to the live-action plates,” explained Hull.

Employing Alias for modelling, Softimage for animation, RenderMan for rendering, plus custom software like ishade and icomp, ILM produced a sequence which, although it wasn’t completely convincing even in 1996, is still exciting.

Perhaps the best-looking part is the climactic explosion, which was achieved with a 1/8th scale miniature propelled at 55mph through a 120ft tunnel model. (The runaway CGI which followed Jurassic Park’s 1993 success wisely stayed away from explosions for many years, as their dynamics and randomness made them extremely hard to simulate on computers of the time.)

Knoll went on to supervise the Star Wars prequels’ virtual sets (actually miniatures populated with CG aliens), and later Avatar and The Mandalorian. Meanwhile, Cruise pushed for more and more reality in his stunt sequences as the franchise went on, climbing the Burj Khalifa for Ghost Protocol, hanging off the side of a plane for Rogue Nation, skydiving and flying a helicopter for Fallout, and yelling at the crew for Mission: Impossible 7.

At least, I think that last one was real.

“Mission: Impossible” and the Dawn of Virtual Sets

5 Ingenious Visual Effects With No CGI

How were visual effects achieved before the advent of computer generated imagery (CGI)? Most of us know that spaceships used to be miniatures, and monsters used to be puppets or people in suits, but what about the less tangible effects? How did you create something as exotic as an energy beam or a dimensional portal without the benefit of digital particle simulations? The answer was often a combination of chemistry, physics, artistry and ingenuity. Here are five examples.

 

1. “Star Trek” transporters

The original series of Star Trek, premiered in 1966, had to get creative to achieve its futuristic effects with the budget and technology available. The Howard Anderson Company was tasked with realising the iconic transporter effect which enables Kirk’s intrepid crew to beam down to alien planets. Darrell Anderson created the characteristic sparkles of the dematerialisation by filming backlit aluminium powder being sprinkled in front of a black background in slow motion. Hand-drawn mattes were then used to ensure that the sparkling powder only appeared over the characters.

 

2. “Ghostbusters” proton packs

The much-loved 1984 comedy Ghostbusters features all kinds of traditional effects, including the never-to-be-crossed particle streams with which the heroes battle their spectral foes. The streams consist of five layers of traditional cell animation – the same technique used to create, say, a Disney classic like Sleeping Beauty – which were composited and enhanced on an optical printer. (An optical printer is essentially two or more film projectors connected to a camera so that multiple separate elements can be combined into a single shot.) Composited onto the tips of the Ghostbusters’ guns were small explosions and other pyrotechnic effects shot on a darkened stage.

 

3. “Lifeforce” energy beams

This cult 1985 sci-fi horror film, most notable for an early screen appearance by Patrick Stewart, features alien vampires which drain the titular lifeforce from their victims. To visualise this lifeforce, VFX supervisor John Dykstra settled on a process whereby a blue argon laser was aimed at a rotating tube made of highly reflective mylar. This threw flowing lines of light onto a screen where it would be captured by the camera for later compositing with the live-action plates. The tube could be deliberately distorted or dented to vary the effects, and to add more energy to certain shots multiple brief elements of a flashing xenon bulb were added to the mix.

 

4. “Big Trouble in Little China” portal

A mixture of chemical and optical effects were employed for certain shots in the 1986 action-comedy Big Trouble in Little China. Director John Carpenter wanted an effervescent effect like “an Alka-Seltzer tablet in water” to herald the appearance of a trio of warriors known as the Three Storms. After many tests, the VFX team determined that a combination of green paint, metallic powder and acetone, heated in a Pyrex jar on a hotplate, produced an interesting and suitable effect. The concoction was filmed with a fisheye lens, then that footage was projected onto a dome to make it look like a ball of energy, and re-photographed through layers of distorted glass to give it a rippling quality.

 

5. “Independence Day” cloud tank

By 1996, CGI was replacing many traditional effects, but the summer blockbuster Independence Day used a healthy mix of both. To generate the ominous clouds in which the invading spacecraft first appear, the crew built what they called the “Phenomenon Rig”. This was a semi-circle of halogen lights and metal piping which was photographed in a water tank. Paint was injected into the water through the pipes, giving the appearance of boiling clouds when lit up by the lamps within. This was digitally composited with a live-action background plate and a model shot of the emerging ship.

See also: “Top Five Low-tech Effects” and “5 Simple but Effective Camera Tricks”

5 Ingenious Visual Effects With No CGI

The History of Forced Perspective

A miniature ship with a real camel, people and helicopters in “Close Encounters of the Third Kind”

“These are small,” Father Ted once tried to explain to Father Dougal, holding up toy cows, “but the ones out there are far away.” We may laugh at the gormless sitcom priest, but the chances are that we’ve all confounded size and distance, on screen at least.

The ship marooned in the desert in Close Encounters of the Third Kind, the cliff at the end of Tremors, the runways and planes visible through the windows of Die Hard 2’s control tower, the helicopter on the boat in The Wolf of Wall Street, even the beached whale in Mega Shark Versus Giant Octopus – all are small, not far away.

The most familiar forced perspective effect is the holiday snap of a friend or family member picking up the Eiffel Tower between thumb and forefinger, or trying to right the Leaning Tower of Pisa. By composing the image so that a close subject (the person) appears to be in physical contact with a distant subject (the landmark), the latter appears to be as close as the former, and therefore much smaller than it really is.

Building Moon's forced perspective corridor
Building the forced perspective corridor for “Moon”

Architects have been playing tricks with perspective for centuries. Italy’s Palazzo Spada, for example, uses diminishing columns and a ramped floor to make a 26ft corridor look 100ft long. Many film sets – such as the basement of clones in Moon – have used the exact same technique to squeeze extra depth out of limited studio space or construction resources.

Even a set that is entirely miniature can benefit from forced perspective, with a larger scale being used in the foreground and a smaller one in the background, increasing the perceived depth. For example, The Terminator’s “Future War” scenes employ skulls of varying size, with background ruins on an even smaller scale.

“Princess Nicotine”

An early cinematic display of forced perspective was the 1908 short Princess Nicotine, in which a fairy who appears to be cavorting on a man’s tabletop is actually a reflection in a distant mirror. “The little fairy moves so realistically that she cannot be explained away by assuming that she is a doll,” remarked a Scientific American article of the time, “and yet it is impossible to understand how she can be a living being, because of her small stature.”

During the 1950s, B movies featuring fantastically shrunk or enlarged characters made full use of forced perspective, as did the Disney musical Darby O’Gill and the Little People. VFX supervisor Peter Ellenshaw, interviewed for a 1994 episode of Movie Magic, remembered the challenges of creating sufficient depth of field to sell the illusion: “You had to focus both on the background and the foreground [simultaneously]. It was very difficult. We had to use so much light on set that eventually we blew the circuit-breakers in the Burbank power station.”

One of many ingenious forced perspective shots in “The Gate”
This behind-the-scenes angle reveals how the above shot was done.

Randall William Cook was inspired years later by Ellenshaw’s work when he was called upon to realise quarter-scale demonic minions for the 1987 horror movie The Gate. Faced with a tiny budget, Cook devised in-camera solutions with human characters on raised foreground platforms, and costumed minions on giant set-pieces further back, all carefully designed so that the join was undetectable. As the contemporary coverage in Cinefex magazine noted, “One of the advantages of a well-executed forced perspective shot is that the final product requires no optical work and can therefore be viewed along with the next day’s rushes.”

A subgroup of forced perspective effects is the hanging miniature – a small-scale model suspended in front of camera, typically as a set extension. The 1925 version of Ben Hur used this technique for wide shots of the iconic chariot race. The arena of the Circus Maximus was full size, but in front of and above it was hung a miniature spectators’ gallery containing 10,000 tiny puppets which could stand and wave as required.

Setting up a foreground miniature for a later Who story, Inferno (1970)
Setting up a foreground miniature for the 1970 “Doctor Who” story “Inferno”

Doctor Who used foreground miniatures throughout its classic run, often more successfully than it used the yellow-fringed chromakey of the time. Earthly miniatures like radar dishes, missile launchers and big tops were captured on location, in camera, with real skies and landscapes behind them. The heroes convincingly disembark from an alien spaceship in the Tom Baker classic “Terror of the Zygons” by means of a foreground miniature and the actors jumping off the back of a van in the distance. A third-scale Tardis was employed in a similar way when the production wanted to save shipping costs on a 1984 location shoot on Lanzarote.

Even 60 years on from Ben Hur, Aliens employed the same technique to show the xenomorph-encrusted roof in the power plant nest scene. The shot – which fooled studio executives so utterly that they complained about extravagant spending on huge sets – required small lights to be moved across the miniature in sync with the actors’ head-torches.

The red line shows the division between hanging miniature and full-scale set in “Aliens”.

The Aliens shot also featured a tilt-down, something only possible with forced perspective if the camera pivots around its nodal point – the point within the lens where the light focuses. Any other type of camera movement gives the game away due to parallax, the optical phenomenon which makes closer objects move through a field of view more quickly than distant ones.

The 1993 remake of Attack of the 50ft Woman made use of a nodal pan to follow Daniel Baldwin to the edge of an outdoor swimming pool which a giant Daryl Hannah is using as a bath. A 1/8th-scale pool with Hannah in was mounted on a raised platform to perfectly align on camera with the real poolside beyond, where Baldwin stood.

The immediacy of forced perspective, allowing actors of different scales to riff off each other in real time, made it the perfect choice for the seasonal comedy Elf. The technique is not without its disadvantages, however. “The first day of trying, the production lost a whole day setting up one shot and never captured it,” recalls VFX supervisor Joe Bauer in the recent documentary Holiday Movies That Made Us.

This shot from “Elf” was accomplished with an extended tricycle allowing Papa Elf to sit much further behind young Buddy than he appears. Tiny puppet hands on Buddy’s shoulders complete the illusion.

Elf’s studio, New Line, was reportedly concerned that the forced perspective shots would never work, but given what a certain Peter Jackson was doing for that same studio at the same time, they probably shouldn’t have worried.

The Lord of the Rings employed a variety of techniques to sell the hobbits and dwarves as smaller than their human friends, but it was in the field of forced perspective that the trilogy was truly groundbreaking. One example was an extended cart built to accommodate Ian McKellen’s Gandalf and Elijah Wood’s supposedly-diminutive Frodo. “You could get Gandalf and Frodo sitting side by side apparently, although in fact Elijah Wood was sitting much further back from the camera than Gandalf,” explains producer Barrie Osborne in the trilogy’s extensive DVD extras.

Jackson insisted on the freedom to move his camera, so his team developed a computer-controlled system that would correct the tell-tale parallax. “You have the camera on a motion-controlled dolly, making it move in and out or side to side,” reveals VFX DP Brian Van’t Hul, “but you have another, smaller dolly [with one of the actors on] that’s electronically hooked to it and does the exact same motion but sort of in a counter movement.”

Forced perspective is still alive and kicking today. For Star Wars Episode IX: The Rise of Skywalker, production designer Kevin Jenkins built a 5ft sand-crawler for shooting in the Jordan Desert. “It was placed on a dressed table at height,” he explained on Twitter, “and the Jawa extras were shot at the same time a calculated distance back from the mini. A very fine powdery sand was dressed around for scale. We even made a roller to make mini track prints! Love miniatures :)”

Filming the Jawa sand-crawler for “Rise of the Skywalker”
The History of Forced Perspective

“Terminator 2: Judgment Day” Retrospective

Next month, Terminator 2: Judgment Day turns 30. Made by a director and star at the peaks of their powers, T2 was the most expensive film ever at the time, and remains both the highest-grossing movie of Arnold Schwarzenegger’s career and the sequel which furthest out-performed its progenitor. It is also one of a handful of films that changed the world of visual effects forever, signalling as it did – to borrow the subtitle from its woeful follow-up – the rise of the machines.

No fate but what we make: Linda Hamilton as Sarah Connor

The original Terminator, a low-budget surprise hit in 1984, launched director James Cameron’s career and cemented Schwarzenegger’s stardom, but it wasn’t until 1990 that the sequel was green-lit, mainly due to rights issues. At the Cannes Film Festival that year, Cameron handed executive producer Mario Kassar his script.

Today it’s easy to forget how risky it was to turn the Terminator, an iconic villain, an unstoppable, merciless death machine from an apocalyptic future, into a good guy who doesn’t kill anyone, stands on one leg when ordered, and looks like a horse when he attempts to smile. But Kassar didn’t balk, granting Cameron a budget ten times what he had had for the original, while stipulating that the film had to be in cinemas just 14 months later.

Even with some expensive sequences cut – including John Connor sending Kyle Reese back through time in the heart of Skynet HQ, a scene that would ultimately materialise in Terminator Genisys – the script was lengthy and extremely ambitious. Beginning on October 8th, 1990, the shooting schedule was front-loaded with effects shots to give the maximum time for CGI pioneers Industrial Light and Magic to realise the liquid metal T-1000 (Robert Patrick).

Rather than CGI, the T-1000’s head in this shot is a chrome model lifted into frame by a crew member.

To further ease ILM’s burden, every trick in the book was employed to get T-1000 shots in camera wherever possible: quick shots of the villain’s fight with the T-800 (Schwarzenegger) in the steel mill finale were done with a stuntman in a foil suit; a chrome bust of Patrick was hand-raised into frame for a helicopter pilot’s reaction shot; the reforming of the shattered T-1000 was achieved by blowing mercury around with a hair dryer; bullet hits on the character’s torso were represented by spring-loaded silver “flowers” that burst out of a pre-scored shirt on cue.

One of the chilling full-size T-800 endoskeleton puppets created by Stan Winston Studio for the Future War sequence

Stan Winston Studio also constructed a number of cable-controlled puppets to show more extensive damage to the morphing menace. These included “Splash Head”, a bust of Patrick with the head split in two by a shotgun blast, and “Pretzel Man”, the nightmarish result of a grenade hit moments before the T-1000 falls to its doom in the molten steel.

Traditional models and rear projection are used throughout the film. A few instances are all too obvious to a modern audience, but most still look great and some are virtually undetectable. Did you know that the roll-over and crash of the cryo-tanker were shot with miniatures? Or that the T-800 plucking John off his bike in the drainage channel was filmed against a rear projection screen?

Plenty of the action was accomplished without such trickery. The production added a third storey to a disused office building near Silicon Valley, then blew it up with 100 gallons of petrol, to show the demise of Cyberdyne Systems. DP Adam Greenberg lit 5.5 miles of freeway for the car chase, and pilot Chuck Tamburro really did fly the T-1000’s police helicopter under a 20ft underpass.

Chaotic, confusing action scenes are the norm today, but it is notable that T2’s action is thrilling yet never unclear. The film sends somewhat mixed messages though, with its horrific images of nuclear annihilation and the T-800’s morality lessons from John juxtaposed with indulgent violence and a reverence for firearms. “I think of T2 as a violent movie about world peace,” Cameron paradoxically stated. “It’s an action movie about the value of human life.”

More Stan Winston puppets were used to depict Sarah’s death by nuclear blast in her nightmare.

Meanwhile, 25 person-years of human life were being devoted by ILM to the T-1000’s metallic morphing abilities. Assistant VFX supervisor Mark Dippé noted: “We were pushing the limits of everything – the amount of disc space we had, the amount of memory we had in the computers, the amount of CPUs we had. Each shot, even though it only lasted about five seconds on the screen, typically would take about eight weeks to complete.”

Robert Patrick shooting reference footage for ILM’s animators

The team began by painting a 2×2” grid on a near-naked Patrick and shooting reference footage of him walking, before laser-scanning his head at the appropriately-named Cyberware Laboratory. Four separate computer models of the T-1000 were built on Silicon Graphics Iris 4Ds, from an amorphous blob to a fully-detailed chrome replica of Patrick, each with corresponding points in 3D space so that the custom software Model Interp could morph between them.

Other custom applications included Body Sock, a solution to gaps that initially appeared when the models flexed their joints, Polyalloy Shader, which gave the T-1000 its chrome appearance, and Make Sticky, with which images of Patrick were texture-mapped onto the distorting 3D model, as when he melts through a barred gate at the mental hospital.

The film’s legacy in visual effects – for which it won the 1992 Oscar – cannot be understated. A straight line can be drawn from the water tendril in Cameron’s The Abyss, through T2 to Jurassic Park and all the way on to Avatar, with which Cameron again broke the record for the highest-grossing film of all time. The Avatar sequels will undoubtedly push the technology even further, but for many Cameron fans his greatest achievement will always be Terminator 2: Judgment Day, with its perfect blend of huge stunts, traditional effects and groundbreaking CGI.

“Terminator 2: Judgment Day” Retrospective

“Jurassic Park” Retrospective

With the temporary closure of Cineworlds around the UK, the future of theatrical exhibition once more hangs in the balance. But just a couple of months ago cinemas were reopening and people were positive that the industry would recover. One of the classic blockbusters that was re-released to plug the gaps in the release schedule ahead of Christopher Nolan’s Tenet was a certain quite popular film about dinosaurs. I described my trip to see it recently, but let’s put that hideous experience behind us and concentrate on the film itself.

Thanks in no small part to the excellent “making of” book by Don Shay and Jody Duncan, Jurassic Park was a formative experience for the 13-year-old Neil Oseman, setting me irrevocably on the path to filmmaking as a career. So let me take you back in time and behind the scenes of an iconic piece of popcorn fodder.

 

Man creates dinosaurs

Even before author Michael Crichton delivered the manuscript of his new novel in May 1990, Steven Spielberg had expressed an interest in adapting it. A brief bidding war between studios saw Joe Dante (Gremlins), Tim Burton (Batman) and Richard Donner (Superman) in the frame to direct, but Spielberg and Universal Pictures were the victors.

Storyboards by David Lowery. Lots of the film’s storyboards are reproduced in “The Making of Jurassic Park” by Don Shay and Jody Duncan.

The screenplay went through several drafts, first by Crichton himself, then by Malio Scotch Marmo and finally by David Koepp, who would go on to script Mission: Impossible, Spider-Man and Panic Room. Pre-production began long before Koepp finished writing, with Spielberg generating storyboards based directly on scenes from the book so that his team could figure out how they were going to bring the dinosaurs to life.

Inspired by a life-size theme park animatronic of King Kong, Spielberg initially wanted all the dinsoaurs to be full-scale physical creatures throughout. This was quickly recognised as impractical, and instead Stan Winston Studio, creators of the Terminator endoskeleton, the Predator make-up and the fifteen-foot-tall Alien queen, focused on building full-scale hydraulically-actuated dinosaurs that would serve primarily for close-ups and mids.

Stan Winston’s crew with their hydraulic behemoth

Meanwhile, to accomplish the wider shots, Spielberg hired veteran stop-motion animator Phil Tippett, whose prior work included ED-209 in RoboCop, the tauntaun and AT-AT walkers in The Empire Strikes Back, and perhaps most relevantly, the titular creature from Dragonslayer. After producing some beautiful animatics – to give the crew a clearer previsualisation of the action than storyboards could provide – Tippett shot test footage of the “go-motion” process he intended to employ for the real scenes. Whilst this footage greatly improved on traditional stop-motion by incorporating motion blur, it failed to convince Spielberg.

At this point, Dennis Muren of Industrial Light and Magic stepped in. Muren was the visual effects supervisor behind the most significant milestones in computer-generated imagery up to that point: the stained-glass knight in Young Sherlock Holmes (1986), the water tendril in The Abyss (1989) and the liquid metal T-1000 in Terminator 2: Judgment Day (1991). When Spielberg saw his test footage – initially just skeletons running in a black void – the fluidity of the movement immediately grabbed the director’s attention. Further tests, culminating in a fully-skinned tyrannosaur stalking a herd of gallimimuses, had Spielberg completely convinced. On seeing the tests himself, Tippett famously quipped: “I think I’m extinct.”

The first CGI test

Tippett continued to work on Jurassic Park, however, ultimately earning a credit as dinosaur supervisor. Manipulating a custom-built armature named the Dinosaur Input Device, Tippett and his team were able to have their hands-on techniques recorded by computer and used to drive the CG models.

Building on his experiences working with the E.T. puppet, Spielberg pushed for realistic animal behaviours, visible breathing, and bird-like movements reflecting the latest paleontological theories, all of which would lend credibility to the dinosaurs. Effects co-supervisor Mark Dippe stated: “We used to go outdoors and run around and pretend we were gallimisuses or T-Rexes hunting each other, and shoot [reference] film.”

 

Dinosaurs eat man

Stan Winston’s triceratops was the first dinosaur to go before the cameras, and the only one to be filmed on location.

Production began in August 1992 with three weeks on the Hawaiian island of Kauai. Filming progressed smoothly until the final day on location, which had to be scrubbed due to Hurrican Iniki (although shots of the storm made it into the finished film). After a brief stint in the Mojave Desert, the crew settled into the stages at Universal Studios and Warner Brothers to record the bulk of the picture.

The most challenging sequence to film would also prove to be the movie’s most memorable: the T-Rex attack on the jeeps containing Sam Neill’s Dr. Grant, Jeff Goldblum’s Ian Malcolm, lawyer Gennaro and the children, Lex and Tim. It was the ultimate test for Stan Winston’s full-scale dinosaurs.

The T-Rex mounted on its motion simulator base on Stage 16 at Warner Brothers

The main T-Rex puppet weighed over six tonnes and was mounted on a flight simulator-style platform that had to be anchored into the bedrock under the soundstage. Although its actions were occasionally pre-programmed, the animal was mostly puppeteered live using something similar to the Dinosaur Input Device.

But the torrential rain in which the scene takes place was anathema to the finely tuned mechanics and electronics of the tyrannosaur. “As [the T-Rex] would get rained on,” Winston explained, “his skin would soak up water, his weight would change, and in the middle of the day he would start having the shakes and we would have to dry him down.”

Although hints of this shaking can be detected by an eagle-eyed viewer, the thrilling impact of the overall sequence was clear to Spielberg, who recognised that the T-Rex was the star of his picture. He hastily rewrote the ending to bring the mighty creature back, relying entirely on CGI for the new climax in which it battles raptors in the visitor centre’s rotunda.

The CGI T-Rex in the rewritten finale

 

Woman inherits the earth

After wrapping 12 days ahead of schedule, Jurassic Park hit US cinemas on June 11th, 1993. It became the highest-grossing film of all time, a title which it would hold until Titanic’s release four years later. 1994’s Oscar ceremony saw the prehistoric blockbuster awarded not only Best Visual Effects but also Best Sound Editing and Best Sound Mixing. Indeed, Gary Rydstrom’s contribution to the film – using everything from a dolphin/walrus combination for the raptors’ calls, to the sound of his own dog playing with a rope toy for the T-Rex – cannot be overstated.

Jurassic Park has spawned four sequels to date (with a fifth on the way), and its impact on visual effects was enormous. For many years afterwards, blockbusters were filled with CGI that was unable to equal, let alone surpass, the quality of Jurassic Park’s. Watching it today, the CGI is still impressive if a little plasticky in texture, but I believe that the full-size animatronics which form the lion’s share of the dinosaurs’ screen time are what truly give the creatures their memorable verisimilitude. The film may be 27 years old, but it’s still every bit as entertaining as it was in 1993.

This article first appeared on RedShark News.

Director of photography Dean Cundey, ASC with the brachiosaur head puppet
“Jurassic Park” Retrospective

Will “The Mandalorian” Revolutionise Filmmaking?

Last week, Greig Fraser, ASC, ACS and Baz Idoine were awarded the Emmy for Outstanding Cinematography for a Single-camera Series (Half-hour) for The Mandalorian. I haven’t yet seen this Star Wars TV series, but I’ve heard and read plenty about it, and to call it a revolution in filmmaking is not hyperbole.

Half of the series was not shot on location or on sets, but on something called a volume: a stage with walls and ceiling made of LED screens, 20ft tall, 75ft across and encompassing 270° of the space. I’ve written before about using large video screens to provide backgrounds in limited aways, outside of train windows for example, and using them as sources of interactive light, but the volume takes things to a whole new level.

In the past, the drawback of the technology has been one of perspective; it’s a flat, two-dimensional screen. Any camera movement revealed this immediately, because of the lack of parallax. So these screens tended to be kept to the deep background, with limited camera movement, or with plenty of real people and objects in the foreground to draw the eye. The footage shown on the screens was pre-filmed or pre-rendered, just video files being played back.

The Mandalorian‘s system, run by multiple computers simultaneously, is much cleverer. Rather than a video clip, everything is rendered in real time from a pre-built 3D environment known as a load, running on software developed for the gaming industry called Unreal Engine. Around the stage are a number of witness cameras which use infra-red to monitor the movements of the cinema camera in the same way that an actor is performance-captured for a film like Avatar. The data is fed into Unreal Engine, which generates the correct shifts in perspective and sends them to the video walls in real time. The result is that the flat screen appears, from the cinema camera’s point of view, to have all the depth and distance required for the scene.

The loads are created by CG arists working to the production designer’s instructions, and textured with photographs taken at real locations around the world. In at least one case, a miniature set was built by the art department and then digitised. The scene is lit with virtual lights by the DP – all this still during preproduction.

The volume’s 270° of screens, plus two supplementary, moveable screens in the 90° gap behind camera, are big enough and bright enough that they provide most or all of the illumination required to film under. The advantages are obvious. “We can create a perfect environment where you have two minutes to sunset frozen in time for an entire ten-hour day,” Idoine explains. “If we need to do a turnaround, we merely rotate the sky and background, and we’re ready to shoot!”

Traditional lighting fixtures are used minimally on the volume, usually for hard light, which the omni-directional pixels of an LED screen can never reproduce. If the DPs require soft sources beyond what is built into the load, the technicians can turn any off-camera part of the video screens into an area of whatever colour and brightness are required –  a virtual white poly-board or black solid, for example.

A key reason for choosing the volume technology was the reflective nature of the eponymous Mandalorian’s armour. Had the series been shot on a green-screen, reflections in his shiny helmet would have been a nightmare for the compositing team. The volume is also much more actor- and filmmaker-friendly; it’s better for everyone when you can capture things in-camera, rather than trying to imagine what they will look like after postproduction. “It gives the control of cinematography back to the cinematographer,” Idoine remarks. VR headsets mean that he and the director can even do a virtual recce.

The Mandalorian shoots on the Arri Alexa LF (large format), giving a shallow depth of field which helps to avoid moiré problems with the video wall. To ensure accurate chromatic reproduction, the wall was calibrated to the Alexa LF’s colour filter array.

Although the whole system was expensive to set up, once up and running it’s easy to imagine how quickly and cheaply the filmmakers can shoot on any given “set”. The volume has limitations, of course. If the cast need to get within a few feet of a wall, for example, or walk through a door, then that set-piece has to be real. If a scene calls for a lot of direct sunlight, then the crew move outside to the studio backlot. But undoubtedly this technology will improve rapidly, so that it won’t be long before we see films and TV episodes shot entirely on volumes. Perhaps one day it could overtake traditional production methods?

For much more detailed information on shooting The Mandalorian, see this American Cinematographer article.

Will “The Mandalorian” Revolutionise Filmmaking?