How to do Scenes on a Moving Train

Behind the scenes of “Last Passenger”

The publicity machine is ramping up for Kenneth Branagh’s Murder on the Orient Express remake, and it’s got me thinking about the challenges of a script set largely on a moving train. There are a number of ways of realising such scenes, and today I’m going to look at five movies that demonstrate different techniques. All of these methods are equally applicable to scenes in cars or any other moving vehicle.

1. For Real: “The Darjeeling limited”

Wes Anderson’s 2007 film The Darjeeling Limited sees three brothers embarking on a spiritual railway journey across India. Many of the usual Anderson tropes are present and correct – linear tracking shots, comical headgear, Jason Schwartzman – but surprisingly the moving train wasn’t done with some kind of cutesy stop-motion. Production designer Mark Friedberg explains:

The big creative decision Wes made was that we were going to shoot this movie on a moving train. And all that does is complicate life. It makes it more expensive, it makes the logistics impossible. It made it incredibly difficult to figure out how many crew, what crew, what gear… but what it did do is it made it real.

Kenneth Branagh has stated that at least some of Murder on the Orient Express was shot on a real moving train too:

They painstakingly built a fully functioning period authentic locomotive and carriages from the Orient Express during the golden, glamorous age of travel. It was a train that moved… All of our actors were passengers on the train down the leafy lanes of Surrey, pretending to be the former Yugoslavia.

 

2. Poor Man’s Process: “The Double”

Director Richard Ayoade

Although best known as The IT Crowd‘s Moss and the new host of the Crystal Maze, Richard Ayoade is also an accomplished director. His last feature was a darkly beautiful adaptation of Dostoyevsky’s classic identity-crisis novella The Double. 

Unlike the other movies on this list, The Double only has short sequences on a train, and that’s a key point. So named because it’s a cheap alternative to rear projection (a.k.a. process photography), Poor Man’s Process is a big cheat. In order to hide the lack of motion, you keep the view outside your vehicle’s windows blank and featureless – typically a night sky, but a black subway tunnel or a grey daytime sky can also work. Then you create the illusion of motion with dynamic lighting, a shaky camera, and grips rocking the carriage on its suspension. Used judiciously, this technique can be very convincing, but you would never get away with it for a whole movie.

Poor Man’s works particularly well in The Double, the black void outside the subway car playing into the oppressive and nightmarish tone of the whole film. In an interview with Pushing Pixels, production designer David Crank explains how the subway carriage set was built out of an old bus. He goes on to describe how the appearance of movement was created:

We put the forks of a forklift under the front of the bus, and shook it… For the effect of moving lights outside the train, it was a combination of some spinning lights on stands, as well as lights on small rolling platforms which tracked back and forth down the outside of the bus.

Part 2 of the Darjeeling Limited featurette above reveals that Poor Man’s Process was also used occasionally on that film, when the train was stuck in a siding due to heavy rail traffic. I used Poor Man’s myself for night-time train sequences in two no-budget features that I made in the early noughties – see the BTS clip below – and I’ve also written a couple of blog posts in the past about my use of the same technique on a promotional video and in a fantasy web series.

 

3. Green screen: “Source Code”

Duncan “Zowie Bowie” Jones followed up his low-budget masterpiece Moon with Hollywood sci-fi thriller Source Code, a sort of mash-up of Quantum Leap and Groundhog Day with a chilling twist. It takes place predominantly on a Chicago-bound commuter train, in reality a set surrounded by green screen. In the featurette above, Jones mentions that shooting on a real moving train was considered, but ultimately rejected in favour of the flexibility of working on stage:

Because we revisit an event multiple times, it was absolutely integral to making it work, and for the audience not to get bored, that we were able to vary the visuals. And in order to do that we had to be able to build platforms outside of the train and be able to really vary the camera angles.

In the DVD commentary, Jones also notes that the background plates were shot in post from a real train “loaded up with cameras”.

Director Duncan Jones on the set of “Source Code”

Cinematographer Don Burgess, ASC discusses lighting the fake train in a Panavision article:

It’s difficult to make it feel like natural light is coming in and still get the sense of movement on a train… We worked with computer programs where we actually move the light itself, and brighten and dim the lights so it feels as if you are travelling… The lights are never 100% constant.

When I shot The Little Mermaid last year we did some train material against green screen. To make the lighting dynamic, the grips built “branch-a-loris” rigs: windmills of tree branches which they would spin in front of the lamps to create passing shadows.

 

4. Rear projection: “Last Passenger”

Perhaps the most low-budget film on this list, Last Passenger is a 2013 independent thriller set aboard a runaway train. Director Omid Nooshin and DP Angus Hudson wanted a vintage look, choosing Cooke Xtal anamorphic lenses and a visual effects technique that had long since fallen out of favour: rear projection.

Before the advent of optical – and later digital – compositing, rear projection was commonly used to provide moving backgrounds for scenes in vehicles. The principle is simple: the pre-recorded backgrounds are projected onto a screen like this…

Rear projection in use on “River of no Return” (1954)

Hudson goes into further detail on the technique as used for the Last Passenger:

To capture [the backgrounds] within our limited means, we ended up shooting from a real train using six Canon 5D cameras, rigged in such a way that we got forward, sideways and rear-facing views out of the train at the same time. We captured a huge amount of footage, hours and hours of footage. That allowed us to essentially have 270 degrees of travelling shots, all of which were interlinked.

Because rear projection is an in-camera technique, Nooshin and Hudson were able to have dirt and water droplets on the windows without worrying about creating a compositing nightmare in postproduction. Hudson also notes that the cast loved being able to see the backgrounds and react to them in real time.

 

5. L.E.D. Panels: “Train to Busan”

Enabling the actors to see the background plates was also a concern for Yeon Sang-ho, director of the hit Korean zombie movie Train to Busan. He felt that green screen would make it “difficult to portray the reality”, so he turned to the latest technology: LED screens. This must have made life easier not just for the cast, but for the cinematographer as well.

You see, when you travel by train in the daytime, most of the light inside the carriage comes from outside. Some of it is toplight from the big, flat sky, and some of it is hard light from the sun – both of these can be faked, as we’ve seen – but a lot of the light is reflected, bouncing off trees, houses, fields and all the other things that are zipping by. This is very difficult to simulate with traditional means, but with big, bright LED screens you get this interactive lighting for free. Because of this, and the lack of postproduction work required, this technique is becoming very popular for car and train scenes throughout the film and TV industry.

This brings us back to Murder on the Orient Express, for which 2,000 LED screens were reportedly employed. In a Digital Spy article, Branagh notes that this simulated motion had an unintended side effect:

It was curious that on the first day we used our gimballed train sets and our LED screens with footage that we’d gone to great trouble to shoot for the various environments – the lowlands and then the Alps, etc… people really did feel quite sick.

I’ll leave you with one final point of interest: some of the above films designed custom camera tracks into their train carriage sets. On Last Passenger, for example, the camera hung from a dolly which straddled the overhead luggage racks, while The Darjeeling Limited had an I-beam track designed into the centre of the ceiling. Non-train movies like Speed have used the same technique to capture dolly shots in the confines of a moving vehicle.

“Last Passenger”s luggage rack dolly

SaveSave

SaveSave

How to do Scenes on a Moving Train

Book Review: “Green Screen Made Easy”

coverMicro-filmmaker Magazine’s Jeremy Hanke recently got in touch and asked if I would review his book, “Green Screen Made Easy”. I used to make a lot of micro- and no-budget movies packed full of VFX, but I usually avoided green-screen because I could never make it look good. Although those kind of projects are behind me, I agreed to the review because I figured that this book might help others succeed where I’d failed – and also I was interested to find out why I had failed!

What Jeremy and his co-author Michele Terpstra set out to do is to cover the entire process from start to finish: defining chromakeying, buying or building a green screen, lighting and shooting it, sourcing or shooting background plates, choosing keying software, and all aspects of the keying itself.

The book is aimed at no-budget filmmakers, hobbyists or aspiring professionals making self-funded or crowd-funded productions, those digital auteurs who are often their own producers, writers, DPs, editors, colourists and VFX artists. Perhaps you’ve tried green-screening before and been disappointed with the results. Perhaps you’ve always seen it as a bit too “techie” for you. Perhaps the unpaid VFX artist you had lined up for your sci-fi feature just pulled out. Or perhaps you’ve already reached a certain level of competency with keying and now you want to step up a level for your next production. If any of these scenarios ring true with you, I believe you’ll find this book very useful.

“Green Screen Made Easy” is divided into two halves, the first half (by Jeremy) on prepping and executing your green screen shoot, and the second half (by Michele) on the postproduction process. Both authors clearly write from extensive first-hand experience; throughout the text are the kind of tips and work-arounds that only come from long practice. By necessity there is a fair amount of technical content, but everything is lucidly explained and there’s a handy glossary if any of the terms are unfamiliar to you.

camera-techniquesThe section on lighting and shooting green screen material contained few surprises for me as a cinematographer – see my post on green screen for my own tips on this subject – but will be very useful to those newer to the field. The chapters on equipment are very thorough, considering everything from which camera and settings to choose to ensure the best key later on, to buying or building a mobile green screen, or even kitting out your own green screen studio – all with various alternatives to suit any budget.

The postproduction chapters revealed clearly why I struggled with keying in the past. Michele explains how the process is much more than simply pulling a single key, and can involve footage clean-up, garbage matting, a core key and a separate edge key, spill suppression, hold-out matting and light wrapping. The book guides you through all these steps, and outlines the pros and cons of the software and plug-in options for each step.

4picsOnce you’ve read this book, I’d say the only other thing you’ll need before you can start successfully green-screening is to watch some YouTube tutorial videos specific to your software. While the instructions in the book look pretty good (as far as I can tell without attempting to follow them) the medium of text seems a little restrictive in teaching what is inherently a visual process. There are explanatory images throughout “Green Screen Made Easy”, but in the ebook version at least I found it difficult to discern the subtle differences in some of the before-and-after comparisons.

Ultimately what will make you the best “green-screener” is practice, practice, practice, but by reading this book first you’ll give yourself a rock-solid foundation, an appreciation of the entire process from start to finish, and the insider knowledge to avoid a lot of time-sucking pitfalls. And keep it handy, because you’ll be sure to thumb through it and re-read those handy tips throughout your prep, production and post.

“Green Screen Made Easy” is available in paperback and ebook editions from Amazon.

Book Review: “Green Screen Made Easy”

5 Tips for Lighting a Green Screen

Green screen work is almost unavoidable for a modern cinematographer. In an age when even the most basic of corporates might use the technique, and big blockbusters might never leave the green screen stage, knowing how to light for it is essential. The following tips apply equally to blue screen work….

IMG_0837 1. Light the screen at key.

Or to put it another way, your screen should not be over- or under-exposed. If you use a light meter, you can hold it at various spots on the screen (taking care not to block any light with your body) and check that the reading always matches what the iris of your lens is set to. If your camera or monitor has a false colours option, you can use this to check the level and consistency of the exposure across the screen.

2. Use soft sources.

Bouncing tungsten lamps off polyboard is a cheap and effective way to spread soft light across a green screen. Typically you will want two sources, one to each side of the screen. They will need to be well flagged so that their light does not spill onto the subject.

On larger budgets, Kinoflo Image 85s or 87s are often used to illuminate green screens. They are 4ft 8-bank units which put out a large amount of soft light. Ask your hire company to supply them with spiked green tubes; designed especially for green screen work, these tubes help to increase the colour saturation of the screen. (Spiked blue tubes are also available.)

3. Control spill.

As far as possible, reflected green light from the screen should not fall on the subject. The main way to ensure this is to put as much distance as possible between the screen and the subject.

I learnt a great tip recently which also helps reduce spill: once the exact camera position is known, bring in 4×4 floppy flags slightly behind the subject, one either side, just out of frame.

IMG_31674. Avoid dark shadows.

Green spill will bleed most easily into the dark areas on your subject, especially if you’re shooting with a wide aperture. Clipped (or ‘crushed’) blacks are particularly undesirable. The solution is to use more fill light, even if this goes against the mood and contrast levels you’re using in non-VFX shots. If you use LUTs, you should consider creating a custom one for green screen work which pushes the contrast further to compensate for this flatter starting point. If not, you will have to work with the colourist in post to ensure that the shadows are restored to their usual levels once the VFX are complete.

5. Add tracking markers.

Camera movement against green screen isn’t the no-no that it used to be, with any VFX team worth their salt being able to deal with handheld shots, pans, tilts and push-ins. If there isn’t a VFX supervisor on set, you can help them out by taping crosses to a few points on the screen. There should always be at least one marker in shot throughout the camera move (more if it’s a multi-axis move), and they shouldn’t stay put behind any tricky edges (e.g. long hair) for long.

5 Tips for Lighting a Green Screen

Black-screen & White-screen: The Best Kept Secrets in Compositing

Accessing the compositing modes in Final Cut Pro 7
Accessing the compositing modes in Final Cut Pro 7

When it comes to shooting elements for VFX, green-screen gets all the press. But certain kinds of elements can be tricky to key well, and sometimes it’s not the right look. In the last few days Kate Madison and I have needed to shoot last-minute elements for some shots in Ren: The Girl with the Mark, and we turned to monochromatic backgrounds.

Why? How does it work? Well certainly you can key out black or white just like you’d key out green, but the most powerful way to use these backgrounds is not with keying at all, but by a bit of basic maths. And don’t worry, the computer does the maths for you.

If you’ve ever used Photoshop, you’ll have noticed some layer modes called Screen and Multiply. Final Cut Pro has the same modes (it also has Add, which to most intents and purposes is the same as Screen) and so do all the major editing and FX packages.

Screen adds the brightness of each pixel of the layer to the layer underneath. Since black has a brightness of zero, your black screen disappears, and the element in front of it is blended seamlessly into the background image, with its apparent solidity determined by its brightness.

Multiply, as the name suggests, multiplies the brightness of each pixel with the layer underneath. Since white has a brightness of one, and any number multiplied by one is that same number, your white screen vanishes. Whatever element is in front of your screen is blended into the background image, with darker parts of the element showing up more than lighter parts.

One of the elements Kate and I needed to shoot was a flame, to be comped onto a torch. We lit a torch and clamped it to a stand, shooting at night with the pitch black garden in the background. It was the work of moments to comp this element into the shot using Screen mode.

The flame element, shot at night in the garden to ensure a seamless black background
The flame element, shot at night in the garden to ensure a seamless black background
I adjusted the flame's size and used Screen mode to composite it over the background.
I adjusted the flame’s size and used Screen mode to composite it over the background.

Fire is the perfect partner for black-screen shooting, because it generates its own light and it’s not solid. Solid objects composited using Screen/Add or Multiply take on a ghostly appearance – perfect for, er, ghost effects – but not ideal in other situations; because of the way Screen mode works, anything that’s not peak white will be transparent to some degree.

We shot some fast-moving leaves and debris against black, but only the high level of motion blur allowed us to get away with it. In fact, if you know you’re going to have a lot of motion blur, black-screen might be the ideal method, because it will be tricky to get a clean key off a green-screen.

A smoke element shot against a black drape and backlit so that the smoke is visible but the drape is not
A smoke element shot against a black drape and backlit so that the smoke is visible but the drape is not
12438992_1158390784188814_4840791360114619966_n
Shooting dirt in a vase of water against white

Other things that work well against black-screen are sparks, smoke and water/rain, again because they’re not solid. If you want to add rain or snow to a shot, black-screen is the way to go – check out my post about that here.

Yesterday Kate and I needed to shoot a whirlwind element. One of the VFX team suggested swirling sand in a vase of water. After a few experiments in the kitchen, we ended up using dirt from the garden. We used fluorescent softboxes for the background, ensuring we got a bright white background, and made weird arrangements of white paper to eliminate as many of the dark reflections in the vase as we could.

One of the tornado elements shot with the set-up pictured above. We let the dirt settle in the bottom of the water, then swirled the water with a spoon (which had to kept out of frame).
One of the tornado elements shot with the set-up pictured above. We let the dirt settle in the bottom of the water, then swirled the water with a spoon (which had to be kept out of frame).

A few weeks back we shot hosepipe water against black, inverted it and used Multiply to superimpose it as blowing dirt.

With a little thinking outside the box, you can shoot all kinds of elements against white or black to meet your VFX needs. I’ll leave you with this featurette I made in 2006, breaking down the various low-tech FX – many of them black-screen – that I employed on my feature film Soul Searcher.

Black-screen & White-screen: The Best Kept Secrets in Compositing

4 Cunning Substitution Effects in Labyrinth

After countless viewings on VHS and DVD over my lifetime, I finally got to see Labyrinth on the big screen today. The imagination and detail in this film are just astonishing. Every scene has little puppet creatures wandering or flying about in the background to bring the sets to life. In today’s screening I noticed, for the first time, that there are two bottles of milk – presumably delivered by the Goblin Milkman – outside the door of Jareth’s castle. How brilliant is that?

"Where did she learn that rubbish? It doesn't even start with 'I wish'!"
“Where did she learn that rubbish? It doesn’t even start with ‘I wish’!”

Anyway, while there are many awesome things about Labyrinth, one of the techniques that I think is put to particularly good effect in the film is in-camera substitution. Typically this involves one type of puppet leaving frame briefly, and a second puppet – of the same character – reappearing in its place. Puppets are often limited in the actions that they can perform, and while scenes will commonly use different versions of the puppet in different shots to cover the full range of actions, Henson sometimes uses different versions of the puppet in the same shot to sell the illusion of a single, living creature. And though many of these effects are fairly obvious to a modern audience, you can still admire their ingenious design and perfect timing.

Skip through the movie to the timecodes listed below to see some of the best substitution effects.

1. Goblin Under the Bedclothes – 11:40

In the film’s first puppet scene, Sarah’s parent’s bedroom becomes infested with goblins, building up to David Bowie’s big oh-so-eighties entrance. One goblin crawls along the bed, under the sheets, before emerging. It looks like the initial crawling is achieved by pulling a rough goblin shape along on a wire under the sheets. The shape then drops out of the end of the bedclothes, behind a chest, and a moment later a puppet pops up from behind the same chest. This substitution effect obviates the need for a custom-built or chopped-up bed, which would have been necessary to permit the passage of the proper puppet and its puppeteer under the bedclothes.

Sirdidymus
Sir Didymus

2. Sir Didymus’ Acrobatics – 58:35

This shot appears to employ three different models of Sir Didymus, the honourable but fighting-crazed guardian of the bridge over the Bog of Eternal Stench. The first is a floppy version which is thrown behind some rocks by Ludo. After a practical puff of dusk, a second Sir Didymus – this one in a more rigid, leaping position – is launched from some kind of catapult hidden behind the rocks. He flies out of frame, to be replaced a moment later by the Muppet-style hand- and rod-puppet which is used for the majority of Sir Didymus’ shots.

3. Cowardly Ambrosius – 1:16:25

To his infinite chagrin, Sir Didymus’ bravery is not matched by that of his canine steed, Ambrosius. During the battle with Humongous, the petrified pooch rears up, throwing off his valiant rider, and retires shamelessly into hiding. The rearing up is accomplished with a rather unconvincing puppet dog. After he drops back down out of frame (aided by a slight zoom in to help lose him), a real dog enters in the background, running into hiding.

"I can't live within you." Not at all creepy, Dave.
“I can’t live within you.” Not at all creepy, Dave.

4. Double David – 1:27:53

In the film’s finale number, “Within You”, David Bowie’s Goblin King messes with our sense of direction as he jumps and flips around the disorientating Escher artwork brought to life. Early in the sequence he jumps off a ledge, only to reappear simultaneously in a background doorway, now seemingly obeying a pull of gravity at 90° to that which acted on his leap. A shot of Bowie jumping off the ledge cuts to another of him coming through the doorway. The doorway is filmed with the camera on its side, and to finish the action of the first Bowie’s leap, a body double is pulled across frame on a dolly. This can be seen at 25:36 in the behind-the-scenes documentary:

This kind of low-tech but ingenious filmmaking is in danger of dying, as CGI is perceived as the only tool to create illusions. But with a little thought, a little planning, cunning framing, and a knowledge of how to use editing (or lack thereof) to your advantage, very effective illusions can still be created in camera.

If you enjoyed this post, you may also like:

The 10 Greatest Movie Puppets of All Time – including the aforementioned Humongous

Double Vision – five ways of having one actor play two characters in the same scene

Top Five Low Tech Effects – tipping my hat to the cheekiest in-camera effects used in big Hollywood movies

Five Simple But Effective Camera Tricks – revealing some simple camera tricks I’ve used in my own films

4 Cunning Substitution Effects in Labyrinth

The Visual Effects of The Abyss

It’s time for one of my occasional asides celebrating the world of traditional visual effects – miniatures, matte paintings, rear projection, stop motion and the like. For a film using all of those techniques, look no further than The Abyss (1989). Arguably James Cameron’s most underrated film, it can also be considered his most ambitious. Whereas Terminator 2 had bigger action scenes, Titanic had a bigger set and Avatar had more cutting edge technology, these concerns all pale in comparison to the sheer difficulty of shooting so much material underwater.

The hour-long documentary Under Pressure makes the risks and challenges faced by Cameron and his crew very clear.

The Abyss won an Oscar for Best Visual Effects, and is remembered chiefly for the then-cutting-edge CG water tentacle. But it also ran the gamut of traditional effects techniques.

The film follows the crew of an experimental underwater drilling platform, led by Bud (Ed Harris), as they are roped into helping a team of navy divers, led by Lt. Coffey (Michael Biehn), investigate the sinking of a submarine. Underwater-dwelling aliens and cold war tensions become involved, and soon an unhinged Coffey is setting off in a submersible to dispatch a nuke to the bottom of the Cayman Trench and blow up the extra-terrestrials.

When Bud and his wife Lindsey (Mary Elizabeth Mastrantonio) give chase in a second submersible, a visual effects tour de force ensues. The following methods were used to build the sequence:

abyss1

  • Medium-wide shots of the actors in real submersibles shot in an abandoned power station that had been converted by the production into the world’s largest fresh-water filtered tank, equal in capacity to about eleven Olympic swimming pools.

abyss2

  • Close-ups of the actors in a submersible mock-up on stage.

abyss3

  • Over-the-shoulder shots of the actors in the submersible mock-up, with a rear projection screen  outside the craft’s dome, showing miniature footage accomplished with….

abyss4

  • Quarter-scale radio-controlled submarines, shot in a smaller tank. These miniatures were remarkably powerful and, due to the lights and batteries on board, weighed around 450lb (204kg). In order to see what they were doing, the operators were underwater as well, using sealed waterproof joysticks to direct the craft. The RC miniatures were used when the craft needed to collide with each other, or with the underwater landscape, and whenever the audience was not going to get a good look at the domes on the front of the submersibles and notice the lack of actors within.

abyss5

  • One of the custom film projectors inserted into the miniature subs
    One of the custom film projectors inserted into the miniature subs

    Where a more controlled camera move was required, or the actors needed to be visible inside the subs, but it was not practical to shoot full-scale, motion control was used. This is the same technique used to shoot spaceships in, for example, the original Star Wars trilogy. A computer-controlled camera moves around a static model (or vice versa), exposing film very slowly in order to maintain a large depth of field. The move is repeated several times for each different vehicle under different lighting conditions, before compositing all of the “passes” together on the optical printer in the desired ratios, to achieve the final look. For The Abyss’s motion control work, the illusion of being underwater was created with smoke. In shots featuring the submersibles’ robot arms, stop motion was employed to animate these appendages. But perhaps the neatest trick was in making the miniature subs appear to be inhabited; the models were fitted with tiny projectors which would throw pre-filmed footage of the actors onto a circular screen behind the dome.

The sub chase demonstrates perfectly how visual effects should work: mixing a range of techniques so that the audience never has time to figure out how each one is done, and using an appropriate technique for each individual shot so that you’re making things no more and no less complicated than necessary to tell that little piece of the story.

My favourite effect in the sequence is near the end, when the dome of Coffey’s sub cracks under the water pressure. This was filmed over-the-shoulder using rear projection for the view outside of the dome. But the dome was taken from a real submersible, and as such was too thick and too valuable to be genuinely cracked. So someone, and whoever he or she is is an absolute genius, came up with the idea of using an arrangement of backlit sellotape on the dome to create the appearance of a crack. A flag was then set in front of the backlight, rendering the sellotape invisible. On cue, the flag was slid aside, gradually illuminating the “crack”.

crack

Now that, my friends, is thinking outside the box.

The Visual Effects of The Abyss

Forced Perspective

The Ark
The Ark

The other day I watched a 1966 Doctor Who story called The Ark. It’s easy to look at a TV show that old and laugh at the stilted acting, rubber monsters and crude effects. But given the archaic and draconian conditions the series was made under back then, I can only admire the creativity displayed by the director and his team in visualising a script which was scarcely less demanding than a contemporary Who story.

Studio floor plan from the very first episode of Doctor Who, showing camera positions (coloured circles)
Studio floor plan from the very first episode of Doctor Who, showing camera positions (coloured circles)

In the sixties, each Doctor Who episode was recorded virtually as live on a Friday evening, following a week of rehearsals. BBC rules strictly limited the number of times the crew could stop taping during the 90 minute recording session, which was to produce a 22 minute episode. Five cameras would glide around the tightly-packed sets in a carefully choroegraphed dance, with the vision mixer cutting between them in real-time as per the director’s shooting script. (Interesting side note: some of Terminator 2 was shot in a very similar fashion to maximise the number of angles captured in a day.) It’s no wonder that fluffed lines and camera wobbles occasionally marred the show, as there was rarely time for re-takes.

But what’s really hard for anyone with a basic knowledge of visual effects to get their head around today is that, until the Jon Pertwee era began in 1970, there was no chromakey (a.ka. blue- or green-screening) in Doctor Who. Just think about that for a moment: you have to make a science fiction programme without any electronic means of merging two images together, simple dissolves excepted.

Setting up a foreground miniature for a later Who story, Inferno (1970)
Setting up a foreground miniature for a later Who story, Inferno (1970)

So the pioneers behind those early years of Doctor Who had to be particularly creative when when they wanted to combine miniatures with live action. One of the ways they did this in The Ark was through forced perspective.

Forced perspective is an optical illusion, a trick of scale. We’ve all seen holiday photos where a friend or relative appears to be holding up the Eiffel Tower or the Leaning Tower of Pisa. The exact same technique can be used to put miniature spaceships into a full-scale live action scene.

In these frames from The Ark, two miniature landing craft are lowered into the background before the camera pans to a full-size craft in the foreground:

The camera pans from a miniature descending in the background to a full-scale craft in the foreground.
The camera pans from a miniature descending in the background to a full-scale craft in the foreground.

And in these later frames, another miniature craft is placed much closer to the camera than the Monoid (a.k.a. a man in a rubber suit). The miniature craft takes off, pulled up on a wire I presume – a feat which time, money and safety would have rendered impossible with the full-size prop:

The camera pulls focus from a foreground miniature taking off to an actor in the background. A greater depth of field would have made the shot more convincing, but  the principle is sound.
The camera pulls focus from a foreground miniature taking off to an actor in the background. A greater depth of field would have made the shot more convincing, but the principle is sound.

Of course, Doctor Who was not by any means the first show to use forced perspective, nor was it the last. This nineties documentary provides a fascinating look at the forced perspective work in the Christopher Guest remake of Attack of the 50 Ft. Woman, and other films…

And Peter Jackson famously re-invented forced perspective cinematography for the Lord of the Rings trilogy, when his VFX team figured out a way to maintain the illusion during camera moves, by sliding one of the actors around on a motion control platform…

So remember to consider all your options, even the oldest tricks in the book, when you’re planning the VFX for your next movie.

Forced Perspective

Soul Searcher: Low Tech FX

In this 2005 featurette I break down many of the visual effects in my feature film Soul Searcher, revealing how they were created using old school techniques, like pouring milk into a fishtank for apocalyptic clouds. Watch the shots being built up layer by layer, starting with mundane elements like the water from a kitchen tap or drinking straws stuck to a piece of cardboard.

Soul Searcher: Low Tech FX

Managing Visual Effects Without a Budget

Stages of the basement shelves replication effect by Mary Lapena
Stages of Stop/Eject’s basement shelves replication effect by Mary Lapena

Stop/Eject is my fourth major project to include visual effects, and also the fourth where it’s been a struggle to get all the visual effects done. As any micro-budget filmmaker knows, it’s par for the course for some cast and crew to pull out, sometimes without warning or explanation, and VFX artists are no exception. On Soul Searcher, for example, I needed a CG artist for the 80+ shots featuring “spectral umbilical cords”. Four artists started the work and then quit, citing various excuses from exploding PCs to miscarriages, before the fifth delivered the goods.

Stop/Eject has a surprising 31 VFX shots (most of which you’d never know were VFX shots), of which the twelve simplest were handled by me and Miguel, the editor. With the remaining nineteen needing to be outsourced, how did I apply what I’d learnt from my previous projects?

  1. I advertised for multiple artists, knowing from Soul Searcher (and before that The Beacon) that relying on a single person was not a good idea. More than half the people who agreed to work on Stop/Eject never completed a single shot.
  2. I created and uploaded zip files to my webspace for each shot. Each zip contained all the footage and information needed for that shot. This way if an artist dropped out, it was quick and easy for me to point another artist to that zip file to take over the shot.
  3. I re-advertised regularly. Beware that the law of diminishing returns applies here: each ad will reap fewer responses than the last.
  4. I assigned the most difficult shots first. That way the shots that are left at the end when the reliable artists are all burnt out and your adverts are getting no responses are – in theory – the easy ones which you can just about do yourself.
  5. I regularly checked in on the artists’ progress. If I didn’t get a reply within a couple of days, I’d assume that the artist had dropped out and I’d re-assign their shot to someone else. Harsh, but necessary.
Another of Stop/Eject's FX shots, this one by Dominic Stephenson
Another of Stop/Eject’s FX shots, this one by Dominic Stephenson

I want to say a huge thanks to those artists who came through for the project: David Robinson, Mary Lapena, Matt Collett, Eranga Mudiyanselage, Dominic Stephenson and Naveed Aftab. You all worked incredibly hard and produced fantastic results – you should be proud of yourselves.

Finally, a few technical points about our workflow, for anyone interested in such things. We shot on a DSLR, so the source footage was in H.264, a format that due to its structure cannot be trimmed without losing a generation. So I supplied the VFX artists with the entire take (along with details of the in and out timecodes of the piece used in the edit) and asked them to deliver their finished shots as 16-bit TIFF sequences. This ensured that we would lose zero quality. The downside to this workflow is that there is a danger of errors being made with the timecodes, leading to a shot not being long enough when you go to conform the edit…. Yes, that happened. There must be a better way. What’s your workflow for DSLR projects with VFX?

Managing Visual Effects Without a Budget