“Quantum Leaper”

This week issue 40 of Infinity magazine comes out, featuring a couple of articles I wrote, including one about the cult sci-fi series Quantum Leap. The show saw Dr. Sam Beckett (Scott Bakula) bouncing around time into other people’s bodies and striving to put right what once went wrong, while his holographic friend Al (Dean Stockwell) smoked cigars, letched, and relayed exposition from Ziggy the computer.

I end the article by wondering whether it’s time for someone like Netflix to bring the show back (it definitely is). What I don’t mention in the magazine is that – unbeknownst to almost everyone – Quantum Leap has already been rebooted once.

This, my loyal readers, is the story of Quantum Leaper.

 

Season One (1995)

As teenagers, my friend David Abbott and I were huge Quantum Leap fans, and were bereft when the show was axed in 1993. I was developing an interest in filmmaking, having dabbled in 2D computer animation on my Atari ST and borrowed my grandfather’s Video-8 camcorder on a couple of occasions. When I was given that camcorder for my 15th birthday, David and I decided that we would make our own version of Quantum Leap, which we imaginatively titled Quantum Leaper.

The first episode was called “Just What the Doctor Ordered” and saw my character – named, again with great imagination, Neil – leaping into a doctor just as his patient is flatlining. I don’t remember much about the plot, but I do remember that we climbed the nearby Malvern Hills to film a fight scene.

Dave played Albert, my holographic helper, communicating with Project Quantum Leap’s supercomputer Ziggy by means of a special hand-link, just like Dean Stockwell did. Unlike Dean Stockwell’s, this hand-link was a calculator.

The two of us also played all the supporting characters (often with the judicious addition of a hat or jacket) and operated the camera, unless we were both in shot, in which case it was locked off. Much of the the editing was done in camera – rewinding the 8mm videotape, cueing it up to the exact moment the last piece of action ended, then hitting record and calling action simultaneously – and the rest I did tape-to-tape with two VCRs connected together. A cheap four-track disco mixer enabled the addition of music (badly composed by me) and sound effects (many of which were sampled from Quantum Leap itself). As YouTube was still years away, the only viewers for the series were our parents and friends, forced to sit down in front of the TV and watch it off VHS.

Episode two, “Boom!”, saw the fictional Neil as a bomb disposal expert supposedly in Northern Ireland in 1980, though like the first episode it was all shot in and around my house. My sister Kate was drafted in to play a journalist whose life Neil has to save.

“A Leap into the Blue” was the next episode, with Neil in the body of a parachutist. Scenes of characters in free-fall were shot with us standing in front of a white wall; I digitised the footage on my ST with a Videomaster cartridge and composited scrolling clouds into the background. The resolution of the Videomaster was very limited – maybe 320×240 – the frame rate was very low too, and it could only do black and white.

A digitised visual effect using a shot of a plane stolen from some TV programme or other

Next we shot a “pilot” episode explaining how Neil and Albert switched places with Sam and Al. I remember digitising shots of Scott Bakula and Dean Stockwell from Quantum Leap and compositing them atrociously into our own footage. At about 30 minutes long, the pilot was double the length of our other episodes.

Then we continued the series where we’d left off. Dave’s script “One Giant Leap” has Neil on a space shuttle mission, an episode that included NASA footage taped off the TV. We made almost no attempt to create sets; the space shuttle cockpit was a plain wall, a computer keyboard and a piece of card to cover an incongruous bookcase.

The space shuttle cockpit “set”

The next two episodes find Neil meeting (and shooting) an evil future version of himself, then leaping into the crazy future space year of 2017. The latter involves a flying car – my mum’s Citroen AX with the wheels framed out, intercut with an extremely crude CGI model.

Dave’s episodes “Virtual Leaping” and “Bullets Over Leaping” see Neil become a VR programmer (with a headset made of Lego) and then an actor (in a studio suspiciously like Dave’s shed).

The VR headset “prop”

My next episode has Neil leaping into himself and saving his father’s life. (My actual dad provided some splendidly wooden acting.) But doing this causes a paradox, and the season finale sees Neil and Albert swap places (as Sam and Al do in a classic Quantum Leap episode) and Neil having to restore the timeline to prevent the destruction of the universe.

We were ambitious. You can say that much for us.

 

Season Two (1996)

The following year, while doing our GCSEs, we began work on a second season. In between I’d made a bad 40-minute comedy, Bob the Barbarian, and an appalling feature-length sci-fi film, The Dark Side of the Earth, and I’d learnt a few things that would lift the production values of Season Two very slightly. I’d also nagged my parents into buying me a genlock which would let me superimpose CGI over analogue video, meaning I didn’t have to digitise footage and suffer the horrendous image degradation any more.

The holographic Albert enters the Imaging Chamber, an effect enabled by my new genlock.

The actual Quantum Leaping effect from this era of the show is surprisingly decent given the equipment we were working with. We would lock the camera off and jump-cut to a blue filter being over the lens, then a white glow would creep over me – an animation I achieved in software called Deluxe Paint – followed by tendrils of electricity. The screen would then fade to white and a similar effect would play out in reverse to show the leap in.

Leaping from life to life, striving to put right what once went wrong…

Another improvement was that we managed to convince a few other friends to act in the series, including fellow Quantum Leap fan Lee Richardson, as well as Chris Jenkins, Conrad Allen, Matt Hodges, Si Timbrell and Jim McKelvie. Recognising my lack of musical talent at last, I abandoned composing and instead used soundtrack CDs from Star Trek: Deep Space Nine (Dennis McCarthy), the John Woo film Broken Arrow (Hans Zimmer), and the Doctor Who story “The Curse of Fenric” (Mark Ayres). Albert’s hand-link prop got an upgrade too, from a calculator to a custom Lego build with flashing lights.

Lee Richardson “acting” in the control room “set”

Season Two opens with Dave’s episodes “Project Hijacked” and “Oh Brother, Where Art Thou?” which focus on events at Project Quantum Leap, supposedly a high-tech facility in the New Mexico desert in 2005. In reality it was a living room with a control console made out of painted cardboard boxes and Christmas lights. In an early manifestation of my cinematography leanings, I snooted the ceiling light with a rolled-up piece of silver card, lending a little bit of mood to the look.

At the time, Dave’s family were training a hearing dog, Louis, so I wrote an episode to feature him; “Silence is Golden” sees Neil leap into a deaf man, and was followed by the morbid “Ashes to Ashes” where he leaps into a corpse.

The next episode, Dave’s “Driven to Distraction”, is probably the best of the lot. For once there were few enough characters that no-one needed to confusingly play dual roles, and there is plenty of action to boot. (I uploaded this episode to YouTube so long ago that the ten-minute time limit still applied.)

The X-Files-inspired “Close Encounters of the Leaping Kind” comes next, with Neil as a ufologist bothered by a shadowy government agent. Then Neil becomes a teenager who must prevent a drugs overdose, then a one-armed man who must overcome prejudice to hold down a job. Cringingly entitled “Not So Armless”, this latter was shot in a newsagent’s owned by a friend’s parents, one of the series’ few non-domestic locations.

Like Quantum Leap we had a mirror shot in every episode where Neil would see the leapee’s reflection looking back at him. Sometimes Dave would track the camera behind my back and we’d hide a cut in the darkness to swap me with whoever was playing the reflection. Another time we pretended the serving hatch in Dave’s house was a mirror and the two of us synchronised our movements. For a fight scene in “Not So Armless” Chris hid one arm inside his t-shirt so that Neil’s mirror image could appear to punch the antagonist with an invisible fist!

Facing mirror images that were not his own…

The penultimate episode of the season features several brief leaps, ending with one to Hiroshima in 1945, where the A-bomb detonation (more footage off the TV) causes both Neil and Albert to leap simultaneously. In the finale, Albert becomes a mountaineer caught in an avalanche, while Neil is a member of the rescue team – a premise thieved from the Quantum Leap novel “Search and Rescue”. We started shooting it during snowy weather, but the snow thawed and the episode was never completed. The friends who had been appearing as supporting characters now had part-time jobs and couldn’t spare the time for filming.

 

Legacy

We wrote all six episodes of a third season which would have explained how Neil became the evil future version of himself seen in an earlier episode, but nothing was ever filmed.

In 1997 we began a remake of the pilot using the experience we had gained since shooting the original, but again it was never completed. One part we did film was an action sequence with me on the roof rack of a car while the driver swerves around trying to throw me off. We shot this on Malvern’s Castlemorton Common and used a dummy of me for some of the wider and more dangerous shots. Its acting was probably better than mine. We remade the scene four years later as part of my Mini-DV feature The Beacon.

Today only five of the 20 Quantum Leaper episodes that we made survive, the rest having been callously taped over at some point in my late teens. That’s probably for the best, as most of it was hilariously bad, but making it taught me a hell of a lot about filmmaking. Without it, I doubt I’d have a career in cinematography today.

His only guide on these journeys is Al, an observer from his own time…
“Quantum Leaper”

The Cinematography of “Doctor Who”

Just a quick post to say that the latest special edition of Doctor Who Magazine, out now, features an article I wrote about the history of the venerable series’ cinematography. From the cathode-ray tube multi-camera studio shoots of 1963 to the latest ARRI Alexa/Cooke Anamorphic photography, the technology and techniques of lighting and lensing Doctor Who encapsulate the history of TV making over the last six decades. I had a great time combining two subjects I know quite a bit about and was very excited to see the article in print. Look out for it in your local newsagent!

The Cinematography of “Doctor Who”

Mechanical TV: A Forgotten Format

Cathode ray tube televisions, those bulky, curve-screened devices we all used to have before the rise of LCD flat-screens, already seem like a distant memory. But did you know that they were not the first form of television, that John Logie Baird and his contemporaries first invented a mechanical TV system more akin to Victorian optical toys than the electronic screens that held sway for the greater part of the 20th century?

Mechanical television took several forms, but the most common type revolved, quite literally, around a German invention of 1884 called the Nipkow disc. This had a number of small holes around it, evenly spaced in a spiral pattern. In the Baird standard, developed by the Scottish inventor in the late 1920s, there were 30 holes corresponding to 30 lines of resolution in the resulting image, and the disc would revolve 12.5 times per second, which was the frame rate.

In a darkened studio, an arc light would be shone through the top portion of a spinning Nipkow disc onto the subject. The disc would create a flying spot – a spot of light that travelled horizontally across the scene (as one of the holes passed in front of the arc lamp) and then travelled horizontally across it again but now slightly lower down (as the next hole in the spiral pattern passed the lamp) and so on. For each revolution of the 30-hole disc, 30 horizontal lines of light would be scanned across the subject, one below the other.

A number of photocells would be positioned around the subject, continually converting the overall brightness of the light to a voltage. As the flying spot passed over light-coloured surfaces, more light would reflect off them and into the photocells, so a greater voltage would be produced. As the spot passed over darker objects, less light would reflect into the photocells and a smaller voltage would result. The voltage of the photocells, after amplification, would modulate a radio signal for transmission.

This picture from “Science and Invention”, November 1928, shows the radio receiver on the left and the Nipkow disc with its conical viewing shade on the right.

A viewer’s mechanical television set would consist of a radio receiver, a neon lamp and an upright Nipkow disc of a foot or two in diameter. The lamp – positioned behind the spinning disc – would fluctuate in brightness according to the radio signal.

The viewer would look through a rectangular mask fitted over the top portion of the disc. Each hole that passed in front of the neon lamp would draw a streak of horizontal (albeit slightly arcing) light across the frame, a streak varying in brightness along its length according to the continually varying brightness of the lamp. The next hole would draw a similar line just beneath it, and so on. Thanks to persistence of vision, all the lines would appear at once to the viewer, and it would be followed by 11.5 more sets of lines each second: a moving image.

A number of people were experimenting with this crude but magical technology at the same time, with Baird, the American Charles Francis Jenkins and the Japanese Kenjiro Takayanagi all giving historic public demonstrations in 1925.

The image quality was not great. For comparison, standard definition electronic TV has 576 lines and 25 frames per second in the UK, twice the temporal resolution and almost 20 times the spatial resolution of the Baird mechanical standard. The image was very dim, it was only an inch or two across, and it could only be viewed by a single person through a hood or shade extending from the rectangular mask.

The BBC began transmitting a regular mechanical TV service in 1929, by which time several stations were up and running in the USA. An early viewer, Ohio-based Murry Mercier Jr., who like many radio enthusiasts built his own mechanical TV from a kit, described one of the programmes he watched as “about 15 minutes long, consisting of block letters, from the upper left to the lower right of the screen. This was followed by a man’s head turning from left to right.” Hardly Breaking Bad.

John Logie Baird working on a mechanical TV set

Higher resolutions and larger images required larger Nipkow discs. A brighter image necessitated lenses in each of the disc’s holes to magnify the light. Baird once experimented with a disc of a staggering 8ft in diameter, fitted with lenses the size of bowling balls. One of the lenses came loose, unbalancing the whole disc and sending pieces flying across the workshop at lethal speeds.

Other methods of reproducing the image were developed, including the mirror screw, consisting of a stack of thin mirrors arranged like a spiral staircase, one “step” for each line of the image. The mirror screw produced much larger, brighter images than the Nipkow disc, but the writing was already on the wall for mechanical television.

By 1935, cathode ray tubes – still scanning their images line by line, but by magnetically deflecting an electron beam rather than with moving parts – had surpassed their mechanical counterparts in picture quality. The BBC shut down its mechanical service, pioneers like Baird focused their efforts on electronic imaging, and mechanical TV quietly disappeared.

Mechanical TV: A Forgotten Format

A Post-lockdown Trip to the Cinema

This article first appeared on RedShark News last month.

What’s wrong with this picture? Apparently nothing, if you work for the Light.

As I write this, I’ve just got back from my first trip to the cinema in six months. Although they have been allowed to reopen in England since July 4th, the higher operating costs in the pandemic kept many cinemas dark well into August. On Friday the 21st, my local branch of the Light here in Cambridge finally opened its doors, and I went along to experience post-Covid cinema.

Studios have been shifting their release dates throughout the lockdown, with some films giving up on theatrical exhibition altogether, so the Light, like its competitors, has filled its screens with classics for now. I selected Jurassic Park, which I haven’t seen on the big screen since its original release in 1993.

When I arrived, the lobby was dark and almost empty. Like most public spaces, it had sprouted new signage and a one-way system since March, and it took me a couple of attempts to find the right lane. Once inside the main corridor though, little had changed except the odd hand sanitiser dispenser on the wall.

I found my screen and took a seat. As with everything from trains to swimming pools, pre-booking is now strongly recommended, due to the diminished capacity caused by social distancing. When you pick your seat, the website makes you leave two empties between your party and the next. You can even pre-purchase your popcorn and bucket of cola.

I needn’t have booked, however. In a screen of about 100 seats, exactly ten were occupied. It will take the general public a while to cotton on that cinema-going is an option again, even before they decide whether they feel comfortable doing so.

As I sat masked and expectant, my hands sticky from sanitiser that refused to evaporate, I was treated to a rare site: a cinema employee inside the auditorium. He announced that they didn’t have any ads or trailers yet, so they would delay starting the film to give everyone a chance to arrive.

A few minutes later, the man reappeared and asked us all to decamp to the corridor. Apparently they had installed a new sound system, and they needed to test it, which could be very loud. Why they couldn’t have checked the system for eardrum bursting at some point in the last six months is beyond me.

The ten of us duly waited in the corridor. A snatch of the Imperial March from an adjacent screen betokened another classic being wheeled out. A woman with a spray bottle and a cloth, masked like all of her colleagues, worked her way down the corridor, cleaning the door handles. A group next to me (but, I hasten to add, appropriately distant) cracked jokes about the sex appeal of Jeff Goldblum’s Ian Malcom. Another group, evidently missing the trailers, watched one on a phone. (If that doesn’t sum up the existential crisis facing cinema, I don’t know what does.)

At last we were readmitted. The lights dimmed, the sounds of a jungle faded up on the brand new sound system, and the Universal logo appeared. But the trademark globe looked like a deflated football. The film was being projected in the wrong aspect ratio. And not just slightly. It was almost unwatchably stretched, like the flat 1.85:1 images were being shown through a 2:1 anamorphic lens.

By the time the first scene was dissolving away to Bob Peck’s cries of “Shoot her!” the problem hadn’t been corrected, so I stepped out to find a member of staff. The senior person on duty claimed that the problem lay with the file supplied by the distributor, not with the projection. “There’s nothing I can do,” he insisted, while I goggled over my mask in disbelief.

At this point, had I not had this article to write, I would have gone home and watched the film on Netflix, or even on DVD. (There’s that existential crisis again.) But I persevered, trying not to imagine Dean Cundey weeping tears of frustration into his beard.

Fortunately, Jurassic Park is such a great film that it could be appreciated even in the face of such technical incompetence. A larger audience would have been nice, to enjoy the scares and humour with, though since screaming and laughing project dangerous droplets further, perhaps that’s less than ideal these days.

Overall, I must say that I found the experience of going to the cinema less altered than many other aspects of life. I’ve got used to wearing a mask, so much so that I was halfway home before I remembered to take it off, and I normally avoid peak times so the emptiness didn’t feel too unusual.

But with the rise in streaming subscriptions during lockdown, and the understandable caution that many feel about going out, cinemas will need to work much harder to get bums back on flip-up seats. The kind of technical troubles that the Light suffered tonight will only strengthen the case for staying at home, mask-free and pyjama-clad, where you can control both the virus and the aspect ratio.

A week after writing this, I went to a Showcase to see Tenet. The member of staff who took our tickets unequivocally told us that the printed screen number was wrong, and that we should go to another one. We did so. The ads and trailers finally started, fifteen minutes late. We were just wondering why they were trailing such kid-friendly movies when another member of staff came in and told us that Tenet was showing in the original screen after all, and by the way, you’ve missed the first couple of minutes. 

Hopefully it is now clear why I wrote “10 Reasons Why Cinemas Don’t Deserve to Survive the Pandemic”.

A Post-lockdown Trip to the Cinema

What Does “Cinematic” Mean?

Earlier this year I undertook a personal photography project called Stasis. I deliberately set out to do something different to my cinematography work, shooting in portrait, taking the paintings of Dutch seventeenth century masters as my inspiration, and eschewing traditional lighting fixtures in favour of practical sources. I was therefore a little disappointed when I began showing the images to people and they described them as “cinematic”.

An image from “Stasis”

This experience made me wonder just what people mean by that word, “cinematic”. It’s a term I’ve heard – and used myself – many times during my career. We all seem to have some vague idea of what it means, but few of us are able to define it. 

Dictionaries are not much help either, with the Oxford English Dictionary defining it simply as “relating to the cinema” or “having qualities characteristic of films”. But what exactly are those qualities?

Shallow depth of field is certainly a quality that has been widely described as cinematic. Until the late noughties, shallow focus was the preserve of “proper” movies. The size of a 35mm frame (or of the digital cinema sensors which were then emerging) meant that backgrounds could be thrown way out of focus while the subject remained crisp and sharp. The formats which lower-budget productions had thereto been shot on – 2/3” CCDs and Super-16 film – could not achieve such an effect. 

Then the DSLR revolution happened, putting sensors as big as – or bigger than – those of Hollywood movies into the hands of anyone with a few hundred pounds to spare. Suddenly everyone could get that “cinematic” depth of field. 

My first time utilising the shallow depth of field of a DSLR, on a never-completed feature back in 2011.

Before long, of course, ultra-shallow depth of field became more indicative of a low-budget production trying desperately to look bigger than of something truly cinematic. Gradually young cinematographers started to realise that their idols chose depth of field for storytelling reasons, rather than simply using it because they could. Douglas Slocombe, OBE, BSC, ASC, cinematographer of the original Indiana Jones trilogy, was renowned for his deep depth of field, typically shooting at around T5.6, while Janusz Kaminski, ASC, when shooting Kingdom of the Crystal Skull, stopped down as far as T11.

There was also a time when progressive scan – the recording of discrete frames rather than alternately odd and even horizontal lines to make an interlaced image – was considered cinematic. Now it is standard in most types of production, although deviations from the norm of 24 or 25 frames per second, such as the high frame rate of The Hobbit, still make audiences think of reality TV or news, rejecting it as “uncinematic”.

Other distinctions in shooting style between TV/low-budget film and big-budget film have slipped away too. The grip equipment that enables “cinematic” camera movement – cranes, Steadicams and other stabilisers – is accessible now in some form to most productions. Meanwhile the multi-camera shooting which was once the preserve of TV, looked down upon by filmmakers, has spread into movie production.

A direct comparison may help us drill to the core of what is “cinematic”. Star Trek: Generations, the seventh instalment in the sci-fi film franchise, went into production in spring 1994, immediately after the final TV season of Star Trek: The Next Generation wrapped. The movie shot on the same sets, with the same cast and even the same acquisition format (35mm film) as the TV series. It was directed by David Carson, who had helmed several episodes of the TV series, and whose CV contained no features at that point.

Yet despite all these constants, Star Trek: Generations is more cinematic than the TV series which spawned it. The difference lies with the cinematographer, John A. Alonzo, ASC, one of the few major crew members who had not worked on the TV show, and whose experience was predominantly in features. I suspect he was hired specifically to ensure that Generations looked like a movie, not like TV.

The main thing that stands out to me when comparing the film and the series is the level of contrast in the images. The movie is clearly darker and moodier than the TV show. In fact I can remember my schoolfriend Chris remarking on this at the time – something along the lines of, “Now it’s a movie, they’re in space but they can only afford one 40W bulb to light the ship.” 

The bridge of the Enterprise D as seen on TV (top) and in the “Generations” movie (bottom).

It was a distinction borne of technical limitations. Cathode ray tube TVs could only handle a dynamic range of a few stops, requiring lighting with low contrast ratios, while a projected 35mm print could reproduce much more subtlety. 

Today, film and TV is shot on the same equipment, and both are viewed on a range of devices which are all good at dealing with contrast (at least compared with CRTs). The result is that, with contrast as with depth of field, camera movement and progressive scan, the distinction between the cinematic and the uncinematic has reduced. 

The cinematography of “Better Call Saul” owes much to film noir.

In fact, I’d argue that it’s flipped around. To my eye, many of today’s TV series – and admittedly I’m thinking of high-end ones like The Crown, Better Call Saul or The Man in the High Castle, not Eastenders – look more cinematic than modern movies. 

As my friend Chris had realised, the flat, high-key look of Star Trek: The Next Generation was actually far more realistic than that of its cinema counterpart. And now movies seem to have moved towards realism in the lighting, which is less showy and not so much moody for the sake of being moody, while TV has become more daring and stylised.

A typically moody and contrasty shot from “The Crown”

The Crown, for examples, blasts a 50KW Soft Sun through the window in almost every scene, bathing the monarchy in divine light to match its supposed divine right, while Better Call Saul paints huge swathes of rich, impenetrable black across the screen to represent the rotten soul of its antihero. 

Film lighting today seems to strive for naturalism in the most part. Top DPs like recent Oscar-winner Roger Deakins, CBE, ASC, BSC,  talk about relying heavily on practicals and using fewer movie fixtures, and fellow nominee Rachel Morrison, ASC, despite using a lot of movie fixtures, goes to great lengths to make the result look unlit. Could it be that film DPs feel they can be more subtle in the controlled darkness of a cinema, while TV DPs choose extremes to make their vision clear no matter what device it’s viewed on or how much ambient light contaminates it?

“Mudbound”, shot by Rachel Morrison, ASC

Whatever the reason, contrast does seem to be the key to a cinematic look. Even though that look may no longer be exclusive to movies released in cinemas, the perception of high contrast being linked to production value persists. The high contrast of the practically-lit scenes in my Stasis project is – as best I can tell – what makes people describe it as cinematic.

What does all of this mean for a filmmaker? Simply pumping up the contrast in the grade is not the answer. Contrast should be built into the lighting, and used to reveal and enhance form and depth. The importance of good production design, or at least good locations, should not be overlooked; shooting in a friend’s white-walled flat will kill your contrast and your cinematic look stone dead. 

A shot of mine from “Forever Alone”, a short film where I was struggling to get a cinematic look out of the white-walled location.

Above all, remember that story – and telling that story in the most visually appropriate way – is the essence of cinema. In the end, that is what makes a film truly cinematic.

SaveSave

What Does “Cinematic” Mean?

Goodbye Final Cut Pro

Recently, having put it off for as long as possible, I upgraded to MacOS High Sierra, the first new OS to not support Final Cut Pro 7. It was a watershed moment for me. Editing used to comprise at least half of my work, and Final Cut had been there throughout my entire editing career.

I first heard of Final Cut in early 2000, when it was still on version one. The Rural Media Company in Hereford, which was my main client at the start of my freelance career, had purchased a copy to go with their shiny Mac G3. The problem was, no-one at the company knew how to use it.

Meanwhile, I was lobbying to get some time in the Avid edit suite (a much hallowed and expensive room) to cut behind-the-scenes footage from Integr8, a film course I’d taken part in the previous summer. The course and its funding were long finished, but since so much BTS footage had been shot, I felt it was a shame not to do something with it.

Being 19 and commensurately inexperienced, I was denied time on the Avid. Instead, the head of production suggested I use the G3 which was sitting idle and misunderstood in one of the offices. Disappointed but rising to the challenge, I borrowed the manual for Final Cut Pro, took it home and read it cover to cover. Then I came back in and set to work cutting the Integr8  footage.

Editing in 2000 was undergoing a huge (excuse the pun) transition. In the back of the equipment storeroom, Rural Media still had a tape-to-tape editing system, but it had already fallen almost completely out of use. Editing had gone non-linear.

In a room next to the kitchen was the Optima suite. This was a computer (I forget what type) fitted with a low resolution analogue video capture card and an off-line editing app called Optima. In this suite you would craft your programme from the low-rez clips, exporting an EDL (Edit Decision List) onto a floppy disc when you were done. This you took into the Avid suite to be on-lined – recapturing just the clips that were needed in full, glorious, standard definition. You could make a few fine adjustments and do a bit of grading before outputting the finished product back to tape.

It wasn’t practical to do the whole edit on the Avid because (a) hard drives big enough to store all the media for a film at full rez weren’t really available at that time, and (b) the Avid system was hellishly expensive and therefore time on it was charged at a premium rate.

As I edited the Integr8 BTS on Final Cut Pro, I believed I was using an off-line system similar to the Optima. The images displayed in the Viewer and Canvas were certainly blocky and posterised. But when I recorded the finished edit back to tape, I couldn’t quite believe what I was seeing. Peering through the viewfinder of the Mini-DV camera which I was using as a recording deck, I was astonished to see the programme playing at the exact same quality it had been shot at. This little G3 and the relatively affordable app on it were a complete, professional quality editing system.

I looked across the office to the sign on the Avid suite’s door. It might as well have read: “DINOSAUR”.

Within a few months I had invested in my own Mac – a G4, no less – and was using FCP regularly. The next year I used it to cut my first feature, The Beacon, and three more feature-length projects followed in the years after that, along with countless shorts and corporates. Using FCP became second nature to me, with the keyboard shortcuts hard-wired into my reflexes.

And it wasn’t just me. Final Cut became ubiquitous in the no-/low-budget sector. Did it have its flaws? Definitely. It crashed more often than Richard Hammond. I can think of no other piece of software I’ve screamed so much at (with the exception of a horrific early desktop publishing app which I masochistically used to create some Media Studies GCSE coursework).

And of course Apple shat all over themselves in 2011 when they released the much-reviled Final Cut X, causing many loyal users to jump ship. I stayed well away from the abomination, sticking with the old FCP 7 until I officially quit editing in 2014, and continuing to use it for personal projects long after that.

So it was quite a big deal for me to finally let it go. I’ve got DaVinci Resolve installed now, for the odd occasion when I need to recut my showreel. It’s not the same though.

Timelines aren’t my world any more, light is, but whenever I look back on my years as an editor, Final Cut Pro’s brushed-aluminium interface will always materialise in my mind’s eye.

Goodbye Final Cut Pro

If Camera was Sound and Sound was Camera

“Sound has the set,” calls the 1st AD, fishing a roll-up from her pocket and heading for the fire exit.

The production sound mixer strides into the middle of the set and strokes his Hipster beard thoughtfully.

“What are you thinking, boss?” asks the gaffer, scratching at the beer belly under his Yamaha t-shirt.

The mixer points to the skylight. “Let’s have some early morning ambience coming through here – the one with the distant traffic.” With a sweeping gesture he encompasses one side of the kitchen set. “I want it to explode off the floor and reverberate throughout this whole area.”

“Hundred watt woofer?” the gaffer suggests. The mixer nods, and a spark scuttles off to the truck for the required speaker.

“Is that practical?” the mixer wonders aloud. The gaffer follows his gaze to the kettle, nods, and flicks the switch. The mixer pulls a sound meter from the pocket of his leather jacket and holds it up to the boiling appliance. “6dB under.”

“We could hide a little tweeter behind it, bring the level up a bit,” the gaffer suggests. “I’ve got half a dozen different kettle effects on the truck.”

The mixer agrees, and proceeds to point out several other positions around the set, which is soon full of busy sparks running XLR cables, rigging speakers and shaping them with sound blankets. A cacophony grows as each one is fired up.

“Does this look about right?” asks the 1st AS, steadying the Sennheiser as the grips wheel its massive Technoboom to the previously agreed spot. She holds a pair of headphones out to the mixer.

He puts them on, and a reverent hush descends upon the set. He pans the mic left, then right, then up, then down, then left and right again. Finally he takes off the cans, clutching at his SQN baseball cap to stop it coming off too. “We need to go tighter,” he pronounces. He holds up his two hands, forming a circular aperture with his fingers, and cups them around his ear. His face a picture of concentration, he squats down and listens intently through the hole in his hands. He shuffles a little to the left. “This is it. We need to be right here on the 67.”

“Copy that,” the 1st AS replies. Her 2nd drags over a massive flight case and she begins unscrewing the ME66 from the power module.

 

“OK everyone, standby for a mic rehearsal.”

At last the camera operator – who had been somehow hiding in plain sight – puts down his coffee and heaves an Alexa onto his shoulder, checking the image as the cast go through the motions.

The director presses her headphones against her ears, frowning. She turns to the mixer. “I’m not getting enough sense of where they are,” she says. “Can we go wider?”

A few moments later the 1st AS is sighing as she unscrews the ME67 and remounts the ME66.

“It’s really quiet,” a producer complains, from his canvas chair in front of the amp at sound city. “Can we turn it up a bit?”

“We’ve got to have the mood,” the mixer insists. “What you can’t hear is more exciting than what you can.”

“I’m paying to hear it!” snaps the producer. “And why is there so much hiss? I can barely hear the dialogue over it.”

“It’s atmosphere!” the mixer protests, but he can see he’s not going to win this one. Reluctantly he signals a spark to turn down the white noise generator.

 

“Cut!” calls the director, smiling in satisfaction at the cast. She turns to the mixer. “How was that for you?”

“That sounded beautiful,” he replies ecstatically.

“OK, moving on,” says the AD, reaching for the clip-list.

“Hang on a minute.”

All eyes turn to the camera op.

“The caterer walked through the back of shot.”

“Did he?” asks the AD, looking around the crew for confirmation.

“I didn’t pick him up,” says the mixer.

The camera op stares at them in disbelief. “He sauntered right across the back of the set. He was there the whole take. It’s completely unusable.”

The AD sighs. “I guess we’d better go again.”

“Can we ask people not to walk through the frame? This lens will pick up literally anyone that walks in front of it.”

The director thinks about this. “Have you got a different lens you can use?”

“Can’t you put Go Pros on them?” asks the AD, gesturing to the cast.

“I’d rather not use Go Pros,” a new voice chimes in. Everyone turns with surprise to see the director of photography blinking in the light. She almost never moves from the shadowy corner where she sits with LiveGrade and a monitor which is rumoured to display mostly rugby matches.

“We can’t afford to lose any more takes because of camera,” says the AD. “What’s wrong with Go Pros anyway?”

“The image just isn’t as good. The dynamic range…”

But the AD cuts her short. “Well, it’s either that or AVR.”

“I just think if we took thirty seconds to find a new position for the Alexa…”

As the producer strides over to stick his oar in, the sound assistants exchange knowing looks: this could go on for a while. The pair lean on the Magliner and check their phones.

“Have you ever worked with a Nagra?” the 2nd AS asks, conversationally. “I still think they sound better than digital.”

If Camera was Sound and Sound was Camera

Diagnosing a Pharma Hack

wordpress-bloggingToday’s post is not about filmmaking, but I hope it will be of use to other WordPress bloggers who have been the victims of so-called Pharma Hacking.

A few weeks ago I started to notice strange things happening on this site.

The first thing was that I couldn’t log in. At the top of the login screen there would be an error message similar to this one:

Warning: Cannot modify header information – headers already sent by (output started at /home/trustjho/public_html/blog/wp-content/themes/adspress/functions.php:74) in /home/trustjho/public_html/blog/wp-login.php on line 302

I googled the message and found various suggested solutions, but in the end the only one that worked was to reinstall WordPress.

The next issue was that the media gallery wouldn’t load. When I tried to upload a new image for a post it wouldn’t work, and I couldn’t see any of the images I’d uploaded previously. I tried all the usual WordPress troubleshooting – deactivating plug-ins and themes, which did nothing, and again reinstalling the core files. After the reinstall the problem went away for a little while, but soon came back.

The third thing I noticed was line breaks appearing after links in many of my posts. I checked the html code of the posts, but couldn’t see any reason for this behaviour.

Fourthly, and most worryingly, I started coming across a couple of weird sentences at the bottom of several blog posts – sentences which I didn’t write. It was always the same:

“Here what I remember even at that time when I sleep it Cialis Dosage which has to be fixed and can’t be. Cialis dose it is an important element of reception. Which it is necessary to remember.”

Both instances of the word Cialis were hyperlinks to a site selling the drug.

After hours of googling I figured out that I had been Pharma Hacked. Pharma Hacking involves uploading rogue code to your WordPress site which then inserts text and links into your posts. It also inserts javascript into the posts which renders the text and links invisible to human viewers, while still being visible to search engines. The result is that the linked drug site rises in search engine rankings because all these invisible links to it have been maliciously inserted into unsuspecting WordPress sites. Because the text is invisible, readers of the victim’s site and even the owner of the site may be completely unaware that it has been hacked.

When I looked at the infected posts in the ‘text’ view mode (as opposed to ‘visual’) I could see two additions, one at the start of the post:

<script type=”text/javascript”>// <![CDATA[
function get_style6610 () { return “none”; } function end6610_ () { document.getElementById(‘database6610’).style.display = get_style6610(); }
// ]]></script>

And one at the end:

<p id=”database6610″>Here what I remember even at that time when I sleep it <a href=”http://cialisdosage.biz/index.html”>Cialis Dosage</a> which has to be fixed and can’t be. <a href=”http://cialisdosage.biz/index.html”>Cialis dose</a> it is an important element of reception. Which it is necessary to remember.</p>
<script type=”text/javascript”>// <![CDATA[
end6610_();
// ]]></script>

Together the two pieces of javascript ensured that the text and link were not displayed. I’m still not sure why I was able to see the text on some of my posts when viewing my site’s front end, but it was lucky that I could otherwise I might never have diagnosed the problem.

After some more googling I downloaded Wordfence, a plug-in that scans your site for malicious code. Wordfence identified around eight or ten malicious files, which I immediately deleted. Straight away the media gallery started working again and the rogue line breaks after links disappeared.

Unfortunately Wordfence isn’t able to remove the text from your posts. I googled around for something that could, and in the end used a plug-in called Search and Replace. This was able to delete all instances of the sentence “Here what I remember….” and its hyperlinks, which turned out to be in over 900 of my 1,100 blog posts. I can’t remove the javascript, because the ID number in it (6610 in the example above) changes with every post, and I can’t find a search and replace plug-in that can handle a wildcard like that. However, without the text and links the javascript does nothing.

I still don’t know how my site got infected in the first place, but apparently the most likely route would have been through one of the old, out-dated plug-ins I was running. Evidently it is very important to regularly update not just WordPress but all of your plug-ins to make sure there are no security loopholes. And I will be performing regular Wordfence scans from now on to check for anything slipping through again.

Diagnosing a Pharma Hack

Why Make Films?

Mini-DV
Shooting Mini-DV in 2003

When I went freelance at the end of the last century, it felt like anything was possible.  If you had the talent, you could go out there and make a great short that could win awards at festivals and get you a good agent, or you could go out and make a feature which made the industry sit up and take notice and hire you on a fully-budgeted production. Call me old and cynical, but that now feels like a ridiculous pipe-dream.

15 years ago, the Mini-DV revolution was just kicking off. Since then we’ve had the DSLR revolution, not to mention the collapse of expensive celluloid as the only accepted acquisition and distribution format for “proper” movies. The technology has removed every barrier to entry, and now the world is swamped with filmmakers.

This is great, but it has had two highly destructive side effects.

Firstly, as a filmmaker, it’s virtually impossible to stand out any more amongst the thousands of micro-budget movies that get made every year, short-form and long. Would I get coverage in The Guardian today for making a fantasy feature on £20,000? I think not.

Shooting on a DSLR in 2013
Shooting on a DSLR in 2013

And although there is now a huge number of film festivals around the world, there are so many people entering them, that the odds of getting in are tiny, and the odds of winning awards even smaller. So once you’ve made a film, what do you do with it? Putting it online is the only option left. Except there are so many films, and other forms of video content, on the internet that you have to be incredibly lucky to get any reasonable number of people to watch yours.

Secondly, as jobbing crew, though there are plenty of productions to work on, most of them are unpaid. Because there’s no more money to go around than there was 15 years ago – it’s just more thinly spread. When I started out, unpaid work was something you did for a couple of years until you could get enough paid work to live on. Now it’s entirely possible to do unpaid gigs for decades without it ever leading to enough paid work to quit your day job.

In a nutshell, the industry has become a farce.

Which brings me back to my question, “Why make films?”

The only answer left, and perhaps the only one that ever truly mattered, is, “Because I love it.”

Do not become a filmmaker because you think you can break into Hollywood. Don’t do it because you want to get rich. Don’t expect to see your work on cinema release, to win Oscars, or to work with the stars. Don’t even expect to reach wide audiences or make a good living.

Just do it because it’s the only thing you want to do with your life, and be happy with that. I know I am.

Why Make Films?