Pinhole Results

In my last couple of posts I described making and shooting with a pinhole attachment for my 35mm Pentax P30t SLR. Well, the scans are now back from the lab and I’m very pleased with them. They were shot on Fujifilm Superia Xtra 400.

As suspected, the 0.7mm pinhole was far too big, and the results are super-blurry:

See how contemptuous Spike is of this image. Or maybe that’s just Resting Cat Face.

The 0.125mm hole produced much better results, as you can see below. My f/stop calculations (f/365) seem to have been pretty close to the mark, although, as is often the case with film, the occasions where I gave it an extra stop of exposure produced even richer images. Exposure times for these varied between 2 and 16 seconds. Click to see them at higher resolution.

I love the ethereal, haunting quality of all these pictures, which recalls the fragility of Victorian photographs. It’s given me several ideas for new photography projects…

SaveSave

Pinhole Results

Adventures with a Pinhole

Last week I discussed making a pinhole for my Pentax 35mm SLR. Since then I’ve made a second pinhole and shot a roll of Fujifilm Superia X-tra 400 with them. Although I haven’t had the film processed yet, so the quality of the images is still a mystery, I’ve found shooting with a pinhole to be a really useful exercise.

My Pentax P30T fitted with a 0.125mm pinhole attachment

 

A Smaller Pinhole

Soon after my previous post, I went out into the back garden and took ten exposures of the pond and the neighbour’s cat with the 0.7mm pinhole. By that point I had decided that the hole was almost certainly too big. As I noted last week, Mr Pinhole gives an optimal diameter of 0.284mm for my camera. Besides that, the (incredibly dark) images in my viewfinder were very blurry, a sign that the hole needed to be smaller.

Scans of my two pinholes

So I peeled the piece of black wrap with the 0.7mm pinhole off my drilled body cap and replaced it with another hole measuring about 0.125mm. I had actually made this smaller hole first but rejected it because absolutely nothing was visible through the viewfinder, except for a bit of a blur in the centre. But now I came to accept that I would have to shoot blind if I wanted my images to be anything approaching sharp.

The 0.125mm(ish) pinhole magnified in Photoshop

I had made the 0.125mm hole by tapping the black wrap with only the very tip of the needle, rather than pushing it fully through. Prior to taping it into the body cap, I scanned it at high resolution and measured it using Photoshop. This revealed that it’s a very irregular shape, which probably means the images will still be pretty soft. Unfortunately I couldn’t see a way of getting it any more circular; sanding didn’t seem to help.

Again I found the f-stop of the pinhole by dividing the flange focal distance (45.65mm) by the hole diameter, the result being about f/365. My incident-light meter only goes up to f/90, so I needed to figure out how many stops away from f/365 that is. I’m used to working in the f/1.4-f/22 range, so I wasn’t familiar with how the stop series progresses above f/90. Turns out that you can just multiply by 1.4 to roughly find the next stop up, so after f/90 it’s 128, then 180, then 256, then 358, pretty close to my f/365 pinhole. So whatever reading my meter gave me for f/90, I knew that I would need to add 4 stops of exposure, i.e. multiply the shutter interval by 16. (Stops are a base 2 logarithmic scale. See my article on f-stops, T-stops and ND filters for more info.)

 

The Freedom of Pinhole Shooting

I’ve just spent a pleasant hour or so in the garden shooting the remaining 26 exposures on my roll with the new 0.125mm pinhole. Regardless of how the photos come out, I found it a fun and fascinating exercise.

Knowing that the images would be soft made me concentrate on colour and form far more than I normally would. Not being able to frame using the viewfinder forced me to visualise the composition mentally. And as someone who finds traditional SLRs very tricky to focus, it was incredibly freeing not to have to worry about that, not to have to squint through the viewfinder at all, but just plonk the camera down where it looked right and squeeze the shutter.

Of course, before squeezing the shutter I needed to take incident-light readings, because the TTL (through the lens) meter was doing nothing but flash “underexposed” at me. Being able to rely solely on an incident meter to judge exposure is a very useful skill for a DP, so this was great practice. I’ve been reading a lot about Ansel Adams and the Zone System lately, and although this requires a spot reflectance meter to be implemented properly, I tried to follow Adams’ philosophy, visualising how I wanted the subject’s tones to correspond to the eventual print tones. (Expect an article about the Zone System in the not-too-distant future!)

 

D.I.Y. pinhole Camera

On Tuesday night I went along to a meeting of Cambridge Darkroom, the local camera club. By coincidence, this month’s subject was pinhole cameras. Using online plans, Rich Etteridge had made up kits for us to construct our own complete pinhole cameras in groups. I teamed up with a philosophy student called Tim, and we glued a contraption together in the finest Blue Peter style. The actual pinholes were made in metal squares cut from Foster’s cans, which are apparently something Rich has in abundance.

DIY pinhole camera

I have to be honest though: I’m quite scared of trying to use it. Look at those dowels. Can I really see any outcome of attempting to load this camera other than a heap of fogged film on the floor? No. I think I’ll stick with my actual professionally-made camera body for now. If the pinhole photos I took with that come out alright, then maaaaaaybe I’ll consider lowering the tech level further and trying out my Blue Peter camera. Either way, big thanks to Rich for taking all that time to produce the kits and talk us through the construction.

Watch this space to find out how my pinhole images come out.

SaveSave

Adventures with a Pinhole

Colour Rendering Index

Many light sources we come across today have a CRI rating. Most of us realise that the higher the number, the better the quality of light, but is it really that simple? What exactly is Colour Rendering Index, how is it measured and can we trust it as cinematographers? Let’s find out.

 

What is C.R.I.?

CRI was created in 1965 by the CIE – Commission Internationale de l’Eclairage – the same body responsible for the colour-space diagram we met in my post about How Colour Works. The CIE wanted to define a standard method of measuring and rating the colour-rendering properties of light sources, particularly those which don’t emit a full spectrum of light, like fluorescent tubes which were becoming popular in the sixties. The aim was to meet the needs of architects deciding what kind of lighting to install in factories, supermarkets and the like, with little or no thought given to cinematography.

As we saw in How Colour Works, colour is caused by the absorption of certain wavelengths of light by a surface, and the reflection of others. For this to work properly, the light shining on the surface in the first place needs to consist of all the visible wavelengths. The graphs below shows that daylight indeed consists of a full spectrum, as does incandescent lighting (e.g. tungsten), although its skew to the red end means that white-balancing is necessary to restore the correct proportions of colours to a photographed image. (See my article on Understanding Colour Temperature.)

Fluorescent and LED sources, however, have huge peaks and troughs in their spectral output, with some wavelengths missing completely. If the wavelengths aren’t there to begin with, they can’t reflect off the subject, so the colour of the subject will look wrong.

Analysing the spectrum of a light source to produce graphs like this required expensive equipment, so the CIE devised a simpler method of determining CRI, based on how the source reflected off a set of eight colour patches. These patches were murky pastel shades taken from the Munsell colour wheel (see my Colour Schemes post for more on colour wheels). In 2004, six more-saturated patches were added.

The maths which is used to arrive at a CRI value goes right over my head, but the testing process boils down to this:

  1. Illuminate a patch with daylight (if the source being tested has a correlated colour temperature of 5,000K or above) or incandescent light (if below 5,000K).
  2. Compare the colour of the patch to a colour-space CIE diagram and note the coordinates of the corresponding colour on the diagram.
  3. Now illuminate the patch with the source being tested.
  4. Compare the new colour of the patch to the CIE diagram and note the coordinates of the corresponding colour.
  5. Calculate the distance between the two coordinates, i.e. the difference in colour under the two light sources.
  6. Repeat with the remaining patches and calculate the average difference.

Here are a few CRI ratings gleaned from around the web:

Source CRI
Sodium streetlight -44
Standard fluorescent 50-75
Standard LED 83
LitePanels 1×1 LED 90
Arri HMI 90+
Kino Flo 95
Tungsten 100 (maximum)

 

Problems with C.R.I.

There have been many criticisms of the CRI system. One is that the use of mean averaging results in a lamp with mediocre performance across all the patches scoring the same CRI as a lamp that does terrible rendering of one colour but good rendering of all the others.

Demonstrating the non-continuous spectrum of a fluorescent lamp, versus the continuous spectrum of incandescent, using a prism.

Further criticisms relate to the colour patches themselves. The eight standard patches are low in saturation, making them easier to render accurately than bright colours. An unscrupulous manufacturer could design their lamp to render the test colours well without worrying about the rest of the spectrum.

In practice this all means that CRI ratings sometimes don’t correspond to the evidence of your own eyes. For example, I’d wager that an HMI with a quoted CRI in the low nineties is going to render more natural skin-tones than an LED panel with the same rating.

I prefer to assess the quality of a light source by eye rather than relying on any quoted CRI value. Holding my hand up in front of an LED fixture, I can quickly tell whether the skin tones looks right or not. Unfortunately even this system is flawed.

The fundamental issue is the trichromatic nature of our eyes and of cameras: both work out what colour things are based on sensory input of only red, green and blue. As an analogy, imagine a wall with a number of cracks in it. Imagine that you can only inspect it through an opaque barrier with three slits in it. Through those three slits, the wall may look completely unblemished. The cracks are there, but since they’re not aligned with the slits, you’re not aware of them. And the “slits” of the human eye are not in the same place as the slits of a camera’s sensor, i.e. the respective sensitivities of our long, medium and short cones do not quite match the red, green and blue dyes in the Bayer filters of cameras. Under continuous-spectrum lighting (“smooth wall”) this doesn’t matter, but with non-continuous-spectrum sources (“cracked wall”) it can lead to something looking right to the eye but not on camera, or vice-versa.

 

Conclusion

Given its age and its intended use, it’s not surprising that CRI is a pretty poor indicator of light quality for a modern DP or gaffer. Various alternative systems exist, including GAI (Gamut Area Index) and TLCI (Television Lighting Consistency Index), the latter similar to CRI but introducing a camera into the process rather than relying solely on human observation. The Academy of Motion Picture Arts and Sciences recently invented a system, Spectral Similarity Index (SSI), which involves measuring the source itself with a spectrometer, rather than reflected light. At the time of writing, however, we are still stuck with CRI as the dominant quantitative measure.

So what is the solution? Test, test, test. Take your chosen camera and lens system and shoot some footage with the fixtures in question. For the moment at least, that is the only way to really know what kind of light you’re getting.

SaveSave

SaveSaveSaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

Colour Rendering Index