Tuesday, February 21, 2006

Frame rates ...

You know you just got to love the Internet ... for producing so much good and bad data all at the same time :-) I've just been listening to Major Nelson's latest podcast where they were discussing HDTV and frame rates, anyways, here is some information for you on why or why not 24...

In the beginning when people started with moving images on film, there was no standard frame rate, different films were run at different speeds. However they were all viewed in a dark room, for portrayal of continuous movement, the eye/brain needs to see quite a small number of frames per second, i.e. new images. If you get too few updates then the distance objects move image-to-image becomes too large and your brain spots the disconnect, so the frame rates were between 16-28, i.e. 16-28 frames per second. Unfortunately when you view images with this frame rate in the dark, you see flicker (origin of the phrase 'the flicks'?).

As a solution, somebody came up with the idea of showing the same frames multiple times by using a bladed shutter in the projectors, to show each frame more than once. This doubles (or triples) the flicker rate so now you don't notice it. Importantly this effect is dependent on the light level of the viewing conditions, the lighter the conditions the higher the refresh rate you need to hide the flicker.

So when we get to adding audio to the cinema experience they needed to maintain a constant rate of motion because they are going to encode the sound as part of the film itself, so to maintain a constant pitch of sound they had to standardize on a frame rate. The theory goes that the higher the speed of motion the better the audio will sound as you can get a better signal on the film, the same goes for the pictures, more pictures looks in some ways more 'real'. So they appear to have taken a pragmatic approach to the problem, they took an average of the played rates and came up with a compromise between quality and cost of 24 FPS, (1.5 Ft per second) but displayed twice giving a 48 Hz refresh rate.

When we move to television, we have a similar problem. We want to make televisions a cheap as possible for consumers to buy. In the beginning we have a problem that we need for the transmitter and receiver to run synchronously, i.e. the same frame rate. The circuitry required to do this was seen as too expensive/unreliable, so the engineering side decided that they would use the AC power frequency rates to generate the required sync/refresh signal, in the UK we then get 50Hz and elsewhere 60Hz.

But we don't want to shoot and transmit over twice the film and use twice the bandwidth in transmission, so we look to the flicker vs continuous motion and decide we can do some thing similar to the bladed shutter in a film projector. We transmit the images in two halves (fields), every other line in each update, and we call this interlace.

Now this works for 50 Hz quite well, you only need to speed up the film by 4% and maybe if your lucky re-pitch the audio. But in the 60 Hz world your going to need too great a speed up, so instead we do some other weird hack which adds extra fields to turn the 24 into 60 and this is 3:2 pulldown/pullup.

For the really technical of course we needed to add colour and that put another spanner in the works. When we wanted to add colour to the black and white signal we didn't want to transmit two different signals one black and white and one colour so we stuff the colour information into the original format. This caused the 'NTSC' world to need to slightly shift the frame rates by 1000/1001 giving 59.94Hz instead of 60Hz, but that is for another time...