With The Hobbit, audiences are finally going to see a high-frame-rate production in wide distribution.  There have been several attempts to bring higher frame rate imaging to the big screen over the years, but this is the first large-scale push for something higher than the standard 24 frames-per-second we've seen for decades.  At first glance, the implications of a higher frame rate may seem obvious:  clearer motion, sharper action, and more realistic imagery.  But audience reaction to this imaging tells us something more subtle is going on. 

When we talk about frame rate, the best way to understand it from a signal point of view is to say that the frame rate determines the maximum bandwidth capacity of the imaging signal.  So when a movie is shot at 48 frames-per-second (fps), it has twice the bandwidth capacity of a traditional 24 fps movie.  But that doesn't mean that it has twice the actual signal.  Nor does the frame rate tell us how much noise is embedded in with the signal.

 Spatial aliasing noise, or a Moiré pattern.

Spatial aliasing noise, or a Moiré pattern.

Noise is a tricky subject.  When we think of noise, we usually think of "snow" or graininess--randomly distributed amplitude noise from photon statistics or electronic readout.  But there are other sources of noise as well.  For instance, on a cheap TV camera, the weatherman's striped shirt may cause new, false patterns to appear in each frame.  This is known as spatial aliasing, or Moiré patterns.  It a resonance between the pattern of the shirt and the regular pixel grid of the sensor.  It's clearly an incorrect signal and can therefore be considered noise.

In the time-domain, this same source of noise exists.  This is known as temporal aliasing, and it's what causes judder in motion imaging.  Just as the stripes in the shirt resonate with the sensor grid, straight lines in panning shots interact in time with the discrete framing of the camera.  It's exactly the same math, but in the time dimension instead of two spacial ones.  For a longer explanation of temporal aliasing, see our explanation page.

So does increasing the frame rate of acquisition cure temporal aliasing.?  Alas, no.   Just as increasing sensor resolution does not fix Moiré patterns spatially, increasing the frame rate does nothing to reduce temporal aliasing or judder.  It does, however, move the noise around.  

First, let's look at what's known as the baseband, the signal bandwidth as recorded.   A 48fps image sequence has twice as much bandwidth as a 24fps sequence.  But there's another aspect to the data, which is how crisply image data is recorded into that bandwidth.  This is controlled by the camera's shutter, and is represented here as the Modulation Transfer Function (MTF) of the camera.  The camera's MTF tells us how much contrast the camera records for optical signals of different time frequency, just as a lens MTF would tell us how much contrast the lens will have for linepairs of different size.  Ideally, the MTF would be 1 (full contrast) over its entire range.  When it dips down, that represents loss of contrast, or undesirable blurring of the action to be recorded.  A shutter that is open longer per frame will cause more motion blur, and therefore reduces the recorded contrast of moving things.  So it's important to discuss the shutter along with frame rate whenever discussing overall motion signal response.

In the following figure, we compare a traditional movie with a HFR scenario.  Typical movies are almost always shot with a 180-degree shutter angle, meaning a 1/48th second shutter at 24 fps.  In many HFR productions, a 360-degree shutter angle is used, meaning the shutter is open for the entire frame-time.  This causes more motion blur, but in the case of 48fps, the shutter time is 1/48th of a second, which allows easy creation of typical 24fps/180-degree versions of the film simply by skipping every other frame of the 48fps/360-degree master.  It comes at a cost, however.  There is substantial contrast loss in the new bandwidth the 48fps rate provides.  So while we got twice the bandwidth by increasing the frame rate, the new bandwidth was filled with some very smeary data.  

It turns out that human perception of motion is much more strongly affected by shutter angle than by frame rate.  TV is higher frame rate, but it's also shot with a 360-degree (1/60th second) shutter, where movies are almost always shot at 24 fps, but at a 180-degree shutter.  The 360-degree shutter used in The Hobbit contributes strongly to the softer appearance, and may actually be more dominant than the frame rate.

 Modulation Transfer Functions for 24fps with a 180-degree shutter and 48fps with a 360-degree shutter showing the increased bandwidth but decreased contrast response of this particular HFR situation.

Modulation Transfer Functions for 24fps with a 180-degree shutter and 48fps with a 360-degree shutter showing the increased bandwidth but decreased contrast response of this particular HFR situation.

Now the MTF tells us about the amount of data stored within the bandwidth provided by the frame rate, but what about the aliasing noise?  To see the aliasing noise, we need to look at the MTF of the signal before it is sampled by the camera.  In the following figure, the initial MTF of a 180-degree shutter is shown, including the area in frequencies beyond the capability of the frame rate to capture.  Notice what happens to signals in this region.  These signals are folded down into the baseband by the camera, and become noise blended into our signal there.  This is temporal aliasing, and the root cause of judder.  Nearly 50% of the signal acquired by a camera with a 180-degree shutter can be aliasing noise, regardless of the frame rate at which it is acquired. 

The signal in the frequencies higher than the camera's frame rate can record is folded into the baseband, becoming noise there.  This is temporal aliasing.

So why doesn't frame rate cure temporal aliasing?  Well, first, the MTF for a given shutter angle is unaffected by frame rate, so the relative ratios of signal to aliasing don't change regardless of frame rate.  Second, increasing the sample rate never pushes the aliasing band into a region where there is no signal to alias.  In audio, this isn't true.  In audio systems, we can increase the sample rate to the point that the air and microphones no longer have any frequency content that could alias and cause noise in the recording.  In the optical situation, however, the real-world frequency content is nearly infinite, so there always exists frequencies to cause aliasing.  

To sum up, high frame rate imaging will yield more realistic imagery because of the increased bandwidth, but in that bandwidth the aliasing ratio will remain unchanged.  While audiences have become used to aliasing at 24fps, the new judder frequencies introduced by 48 or 60fps acquisition and playback may be very unfamiliar.  

See our gallery for examples of how the Tessive Time Filter addresses temporal aliasing.  Regardless of frame rate, the Tessive Time Filter corrects judder and substantially improves signal to noise ratio from aliasing.