I have seen a lot of debate on social media lately about frame rates. What frame rate do you prefer? What frame rate is better? Why is it better? Well, here is my answer, there is no right or wrong frame rate.
One framerate is not suitable for all applications or projects. You can’t just say 24p is the best frame rate for everything because it isn’t.
Nobody wants to watch sport in 24p, and conversely, no one wants to watch a movie at 60p. The reason we find 24fps so visually appeasing when watching movies is that it is what we have always seen and our brains are accustomed to dealing with. This is why if you suddenly see a movie at 60fps it doesn’t quite look right. It’s not that it is wrong (although you could aesthetically argue that it was), it is because we are not used to seeing material presented that way in a theatre. It is almost like when you go to get on a moving walkway that isn’t turned on. Your brain is expecting it to be moving, and it feels strange when you step on it and it isn’t.
The history of 24fps
24fps is the defacto framerate we are used to when watching theatrical releases in a movie theatre. You will often hear cinematographers talking about the magic of 24fps. One of the main reasons we have 24fps is that it was needed to capture sound!
William Dickson who was working for Thomas Edison (I’m sure that name rings a bell), came up with a solution where you could shoot at 16fps and show the same image 3 times by employing the use of a three-bladed shutter in the projector (hence 3×16 is 48fps). Why 16fps? Well, that was the minimum number of frames that were required to create the illusion of a moving image.
This was fine in the early days when there were silent films, but what about when sound came along? Well, back in 1927 you couldn’t record sound at 16fps, so they went into the next submultiple of 48, which was 24. So instead of having 48fps (16fps shown 3 times with a three-bladed shutter), they ended up with 24fps with a 2-bladed shutter. This is how 24fps was born.
24fps proved to be the perfect compromise because with a 2-bladed shutter your shutter speed is 1/48th. This is the critical factor, because it is the shutter that is actually creating the effect that makes 24fps 1/48th shutter suitable for moving images. 24fps was deemed to be the correct frame rate where you could no longer see flicker. If you take a photograph of something moving (say a person walking in the street) at 1/48th shutter it will most likely result in a blurred image. The same is true with moving images, but because they are being combined and you see 24 of them in one second the blurring effect isn’t nearly as noticeable.
Progressive & Interlaced
As you are probably well aware, video can either be interlaced or progressive. So how do they differ? With progressive, each refresh period updates all of the scan lines in each frame in sequence. When displaying a natively progressive broadcast or recorded signal, the result is optimum spatial resolution of both the stationary and moving parts of the image. Interlaced, on the other hand, was primarily invented as a way to reduce the flicker you would see on CRT video displays without increasing the number of frames per second. Interlacing also retains detail while requiring lower bandwidth compared to progressive scanning.
So, how does interlaced video actually work? In Laymen’s terms, it is a way of doubling the number of fields while effectively keeping the same amount of frames. The horizontal scan lines of each complete frame are treated as if they are numbered consecutively and captured as two fields. These two fields consist of an odd field (upper field) consisting of the odd-numbered lines and an even field (lower field) consisting of the even-numbered lines.
On analog display devices such as CRT monitors interlaced footage effectively doubles the frame rate as far as perceptible flicker is concerned. Hence the reason it was and is still widely used for broadcast television.
The problem with interlaced is it wasn’t designed for any of our modern-day viewing devices. Almost all modern-day viewing is done on progressive displays. Your TV at home, your computer screen, your iPhone, they are all progressive devices.
If you try and display a natively interlaced signal on a progressive display, the overall spatial resolution of the image is degraded by the line doubling. This is why you will see flickering artifacts and why native interlaced material looks horrible on modern-day viewing devices. To counteract this we need to use a process referred to as deinterlacing so that interlaced material can be optimized to look a lot better on modern-day viewing platforms that are all progressive. The problem with the whole deinterlacing process is that it still can’t magically match the quality of true progressive scan material.
A lot of TV stations still send out interlaced signals because it requires significantly less bandwidth than progressive. Often you will find that material that was originally produced in a progressive form will be converted and sent out as 50 or 60I by broadcasters.
Interlaced will eventually go away because it only exists as HD and below resolutions. There is no such thing as UHD or 4K interlaced signals.
The argument for high frame rates and not higher resolution
There is a definite argument that you get much better image quality by doubling the frame rate than by adding more pixel resolution. ‘How dare you mention higher framerates‘ I can hear the purists screaming already. But before you grab your pitchforks and come to burn down my house, let me explain.
I don’t think high frame rates are suitable for everything, but I’m also not a believer that 24fps is suitable for everything either. I would much prefer to watch sports at 120p or a wildlife documentary at 120p because I want to see detail and not have motion blur. I want to feel like I am right there. Conversely, there is a certain amount of magic about seeing a theatrical release movie at 24fps, because it suspends disbelieve and transforms you into another world. It isn’t supposed to be real and the fantasy aspect of the movies works well at 24fps.
Shooting at 24fps helps mask a certain level of sharpness, softens motion, and helps with set design, costumes, and make-up. At higher frame rates, you can see flaws in sets, and make-up and costumes can look not quite right. The ultra-sharp look of using higher frame rates and a higher shutter speed can take you out of the viewing experience when watching a movie.
Peter Jackson’s ‘The Hobbit‘ was shot in 48fps in 3D with dual RED EPICS. Three different versions of the movie were available to see, 3D HFR, Standard 3D, and 2D. It immediately created a heated debate in the film industry. Audiences and critics complained about the ultra-sharp look of the HFR version. Subsequently, Jackson was forced to change the look to the follow-up films in the series. While Jackson may well have been technically right that the viewing experience of 3D at 48fps was better, audiences just weren’t ready for it.
Here is the thing though, younger audience members who were more accustomed to viewing high frame rate material when playing video games didn’t have nearly as many issues with it as the older generation. Again, it is what we are used to that greatly affects our perceived interpretation of what we are viewing. You can also put this down to the differences between TV and film. We associate a certain look with TV and so if we see a movie that looks like TV it subconsciously makes us react in a different way.
Director Ang Lee has also shot several movies ain 120p, including Billy Lynn’s Long Halftime Walk and Gemini Man. Neither movie was well received and both were box office flops. That hasn’t stopped him from continuing to do it. I’m not angry about him doing it as experimentation and pushing boundaries should never be frowned upon.
Now, resolution is directly effected by framerate. Anyone who tells you otherwise doesn’t know what they are talking about.
Perceived image fidelity is not simply about resolution. At first glance, when objects are moving or appear only very briefly, your eye cannot perceive the difference between a 1K and a 4K image. For fine detail to be resolved at higher resolutions, your eye needs time to rest on the image. If the image is moving or cut too quickly the extra resolution cannot be perceived. If you have perfect eyesight and were sitting in the first two rows of the theatre you were in could you actually enjoy the benefits of 4K. Anyone sitting further back would not be able to perceive the difference in resolution. To get the benefit of 4K at home you either need to be sitting very close to the screen or have a very large 4K TV.
If you are filming in 2K or 4K at 24 frames per second with a 180-degree shutter angle (1/48th of a second) and the camera is moving quickly, the amount of motion blur means the image will look the same in 2K as it does at 4K. The only way to perceive a difference between 2K and 4K when the camera is moving is to use a higher frame rate and shutter speed – with less resulting motion blur. If you want to have higher resolution for motion pictures where objects are moving and the camera is moving, then you need to use higher frame rates for capture and display to see a difference.
Within the industry, the jury still seems to be out on whether HFR material still looks at all like film and whether strobing artifact effects are needed for it to retain its film look. An argument can be made that if we don’t move to higher frame rates, then we need to end up with images that are more static, or forget about higher resolutions altogether.
US Broadcast TV is moving forward with 60P as the standard and this makes a lot of sense. 4K TV transmission is done in the REC.2020 color space that is only compliant with progressive frame rates and not interlaced. 24 or 30P would not be options as the amount of motion blur would make sports events almost unwatchable. There are now digital cinema projectors that are capable of 120 progressive frames per second, so the technology is in place, but the question remains: will the aesthetics be accepted by directors, the industry, and most importantly the audience?
I would strongly argue that higher resolution requires significantly higher capture frame rates in anything but static shots, and there aren’t a lot of static shots in most of today’s movies. If we want to stick to 24fps for movies then the endless resolution push seems a bit pointless to me.
Final Thoughts
There is no such thing as one framerate to rule them all. Anyone who thinks you should shoot everything at 24p is making a very dumb argument. Different frame rates, including high frame rates, have their place. Dismissing high frame rate capture and display is not the correct thing to do.
Technology is constantly changing and the audience’s perceptions of what they like will also evolve along with the technology. We are so used to seeing 24fps that we have become accustomed to it, and that is not a bad thing. I like the look of 24fps for movies, because it doesn’t look like real life. I don’t want to see every little detail, and in a strange way, it is what I can’t see that helps create the illusion of another world. If I am watching a fictional movie then I want to be taken on a journey and that is half the magic of 24fps.
However, If I am watching live sport or a nature documentary I want it to be as real as possible and to make it feel like I am there. This is where I would quite comfortably watch 4K at 60p or 120p.
For higher frame rates and resolutions to coexist and evolve the language of filmmaking may well have to change to. This is not something that is likely to happen soon, especially when it comes to shooting movies.
What are your thoughts on the whole frame rate debate? Let us know in the comments section below or on the forum.
Like what we do and want to support Newsshooter? Consider becoming a Patreon supporter and help us to continue being the best source of news and reviews for professional tools for the independent filmmaker.