This is another of those frequent questions at workshops and online.
What frame rate is the best one to use?
First – there is no one “best” frame rate. It really depends on how you want your video to look. Do you want the slightly juddery motion of a feature film or do you want silky smooth motion?
You also need to think about and understand how your video will be viewed. Is it going to be watched on a modern TV set or will it be watched on a computer? Will it only be watched in one country or region or will it be viewed globally?
Here are some things to consider:
TV in Europe is normally 50Hz, either 25p or 50i.
TV in the North America is 60Hz, either 30p or 60i (both actually 29.97fps).
The majority of computer screens run at 60Hz.
Interlaced footage looks bad on most LCD screens.
Low frame rates like 24p and 25p often exhibit judder.
Most newer, mid price and above TV’s use motion estimation techniques to eliminate judder in low frame rate footage.
If you upload 23.98fps footage to YouTube and it is then viewed on a computer it will most likely be shown at 24p as you can’t show 0.98 of a frame on a 60Hz computer screen.
Lets look first at 25p, 50i and 50p.
If you live in Europe or another 50Hz/Pal area these are going to be frame rates you will be familiar with. But are they the only frame rates you should use? If you are doing a broadcast TV production then there is a high chance that you will need to use one of these standards (please consult whoever you are shooting for). But if your audience is going to watch your content online on a computer screen, tablet or mobile phone these are not good frame rates to use.
Most computer screens run at 60Hz and very often this rate can’t be changed. 25p shown on most computer screens requires 15 frames to be shown twice and 10 frames to be shown 3 times to create a total of 60 frames every second. This creates an uneven cadence and it’s not something you can control as the actual structure of the cadence depends on the video subsystem of the computer the end user is using.
The odd 25p cadence is most noticeable on smooth pans and tilts where the pan speed will appear to jump slightly as the cadence flips between the 10 frame x3 and 15 frame x 2 segments. This often makes what would otherwise be smooth motion appear to stutter unevenly. 24p material doesn’t exhibit this same uneven stutter (see the 24p section). 50p material will exhibit a similar stutter as again the number of padding frames needed is uneven, although the motion should be a bit more fluid.
So really 25p and 50p are best reserved for material that will only ever be seen on televisions that are running at 50Hz. They are not the best choices for online distribution or viewing on computers etc.
24p, 30p or 60p (23.98p, 29.97p)
If you are doing a broadcast TV show in an NTSC/60Hz area then you will most likely need to use the slightly odd frame rates of 23.98fps or 29.97fps. These are legacy frame rates specifically for broadcast TV. The odd frame rates came about to avoid problems with the color signal interfering with the luma (brightness) signal in the early days of analog color TV.
If you show 23.98fps or 29.97fps footage on a computer it will normally be shown at the equivalent of 24p or 30p to fit with the 60Hz refresh rate of the computer screen. In most cases no one will ever notice any difference.
23.98p and 24p when shown on a 60Hz screen are shown by using 2:3 cadence where the first frame is shown twice, the next 3 times, then 2, then 3 and so on. This is very similar to the way any other movie or feature film is shown on TV and it doesn’t look too bad.
30p or 29.97p footage will look smoother than 24p as all you need to do is show each frame twice to get to 60Hz there is no odd cadence and the slightly higher frame rate will exhibit a little less judder. 60p will be very smooth and is a really good choice for sports or other fast action. But, higher frame rates do require higher data rates to maintain the same image quality. This means larger files and possibly slower downloads and must be considered. 30p is a reasonable middle ground choice for a lot of productions, not as juddery as 24p but not as smooth as 60p.
24p or 23.98p for “The Film Look”.
Generally if you want to mimic the look of a feature film then you might choose to use 23.98p or 24p as films are normally shot at 24fps. If your video is only going to be viewed online then 24p is a good choice. If your footage might get shown on TV the 23.98p may be the better choice as 23.98fps works well on 29.97fps TV’s in 60Hz/NTSC areas.
BUT THERE IS A NEW CATCH!!!
A lot of modern, new TV’s feature motion compensation processes designed to eliminate judder. You might see things in the TV’s literature such as “100 Hz smooth motion” or similar. If this function is enabled in the TV it will take any low frame rate footage such as 24p or 25p and use software to create new frames to increase the frame rate and smooth out any motion judder.
So if you want the motion judder typical of a 24fps movie and you create at 24fps video, you may find that the viewer never sees this juddery, film like motion as the TV will do it’s best to smooth it out! Meanwhile someone watching the same clip on a computer will see the judder. So the motion in the same clip will look quite different depending on how it’s viewed.
Most TV’s that have this feature will disable it it when the footage is 60p as 60p footage should look smooth anyway. So a trick you might want to consider is to shoot at 24p or 30p and then for the export file create a 60p file as this will typically cause the TV to turn off the motion estimation.
In summary, if you are doing a broadcast TV project you should use the frame rate specified by the broadcaster. But for projects that will be distributed via the internet I recommend the use of 23.98p or 24p for film style projects and 30p for most other projects. However if you want very smooth motion you should consider using 60p.
8 thoughts on “More on frame rate choices for todays video productions.”
Thanks for this article. Indeed the vulgar adoption of motion compensation makes things more complicated or at least different given that it is now enabled by default on most tv’s.
I need to add that if you use 60p in a 50 Hertz electrified country, you will have some bad flickering whenever you are shooting under certain very common types of electric lighting so this is not advisable at least when you are shooting indoors at night in an uncontrolled environment. The same is true whenever computer screens appear in your frame.
Just a question : if you shoot for broadcast only will you shoot 50i or 50p given that the broadcaster asks 50i ? For example with an FS7 ?
If shooting 60p in a 50Hz country just use a 1/100th shutter. If shooting 50p in a 60Hz country use a 1/60th shutter. Either way you won’t get flicker.
If the broadcaster wants 50i then I would shoot 50i. I know that 50p makes great 50i, but broadcasters hate it if they ask you for one thing and you do something different, even if different might actually be better.
Well technically that is correct I guess but a 1/100 shutter deprives you of some light.
Regarding the 50i , I was imagining shooting 50p but delivering 50i so the broadcaster would not notice. I guess shooting at 50i you have less compression so the image quality will theoretically be better.
Thanks Frédéric for bringing this up, as I have the same question, but a little more tricky. Nowadays most cameras can either record HD 50p or 4K 25p. So, if the broadcaster asks for 50i, what is better:
-Shooting in HD 50p and putting it on a 50i timeline for delivery
-Or shooting in 4K 25p and putting it on a 50i timeline for delivery
I did two tests and shot the same scene in 4K 25p and HD 50p, and then put both clips into a FCPX timeline with the settings HD 1920×1080 50i. Exported the timeline. I felt that the 4K material looked better, even if it was shot in 25p. What do you think?
Generally shooting 50p will provide the best 50i. Shooting 4K 25p and you won’t ever have the 50 hz refresh rate so the motion will not be as expected for 50i.
I have a 1080/60i camcorder and watching such interlaced video on my TV is perfect, while my PC/LCD monitor has problems properly deinterlace it. Why it is so? Is the TV showing video “naturally” interlaced , i.e. like original CRT TVs? Aren’t there any PC video cards, which would behave identically like TV, i.e. showing half-frame after half-frame?
Computer monitors are always progressive and cannot show interlace correctly. LCD TV’s do a better job of interlace by working at double the frame rate, but even so it’s still a progressive display and the resolution will be slightly reduced with interlace material compared to progressive.
Thanks for your reply!
Well, the modern TV sets are for sure are progressive, exactly like the PC monitors and yet they are able to cope with the interlaced sources much better. And this is what I don’t understand why it is so. It can’t be due to the higher refresh rate of TV, as my TV and PC monitor have just 60Hz refresh rate. It has to do with how the electronics “streams” the video signal to the LCD panel.