For Science!

6 Frame Rate Facts You Didn’t Know


Gamers will vociferously defend the superiority of 60 frames per second over 30. Moviegoers will lament the loss of that cinematic feel when watching The Hobbit at 48 FPS rather than the traditional 24.

With frame rate issues once again in the headlines, I thought this an opportune time to revisit the great frame rate debate in video games and cinema. But we’ll take a different angle – rather than try to convince you that something is better or worse, here are six facts about frame rates that you may not have known. They may just change your opinion – or further entrench you.

1. The FPS “limits” of the human eye

First off, a question people often ask is: how many frames per second is enough? At what point can we no longer perceive any difference in FPS? The answer, unfortunately, isn’t black and white.

In their book, Restoration of Motion Picture Film, authors Paul Read and Mark-Paul Meyer write, “The human eye and its data reception and transmission system can form, transmit and analyse 10-12 images per second.”

Now hold on – does that mean that we cap out at 12 FPS? No. The key word in that quote was “analyse” (British spelling). The mind can interpret up to a dozen separate images shown sequentially. Beyond that, we begin to perceive a continuous animation.

“The vision centre in the brain retains each individual image for one-fifteenth of a second. If [it] receives another image during this fifteenth of a second, the sight mechanisms will create the sensation of visual continuity.”

So 10-12 FPS is effectively what distinguishes animation from a slideshow, forming our lower limit. But what about an upper limit?

“In laboratory testing, it has been found that the human sense of sight can distinguish up to 48 flashes of light per second, the switching from light to dark not being detected when they take place at higher rates of speed.”

What this means is that, above 48 FPS, the effect we call “flicker” should no longer be noticeable. That said, vision is a complex system. In very bright rooms, flicker can be noticeable at higher frame rates. Larger displays (or small displays that you get very close to) produce more noticeable flicker, because our peripheral vision is most sensitive to this effect. The sweet spot where flicker becomes unnoticeable to most people under most conditions is 70-90 FPS.

Why is flicker important to this discussion? More on that later, but sadly, this still isn’t an upper limit, just another lower limit. It turns out that, when dealing with still frames that do not have motion blur – which we’ll get to as well – more frames per second will always result in a smoother visual. In fact, the faster the on-screen movement, the more we are able to distinguish “choppiness,” and thus the higher the FPS we need to achieve a smooth image.

2. Animation is the result of the phi phenomenon

Phi Phenomenon

5. Long exposure time makes 24 FPS smooth

If 24 FPS is good enough for film, then surely, 30 FPS is good enough for games, right? No one ever complains about movies being choppy.

The reason movies don’t appear choppy at 24 FPS is because they are recorded at 24 FPS. At their most basic, a movie is a sequence of photographs, and a film camera takes 24 photographs per second.

If you’ve ever attempted to photograph a moving subject, you’ve no doubt seen the resulting blurry mess in the developed picture. And you’ve wondered why professional photographers manage to get those glorious, unblurred images of action shots. It all has to do with a camera’s shutter speed, or the image’s exposure time. Essentially, the more milliseconds a camera takes to record an image, the more blurred the motion will be.

24 FPS footage introduces a lot of motion blur, and it’s this blur that helps the illusion of a continuous, flowing sequence of animated images. If you take a screenshot of a movie during an action scene, you’ll see motion blur everywhere. But if you take a screenshot of a game during an action scene, you won’t see any blur at all – or rather, some minimal blur rendered in post-processing.

A game running at 24 FPS would look choppy, especially when there is a lot of movement on screen, because there isn’t any motion blur to help guide our eye toward the movement. So why not just make games render motion blur? Because that requires additional processing power, and if you have the power to render motion blur, then you have the power to render more frames per second, making motion blur unnecessary!

Frequency Flicker

6. Your display limits your FPS

Okay, this is something you probably know, but I couldn’t let you walk away from this discussion without ensuring you do. More FPS in a video game is always better, right? Wrong.

In gaming, it doesn’t matter how powerful your computer or video card is. It doesn’t matter if your system can render 200+ FPS. If your monitor operates at a refresh rate of 60 Hz, then you’re only getting 60 FPS.

“Refresh rate” is a measure of how many times per second your monitor updates the image it displays. Hz – or hertz – is simply a unit that defines how many times per second something happens. So if your monitor is only showing you 60 images per second, but your computer is rendering 120 FPS, then it’s dropping 60 frames every second – frames it has put the computational power into rendering. What a waste!

That said, because there are areas in games where computers tend to “slow down” and frame rates drop, you want to ensure that your lowest frame rate never drops below your monitor’s refresh rate, if you want a consistently smooth experience. Just keep in mind, when building that monster PC, that you may want to consider a monitor with a higher refresh rate as well.

So where do you stand in the frame rate debate? Is 30 FPS enough in video games? Is 60 enough? What about movies? Should we move away from the motion-blurred 24 FPS and move onto greater frame rates? Let us know in the comments!


About the author