The Great Debate. Why 60 over 30?

 Pages 1 2 3 NEXT
 

The concept of the argument deludes me, I used to work in film, and having worked in mediums where films are shot in 23.9/25 fps upto 30 for PAL screening, i always prefered a lower frame rate, because the progression of frames feels more movie like (Not like.. sluggish 1-10fps because of lower level hardware) but I want to know what justifies the reasoning to complain if a game is 30fps, and not 60. I'm not asking for a cussing match, & i appreciate arguments on both sides, im more curious as to why.

30fps is acceptable, but I like a solid 60fps. Even if I sacrifice graphics to do so. It's just a personal preference. No justification needed.

a game plays better if its at 60 FPS, thats a fact, theres less input lag, this has a real impact in the way people play

Dead Century:
30fps is acceptable, but I like a solid 60fps. Even if I sacrifice graphics to do so. It's just a personal preference. No justification needed.

Fair play! It's nice to know that there are people who are willing to accept compromise for their preferences. I've seen alot of people going on about how it must be 60fps/1080p, and accepting nothing else. I'm more curved towards the console gamer in this debate, as i can understand PC users flipping out considering money spent on hardware, thus they should be handed the best experience for the price paid.

Well. The question I would ask myself is would I want the moving images I'm looking at to update every time my monitor does or only half the time? I prefer it to update 60 per second for gaming because that's my monitors refresh rate.

If I had a console I would prefer that to output 100fps since that is what my last TV's had as refresh rate.

If I'm watching a movie I don't really care about the fps because it's not my GPU that's having to draw every image anyway.

JettMaverick:

Dead Century:
30fps is acceptable, but I like a solid 60fps. Even if I sacrifice graphics to do so. It's just a personal preference. No justification needed.

Fair play! It's nice to know that there are people who are willing to accept compromise for their preferences. I've seen alot of people going on about how it must be 60fps/1080p, and accepting nothing else. I'm more curved towards the console gamer in this debate, as i can understand PC users flipping out considering money spent on hardware, thus they should be handed the best experience for the price paid.

I've never expected consoles to be graphical powerhouses.
The biggest selling points for me are convenience, ease of use, and couch co-op.
If I want to pretty everything up to the max, I'll just use my PC.

Well 60fps is undeniably smoother. That said, the importance really comes down to what genre of game you are playing.

NuclearKangaroo:
a game plays better if its at 60 FPS, thats a fact, theres less input lag, this has a real impact in the way people play

This is very insightful, thanks for sharing :)

JettMaverick:

Dead Century:
30fps is acceptable, but I like a solid 60fps. Even if I sacrifice graphics to do so. It's just a personal preference. No justification needed.

Fair play! It's nice to know that there are people who are willing to accept compromise for their preferences. I've seen alot of people going on about how it must be 60fps/1080p, and accepting nothing else. I'm more curved towards the console gamer in this debate, as i can understand PC users flipping out considering money spent on hardware, thus they should be handed the best experience for the price paid.

i seriously doubt most PC gamers want 60 FPS just for the sake of it, like i showed you, 60 FPS objectively plays better than 30 FPS

JettMaverick:

NuclearKangaroo:
a game plays better if its at 60 FPS, thats a fact, theres less input lag, this has a real impact in the way people play

This is very insightful, thanks for sharing :)

happy to help, you know, totalbiscuit made a similar video a few days ago, he goes a little bit more technical, he also says theres no reason why console games shouldnt ATLEAST provide the option to play games at 60 FPS, like a simple graphical option, high detail/30 FPS and low detail/60 FPS

and honestly, i agree

NuclearKangaroo:

JettMaverick:

Dead Century:
30fps is acceptable, but I like a solid 60fps. Even if I sacrifice graphics to do so. It's just a personal preference. No justification needed.

Fair play! It's nice to know that there are people who are willing to accept compromise for their preferences. I've seen alot of people going on about how it must be 60fps/1080p, and accepting nothing else. I'm more curved towards the console gamer in this debate, as i can understand PC users flipping out considering money spent on hardware, thus they should be handed the best experience for the price paid.

i seriously doubt most PC gamers want 60 FPS just for the sake of it, like i showed you, 60 FPS objectively plays better than 30 FPS

JettMaverick:

NuclearKangaroo:
a game plays better if its at 60 FPS, thats a fact, theres less input lag, this has a real impact in the way people play

This is very insightful, thanks for sharing :)

happy to help, you know, totalbiscuit made a similar video a few days ago, he goes a little bit more technical, he also says theres no reason why console games shouldnt ATLEAST provide the option to play games at 60 FPS, like a simple graphical option, high detail/30 FPS and low detail/60 FPS

and honestly, i agree

Y'know, curiously when i've opened a console games options and there's a 'display/video' sub-option, I always jumped in thinking that there 'might' be some form of alteration besides brightness etc, I agree with your agreement on this, totally.

Why turn on AA? Why use higher resolutions? Because it looks better.

There are some technical aspects to the way films are made (involving motion blur?) that make lower frame rates a lot more acceptable to the eye. It's something about the way films are capturing a moving image in real life and turning into discrete frames, whereas a game is building up a series of images designed as discrete frames.

NuclearKangaroo:

happy to help, you know, totalbiscuit made a similar video a few days ago, he goes a little bit more technical, he also says theres no reason why console games shouldnt ATLEAST provide the option to play games at 60 FPS, like a simple graphical option, high detail/30 FPS and low detail/60 FPS

and honestly, i agree

Here is that video, for the so inclined. Sums it up for me. https://www.youtube.com/watch?v=eXJh9ut2hrc

Even then, 60FPS is only the current preferred due to refresh rates.

If higher refresh rates became popular, those would be preferred.

At present, 120/144hz retain a margin of the market.

Getting a computer which can run new games at 120/144FPS, however... A lot harder to do.

Linus explains it well in this video.

Generally, it is superior. You see, movies may be 24 fps, but you do NOT play movies and they have a lot of DoF and Motion blur effects.
Games with awesome DoF and Motion Blur effects (metro Last Light for an example and 2033) are more easy on the eye at 30 fps and 24 fps.
But response times are better on 60fps. It is smoother. 120 fps and 144 fps are even better then that.
120 fps, with a game that has good DoF and Motion Blur effects will be easily the best possible combo.

Anyways, for PC gaming, I prefer having options. Having a drop down menu with 30 fps, 60 fps, 75 fps, 120 fps, 144 fps and lets say 240 fps :P (future proofing) would be best.

JettMaverick:
The concept of the argument eludes me, I used to work in film, and having worked in mediums where films are shot in 23.9/25 fps upto 30 for PAL screening, I always prefered a lower frame rate, because the progression of frames feels more movie like.

Define 'movie like'. Like all those other movies that happen to be shot at 24 fps? I can understand that 48 fps movies might look strange if you are used to 24 fps, but the argument is ultimately circular. If movies were shot at 100 fps, 100 fps would look more 'movie like' than 24 fps.

The main reason studios didn't use higher frame rates was that making copies cost money, so they just picked the lowest frame rate that didn't visibly stutter. They didn't compare 24 fps to 100 fps and declare that 24 fps looked better. If neither money nor technical issues were a factor, they would probably have gone for 100 fps or more.

Anyway, when it comes to games, high framerates tend to play better. The eye can track fast moving objects more easily at higher frame rates, and players can respond faster. Once you start playing, you will therefore have more fun if the framerate is good, whereas graphical fidelity doesn't actually matter that much.

Well, as I see it 30fps vs 60fps is like a Ford Mustang GT vs a Ferrari F12-berlinetta.

Sure I'd prefer the Ferrari, since it objectively is the best, but that doesn't preclude me from enjoying the Mustang.

TotalBiscuit summed it up very well recently:

Basically, 60fps leads to significantly less input lag, which is why it is objectively better for gaming. As for films, they use interpolation, which means that there are computer-generated frames inbetween each real frame, making the film appear smoother. Video games (outside of some cutscenes) can't do this because the computer can't predict what is going to happen on the next frame.

Apparently film fps and game fps don't match because motion blur doesn't look natural in games while it does in movies. More fps just makes games look and feel smoother.

The human eye sees at about 60 FPS, and in films the frame rate is just high enough that we don't see the individual frames, and the blur makes up for the rest. With games, there isn't that same motion blur, which makes 30 FPS look awkward. Unless the game has a forced motion blur or doesn't have a large depth-of-field it needs to be as close to what our eyes are used to seeing as possible in order to feel right.

Games are not movies. That's why.

Lower fps in something you watch results in a smoother experience (providing the shutter speed is set right etc.). In games, it results in experience that is less responsive, even sluggish. Basically the difference is that the game has to REACT to your input and the quicker it reacts, the better the experience is.

To have the movie-experience like in a game, you can just add some motion blur or something.

Risingblade:
Apparently film fps and game fps don't match because motion blur doesn't look natural in games while it does in movies. More fps just makes games look and feel smoother.

More to the point film does not render in real time.

Film is a prearranged succession of still images, Games render the world in real time and re-render according to the player's input, the higher the frame rate, the shorter the time gap between an input being made and the response being seen. Since responsiveness (or lack of) is one of the biggest factors in player immersion that makes FPS more important than most player's seem to consciously realise.

Alternatively, as the infamous Quake 3 experiment proved, players with access to a higher FPS win...

Games at 30FPS have significant visual stuttering in many situations. Film=/=games in any way. Real-time rendered graphics are not the same as optically exposed film. No offense to anyone but if you actually spent a lot of time around games at higher frame-rates then you KNOW why higher framerates are better ESPECIALLY in the range of 25FPS-60+ FPS. .

In games where ultra rapid input is paramount, like fighting games, 30FPS is unplayable. There is no real 'debate', there is just a lot of PR spin lately. If you are like my and you own a PC built for gaming you know from experience that 60FPS is about where you want to aim. Even then it can be a compromise. If you've played regularly on a monitor you know just how your frame-rate effects your experience. There is a reason they sell 120Hz monitors. There is a reason the frame-rate of CGI movies is being aimed higher in the future.

I also like a higher FPS for those times you enter a bit of a game where shit gets kicked up a notch. Ordos in WoW would be the best example for me currently.
So many people and so many effects, it takes a bit more toll than the usual goings on, and if my ~60 FPS gets kicked down to ~40 FPS because of that, it's not so bad. But if 30 FPS gets kicked down to 10, that's a fair amount more troublesome to me tbh.

Obviously I was pretty naive to the subject as a whole, this thread has really opened my eyes to the topic and all your input has been really insightful. Thank you guys :)

Lilani:
The human eye sees at about 60 FPS

The human eye does not work at neither 30 nor 60 FPS. The human eye does not work in frames at all.

DoPo:

Lilani:
The human eye sees at about 60 FPS

The human eye does not work at neither 30 nor 60 FPS. The human eye does not work in frames at all.

I'm an animator. I know what frames are, and I know the human mind does not work in frames.

However, frames are in essence a measure of motion over time. Humans do not have an unlimited capacity for perceiving things clearly in motion--stuff can move so fast that all our mind can only interpret a blur, if it can interpret anything at all. So frames per second may not accurately reflect how the human eye and brain actually work, but it can at the very least act as a rudimentary reference for how fast the mind can process visual stimuli and at what point things begin to blur.

As a general rule, 60 FPS games run smoother than 30 FPS. Nothing about responsiveness, but it feels more consistent and solid for whatever reason.

I've seen the argument before that this is due frame-rate stutter; or more specifically, relative frame-rate stutter. Both frame-rates are subject to slow-down, the difference is that the average person starts to notice individual frames when it drops below 20 FPS or so (which is below what films and TV run at; but unlike games, those ALWAYS have rock-solid frame-rate); this is plausible for a 30 FPS game to do when under strain, but a 60 FPS game needs to be under a LOT more strain for it to drop below that threshold.

Both 30 FPS and 60 FPS games slow down at points... you're just much less likely to notice this happening on a 60 FPS game, while it can happen surprisingly frequently in a 30 FPS game (especially if poorly optimized).

JettMaverick:
i always prefered a lower frame rate

In games? I know I (and most people) immediately feel the difference between 24 and 30 in video, and strongly associate them to film and TV/soaps respectively.

But are you really saying you prefer playing games at 30 fps instead of 60 fps?

Frames don't blur together in games; they're just a series of perfectly sharp stills played like a stop-motion animation. We're only just starting to get per-object motion blur in games (it's a very performance-intensive effect), which could ease the pain of playing at 30 fps, but it'll still be worse than 60+ fps because...

...In films you control neither the action nor the camera. Directors have to take great care to make an action scene look good on film at 24 fps. They control the camera, its movements, its framing and the actors/stuntmen very carefully for the scene not to look like shit. Even then it doesn't always work; surely you heard some moviegoers and critics call out bad action scenes and "shaky cam" by now?

There's more to a game than how it looks; it has to respond to your commands and show you the result as fast and seamlessly as possible. It's just a mathematical fact that a 60 fps game can be twice as responsive as the same game at 30 fps.

Lilani:
However, frames are in essence a measure of motion over time. Humans do not have an unlimited capacity for perceiving things clearly in motion

Yes, however, I was objecting to the statement that humans see at 60 FPS. They simply do not. FPS is a measure but not a measure of how people see.

DoPo:

Lilani:
However, frames are in essence a measure of motion over time. Humans do not have an unlimited capacity for perceiving things clearly in motion

Yes, however, I was objecting to the statement that humans see at 60 FPS. They simply do not. FPS is a measure but not a measure of how people see.

And I agreed with you and clarified what I meant by that. I simply took the liberty of assuming that most people were aware of the fact that the human eyes aren't camera lenses and that the brain is not a camera.

I don't notice a huge difference in visuals between 30 and 60 fps. I'm happy to watch movies at 24 frames. My big problem with it only comes into play when input is part of the equation. Mouse input just sucks at 30 fps. Every single freelook game I've played that was capped at 30 frames made my mouse feel like a cinder block.

MrFalconfly:
Well, as I see it 30fps vs 60fps is like a Ford Mustang GT vs a Ferrari F12-berlinetta.

Sure I'd prefer the Ferrari, since it objectively is the best, but that doesn't preclude me from enjoying the Mustang.

i dont know, 30 FPS is not really like its good, its just the BARE MINIMUM, is like, a mustang GT compared to a Fiat 1, the Fiat is a serviceable car, but far from ideal

now 60 FPS and anything higher, there your car comparison works, 60 FPS is pretty great, but anything higher is obviously going to be better

Lilani:
I simply took the liberty of assuming that most people were aware of the fact that the human eyes aren't camera lenses and that the brain is not a camera.

I've seen people make the following claims:
- eyes work at 24 FPS ("because of movies, duh")
- eyes work at 30 FPS ("it's the minimum games need, everything else is irrelevant, duh")
- eyes work at somewhere around 40 and 50 FPS but not as high as 60 FPS ("Because 30 is the minimum but more is better. 60 is overproviding, if there happens to be a framerate drop" or some similar bullshit)

More than one person has claimed it. There are multiple videos (some posted in this thread) and articles devoted to dispelling the illusion of "eyes see X FPS only". I think that is evidence enough that clearly not enough people know it.

 Pages 1 2 3 NEXT

Reply to Thread

This thread is locked