The Great Debate. Why 60 over 30?

 Pages PREV 1 2 3 NEXT
 

JettMaverick:
The concept of the argument deludes me, I used to work in film, and having worked in mediums where films are shot in 23.9/25 fps upto 30 for PAL screening, i always prefered a lower frame rate, because the progression of frames feels more movie like (Not like.. sluggish 1-10fps because of lower level hardware) but I want to know what justifies the reasoning to complain if a game is 30fps, and not 60. I'm not asking for a cussing match, & i appreciate arguments on both sides, im more curious as to why.

this link explains a lot
http://www.100fps.com/how_many_frames_can_humans_see.htm

given the definition, lighting, movement, type and refresh rate of normal monitors 30 fps looks ugly , feels laggy and in high precision games like fighters and fps actually makes you less accurate.

on a console playing on a big tv at a large distance when your gimped with a gamepad to start with you might honestly not notice or care about the 30/60/120 debate. but unless your sight impaired theres just no debate on PC

30 FPS is perfectly playable and is just fine with most games, with an exception of a couple heavily reflex-reliant genres. The issue is that with a lower framerate, the delay between you pressing a button and the corresponding action appearing on screen is doubled. It is still not low enough to cause significant issues and I tend to go for 30 FPS when fiddling with the settings on my computer, so I don't exactly subscribe to the 60-fps hysteria. In fact, I was for a long time on the side of the debate that claims framerate is mostly irrelevant as long as it doesn't drop below 30.

HOWEVER, there is a very large number of people to whom the difference is important and to whom the (even slight) change in responsiveness does make a world of difference. And even I, regardless of the fact that I find 30 FPS perfectly acceptable, have to admit that 60 fps does look better and does play better. I do not find the difference noticeable enough to matter in most genres but many people do and I have to respect that.

The thing is, in order to raise FPS, you have to sacrifice some of the graphical fidelity. And this is where the argument gets complicated. Prioritising the boost in responsiveness against the quality of graphics is a matter of personal preference and I do not think that it is possible to chose one approach without alienating the part of the audience that would prefer the other. I rarely agree with Totalbiscuit on anything but he hammered this one point home the other day - the best approach in this situation is giving the choice to the player. Let them switch between a lower setting that runs on 60 FPS and a higher setting that run on 30 FPS. This is something that is very much a given on PCs and there is very little reason for the option to not be available on consoles as well.

Frames per second are important. I've been running games at over 60FPS for about ten years. It's noticeable when it's lower.

Each frame your game loop will update itself as much as possible. The GPU will render the scene just once.

During the update, input is polled, physics calculations are performed, collision-detection, artificial intelligence and so-on.

PS4: "I can do all of this easily 60 times every second!"
XB1: "I can only do it 30 times every second. Gosh, I'm out of breath."

With the PS4 being able to easily handle 60FPS, it's saying "games may not be taking full advantage of me right now, but there's room for improvement!" and that's a good thing - it's like having an elastic waistband on your favorite pair of pants.

I've been playing Resident Evil 4 HD Edition on Steam lately and some of the animations are quite jerky, presumably because they're ported from a system that expects a certain locked frame-rate, but my PC is running it significantly faster, and the result means you get zombies stumbling "slowly" towards you at two or three times their regular speed, like when you emulate a really old Amiga game on a modern PC or something.

Ok let's break it down again then:
The human eye does not see frames it sees an endless stream of light hitting it's sensor array, and that sensor array has it limits. Our sensors need a certain accumulation of light before an object is clearly visible so when objects move the light coming from then will distribute across our view and as it gets faster the form gets an ever decreasing pronunciation, i.e. speed blurs objects up to the point we can't spot their presence at all.
But despite all that we never have a cut off moment where we would stop getting new information.

Movies are captured in a similar way, this time the light is captured with a digital sensor array or film that also has limits as far as light/data accumulation goes. Objects that move fast will again leave a fainter trace of their form across the scene. Difference comes in where movies do need a cut off point, they need to store their data picture by picture for the tech to work and there we get the frames.
So 24FPS was worked out to look smooth enough, that makes each picture stay statically on display for 42ms, sounds like a small amount of time but compared to our normal vision that has 0ms of static pictures this is quite the gap.
Luckily the shortfalls of sensors/film is exactly what makes movies so compatible, that 42ms picture might be static but in the time to make it near 42ms of incoming light was recorded, so all the movement in that time frame actually was captured and we lost very little information.

Games however work from the complete opposite end, renderings do not capture ongoing scenery they create the scene from scratch each moment at a time. And every frame they create has come from an absolute zero standstill of the scene, so if you render at 24 FPS that 42ms static image is not an accumulation of anything between fames, you completely lost the ongoings for the past 42ms... which again is a minute time frame but to our vision that is a gigantic information gap.
So we go up to 30FPS which makes a 33.3ms information gap, at 60FPS we get a 16.6ms gap, 120FPS down to 8.3ms, 240FPS - 4.2ms, so on and so forth.
But wait, my monitor doesn't swap images that fast so why would we even go there? Because most game engines have their mechanics locked to visual frames, then the information gap isn't just in your visual part it also affects input.
At 30FPS there are 33,3ms gaps where the game has no clue what you are telling it, then the game jumps to the next moment and then you need to counter compensate for whatever it missed/got wrong. In slow games with slow control schemes that gap is mostly covered over by the games inherit delayed response, but as the need for precision and speed goes up that gap becomes and unavoidable hindrance to the entire experience.
Does 60 FPS then fix all the problems then... no it just makes them half as bad as before, and 120 would make it one quarter as bad.

zumbledum:
on a console playing on a big tv at a large distance when your gimped with a gamepad to start with you might honestly not notice or care about the 30/60/120 debate. but unless your sight impaired theres just no debate on PC

Plus playing with a TV in the first place gimps the experience anyway, even the fastest responding TVs are far slower than a budget monitor, huge grey to grey pixel response and long image processing times mean you can be behind as many as 15-30 frames behind anyway no matter the FPS the device is pushing out.

Baron Teapot:
PS4: "I can do all of this easily 60 times every second!"
XB1: "I can only do it 30 times every second. Gosh, I'm out of breath."

With the PS4 being able to easily handle 60FPS, it's saying "games may not be taking full advantage of me right now, but there's room for improvement!" and that's a good thing - it's like having an elastic waistband on your favorite pair of pants..

Actually the PS4 can't handle it, one or two mediocre looking games have got close (not locked either) but everything else has needed compromises in either AA or resolution to get that high and some games are simply not bothering. Even as far as shovelling bullshit like they want a game to look "filmic".

For one, better response time between you push a button and shit happening. For two, it's much more fluid and pleasant to the eye.

30FPS - Looks okay to me. My character feels like a tank, but otherwise I can manage.
60FPS - Sure, it looks smoother and my character controls better now, but why do I feel consistently seasick even with an FOV over 90?

*shrug* Until I can afford a rig capable of the mythical 60FPS/1080p at Ultra settings, this debate is a non-issue for me. Unfortunately, being a poor student means I have to be happy with what I have.

NuclearKangaroo:

MrFalconfly:
Well, as I see it 30fps vs 60fps is like a Ford Mustang GT vs a Ferrari F12-berlinetta.

Sure I'd prefer the Ferrari, since it objectively is the best, but that doesn't preclude me from enjoying the Mustang.

i dont know, 30 FPS is not really like its good, its just the BARE MINIMUM, is like, a mustang GT compared to a Fiat 1, the Fiat is a serviceable car, but far from ideal

now 60 FPS and anything higher, there your car comparison works, 60 FPS is pretty great, but anything higher is obviously going to be better

Well, compared to the Ferrari, the Mustang GT is bare minimum.

And by bare minimum, I mean what's minimally acceptable for having fun (I don't consider a Fiat Uno to be fun on the roads).

This is a debate? As in, people are actually making long posts about a subject that's so simple?

1. It looks nicer.

2. It controls smoother.

That's it. The end. You might argue about weather or not 30 FPS is adequate or if 60 should be mandatory, but there really isn't any debate as to why 60 is better than 30.

Because it looks nice and smoother and it plays better.

60fps is objectively better than 30fps, there is no debate here.

What here is, is that people who have been used to consoles for a very long time are accustomed to the slow pace of 30fps and they just go with it because they don't know any better (no offense). Things would be different if you could tweak your settings to your liking to reach higher fps, but consoles are locked down machines and can't do that.

Simple as that.

Lilani:

I'm an animator. I know what frames are, and I know the human mind does not work in frames.

However, frames are in essence a measure of motion over time. Humans do not have an unlimited capacity for perceiving things clearly in motion--stuff can move so fast that all our mind can only interpret a blur, if it can interpret anything at all. So frames per second may not accurately reflect how the human eye and brain actually work, but it can at the very least act as a rudimentary reference for how fast the mind can process visual stimuli and at what point things begin to blur.

Framerate in gaming is nothing like framerate in video.
Watching animation and moving a 3D object in realtime are not the same.
Framerate in gaming is all about responsiveness.
I.e. At a lower framerate the game will lag more when I walk around in-game or move the camera.
Nothing to do with the human eye.
Motion-blur and FOV are completely different things that you can adjust in the options menu.

JettMaverick:
The concept of the argument deludes me, I used to work in film, and having worked in mediums where films are shot in 23.9/25 fps upto 30 for PAL screening, i always prefered a lower frame rate, because the progression of frames feels more movie like (Not like.. sluggish 1-10fps because of lower level hardware) but I want to know what justifies the reasoning to complain if a game is 30fps, and not 60. I'm not asking for a cussing match, & i appreciate arguments on both sides, im more curious as to why.

Sure in film 25 to 30 is optimal, but when nuanced interactivity and pattern mastery from the player comes into the picture, it really does become a whole new ballpark m'afraid. in a lot of games avoiding that input lag or latency is pretty important and informs clearly on how well the players performance is.

Any game where split second reaction time is required upon the player (usually fighters and shooters with certain skill gaps) makes it better, when you can better anticipate attacks.

Better frame rate, better visual feedback, better response time, better game play experience.

There is not much of a debate. There are just people going apeshit when someone criticises something they are hyped about / something they like.

I would also not draw the connection between fps in movies and fps in games. As the audience of a movie you are static, have no control over the camera and the director has all the tools of his trade to make the 24 fps appear smooth. In games I control the camera. I am dictating the action. It is not a movie, I am more involved and I am more eager to pick up low frame rates when I need to do a 180 and my camera stutters like shit.

Personally, I'm okay with 30fps for games. If I can get a higher fps without compromising my graphical settings then so much the better. I can't really see the difference too well between 30 and 60 fps (which is probably why I'm ok with 30) and I've had a lot of gaming consoles over the years so i'm used to the response delay (what there is of it).

Basically, I think that the option for either 30fps with a higher graphical setting or 60 with lower settings (and any higher settings) should be made available and leave it to the player to decide.

I thought i'd put this to an experiment, I tried Modern Warfare with it's 60fps, then proceeded to try Battlefield 3 (Both on X360, both same year releases) & i can totally see the response beinf kinda sluggy in regards to 30fps.

TheKasp:
There is not much of a debate. There are just people going apeshit when someone criticises something they are hyped about / something they like.

I know it's not really a debate per se, but it's a topic of heavy discussion recently, & I wanted to open up (You guys at Escapist are a friendly bunch, so I thought this would be a decent objective perspective)

I was playing KI this morning too, which runs at 60fps, & It totally dawned on me how important a higher frame rate would be, I wouldn't want it to drop to 30 at all (Before this thread, I was undecided about it) It's an actual graphical discussion that doesn't just stick to the topic of visuals, it affects gameplay as well, & thus I can totally see it's importance & demand.

MrFalconfly:

NuclearKangaroo:

MrFalconfly:
Well, as I see it 30fps vs 60fps is like a Ford Mustang GT vs a Ferrari F12-berlinetta.

Sure I'd prefer the Ferrari, since it objectively is the best, but that doesn't preclude me from enjoying the Mustang.

i dont know, 30 FPS is not really like its good, its just the BARE MINIMUM, is like, a mustang GT compared to a Fiat 1, the Fiat is a serviceable car, but far from ideal

now 60 FPS and anything higher, there your car comparison works, 60 FPS is pretty great, but anything higher is obviously going to be better

Well, compared to the Ferrari, the Mustang GT is bare minimum.

And by bare minimum, I mean what's minimally acceptable for having fun (I don't consider a Fiat Uno to be fun on the roads).

pfff, id love to have such a "bare minimum" car

game framerate is not the same as a movie one.

Movies are a series of still images shot at a lower rate but with more blur between frames, which makes it smoother than a game drawn by a gpu at the same frame rate, because the blur helps your brain fill in the gaps.

If there was no difference people wouldnt have found the hobbit weirdly smooth, and I wouldn't be able to tell you the difference between 30hz and 60hz refresh rates.

But I can.

LinusTechTips on youtube has done a number of videos on this and I would direct you that way for more detail.

And this still goes on. Ok, some people really see difference all the time and can't get acclimated to it. Fine, no problem with that. In ideal world every game would work at 120fps (you can still see difference between 60 and 120fps when seen side by side).

But we are playing with limited resources. Limited by hardware but also limited by development time. And all the power you split between graphics, gameplay systems, foundation and various assorted options that particular game has. Then, out of that part left for graphics you balance between graphics quality, graphics complexity, optimization time and framerate. And graphics sell, that much is proven through hype machine. Even Rise of Robots sold and that had nothing but graphics. On the other side nobody have proven that framerate sells.

Idealy, 60 fps is always better (although some games do nor benefit, just the opposite. Better framerate made sure that no fighting game ever felt as cleanly brutal as first two double dragons to me. Jumpy animation just felt right.). But it's not always beneficial enough to make compromises on all other front. And just to remind you, as many developer said already, it doesn't require twice as much power to go from 30 to 60FPS, Overhead already took so much of a power that actually requires, comparatively, 4-5 times the power or in other words, significant simplification of graphics.

Less input lag is better, but outside of few genres it's not critical.

For me, the difference is something I feel rather than see. If I look at comparisons of 30 fps and 60 fps, I cannot make out any meaningful difference between the two no matter how hard I try. And yes, I have looked at the website that TotalBiscuit linked in his video about the issue, and yes I have seen plenty of other websites that do the same. The only time I can make out a visible difference is when they include one running at 15 fps, and the difference is only in relation to the frame running at 15 fps. However, when I play a game and actually have an understanding of how the game responds to my input, then I can definitely feel the difference. This is why I prefer 60 fps, especially on games based on precision and difficulty. I can handle 30 fps, but 60 fps is definitely preferable for playing games. However, if I'm just passively watching something, then 30 fps and 60 fps doesn't make a difference to me.

NuclearKangaroo:

pfff, id love to have such a "bare minimum" car

Well maybe not bare minimum.

A Toyota GT86, or Mazda MX5 would be bare minimum.

But then the Mustang GT would equate to 45fps, a Ferrari 458 Italia 60fps, and a LaFerrari 120fps.

The only reason this is even a issue are because the "Next-Gen" consoles are so absurdly underpowered, they need bullshit excuses to hide the fact that the hardware of the PS4 and Xbone is so weak it can't do decent graphics at a solid 60 fps.

higher the better, but i'm happier at 1080p 30fps, than i would be at 720p 60fps. i think it comes down to the individual preferences. so long as there are no sudden fps drops, i'm mostly happy

If you worked at films, then you should know that 24fps works for films because it leaves the brain with imagining missing information and this makes it seem less like actors acting in front of a camera on a set, and more like an imaginative work.
With games however you are expected to interact and for that you don't want to be imagining information but actually have information. Imagine having to drive a car at 24fps in real life. It would suck horribly if you get in a tight situation where fast reaction is necessary.
So that's why 60 fps is superior in interactive media - it strengthens the interactive portion.

JettMaverick:

NuclearKangaroo:

JettMaverick:

Fair play! It's nice to know that there are people who are willing to accept compromise for their preferences. I've seen alot of people going on about how it must be 60fps/1080p, and accepting nothing else. I'm more curved towards the console gamer in this debate, as i can understand PC users flipping out considering money spent on hardware, thus they should be handed the best experience for the price paid.

i seriously doubt most PC gamers want 60 FPS just for the sake of it, like i showed you, 60 FPS objectively plays better than 30 FPS

JettMaverick:

This is very insightful, thanks for sharing :)

happy to help, you know, totalbiscuit made a similar video a few days ago, he goes a little bit more technical, he also says theres no reason why console games shouldnt ATLEAST provide the option to play games at 60 FPS, like a simple graphical option, high detail/30 FPS and low detail/60 FPS

and honestly, i agree

Y'know, curiously when i've opened a console games options and there's a 'display/video' sub-option, I always jumped in thinking that there 'might' be some form of alteration besides brightness etc, I agree with your agreement on this, totally.

The first Bioshock had some neat console "graphic options". I believe you could turn off V-sync, and turn off the way textures were loaded or something like that. The game was actually able to run at 60FPS at times it seemed.

JettMaverick:
The concept of the argument deludes me, I used to work in film, and having worked in mediums where films are shot in 23.9/25 fps upto 30 for PAL screening, i always prefered a lower frame rate, because the progression of frames feels more movie like (Not like.. sluggish 1-10fps because of lower level hardware) but I want to know what justifies the reasoning to complain if a game is 30fps, and not 60. I'm not asking for a cussing match, & i appreciate arguments on both sides, im more curious as to why.

First and foremost, the whole idea that 30 fps is 'more cinematic' is complete crap. Movies are filmed at 24 fps, but also have each frame blurred in areas to give the impression or illusion of more motion than is actually able to be filmed at 24 fps.

Why 60 fps is 'better' is because it is a lot, lot, LOT smoother. For some, they see this as the game looking 'faster', but it is actually going at the same speed. It's just a jarring experience going from 30 fps to 60 fps(somewhat like the weirdness that people experienced watched The Hobbit at 48 fps)

Lilani:

DoPo:

Lilani:
However, frames are in essence a measure of motion over time. Humans do not have an unlimited capacity for perceiving things clearly in motion

Yes, however, I was objecting to the statement that humans see at 60 FPS. They simply do not. FPS is a measure but not a measure of how people see.

And I agreed with you and clarified what I meant by that. I simply took the liberty of assuming that most people were aware of the fact that the human eyes aren't camera lenses and that the brain is not a camera.

That's a bad gamble, most people do indeed think the eye acts as a camera if they ever even wonder how the eye works at all.

Interesting eye fact.

The most common type of chromophore[1] found in the eye is the rhodopsin chromophore. The time it take for it to detect light and change it's molecular structure to react to that light is about 200 femtoseconds[2] So we can detect changes in light fantastically quickly, it just takes a lot longer for us to make sense of it.

OT

60fps is just perceived as being smoother to the human eye than 30fps. That's all there really is to it. Though it's not about your eyes really, they can detect changes in light very fast indeed. It's more about how your brain interprets the data your eyes are sending it.

[1] a molecule that detects light..kinda like the CCD in a camera
[2] A femtosecond is equal to 0.0000000000000001 of 1 second.

MrFalconfly:

NuclearKangaroo:

pfff, id love to have such a "bare minimum" car

Well maybe not bare minimum.

A Toyota GT86, or Mazda MX5 would be bare minimum.

But then the Mustang GT would equate to 45fps, a Ferrari 458 Italia 60fps, and a LaFerrari 120fps.

Most fun I have had recently was in 40+ year old Morris Minor 1000, although that was on a racetrack, so to say you can't have fun in a "basic" car is just silly, you can if it weighs nothing and has rear wheel drive.

But silly and totally incorrect car analogies aside,60fps is objectively superior to 30fps, I'm not saying 30 is terrible it just isn't as good. Of course things like RTS and turn based games don't need it but for anything in a player perspective be it driving, TPS, FPS or other having a lower framerate means worse control and gameplay.

Dead Century:
30fps is acceptable, but I like a solid 60fps. Even if I sacrifice graphics to do so. It's just a personal preference. No justification needed.

Justification is actually needed because the question asked specifically asked for it. There really are not that many reasons I can think of:

1) Smoother animation. Film and TV get away with relatively low frame rates (24 and 25fps) because they a built in mechanism that helps the image seem continuous rather than a series of discrete pictures played in rapid succession. Attempts to mimic this effect well are computationally expensive and thus aiming for a higher frame rate in general (thus reducing the time you spend looking at any particular image) is an easier way to hide this effect. Basically, the correct frame rate in this case is the minimum rate at which you can look at a sequence and not perceive it as a sequence of discrete still images. This number various considerably from person to person. Going beyond this minimum number offers further improvements to the perceived quality until you reach that person's limit (a biological factor easily in the hundreds of fps).

2) Input Lag. Regardless of how fast your input device polls you'll have to wait until the next frame to see the effect of your input. In the vast majority of games and for the vast majority of players, you'd be hard pressed to argue this as an actual problem. 1/60 of a second (the maximum extra time you'd have to wait at 30fps vs 60fps) is so insubstantial compared to actual human response time as to be negligible at best. Now, if you are dealing with a game with very narrow timing windows and your frame rate fluctuates between two extremes, then you'd have a pretty good basis for complaint. Here the problem is less about the lag itself than it is about variability of lag.

I've personally seen people go to some silly lengths for framerate though. Turning off all settings in Quake 3 to achieve rates above 130 fps was common which struck me as strange given that the display could only show the player 60 of those 130 frames.

g7g7g7g7:

MrFalconfly:

NuclearKangaroo:

pfff, id love to have such a "bare minimum" car

Well maybe not bare minimum.

A Toyota GT86, or Mazda MX5 would be bare minimum.

But then the Mustang GT would equate to 45fps, a Ferrari 458 Italia 60fps, and a LaFerrari 120fps.

Most fun I have had recently was in 40+ year old Morris Minor 1000, although that was on a racetrack, so to say you can't have fun in a "basic" car is just silly, you can if it weighs nothing and has rear wheel drive.

But silly and totally incorrect car analogies aside,60fps is objectively superior to 30fps, I'm not saying 30 is terrible it just isn't as good. Of course things like RTS and turn based games don't need it but for anything in a player perspective be it driving, TPS, FPS or other having a lower framerate means worse control and gameplay.

Didn't say "you can't have fun in a basic car". Obviously you can. I just doubted the Fiat Uno's capacity for fun on the roads.

Although if I was to go for a Hothatch, that'd probably be something like a Peugeot 205 GTi.

Eclectic Dreck:

I've personally seen people go to some silly lengths for framerate though. Turning off all settings in Quake 3 to achieve rates above 130 fps was common which struck me as strange given that the display could only show the player 60 of those 130 frames.

That's not as bad as the Counterstrike framerate junkies who insist on having 300FPS. I think with some of those old shooters, the speed the game ran at had an effect on the game's physics engine, and there were certain exploits that could only be done with ridiculously high framerates. The Counterstrike crowd insists on 300FPS because otherwise it seems to screw up their ability to move faster via bunny hopping.

Supernova1138:

Eclectic Dreck:

I've personally seen people go to some silly lengths for framerate though. Turning off all settings in Quake 3 to achieve rates above 130 fps was common which struck me as strange given that the display could only show the player 60 of those 130 frames.

That's not as bad as the Counterstrike framerate junkies who insist on having 300FPS. I think with some of those old shooters, the speed the game ran at had an effect on the game's physics engine, and there were certain exploits that could only be done with ridiculously high framerates. The Counterstrike crowd insists on 300FPS because otherwise it seems to screw up their ability to move faster via bunny hopping.

I don't think that applied to Quake 3 which allowed for fairly absurd exploitations of the bunny hop. In that case, the only advantage I saw was that characters were very highly contrasted with the background which, I suppose, might somehow make it very slightly easier to see and shoot them.

I do not care and I want people to shut up about it. But they won't. Leave it to gamers to complain about the smallest shit. Sure 60 looks better than 30, but the difference shouldn't conjure this much damn vitriol. Ugh, whatever.

My only real defense for 60 FPS is that, unlike movies, we react to what's happening in a game. Therefore the more we can follow the onscreen action in depth, the better. And that's what higher FPS does.

But that doesn't justify some of the vitriol about it.

It looks and feels better its that simple.

All the stuff about it looking smoother or the human eye can't see above 30fps are just excuses to make consoles look less terrible.

That said it should be noted that 30fps for PC is terrible due to being up close to a monitor its far less of a problem for someone sat back using a TV.

JettMaverick:
The concept of the argument deludes me, I used to work in film, and having worked in mediums where films are shot in 23.9/25 fps upto 30 for PAL screening, i always prefered a lower frame rate, because the progression of frames feels more movie like (Not like.. sluggish 1-10fps because of lower level hardware) but I want to know what justifies the reasoning to complain if a game is 30fps, and not 60. I'm not asking for a cussing match, & i appreciate arguments on both sides, im more curious as to why.

I think that in most games lower framerates are more noticeable. Probably because filmmakers have had decades to come up with camera techniques which avoid most of the key problems - fast objects get blurred onscreen and camera movements tend to be either shakycam or steady, consistent movement, both of which reduce the obviousness of frame changes.
Games don't, by and large, do that yet. Sure, certain games experiment with motion blur, but it isn't able to completely alleviate the issue.
Perhaps a part of it is based on pure perception as well. It might be that lower framerates are more noticeable on a monitor sized screen a foot away than on a TV-sized screen a few meters away. Or perhaps it's something to do with wider Field of View in games to simulate periphiral vision. Or it might be input lag issues which aren't a concern in a film.
It might even be that one concentrates more as an active participant in a game than when watching a film (which is ultimately a passive and observational activity), which keys up one's reaction times, and thus you notice lower fps. Or perhaps that we more readily associate in-game camera movement as a facsimile of our own physical movement and more readily notice visual differences from real life, but maintain a degree of conscious separation from the disembodied and uninvolved camera which shows a film.

That's all educated guesswork, but all I can say is that it is massively noticeable to me personally. Anything below about 45-50FPS feels sluggish, and anything below 30-35 is often downright unplayable for me (on PC at least), not to mention giving me massive headaches.
I've got no problem with people who don't mind it, I don't care that the consoles might be locked to 30 (doesn't affect me), but my PC games need to play at a decent framerate for me to enjoy them properly!

 Pages PREV 1 2 3 NEXT

Reply to Thread

This thread is locked