The Great Framerate Debate

The Great Framerate Debate

Shamus jumps into Xbox One vs. PlayStation 4 discussion on framerate vs. resolution.

Read Full Article

Whelp. After reading the description in the article and carefully studying both of the pictures, I'm having problems noticing the difference. Certainly I wouldn't have noticed if the two images were used separately on pages one and two.

I guess it's time to turn in my gamer card, grab a Filthy Casual Peasant burlap tunic and slink away into the night.

Shamus Young:
Note the uneasy case where (for whatever reason) the game runs at something oddball like 40fps. You draw a frame and show it. Then you're only two-thirds of the way done when the refresh comes, so it repeats the previous image. Then you finish an image, but the refresh isn't ready yet. Then when the refresh happens the current image is a bit old, but the new one is only one-third done. Then on the next cycle the frame and the refresh are in sync again. The result is this strange stutter-step where it feels like the game keeps shifting between 60fps and 30fps.

Is this with or without any V-Sync modes on?

Falterfire:
Whelp. After reading the description in the article and carefully studying both of the pictures, I'm having problems noticing the difference. Certainly I wouldn't have noticed if the two images were used separately on pages one and two.

I guess it's time to turn in my gamer card, grab a Filthy Casual Peasant burlap tunic and slink away into the night.

Have another look in full-size: http://cdn.themis-media.com/media/global/images/library/deriv/737/737897.png

Falterfire:
Whelp. After reading the description in the article and carefully studying both of the pictures, I'm having problems noticing the difference. Certainly I wouldn't have noticed if the two images were used separately on pages one and two.

I guess it's time to turn in my gamer card, grab a Filthy Casual Peasant burlap tunic and slink away into the night.

Don't look at Rainbow Brite; focus on the brown background. You get weird banding in the bottom image. This would be much, much more pronounced full-screen and in motion. Those bands jumped all over the place; it was pretty ugly. The difference between 720p and 1080p is often pretty subtle; but video of this is pretty bad.

There is yet one more compounding factor to the debate: We have gotten to a point in time where increased graphics resolutions and textures give us increased diminishing returns (ie, spending tons of memory and CPU for tiny-winy boosts) and all people ever talk about are how important those tiny boosts are, but ignore the truly groundbreaking GPU additions.

The biggest changes that we will ever see in graphics engines in the near future are post processing effects. This includes fancy lighting, reflections, smoke, particle effects, anti-aliasing, ambient occlusion, tessellation... terms that most gamers may have heard of but never knew what they were (because no single effect is a GIGANTIC change). These are the features that set apart the true 'next-gen' game engines such as CryTech's CryEngine. These effects are also at the root of modern major gaming news such as the Watch_Dogs downgrade.

CryEngine 3 can give a medium-hardware machine such a pretty picture not because it has high resolution or fps, but because every single element mentioned above was considered in the engine's design from the beginning, given the devs time to optimize rendering for them. Watch_Dogs got downgraded because the engine was designed to print a basic image to the screen at solid fps. They then put in some of these and cranked it up to 11 to see how pretty it could be at E3, only to realize months later that the engine was never designed to handle all of them at once. You can only optimize for them so much when you don't take these seriously in to account - when they aren't included in the ground floor. These features are a tiny after thought when they should be integral, and not a single one of them is a gamebreaker by itself.

The same happened with Assassin's Creed 4: it was built with the Screed 3 engine and had some pretty lights tacked on that it was never meant to handle (Screed 3 was based off of Screed 2, etc..). The result: turning god rays on in Black Flag tanks your fps. While Ubisoft is known for having terribly optimized games, this is a broader trend with most companies. Tomb Raider trumpeted TressFX for Laura's hair, but how did the game handle it? The result: turning TressFX on in TotalBiscuit's mega rig tanked his fps by over half. The new inFamous ran at 30fps but still looked better than many games that can run 60, and it was because they dedicated early development to lighting and reflection. The game had some issues, but very few people complained about the game not looking good.

CryEngine 3 is considered one of the truest "next gen" engines, and it was released FIVE YEARS AGO. Just sayin'.

Something I remember about the old 3DFX cards is that on some of the later cards (Voodoo 3 I think) the processor worked at 24 bit internally and dithered the output. So you did not actually see banding unless the textures were terribly designed.

Also, 40 fps does not look weird. It looks smoother than 30fps, and less smooth than 60fps. That the frames are visible for varying lengths of time does not matter. The eye cannot make out the individual frames, it just sees that there are more of them. Fluctuating fps is only noticeable when, for example, you go from a corridor into a big room and your framerate drops from 60 to 30.

And something that should be mentioned regarding framerate. It's a pity Youtube is stuck at 30 fps. This means that a 30 fps game will always look better than it would at 60 fps. And that's 90% of what people will look at before buying a game. If everyone went to Twitch instead, which does 60 fps streams, I think a lot of devs would target 60 fps instead.

senkus:

Shamus Young:
Note the uneasy case where (for whatever reason) the game runs at something oddball like 40fps. You draw a frame and show it. Then you're only two-thirds of the way done when the refresh comes, so it repeats the previous image. Then you finish an image, but the refresh isn't ready yet. Then when the refresh happens the current image is a bit old, but the new one is only one-third done. Then on the next cycle the frame and the refresh are in sync again. The result is this strange stutter-step where it feels like the game keeps shifting between 60fps and 30fps.

Is this with or without any V-Sync modes on?

Um, yes? Turning V-sync off will not affect the refresh rate of the monitor. It will, however, allow the frame to change in the middle of a screen refresh, causing "tearing", where part of the screen is showing one frame and the rest of the screen is showing the next frame. ...I hate that.

It's a shame that your columns are shuffled off the main page by The Escapist, Shamus. As for this article, the body of it doesn't support your conclusion, which I thought strange. In any case, I don't see how your analysis shows it to be subjective. If anything, it only supports the conclusion that some people are better equipped than others to make a judgment about graphical fidelity. If this article was about art, then your conclusion would have been that which artistic medium is best is subjective-- in part-- because some people viewing the art need corrective lenses.

It probably also comes down to what you are used to. I have a pretty big tolerance for low frame rates. I normally notice the difference between 720p and 1080p, I just don't really care as much. I still find high frame rates and higher resolutions to be simply better and more enjoyable but I don't think it is that big of a deal. But then again I grew up without high frame rates or even high graphics settings and couldn't even run high resolutions if I wanted to because our screens were always pretty bad.

In the end style also plays a huge role. If the art style is done well, the game can run on a low resolution and if the game is designed around it it can even run at low frame rates. It can still be enjoyable and visually very pleasing.
If the art style isn't designed to deal with a lower resolution or if the gameplay isn't designed around allowing low frame rates, it can still diminish the game. For example I don't think 30 fps 720 p is particularly suited for fast paced multiplayer shooters that aim for getting as close as possible to photo realism. As long as the frame rate doesn't dip below 30, I'll still prefer them on the PC. The only saving grace for them on consoles is probably the mentioned higher distance from the screen.
They are still playable imo, as I said my tolerance is pretty high, and if I had a console and no PC to adequately play it on, I'd still get it for the console. Single games (or even a few) aren't really worth investing into an additional platform.

I wonder if there are noticeable differences in generations. E.g. ~40 year olds I(grown up with uglyTV and very abstract graphics) having a higher tolerance than ~25 year olds (grown up with acceptableTV and bad graphics) having a medium tolerance and the young folk (grown up with HD-TVs, good graphics) puking their guts out when things get jumpy.

90sgamer:
It's a shame that your columns are shuffled off the main page by The Escapist, Shamus.

Uhm, isn't it on the main page? Over by the right hand side where also Yahtzee's, Bob's and whoeverdoescriticalintel's articles are? The main site presents the article twice to me at the moment. One medium sized one is directly above the news window and the smaller one is over on the right side where all the articles go. Or do you mean the banner thingy? The space on there seems very limited. Depending on the topic they could maybe push it into the science&tech banner. This article certainly has a very large tech aspect and is certainly very informative about how display tech works/worked.

rofltehcat:

90sgamer:
It's a shame that your columns are shuffled off the main page by The Escapist, Shamus.

Uhm, isn't it on the main page? Over by the right hand side where also Yahtzee's, Bob's and whoeverdoescriticalintel's articles are? The main site presents the article twice to me at the moment. One medium sized one is directly above the news window and the smaller one is over on the right side where all the articles go. Or do you mean the banner thingy? The space on there seems very limited. Depending on the topic they could maybe push it into the science&tech banner. This article certainly has a very large tech aspect and is certainly very informative about how display tech works/worked.

I don't know anyone else, but when I'm on this site I'll routinely check the main banner on the front page and scroll through that, then check down to see the news. Anything else on the screen is random crap that I'll maybe notice if I'm paying attention.

They've relegated this column to the random crap section, so I don't always notice it as quickly as I otherwise might, which is unfortunate.

Bad Jim:

Also, 40 fps does not look weird. It looks smoother than 30fps, and less smooth than 60fps. That the frames are visible for varying lengths of time does not matter. The eye cannot make out the individual frames, it just sees that there are more of them. Fluctuating fps is only noticeable when, for example, you go from a corridor into a big room and your framerate drops from 60 to 30.

Careful about saying what the eye can or can't make out. For quite a few people, sure, they won't be able to notice any sort of difference, but for others they might, especially if the effect is called out to them. It's likely a small thing either way, but to be honest small things are often the worst. They nag at your brain and tell you "Something is off about this" but your conscious brain won't be able to mark down why, causing some dissonance.

I've always held this viewpoint, because it doesn't matter the FPS or refresh rate. I can't honestly top this well written viewpoint so I'll defer to Mr. Young (man thats hard to type because my private High School principal who was responsible for me getting kicked out over bs reasons was also named Mr. Young).

Falterfire:
Whelp. After reading the description in the article and carefully studying both of the pictures, I'm having problems noticing the difference. Certainly I wouldn't have noticed if the two images were used separately on pages one and two.

I guess it's time to turn in my gamer card, grab a Filthy Casual Peasant burlap tunic and slink away into the night.

Part of the problem for me is that I can notice the difference but I'm hard pressed to care. I heard Watch Dogs was being compared to GTA V and thought "great!"

I should probably grab a tunic, too.

Xeorm:

Careful about saying what the eye can or can't make out.

He said the eye can't make out individual frames at that frame rate, and it's true. The human eye can only select a few discrete images in a second. He also said that you will notice more or less.

The closest to a contestable part was his use of an example. Even that's true if taken as an example, not as the only rule.

I'm also not 100% sure the people claiming they can see the difference on higher frame rates really can. I used to have access to a studio with high end equipment, and the people who claimed they could hear the difference between analogue and digital or between higher-quality MP3s and lossless were full of it. This wasn't the most scientific of experiments, but it's close enough for this level of discussion.

I'd be surprised to find it was any different with high frame rates.

I started gaming in the 8bit era so I'm fine with anything. If it's fun that's what matters. IMO atm the most high-detailed high res high framerate video game character looks incredibly fake to me still. So they could have 'drawn' it as a sprite and I would still have enjoyed the game if it was fun.

In fact I love sprites, more games should use them.

Zachary Amaranth:

Xeorm:

Careful about saying what the eye can or can't make out.

He said the eye can't make out individual frames at that frame rate, and it's true. The human eye can only select a few discrete images in a second. He also said that you will notice more or less.

The closest to a contestable part was his use of an example. Even that's true if taken as an example, not as the only rule.

I'm also not 100% sure the people claiming they can see the difference on higher frame rates really can. I used to have access to a studio with high end equipment, and the people who claimed they could hear the difference between analogue and digital or between higher-quality MP3s and lossless were full of it. This wasn't the most scientific of experiments, but it's close enough for this level of discussion.

I'd be surprised to find it was any different with high frame rates.

Some differences are small enough that they can be detected only with a proper experiment. I can detect a 10ms audio/video lag when playing Rock Band, like most musicians, but there's no way I could tell if someone added 10ms of audio lag to an ordinary microphone. Similarly, it's easy to tell 16-bit color depth from Shamus' example, but I played Dark Souls for like 10 hours before noticing that I had accidentally set the fog effects to 16-bit color depth. (And this is a game with plenty of fog.)

How much framerate is desired depends on the game engine. Quake I/II/III engines had a massive bug/feature where the game logic's tick rate was the same as the rendering rate. That means the game's logic would move on a step only when a frame was about to be rendered. This would affect your movement, so for example certain jumps were just not possible unless you could run the game at 125 fps. It would also mean that lower fps were badly playable since it's not just the renderer that would be slow, the whole game would become sluggish and laggy.

Other game engines (certainly all the current ones) have the game logic independent from the renderer. That means no matter how fast or how slow the game draws on the screen, you are still free to do whatever you need even though you can't see the result immediately.

One funny example of an odd implementation were the first Splinter Cell games (at least the first 3 IIRC). If the renderer didn't have the speed, the whole game would slow down into bullet time effect instead of becoming laggy and stuttery.

My point is that the game can be programmed in a way that even 30 fps is good enough while other games can feel crappy even in 60 fps. I must say that most UE3 games are certainly well playable in 30 fps even on my PC.

BTW the 32-/16-bit difference certainly wasn't that bad IRL. I remember only noticing it in fog and some funny images like the Q3A logo. And I'm not even sure if 32-bit was ever embraced fully, didn't everyone just move to floating point? Or was it only with the last gen?

Another thing I'd like to point out: It's 2014. You can buy a 2560*1440 display, or 3 FullHD displays for your PC for not much money, and a $200 graphics card in a $500 (or cheaper) computer will allow you to play anything in 60 fps. Why are we discussing something like 720p/30fps in the latest consoles? I'd understand if they'd be some low-cost kiddy consoles, but $400 with $60 games, that's kinda pathetic. We'll have 4k displays pretty soon, what next?

Frame rate is one of those things that as long as its above 30 FPS it looks fine to me. I can only tell the difference between 30 and 60 FPS if a game is pingponging between them. As long as the game is running steady I can't tell the difference. However my eyes are not as good as they used to be.

No mention of G-sync and Freesync?
You should really add them, perhaps talk about Nvidia's Frame Capture Analysis Tool (FCAT) as well!!! :\

TiberiusEsuriens:
There is yet one more compounding factor to the debate: We have gotten to a point in time where increased graphics resolutions and textures give us increased diminishing returns (ie, spending tons of memory and CPU for tiny-winy boosts) and all people ever talk about are how important those tiny boosts are, but ignore the truly groundbreaking GPU additions.

The biggest changes that we will ever see in graphics engines in the near future are post processing effects. This includes fancy lighting, reflections, smoke, particle effects, anti-aliasing, ambient occlusion, tessellation... terms that most gamers may have heard of but never knew what they were (because no single effect is a GIGANTIC change). These are the features that set apart the true 'next-gen' game engines such as CryTech's CryEngine. These effects are also at the root of modern major gaming news such as the Watch_Dogs downgrade.

CryEngine 3 can give a medium-hardware machine such a pretty picture not because it has high resolution or fps, but because every single element mentioned above was considered in the engine's design from the beginning, given the devs time to optimize rendering for them. Watch_Dogs got downgraded because the engine was designed to print a basic image to the screen at solid fps. They then put in some of these and cranked it up to 11 to see how pretty it could be at E3, only to realize months later that the engine was never designed to handle all of them at once. You can only optimize for them so much when you don't take these seriously in to account - when they aren't included in the ground floor. These features are a tiny after thought when they should be integral, and not a single one of them is a gamebreaker by itself.

The same happened with Assassin's Creed 4: it was built with the Screed 3 engine and had some pretty lights tacked on that it was never meant to handle (Screed 3 was based off of Screed 2, etc..). The result: turning god rays on in Black Flag tanks your fps. While Ubisoft is known for having terribly optimized games, this is a broader trend with most companies. Tomb Raider trumpeted TressFX for Laura's hair, but how did the game handle it? The result: turning TressFX on in TotalBiscuit's mega rig tanked his fps by over half. The new inFamous ran at 30fps but still looked better than many games that can run 60, and it was because they dedicated early development to lighting and reflection. The game had some issues, but very few people complained about the game not looking good.

CryEngine 3 is considered one of the truest "next gen" engines, and it was released FIVE YEARS AGO. Just sayin'.

I am all for making good use of the GPU, but you are wrong on part 1.
Most games DO NOT make good use of all the CPU can do. They barely use 2 cores, let alone 4 or 6 or 8.
Crysis 3 has great physics, equal to those of PhysX on the CPU without tanking the frame rate...

Also I disagree on the engine part.
STALKER Lost Alpha released a month ago (an early access build) and is a fan made official/unofficial game. ANd you know what?
They finally fixed the 14 year old STALKEr engine. Now it uses ALL avaliable cores to generall good effect and load MASSIVE, multi level and VERY detailed maps (that would make most AAA devs commit suicide). Here is how many parts of the game look like:
http://cloud-2.steampowered.com/ugc/468682266817903805/EADFB7A0B11FED887E6AFE3F591BF4E1AC7845B8/

*Though Shamus will ignore as always*

So even old engined can be made to rock.

Lovely article. I like the end parts.

I used to, until 2009 play on a very old PC. I remember I played Metro 2033 (because i tested it, I still had the PC after I go the new one) at 14 fps on the lowest possible settings AND completed Ranger Hardcore difficulty.

Whilst I will say that the resolution must always be the LAST THING you tone down, because I am one of those people that see a gigantic, monumental difference between 720p, 900p, 1080p, 2K and 4K resolutions.
I played The Last of Us on my TV where I usually play my PC games. It was VERY hard for me in the begining, because, whilst I can excuse the mediocre graphics, the low resolutions was making everything look fuzzy.
The 22-25 fps TLOU plays on I could take, it is a slow game. True, 60 fps would have been BETTER, but 23 fps was MEH, WORKS.

Same for other slow games or games I can play slowly like Wolfenstein, Metro Last Light and Crysis.

Some Single player games I can also do with 30 or even 40-50 fps. Not too bad. 60 would be better, but I can deal with that.

However when I play World of Tanks or Counter Strike or even Men of War... every single frame is important.
I can not play CS on 30 fps. I can not play UT or Quake or other fast paced games at lower then 60 fps.

michael87cn:
I started gaming in the 8bit era so I'm fine with anything. If it's fun that's what matters. IMO atm the most high-detailed high res high framerate video game character looks incredibly fake to me still. So they could have 'drawn' it as a sprite and I would still have enjoyed the game if it was fun.

In fact I love sprites, more games should use them.

So much this. While personally I don't begin to really get bothered by a loss of frame rate until it hits 16fps, I don't require every game to push the limits in that aspect. I am far more interested in the art style, narrative, game mechanics, etc.

Charcharo:

TiberiusEsuriens:
snip

I am all for making good use of the GPU, but you are wrong on part 1.
Most games DO NOT make good use of all the CPU can do. They barely use 2 cores, let alone 4 or 6 or 8.
Crysis 3 has great physics, equal to those of PhysX on the CPU without tanking the frame rate...

Also I disagree on the engine part.
STALKER Lost Alpha released a month ago (an early access build) and is a fan made official/unofficial game. ANd you know what?
They finally fixed the 14 year old STALKEr engine. Now it uses ALL avaliable cores to generall good effect and load MASSIVE, multi level and VERY detailed maps (that would make most AAA devs commit suicide). Here is how many parts of the game look like:
http://cloud-2.steampowered.com/ugc/468682266817903805/EADFB7A0B11FED887E6AFE3F591BF4E1AC7845B8/

*Though Shamus will ignore as always*

So even old engined can be made to rock.

Those are actually pretty solid screens. I'm not saying that old engines can't be made better, but that the important details are always thrown in as afterthought.

I am curious as to how long development has been going on the Stalker mod. As you said it's a very old engine, so the modders have had [up to] 14 years to make it sing. AAA studios can make their engines better, but as PC gamers love to point out, they simply don't try. Engines can be optimized after the fact, but it requires a LOT more work than if they had better groundwork, leading studios like Ubi to enhance Anvil repeatedly while never putting in the effort to optimize. Optimizing an engine takes a bunch more time than creating it. After each yearly iteration/addition they would have to completely re-optimize again and again.

That's about 3-4x as much effort as the CryTech example, where one very stable engine lasts 5 years, giving them time to create a new HIGHLY optimized very stable engine that lasts another 5-10 years. The Anvil engine just recently has what CryEngine 'perfected' years ago, and many times it feels like it barely functions. It's a big reason why Source engine is so old but still loved - it was built to last.

As someone rocking 1440p at 114hz with the GPU brawl to back it, I find this discussion simply adorable!

*sips wine*

In all sincerity though, framerate is pretty noticeable. I immediately notice when my graphics driver decides to mess with me and switch back to 60Hz and frame drops below 60 actively annoy me.

And even if its not immediately noticeable, details add up. Bits of resolution, framerate, lighting, shading, post-processing, etc add up to the difference between lifelike and a plastic-y, stuttering mess.

No not everyone will immediately notice different framerates, but consoles are supposed to be family machines and should not be designed with only grandpa who refuses to wear his reading glasses in mind.

TiberiusEsuriens:

Charcharo:

TiberiusEsuriens:
snip

I am all for making good use of the GPU, but you are wrong on part 1.
Most games DO NOT make good use of all the CPU can do. They barely use 2 cores, let alone 4 or 6 or 8.
Crysis 3 has great physics, equal to those of PhysX on the CPU without tanking the frame rate...

Also I disagree on the engine part.
STALKER Lost Alpha released a month ago (an early access build) and is a fan made official/unofficial game. ANd you know what?
They finally fixed the 14 year old STALKEr engine. Now it uses ALL avaliable cores to generall good effect and load MASSIVE, multi level and VERY detailed maps (that would make most AAA devs commit suicide). Here is how many parts of the game look like:
http://cloud-2.steampowered.com/ugc/468682266817903805/EADFB7A0B11FED887E6AFE3F591BF4E1AC7845B8/

*Though Shamus will ignore as always*

So even old engined can be made to rock.

Those are actually pretty solid screens. I'm not saying that old engines can't be made better, but that the important details are always thrown in as afterthought.

I am curious as to how long development has been going on the Stalker mod. As you said it's a very old engine, so the modders have had [up to] 14 years to make it sing. AAA studios can make their engines better, but as PC gamers love to point out, they simply don't try. Engines can be optimized after the fact, but it requires a LOT more work than if they had better groundwork, leading studios like Ubi to enhance Anvil repeatedly while never putting in the effort to optimize. Optimizing an engine takes a bunch more time than creating it. After each yearly iteration/addition they would have to completely re-optimize again and again.

That's about 3-4x as much effort as the CryTech example, where one very stable engine lasts 5 years, giving them time to create a new HIGHLY optimized very stable engine that lasts another 5-10 years. The Anvil engine just recently has what CryEngine 'perfected' years ago, and many times it feels like it barely functions. It's a big reason why Source engine is so old but still loved - it was built to last.

Thanks mate. I agree. Id Tech Engines, CryEngine, X-ray engine and 4A engine are all built to last. Source too.
Those are the best we have. And I would love it if devs tried to be like them and make good use of all avaiable tech.

As for STALKER, well not exactly.
You see, STALKER development started in 2000. By 2001 the engine was done. And in 2007, STALKER Shadow of Chernobyl came out and it looked very good for the time and nice even by our standards today (though more because of art style).
Later, in 2008, modder started work on STALKER Lost Alpha game/mod thing. So they had 6 years to make the old engine be what it is now, HOWEVER they claim they made the new DX 10.1 and experimental DX 11 renders ONLY one year ago. Rest of the time they made the gigantic maps and quests and other game systems.
Also in 2008 GSC, the developers made X-Ray 1.5 and released STALKER Clear Sky. The by then 7-8 year old engine looked like this:
https://www.youtube.com/watch?v=YAYLHAPPkvw&index=4&list=PLD2B82E405CF9650C

https://www.youtube.com/watch?v=gkLR2tYRubw&index=3&list=PLD2B82E405CF9650C

Not too bad either.

TiberiusEsuriens:
There is yet one more compounding factor to the debate: We have gotten to a point in time where increased graphics resolutions and textures give us increased diminishing returns (ie, spending tons of memory and CPU for tiny-winy boosts) and all people ever talk about are how important those tiny boosts are, but ignore the truly groundbreaking GPU additions.

The biggest changes that we will ever see in graphics engines in the near future are post processing effects. This includes fancy lighting, reflections, smoke, particle effects, anti-aliasing, ambient occlusion, tessellation... terms that most gamers may have heard of but never knew what they were (because no single effect is a GIGANTIC change). These are the features that set apart the true 'next-gen' game engines such as CryTech's CryEngine. These effects are also at the root of modern major gaming news such as the Watch_Dogs downgrade.

CryEngine 3 can give a medium-hardware machine such a pretty picture not because it has high resolution or fps, but because every single element mentioned above was considered in the engine's design from the beginning, given the devs time to optimize rendering for them. Watch_Dogs got downgraded because the engine was designed to print a basic image to the screen at solid fps. They then put in some of these and cranked it up to 11 to see how pretty it could be at E3, only to realize months later that the engine was never designed to handle all of them at once. You can only optimize for them so much when you don't take these seriously in to account - when they aren't included in the ground floor. These features are a tiny after thought when they should be integral, and not a single one of them is a gamebreaker by itself.

The same happened with Assassin's Creed 4: it was built with the Screed 3 engine and had some pretty lights tacked on that it was never meant to handle (Screed 3 was based off of Screed 2, etc..). The result: turning god rays on in Black Flag tanks your fps. While Ubisoft is known for having terribly optimized games, this is a broader trend with most companies. Tomb Raider trumpeted TressFX for Laura's hair, but how did the game handle it? The result: turning TressFX on in TotalBiscuit's mega rig tanked his fps by over half. The new inFamous ran at 30fps but still looked better than many games that can run 60, and it was because they dedicated early development to lighting and reflection. The game had some issues, but very few people complained about the game not looking good.

CryEngine 3 is considered one of the truest "next gen" engines, and it was released FIVE YEARS AGO. Just sayin'.

One note: Software is often created long before it can be practically used, especially in an area like graphic rendering. CryEngine 3 may have been released 5 years ago, but that means literally nothing about the type of hardware that is required to run it - I am willing to bet 5 years ago nothing readily available to consumers would have been able to run a CryEngine3 game that took advantage of the features of that engine.

We have had ray tracing for years but there is only one computer in the world, a $500,000 super computer at the University of Texas, that can run it real time.

How "next gen" a piece of software is cannot be determined by when it was created, and when the engine was created often does not indicate what kind of power is required to run it.

Personally I think we should expect 1080p at minimum because it's two thousand goddamn thirteen and it's pretty ridiculous if these much-hyped systems can't even match the specs of the televisions we hook them up to, televisions that a lot of us bought before the previous generation came out. Can you imagine if a video game system from 1998 wasn't able to achieve 640p? They'd be laughed out of the industry.

I don't know where I stand on framerate. I know it's pretty distracting when TF2 dips below about 45, even if I'm just flying around scoping out a new map that I just downloaded, and I imagine it's not good for my reaction time at levels far above that. But my other games, even the newer ones that surely can't hit that level, look fine to me at whatever framerate they happen to run at (Source is the only engine where I have any clue how to check). They're all single-player, granted, and frankly if you're into twitch action multiplayer, the fact that console players are stuck using a gamepad is going to hurt their performance a lot more than framerate issues.

If Call of Duty ran at 30 fps, it would make a significant impact on most players who are used to having smooth motion and responsive actions. For twitchers that pop off headshots between blinks, it would be a nightmare.

*sigh* I'd like to jump into this lovely "framerate/resolution" debate, but at the end of the day... it's just meaningless.

I'm one of those people who DO tell the difference between a 16 bit and a 32 bit picture (like the one in the article), I DO tell the difference between 60 and 30 fps (anything above 60 flies over my head, I just don't notice it) and I DO tell the difference between 720p and 1080p and yet... I don't care about all that fluff, as long as the gameplay is smooth enough.

I play DarkSouls unmodded (!), so it's 720p (wich in reality is even lower than that) and 30 fps, but I still love the game regardless, because of how smooth it plays. In my early days of PC gaming, I was stuck at 800x600 with framerates lower than 30 fps and I still considered that playable enough.

It wasn't until the first FEAR when I started to learn the difference between framerates (thanks to it's built in benchmark), I found that anything above 15 fps is still pretty playable to me, 20 fps and up and it was heaven for me.

Heck, I still play some PS1/N64 games hooked up to my old CTR TV and I still play them perfectly fine!, freaking Ocarina of Time/Majora's Mask are stuck at 20 fps and I still love those games!.

I could be here all day, but as I said, it's just meaningless to me at the end of the day.

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here