Capcom Explains Why 30 FPS Isn't That Bad

 Pages PREV 1 2 3 4 5 NEXT
 

I do not understand can someone explain how the ps2 was capable of pumping out 60fps with 60htz graphics for Devil May Cry 3, and 720p graphics and 60fps for Devil May Cry 4 on the ps3? But now the graphics look shittier and the game play is capped at 30fps WTF? It is going to be heavenly sword all over again, slow clunky game play.

me reading it:

Capcom

"60 FPS is a speed the brain and the eye can catch up with and understand," he said.

What R U Doin

"But at 30 FPS there's a technique where you take advantage of the brain's ability to fill in the blanks. So even though you have it running at 30 FPS, you create the motions and the poses in such a way that the brain will naturally fill in what would have been the extra frames."

Capcom

Itsuno pointed out that 60 FPS would be "better," but went on to claim that long gaming sessions at higher framerates have a tiring effect on payers' eyes because the frames "almost shake or flash."

Stahp

This is like the Inception of ignorance, there are just so many concentric layers of dumbness in what this guy is saying.

This is just completely made up nonsense, this has no relation to how the human visual system nor games rendering works.

Is this like some Onion Report fake story?!?!

For a fast paced action game like DmC 60 fps is a must. Hell mgs rising is running @ 60fps.
But if they do go through with it, at least ensure that you lock it @ 30fps. The real reason 60 fps is so good is because you need to chew through roughly 35 frames of performance to notice stuttering, whereas with 30 fps you only can lose around 6. And with all the insane on screen action going on in these games...

MegaManOfNumbers:
You know Capcom, FPS is the LAST thing I'm worrying about.

+1 internet.

The game's not out yet and we're having whiners demanding that the game be a solid 60 to justify their horrendously expensive graphics cards/ placebo-effect-addled brains. Seriously. Even if you can notice a difference between 30 and 60, it's eye candy. 30 is objectively provable as perfectly playable. More looks better, but it's really an optional extra.

The gameplay and story are going to be 100x more important than whether the framerate is 30 or 60.

WaitWHAT:

The gameplay and story are going to be 100x more important than whether the framerate is 30 or 60.

"gameplay"

"story"

Two words never before used in a sentence with DMC, unless laughter follows.

Also yes, if I've paid in excess of $1000 Australian Dollars for my graphics cards (that's about $1 trillion USD) I'm going to go ahead and expect that the game industry actually releases stuff that takes advantage of it, instead of endlessly pandering to the PS3 hardware.

And at the very least, not lying about the fact that there's no visual difference between them.

ResonanceSD:

"gameplay"

"story"

Two words never before used in a sentence with DMC, unless laughter follows.

Also yes, if I've paid in excess of $1000 Australian Dollars for my graphics cards (that's about $1 trillion USD) I'm going to go ahead and expect that the game industry actually releases stuff that takes advantage of it, instead of endlessly pandering to the PS3 hardware.

And at the very least, not lying about the fact that there's no visual difference between them.

What? Story and gameplay are less important than 60 vs 30 fps?

image

So...what you're saying is that if somehow, by some magic space miracle, the new DMC is the most amazing work of art created, that'll matter less than whether it has 30 or 60 fps? OK, that's unlikely, but if it did happen, guess which would be more important?

Besides, leaving aside the fact that this won't even affect P.C. gamers (no matter what their rig, the games industry does *not* have an obligation to kill itself to justify your latest graphics card just because you got hit by diminishing returns. [Hd 7850 =150, GTX 670=300, GTX 670 < 2x(HD7850)] If a game is still perfectly functional at 30fps, and it is, since film developers seem to have managed fine on 24 for the last upteen years, then that's all a developer needs to do. Would you rather that they spend 20% of the budget on eye candy and 80% on story and gameplay or made it a 50-50 split, resulting in an infinitely inferior game? No, that's stupid.

Graphical fidelity is not as important as story and gameplay.

End of. Now stop being so silly.

WaitWHAT:

ResonanceSD:

"gameplay"

"story"

Two words never before used in a sentence with DMC, unless laughter follows.

Also yes, if I've paid in excess of $1000 Australian Dollars for my graphics cards (that's about $1 trillion USD) I'm going to go ahead and expect that the game industry actually releases stuff that takes advantage of it, instead of endlessly pandering to the PS3 hardware.

And at the very least, not lying about the fact that there's no visual difference between them.

What? Story and gameplay are less important than 60 vs 30 fps?

image

So...what you're saying is that if somehow, by some magic space miracle, the new DMC is the most amazing work of art created, that'll matter less than whether it has 30 or 60 fps? OK, that's unlikely, but if it did happen, guess which would be more important?

Besides, leaving aside the fact that this won't even affect P.C. gamers (no matter what their rig, the games industry does *not* have an obligation to kill itself to justify your latest graphics card just because you got hit by diminishing returns. [Hd 7850 =150, GTX 670=300, GTX 670 < 2x(HD7850)] If a game is still perfectly functional at 30fps, and it is, since film developers seem to have managed fine on 24 for the last upteen years, then that's all a developer needs to do. Would you rather that they spend 20% of the budget on eye candy and 80% on story and gameplay or made it a 50-50 split, resulting in an infinitely inferior game? No, that's stupid.

Graphical fidelity is not as important as story and gameplay.

End of. Now stop being so silly.

What he meant was that the previous DMC games never had story as their.... strongsuit while gameplay is something DMC does best.

60fps is always better, always.
DMC4 is still a phenominal looking game, far clearer graphics than most games to come out since.

I want the next GTA to look like an n64 game so they can focus on game mechanics and run it at 60fps.

WaitWHAT:

MegaManOfNumbers:
You know Capcom, FPS is the LAST thing I'm worrying about.

+1 internet.

The game's not out yet and we're having whiners demanding that the game be a solid 60 to justify their horrendously expensive graphics cards/ placebo-effect-addled brains. Seriously. Even if you can notice a difference between 30 and 60, it's eye candy. 30 is objectively provable as perfectly playable. More looks better, but it's really an optional extra.

The gameplay and story are going to be 100x more important than whether the framerate is 30 or 60.

Oh sweet bejesus....

There are about a dozen replies already on this thread about why the framerate does matter. This isn't some meaningless whining.

Every Devil May Cry game prior to this one has run at 60fps. That includes the games that ran on the PS2, a console with all the modern computing power of a sack of potatoes. The gameplay of Devil May Cry necessitates a high framerate. The timing required for the player to utilise the combat engine, especially at higher difficulties, is split-second. The player needs to be able to react to anything the instant it happens. With 60fps, you can react to an enemy's move the instant it occurs. With 30fps, reaction time is significantly impaired.

Every fast-paced hack and slash game of note has run at 60fps. On the last generation of consoles, the DMC series and Ninja Gaiden set the standard by both running at a solid 60fps. This generation, both Bayonetta and God Of War 3 have proven that a game can look visually stunning and still run at a steady 60fps. The upcoming Metal Gear Rising, another Platinum hack-and-slash, is apparently confirmed at running at 60fps. The game prior to this instalment, DMC 4, managed to run at 60fps. If those games can all manage it, there is no reason for this instalment not to.

This has nothing to do with a placebo effect. The gameplay of Devil May Cry has always required lightning response times in order for the player to truly succeed. That can only happen if the player is given a framerate which allows them to see action happening at a smooth, consistent rate. I've played other hack-and-slash games that used 30fps just fine. However, I have never played a hack-and-slash game that tried to pull off the same style of over-the-top fast paced combat while only running at 30fps. It simply doesn't work.

TLDR: If Bayonetta, God Of War 3 and MGS Rising can manage it, there is no reason Capcom and Ninja Theory shouldn't be able to. What's obviously happened is that they've decided to use a game engine that simply isn't suited for hack-and-slash games, and they're now coming up with the most flagrant pseudo-science to try and justify it.

WaitWHAT:

What? Story and gameplay are less important than 60 vs 30 fps?

image

So...what you're saying is that if somehow, by some magic space miracle, the new DMC is the most amazing work of art created, that'll matter less than whether it has 30 or 60 fps? OK, that's unlikely, but if it did happen, guess which would be more important?

Besides, leaving aside the fact that this won't even affect P.C. gamers (no matter what their rig, the games industry does *not* have an obligation to kill itself to justify your latest graphics card just because you got hit by diminishing returns. [Hd 7850 =150, GTX 670=300, GTX 670 < 2x(HD7850)] If a game is still perfectly functional at 30fps, and it is, since film developers seem to have managed fine on 24 for the last upteen years,

Stop right there.

The reason films have used 24fps is because it was a standard set in the fifties when film was expensive.

More importantly, film is a passive medium. The audience only has to watch the events of a film, they do not have to participate in them. If films actually required the involvement of the audience, then you can sure as hell bet that 24fps would be a lot less adequate.

Lastly, film has a number of inbuilt advantages that gaming does not. Chief among them being that when filming real actors in a real set, motion blur occurs naturally. Motion blur is that thing which helps suggest that a consistent smooth action is occuring, even though you're just watching a series of still images.

Gaming does not have natural motion blur. Most games have very little motion blur at all, if any. It's hugely resource intensive. That means that a game shown at a rate of 24fps is inherently going to be choppier and less fluid than a film, because it does not have the natural side effects that affect fluid motion in real life. Add to that the fact that games like DMC require split-second reaction input from the player, and your idea that 24fps would be adequate for gaming goes out the window.

Sorry, but as a medium based around digital simulation, gaming has completely different standards from film. Film has a bunch of advantages that come with the medium. gaming does not, so everything has to be carefully programmed by the developers. Including framerate.

[quote[

then that's all a developer needs to do. Would you rather that they spend 20% of the budget on eye candy and 80% on story and gameplay or made it a 50-50 split, resulting in an infinitely inferior game? No, that's stupid.[/quote]

You do not seem to realise that framerate and graphics are completely different areas. A game can have crap graphics and still have a high framerate. Likewise, a game with incredibly high-end graphics can have a low framerate. Framerate has nothing to do with the visual fidelity of an image, it simply relates to how quickly images are displayed to you, the gamer.

And yes, it's pretty damn important. If you don't think so, ask someone who tried playing Skyrim on PS3.

Graphical fidelity is not as important as story and gameplay.

End of. Now stop being so silly.

DMC has never been about story. it has always centred around gameplay. And the gameplay has always centred around the fact that the player needs to react to enemy attacks and moves as they happen. A framerate drop of 50% will have a profound effect on that. Hence why people are so upset.

1: The really important bit to me here is whether or not DMC runs at a stable 30 FPS. If it does? Not ideal, but not terrible. But if you the FPS drops below that? That's bad shit, yo.

2: To all the arguing about whether or not you can see the difference between 30 and 60 FPS, please go here or here.

So, this guy thinks that lying to my internet face will make me want to care about the gaem?

doggie015:
Am I the ONLY one in this thread that DOES NOT MIND playing at 30 FPS?

I didn't either, "before". Well, I did, from time to time, in fast games like Shift and some shooters, but often it doesn't matter too much.

The amount of filthy lying in his comments however, that's just disgusting. It's like being given some cyanide by some charlatan in 1540 as a cure for morning boner.

So a synchronization to monitors defautl frequency of 60 hz (60 blinks per second) is given way for the traditional 30 FPS (every second blink) tactics.
It is being met with a huge amount of stupid fans who think it changes anything.
They try to explain it with a made up theory of how it works without knowing shut about how human eye or brain interprets sight.

Well, its capcom.... stupidity is demanded of them.

And yes, there's a huge difference between 30 and 60 fps.

yes, it requires 2x the processing power of computer for no gain. synchronizing with monitor is good and so, but it can be done with 30 (now 35 for example would be a problem, or if you use one of those monitors that run on 80 hz, but those are kinda extinct now).
as far as "seeing" if synchrnoization is done correctly the only effect is psychological.
and yes i know there are gmes like quake where higher FPS gives you higher jumps, thats BAD PROGRAMMING.

I've... never really liked 60fps. Whenever I see something moving that smoothly, it just looks... unnatural. Like, it feels like what's on the screen moves smoother than off? I mean, not really, but that's how it feels.

What I'm saying is high framerates trigger my uncanny valley senses.

Admittedly it just might be from being so used to not see things so smoothly that things look weird in comparison, but... I don't know. I'm not a fan.

j-e-f-f-e-r-s:
Multiple snip

At no point in your long, rambling posts, do I see anything coming even remotely close to a valid argument explaining why 30 fps instead of 60 will ruin the game. Well, one suggestion that lowering frame rates to 30 will somehow make it harder to play. So, if hard games can't be played at 30 fps...

I rest my case.

As someone who can sacrifice graphical fidelity for smooth gameplay on a whim, i disapprove. But i won't buy the DmC so no big deal. But 60 FPS does matter and it does wonders for the immersion.

Besides, am I the only one who is disturbed by their attitude of "we will hide it behind flashy effects so you won't see it."

I was playing Serious Sam 3 before, and don't you even try and tell me I didn't notice when my frames went down. It's instantly a noticeable 'Oh, that's annoying, hope that smooths out again when I'm not looking at this area'. People who say that you can't see a difference are frankly either literally retarded (as in, their eyes are dysfunctional and don't work as the normal human eye does) or are just talking crap for...reasons? I dunno, maybe their shitbox PC/Console/Plank of wood runs at 10 frames and anything above that is incomprehensible to them.

And yeah, different games do make a difference. Heck, with this particular game, it might not matter. I don't even care when I go down to like 20 frames in games like Dawn of War 2 or something, but in Shooters it's stupidly noticeable.

And yeah, I mad bro, because I'm so sick of people spreading misinformation about what the human can/can't see just because THEIR eyes can't see it, and then making out that people that can notice it are 'spoiled' or 'elitist' etc. But then with the amount of people who are saying that crap, maybe I am one of the few 'Elite' humans in the world who can see higher than 30fps, in which case, fuck yeah.

WaitWHAT:
The game's not out yet and we're having whiners demanding that the game be a solid 60 to justify their horrendously expensive graphics cards

Um, I assume that comment is implying that you think it's PC gamers that are whining... when if you'd read the article you'd know that it was only the console version that had a capped FPS.

60FPS has been the industry standard for console hack and slash games since the first DMC at the start of the last generation; that's why people are complaining.

I also disagree with your assessment that story is more important than visuals in a hack 'n' slash. DMC has never had a good story, but is still a very enjoyable series and the visual feedback is a very important part of that. It doesn't necessarily have to have amazing fidelity but buttery smooth animations (much more achievable at 60FPS) and a great sense of style are very important. In games like this, the visuals enhance the gameplay.

I'm actually looking forward to the new DmC, it looks pretty good, I actually just bought the HD collection today and I'm having my first real DmC experience. As well as thinking the new game looks pretty awesome I also couldn't give less of a fuck about frame rates, yeah, it's nice to have 60 FPS but al long as the game is fun I don't care. I know lots of you are going to crucify me for this but... Well, I can't tell the difference between PC and console. Let's get one thing straight, I have a 360, PS3 and a pretty good Gaming PC. The games on my PC look a bit better in places but I have the same level of fun and thus don't care...

So, who's with me on DmC?

No one?

I'll see myself out.....

WaitWHAT:

MegaManOfNumbers:
You know Capcom, FPS is the LAST thing I'm worrying about.

+1 internet.

The game's not out yet and we're having whiners demanding that the game be a solid 60 to justify their horrendously expensive graphics cards/ placebo-effect-addled brains. Seriously. Even if you can notice a difference between 30 and 60, it's eye candy. 30 is objectively provable as perfectly playable. More looks better, but it's really an optional extra.

The gameplay and story are going to be 100x more important than whether the framerate is 30 or 60.

... Yeah, because you buy graphic cards for your consoles /facepalm

Damn, you have no idea what you are talking about. The difference between 30 and 60 FSP is perfectly visible and diminishes the experience.

WaitWHAT:

MegaManOfNumbers:
You know Capcom, FPS is the LAST thing I'm worrying about.

+1 internet.

The game's not out yet and we're having whiners demanding that the game be a solid 60 to justify their horrendously expensive graphics cards/ placebo-effect-addled brains. Seriously. Even if you can notice a difference between 30 and 60, it's eye candy. 30 is objectively provable as perfectly playable. More looks better, but it's really an optional extra.

The gameplay and story are going to be 100x more important than whether the framerate is 30 or 60.

Of course it'll be 60fps on PC, it's the consoles where the problem will be. How can you not understand that? What, did you think the PC version too would be arbitrarily locked to 30fps?!?!

And you don't know what you are talking about if you think placebo has anything to do with the difference between 30fps and 60fps. I have been tested, I know THE VERY SECOND the framerate in a game I'm playing drops down to 30fps (it was from FRAPS randomly starting).

And it has nothing to do with "eye-candy". It's to do with GAMEPLAY! You cannot "see" the difference between 60fps and 30fps, you FEEL it, how the responses change to your input, you just can't feel as precise with the controls.

Even in minecraft. Even in Doom. Even is the lowest fidelity graphics like Super Mario Bros NES or even Pong... 60fps MATTERS! It's a matter of reaction time and precision. DOn't come into this like an amateure, this is like someone who's doesn't know the first thing about aerodynamics lecturing a pilot that they never have to worry about stall warnings. No. you have no idea what you are talking about.

Saying gameplay matters more than framerate is like saying car performance is more important than it's traction... when once is a MAJOR influencing factor on the other.

I don't really care, the human eye can only perceive 25 frames per second, so unless you dog is playing, the rest is just wasted computing power.

WaitWHAT:

j-e-f-f-e-r-s:
Multiple snip

At no point in your long, rambling posts, do I see anything coming even remotely close to a valid argument explaining why 30 fps instead of 60 will ruin the game. Well, one suggestion that lowering frame rates to 30 will somehow make it harder to play. So, if hard games can't be played at 30 fps...

I rest my case.

You're clueless.

Ninja Gaiden 2, Serious Sam and Super Meat Boy are played at 60fps.

Dark Souls is hard BECAUSE OF the clunky controls combined with how powerful the enemies are! The game forces you to be extremely strategic with every move you make as you cannot correct on the fly. Dark Souls is not a hack n' slash in the same sense as Ninja Gaiden or DMC. Super MEat Boy is a fast paced side scrolling platformer that since Super Mario Bros NES has been 60fps standard. Fast paced FPS games like Serious Sam are usually at home on PC where they are benchmarked with 60fps considered the ideal framerate to achieve. If v-sync is enabled in Serious Sam 3 on PC then it will lock frame-rate to 60fps.

Question: do you even play video games? Like. At all.

Brad Calkins:
I don't really care, the human eye can only perceive 25 frames per second, so unless you dog is playing, the rest is just wasted computing power.

Who told you that? Or did you just make it up. Because either someone lied to you or you just made something up.

http://www.100fps.com/how_many_frames_can_humans_see.htm

Strazdas:
So a synchronization to monitors defautl frequency of 60 hz (60 blinks per second) is given way for the traditional 30 FPS (every second blink) tactics.
It is being met with a huge amount of stupid fans who think it changes anything.
They try to explain it with a made up theory of how it works without knowing shut about how human eye or brain interprets sight.

Well, its capcom.... stupidity is demanded of them.

And yes, there's a huge difference between 30 and 60 fps.

yes, it requires 2x the processing power of computer for no gain. synchronizing with monitor is good and so, but it can be done with 30 (now 35 for example would be a problem, or if you use one of those monitors that run on 80 hz, but those are kinda extinct now).
as far as "seeing" if synchrnoization is done correctly the only effect is psychological.
and yes i know there are gmes like quake where higher FPS gives you higher jumps, thats BAD PROGRAMMING.

Yeah, you're the "expert". Nintendo, Sony, Valve, id-software, John Carmack.... they're all just stupid idiots who have been wasting their whole career and you're so smart you don't even have to make any games you'll just claim they are wrong without evidence, experience or precedent.

It has nothing to do with synchronisation considering how 30fps games so often dip into 24-29fps range which does NOT evenly split over 60 frames per second.

Bu no, the WHOLE INDUSTRY is wrong and you are right. Apparently.

Brad Calkins:
I don't really care, the human eye can only perceive 25 frames per second, so unless you dog is playing, the rest is just wasted computing power.

This is plain false. There is a quite visible difference between 30 and 60 and I'm sure people with better eyes than me (my scrummy -4.somewaht on sight) can see the difference in higher framerates.

http://frames-per-second.appspot.com/

If you don't see the difference then you just lie.

MrFalconfly:
Well OK.

Personally I know squat about how many framerates the human eye can perceive but I do know that people don't go out of movie-theaters complaining about choppy framerates (movies usually run 24fps).

Now personally I think above 45fps is just luxury and bragging rights (kinda like having a Veyron, sure it can do 431km/h but how many times do you need that capability?!?), but hey I could be wrong.

any higher than 60 fps the human eye cant see but personally things dont look bad until they are bellow 20 fps

This, the weaker engine, the slower gameplay, and the kind of desperate "cool" factor make me not want this game at all.

Treblaine:

Strazdas:
So a synchronization to monitors defautl frequency of 60 hz (60 blinks per second) is given way for the traditional 30 FPS (every second blink) tactics.
It is being met with a huge amount of stupid fans who think it changes anything.
They try to explain it with a made up theory of how it works without knowing shut about how human eye or brain interprets sight.

Well, its capcom.... stupidity is demanded of them.

And yes, there's a huge difference between 30 and 60 fps.

yes, it requires 2x the processing power of computer for no gain. synchronizing with monitor is good and so, but it can be done with 30 (now 35 for example would be a problem, or if you use one of those monitors that run on 80 hz, but those are kinda extinct now).
as far as "seeing" if synchrnoization is done correctly the only effect is psychological.
and yes i know there are gmes like quake where higher FPS gives you higher jumps, thats BAD PROGRAMMING.

Yeah, you're the "expert". Nintendo, Sony, Valve, id-software, John Carmack.... they're all just stupid idiots who have been wasting their whole career and you're so smart you don't even have to make any games you'll just claim they are wrong without evidence, experience or precedent.

It has nothing to do with synchronisation considering how 30fps games so often dip into 24-29fps range which does NOT evenly split over 60 frames per second.

Bu no, the WHOLE INDUSTRY is wrong and you are right. Apparently.

Im not an expert, but i did my research. Nintendo was never known to be "Smart" anyway, Valve never claimed that "60 fps better than 30 fps" and so on. The explanation why 30 fps is ok given in this article is CLEARLY false however so it follows that the guy is either lieing or stupid.
You make assumtions of what i do and do not, and you blame me for lack of evidence, funny.
The lag-spike that drops the FPS is a problem for 30 fps locks, unless you program it like San Andreas did, which made no problem at 25fps lock. 30 is more popular however due to monitor sinchronization. if the sinchronization is done properly, human eye cant see the difference. difference is seen when the game frame generation does not match monitor refreshing, and thats why some people claim to "see the difference" when all they see is game change does not match monitor change. V-sync is popular for a reason.
If you make a game that can run on 60 FPS with, say, lag-spyking into 50 fps, lock it at 30 fps, you will not have lag-spykes because you dont need to generate more than 50 frames as you generate only 30. Now of course there are things like bad end-user equipment but really thats up to the user to sort out.
There are many people who are wrong about many things, and its no wonder there are many in the gaming indsutry as well. when you take a 3000 lines code and see that i can easily be shorted into 1000 lines code and woudl take 0.5 times the processing power, but they tell you to "go with what your given" and then complain about "high system requirements" it becomes really easy to blame the industry for stupidity.

TheKasp:

WaitWHAT:

MegaManOfNumbers:
You know Capcom, FPS is the LAST thing I'm worrying about.

+1 internet.

The game's not out yet and we're having whiners demanding that the game be a solid 60 to justify their horrendously expensive graphics cards/ placebo-effect-addled brains. Seriously. Even if you can notice a difference between 30 and 60, it's eye candy. 30 is objectively provable as perfectly playable. More looks better, but it's really an optional extra.

The gameplay and story are going to be 100x more important than whether the framerate is 30 or 60.

... Yeah, because you buy graphic cards for your consoles /facepalm

Damn, you have no idea what you are talking about. The difference between 30 and 60 FSP is perfectly visible and diminishes the experience.

Congratulations, Capcom has officially distracted you from the real problem.

MegaManOfNumbers:

Congratulations, Capcom has officially distracted you from the real problem.

How? I don't own a console and don't give a fuck about Devil May Cry or Capcom. But even if I would: The playability of such games is the biggest problem. I can play adventure games with 30 FPS, not that anything hectic is happening on the screen and I have to react fast. I can't play FPS with 30 fps, it is simply not playable. The aim goes off because everything statters.

The fluency of the depicted picture is the biggest issue there might be depending on what genre we talk.

And what is supposed to be the bigger issue with DMC and Capcom? That they changed Dantes hair?

Strazdas:

Treblaine:

Strazdas:
So a synchronization to monitors defautl frequency of 60 hz (60 blinks per second) is given way for the traditional 30 FPS (every second blink) tactics.
It is being met with a huge amount of stupid fans who think it changes anything.
They try to explain it with a made up theory of how it works without knowing shut about how human eye or brain interprets sight.

Well, its capcom.... stupidity is demanded of them.

yes, it requires 2x the processing power of computer for no gain. synchronizing with monitor is good and so, but it can be done with 30 (now 35 for example would be a problem, or if you use one of those monitors that run on 80 hz, but those are kinda extinct now).
as far as "seeing" if synchrnoization is done correctly the only effect is psychological.
and yes i know there are gmes like quake where higher FPS gives you higher jumps, thats BAD PROGRAMMING.

Yeah, you're the "expert". Nintendo, Sony, Valve, id-software, John Carmack.... they're all just stupid idiots who have been wasting their whole career and you're so smart you don't even have to make any games you'll just claim they are wrong without evidence, experience or precedent.

It has nothing to do with synchronisation considering how 30fps games so often dip into 24-29fps range which does NOT evenly split over 60 frames per second.

Bu no, the WHOLE INDUSTRY is wrong and you are right. Apparently.

Im not an expert, but i did my research. Nintendo was never known to be "Smart" anyway, Valve never claimed that "60 fps better than 30 fps" and so on. The explanation why 30 fps is ok given in this article is CLEARLY false however so it follows that the guy is either lieing or stupid.
You make assumtions of what i do and do not, and you blame me for lack of evidence, funny.
The lag-spike that drops the FPS is a problem for 30 fps locks, unless you program it like San Andreas did, which made no problem at 25fps lock. 30 is more popular however due to monitor sinchronization. if the sinchronization is done properly, human eye cant see the difference. difference is seen when the game frame generation does not match monitor refreshing, and thats why some people claim to "see the difference" when all they see is game change does not match monitor change. V-sync is popular for a reason.
If you make a game that can run on 60 FPS with, say, lag-spyking into 50 fps, lock it at 30 fps, you will not have lag-spykes because you dont need to generate more than 50 frames as you generate only 30. Now of course there are things like bad end-user equipment but really thats up to the user to sort out.
There are many people who are wrong about many things, and its no wonder there are many in the gaming indsutry as well. when you take a 3000 lines code and see that i can easily be shorted into 1000 lines code and woudl take 0.5 times the processing power, but they tell you to "go with what your given" and then complain about "high system requirements" it becomes really easy to blame the industry for stupidity.

Absolute nonsense.

And predictably a COMPLETE LACK OF ANY SOURCES! Nor even an explanation, just hollow claims like:

"difference is seen when the game frame generation does not match monitor refreshing, and thats why some people claim to see the difference"

Which is completley unfoudned. And you're not going to convince me you are right about synchronisation when you REPEATEDLY spell it wrong as "sinchronization". That's not a typo. You genuinely don't know how to spell it.

"V-sync is popular for a reason."

Yes, to stop screen tearing. You clearly have NO IDEA what you are talking about. V-sync stands for "Vertical synchronisation" which means every refreshed frame must be a whole frame. You probably don't realise what screen tearing is.

Locking to 30fps is far worse than 60fps occasionally dropping to 50fps as you have inherently increased lag of the 3-frame delay. And that's not what lag-spikes are.

You STILL don't know what you are talking about and it's embarrassingly obvious.

"its no wonder there are many in the gaming indsutry as well."

it's not "many" it's the ENTIRE INDUSTRY! They are ALL in agreement that 60fps is better and appreciably better than 30fps, it's just some want to have the higher resolution or resolution that they settle for 30fps is "minimum acceptable level".

Higher framerate is achieved by having LOWER system requirements!!! You just don't have any freaking idea about this. I pity for anyone who reads any of your posts on this subject and believes a single word of it.

I can feel a massive difference from 30 to 60 on PC, even 45 feels a little bit off... as well as a noticeable difference with input lag when vsync is on.
On ps3/xbox though the difference is minimal

I'm a relatively competitive FPS player though, with currently about 1.5k hrs in CSS so I guess it depends on what you play a lot of the time as well.

EDIT: Just a note, I have also used a 120hz monitor and FELT a difference but couldn't really see one, its hard to explain... I guess it was just more responsive maybe?

TheKasp:

MegaManOfNumbers:

Congratulations, Capcom has officially distracted you from the real problem.

How? I don't own a console and don't give a fuck about Devil May Cry or Capcom. But even if I would: The playability of such games is the biggest problem. I can play adventure games with 30 FPS, not that anything hectic is happening on the screen and I have to react fast. I can't play FPS with 30 fps, it is simply not playable. The aim goes off because everything statters.

The fluency of the depicted picture is the biggest issue there might be depending on what genre we talk.

And what is supposed to be the bigger issue with DMC and Capcom? That they changed Dantes hair?

The problem?

Gameplay, story, characters, this:

http://www.youtube.com/watch?v=BuoUfyMUQTc&list=FLjmHpfEcT1qoAXIWAVRca3A&index=12&feature=plpp_video

MegaManOfNumbers:

The problem?

Gameplay, story, characters, this:

http://www.youtube.com/watch?v=BuoUfyMUQTc&list=FLjmHpfEcT1qoAXIWAVRca3A&index=12&feature=plpp_video

Really? Story and character is a 'problem' in a Devil May Cry game? Oh boy, that is the same as if I would have a problem with the characterisation of the leet skin in CS:GO in comparison of CS1.6

Most of the rant is based on what? A few videos we've seen so far. As far as I know the game has yet to be released and thus the full story and character unveiled.

And no, as long as the game is unplayable the rest is not a concern to me.

 Pages PREV 1 2 3 4 5 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Registered for a free account here