The Order: 1886 Reveals Actual Gameplay Footage

 Pages PREV 1 2 3 NEXT
 

I was actually hoping for something like Metro 2033. I don't mind it being a shooter but damn if that trailer doesn't cut in awkward ways and make it look like a 3rd person shooter (which, FYI, 3rd person views always makes shooting weird). At least in The Last of US there was mostly a stealth element but this looks like a run and gun which has almost always been better off as an FPS.

This has so many fanboys hot and bothered on other sites its annoying. Its not even funny anymore, its pervades so many sites.

"new innovative game? LOL NOPE 1886 TROLOLOLOLOLOLOL!"

"early access being the biggest thing in gaming? 1886 LOLOLOLOLOLOLOL"

"Last of us? Lol shit game Order 1886 LOLOLOLOLOLOL."

Some even say the game looks better than any other platform puts out, which is a bold faced lie.

N4G alone is so full of hype I genuinely feel sorry for those gamers who actually believe it. All they see is cut scenes and very linear corridors and their minds automatically fill in what they want to see.

I predict it will be a very disappointing game because everyone started hyping it up before they even got any info at all. Even the graphics are way late from what PC has gotten for the last couple years.

Even then, high def burly men won't save you from what would amount to a gears of war clone over 6 years after the fact.

Ultratwinkie:
Even then, high def burly men won't save you from what would amount to a gears of war clone over 6 years after the fact.

So a 3rd person cover system makes it a clone of another game?

Don't get me wrong, I'm disappointed with what I see, but being the first (or one of the first?) to rely on a mechanic doesn't magically make everything that follows a clone. Additionally, wasn't Gears of War insanely popular? I'd far rather see a clone of good games than clones of say, Colonial Space Marines. Are cover based shooters off the table now? Looks to me like GOW3 performed really well and GOW4 is likely to do well for its platform. So why the hate? I didn't like the series because it felt clunky.

I'm hoping that the story will be compelling. I mean, The Last of Us and Uncharted are both 3rd party platformers with a cover system and I liked those. But the shooting element was shit and in the Last of Us that felt intentional. If this game has a legitimate story and if they do a decent job with shooting then it could end up being fantastic. This is Ready at Dawn and Santa Monica working together, they have an ok history of good games. This can go either way. Way too early to tell. I'm just hesitant at the moment.

Who knows, maybe the rest of the game has a serious stealth component?

I want it to be good. Lord knows I'd like a well-done steam punk game set in the Van Helsing universe. But I still need more info. If the only draw is the universe they've created then this will fall flat.

Lightknight:

Ultratwinkie:
Even then, high def burly men won't save you from what would amount to a gears of war clone over 6 years after the fact.

So a 3rd person cover system makes it a clone of another game?

Don't get me wrong, I'm disappointed with what I see, but being the first (or one of the first?) to rely on a mechanic doesn't magically make everything that follows a clone. Additionally, wasn't Gears of War insanely popular? I'd far rather see a clone of good games than clones of say, Colonial Space Marines. Are cover based shooters off the table now? Looks to me like GOW3 performed really well and GOW4 is likely to do well for its platform. So why the hate? I didn't like the series because it felt clunky.

I'm hoping that the story will be compelling. I mean, The Last of Us and Uncharted are both 3rd party platformers with a cover system and I liked those. But the shooting element was shit and in the Last of Us that felt intentional. If this game has a legitimate story and if they do a decent job with shooting then it could end up being fantastic. This is Ready at Dawn and Santa Monica working together, they have an ok history of good games. This can go either way. Way too early to tell. I'm just hesitant at the moment.

Who knows, maybe the rest of the game has a serious stealth component?

I want it to be good. Lord knows I'd like a well-done steam punk game set in the Van Helsing universe. But I still need more info. If the only draw is the universe they've created then this will fall flat.

A bad 3rd person cover shooter would be GOW. That's all it really was, a really bland shooter that looked good at the time. The burly men won't mask bad game play this time like thief and crysis, which found out the hard way. It was briefly popular on the xbox and the dudebro culture, but it became an insult very quickly.

thankfully, COD came and stole the show and GOW is nothing but a memory for those old enough to remember the start of the generation. Still an insult though. A lot like halo.

All they shown is how the game looks like a cutscene and pointed that out every chance they get.

You heard me, they bragged about using cutscenes interchangeably with actual gameplay. Even the actual gameplay footage was so linear I really began to wonder what other ways they cut the game down so it can run. Cutscenes shouldn't be used at all.

I even saw some awful slowdown on that video.

Consoles were never known for their power until the 360 and ps3. They sacrificed graphics for cheapness and hopefully gameplay. Focusing on graphics on a a "budget" console generation is just asking for trouble, and the cracks are already showing here. This isn't 2005, they didn't shell out for the top end hardware this time.

Sweet premise, doesn't look to be that well executed, with some pretty bad enemy AI as well. I wait patiently for the "Ye Olde Gears of War" puns, though

Cover based shooting and QTEs?

Truly, next gen is finally upon us.

Ultratwinkie:
A bad 3rd person cover shooter would be GOW. That's all it really was, a really bland shooter that looked good at the time. The burly men won't mask bad game play this time like thief and crysis, which found out the hard way. It was briefly popular on the xbox and the dudebro culture, but it became an insult very quickly.

Dudebro culture? You mean people who like FPS titles that are multiplayer intensive? Is this significant portion of the gaming community somehow inferior in your eyes? Are their opinions and tastes less valid than yours and somehow considered apt for ridicule and disdain? Is it ok now to use an entire culture like it's insulting or is it ok because it isn't a race, religious group, or gender?

I've been playing games my whole life. RPGs, action RPGs, MMOs, classic platformers, adventure games, RTS, everything and I like FPS titles too. It's insulting to dismiss an entire genre of games because "dudebros", especially when the actual group of people aren't any less a valid class of people than whatever category you place yourself in. And, in actuallity, considering they represent a larger segment of the video game consumer market than any other single group, the argument could be made that they are even more valid of a group.

thankfully, COD came and stole the show and GOW is nothing but a memory for those old enough to remember the start of the generation. Still an insult though. A lot like halo.

Ok, now you're making a Sony fan defend Microsoft IPs.

GOW 3 came out in 2011. It was the tenth best seller of the year and came in above any version of Skyrim (just under MW3 at 70k units fewer). It sold over 6.6 million units.

Halo 4 sold more units in its launch than any previous main game in the series and came in at the second fastest selling game of the year (3 overall behind both versions of black ops 2).

It sounds like you're not up to date on your facts. These games are still quite popular. I'd say GOW Judgment taught them that they have to deliver a legitimate game to make the same sales though. I didn't even know it existed.

You heard me, they bragged about using cutscenes interchangeably with actual gameplay. Even the actual gameplay footage was so linear I really began to wonder what other ways they cut the game down so it can run. Cutscenes shouldn't be used at all.

I am concerned that this will be Sony's equivalent of Ryse. That would be terrible. But it already appears to have more player interaction than the Ryse of QTEs.

I even saw some awful slowdown on that video.

I think you're imagining it or perhaps you had a spot of poor internet connection while viewing the video.

Consoles were never known for their power until the 360 and ps3. They sacrificed graphics for cheapness and hopefully gameplay. Focusing on graphics on a a "budget" console generation is just asking for trouble, and the cracks are already showing here. This isn't 2005, they didn't shell out for the top end hardware this time.

Sure, the ps4 isn't the equivalent of a machine with the latest CPU, crossfired Titans and 32GBs of RAM. But it is well above the average gaming machine currently on the market.

You really shouldn't expect a console to be the absolute bleeding edge of gaming. Not without breaching $1,000. However, keep in mind that the PS3 had 512MB of ram divided into two 256MB sections with a 7 year old CPU/GPU. It even had purposefully made obstacles that prevented developers from easily developing for it (the current CEO's on words back when he was on that project, he said the purpose was to prevent developers from unlocking the full potential of the machine in year one and leave nowhere to go after then... they should have been so lucky). And yet, somehow it plays Bioshock Infinite on the equivalent of Medium settings when the recommended settings for the game are 4GBs with a modern quad core and newer GPU. It plays Skyrim and the Last of Us with extremely pretty graphics (though the obstacles forced onto the PS3 is asset categorization that really hurt it with Bethesda games and asset bloating). The reason it can play those games now despite the pc requirements being 4 times or more the ps3 specs is hardware optimization. That's the thing about consoles, you can do more with less.

With all this in mind, remember that the ps4 is 10x the power of the ps3. 10x. Look at the prettiest games the system currently offers and imagine that the new console is able to add 10 times the processing that was required to play that game and that game may not have even been pushing the ps3 to its full capability. Also, the archetecture is now x86 and not the proprietary obstacle crap which means it's actually even more powerful than the ps3 would appear to be on paper because developers don't have to deal with trying to get all their assets to fit into individual categories that have a maximum size.

10x means better graphics, yes. But more to the point, it means better physics which will make a world of difference in making things look more real. The truth is, we aren't going to see any more leaps and bounds in graphics, not when the last generation started looking so much more realistic before we even hit the 8th gen.

Understand, you're not talking to a console elitest. I own a gaming pc with a new i7, a new Radeon, and 16GBs of 1866 mhz RAM (can upgrade to 32GB when necessary). I just understand that the ps4 is a great deal for $400 and optimisation will have it playing games long after my pc has to be upgraded. It's simply the way things are. A budget $1,500 pc in 2005 had 2GB of Ram, dual core processor and an ancient gpu. Something that could play Skyrim on its lowest settings and not much more. Consider that consoles actually have an advantage because you can optimise the hell of hardware when all the hardware is fixed and enough people own the set of hardware to make optimizing for it worthwhile.

The days of getting clothing physics to work correctly are close at hand. That is what I'm looking forward to the most about this game.

Dishonored has all the Dirty England I'll ever need.

Hey, it is only a few minutes of gameplay, there could be more to it. I mostly think this simply looking at the past games they've worked on, so it just feels like it'd be unlikely for this to turn out to be nothing more than a brown shooter with nothing else to offer.

This is actually a big reason why I'm staying out of console gaming for the moment. Aside from my Wii U of course, but ever since I've fully completed the Wind Waker remake it's become just a dedicated Netflix machine. All the "style over substance" of the AAA console scene is really starting to grow stale. For every standout, there are a dozen games that have squandered good premises in the name of making something broad and generic to justify their production cost. Maybe the next two years or so will end up changing my mind, but I'm going to hold off on buying an Xbox One or PS4 until I have to port a game to them.

Casual Shinji:
It's a bit too jumbled to really get a good idea of what the gameplay has to offer. Obviously shooting, but that could mean anything. Looks like kind of a rush job trailer.

These were my thoughts exactly. It feels like they're showing off the "safe footage" -- the kind they show to investors. Now, give me a trailer like the one shown at E3, and we'll be cooking.

Lightknight:

Ultratwinkie:
A bad 3rd person cover shooter would be GOW. That's all it really was, a really bland shooter that looked good at the time. The burly men won't mask bad game play this time like thief and crysis, which found out the hard way. It was briefly popular on the xbox and the dudebro culture, but it became an insult very quickly.

Dudebro culture? You mean people who like FPS titles that are multiplayer intensive? Is this significant portion of the gaming community somehow inferior in your eyes? Are their opinions and tastes less valid than yours and somehow considered apt for ridicule and disdain? Is it ok now to use an entire culture like it's insulting or is it ok because it isn't a race, religious group, or gender?

I've been playing games my whole life. RPGs, action RPGs, MMOs, classic platformers, adventure games, RTS, everything and I like FPS titles too. It's insulting to dismiss an entire genre of games because "dudebros", especially when the actual group of people aren't any less a valid class of people than whatever category you place yourself in. And, in actuallity, considering they represent a larger segment of the video game consumer market than any other single group, the argument could be made that they are even more valid of a group.

thankfully, COD came and stole the show and GOW is nothing but a memory for those old enough to remember the start of the generation. Still an insult though. A lot like halo.

Ok, now you're making a Sony fan defend Microsoft IPs.

GOW 3 came out in 2011. It was the tenth best seller of the year and came in above any version of Skyrim (just under MW3 at 70k units fewer). It sold over 6.6 million units.

Halo 4 sold more units in its launch than any previous main game in the series and came in at the second fastest selling game of the year (3 overall behind both versions of black ops 2).

It sounds like you're not up to date on your facts. These games are still quite popular. I'd say GOW Judgment taught them that they have to deliver a legitimate game to make the same sales though. I didn't even know it existed.

You heard me, they bragged about using cutscenes interchangeably with actual gameplay. Even the actual gameplay footage was so linear I really began to wonder what other ways they cut the game down so it can run. Cutscenes shouldn't be used at all.

I am concerned that this will be Sony's equivalent of Ryse. That would be terrible. But it already appears to have more player interaction than the Ryse of QTEs.

I even saw some awful slowdown on that video.

I think you're imagining it or perhaps you had a spot of poor internet connection while viewing the video.

Consoles were never known for their power until the 360 and ps3. They sacrificed graphics for cheapness and hopefully gameplay. Focusing on graphics on a a "budget" console generation is just asking for trouble, and the cracks are already showing here. This isn't 2005, they didn't shell out for the top end hardware this time.

Sure, the ps4 isn't the equivalent of a machine with the latest CPU, crossfired Titans and 32GBs of RAM. But it is well above the average gaming machine currently on the market.

You really shouldn't expect a console to be the absolute bleeding edge of gaming. Not without breaching $1,000. However, keep in mind that the PS3 had 512MB of ram divided into two 256MB sections with a 7 year old CPU/GPU. It even had purposefully made obstacles that prevented developers from easily developing for it (the current CEO's on words back when he was on that project, he said the purpose was to prevent developers from unlocking the full potential of the machine in year one and leave nowhere to go after then... they should have been so lucky). And yet, somehow it plays Bioshock Infinite on the equivalent of Medium settings when the recommended settings for the game are 4GBs with a modern quad core and newer GPU. It plays Skyrim and the Last of Us with extremely pretty graphics (though the obstacles forced onto the PS3 is asset categorization that really hurt it with Bethesda games and asset bloating). The reason it can play those games now despite the pc requirements being 4 times or more the ps3 specs is hardware optimization. That's the thing about consoles, you can do more with less.

With all this in mind, remember that the ps4 is 10x the power of the ps3. 10x. Look at the prettiest games the system currently offers and imagine that the new console is able to add 10 times the processing that was required to play that game and that game may not have even been pushing the ps3 to its full capability. Also, the archetecture is now x86 and not the proprietary obstacle crap which means it's actually even more powerful than the ps3 would appear to be on paper because developers don't have to deal with trying to get all their assets to fit into individual categories that have a maximum size.

10x means better graphics, yes. But more to the point, it means better physics which will make a world of difference in making things look more real. The truth is, we aren't going to see any more leaps and bounds in graphics, not when the last generation started looking so much more realistic before we even hit the 8th gen.

Understand, you're not talking to a console elitest. I own a gaming pc with a new i7, a new Radeon, and 16GBs of 1866 mhz RAM (can upgrade to 32GB when necessary). I just understand that the ps4 is a great deal for $400 and optimisation will have it playing games long after my pc has to be upgraded. It's simply the way things are. A budget $1,500 pc in 2005 had 2GB of Ram, dual core processor and an ancient gpu. Something that could play Skyrim on its lowest settings and not much more. Consider that consoles actually have an advantage because you can optimise the hell of hardware when all the hardware is fixed and enough people own the set of hardware to make optimizing for it worthwhile.

GOW was big muscley men big impractical armor with big guns. It became an absolute joke to the point it was an insult. It was popular with all the "dudebros" that gamers loved to hate. Same with Halo.

It got the same hate that COD does now.

It was when COD came around and broke sales figures that in the grand scheme of things they became irrelevant. The staggering difference in sales was a nail in the coffin to exclusive games.

It was what killed Microsoft's exclusive superiority over the ps3, because their exclusives were one-uped by a franchise everyone had. So now xbox had no counter argument to the ps3, and the ps3 became the better console.

GOW was only popular for a short time, so any attempt to make lightning strike twice but on sony side is 7 years after the fact. They'd be better off doing something else than doing another pretty cover based shooter. If anything, they should be copying the Metro series because it had equal parts graphics and gameplay.

At least the gaming formula would be fresh. Not the same rehashed formula dating all the way back when the 7th generation was "new."

Secondly, on a power scale you are wrong on both counts. If they wanted physics, they would have played ball with Nvidia to get actual physx tech. They paid for AMD, the bargain bin of PC gaming. Their TressFX is more limited, and rarely used in games. Its hair physics than actual physics themselves. Hell, physx is more supported and covers more things than TressFX. You get what you pay for.

The average desktop PC on steam, 70% of PC gaming, is a Nvidia 660. Their CPU is over 2.3 ghz on average, and a quad core. 1 GB of system ram, and 1 GB of VRAM. A 660 is actually better than the 750 that matches the consoles in power. So the average gaming PC is already more powerful than the next gen consoles running on very old tech.

Not counting over clocking of course.

The card that everyone is switching to is the 760, which blows even the 750 TI out of the water in a huge way. So on every front the average PC on steam is more powerful.

and lastly, your arguments are the same cookie cutter arguments from the 1990s. Any PC gamer knows that as long as you beat consoles in power, you can still play games because consoles are the bare minimum. Its only in PC exclusives do you have to worry, and graphics intensive PC exclusives are rare. You are not forced to upgrade until a new console generation comes around. Hell, on PC gamer subreddits there are loads of videos of modern games run on a 8800 on high. That card is ancient. Their optimization means nothing because AMD and Nvidia pay loads of money to ensure their cards are optimized with drivers. Which has been the case for the last 10 years. There is nothing special about consoles, at least not anymore. Even coding to the metal doesn't matter anymore when the difference in power is staggering beyond the bare minimum entry level.

Fynik:
I'm not sure I'm comfortable playing an ornately decorated knight who has literal gold neck protection shooting rabble rebel rabble who are only really "rebels" because they don't want to be worked to death in a factory owned by some noble. I mean that angle is a little... odd. I get that rebels against the queen's peace is kind of a bad thing for the britishly inclined, but literally going out there and machine gunning working class sods because they're breaking martial law when I could (And should) be focusing on fighting werewolves?

I wouldn't be surprised if it turns out to be a setup for some Dramatic Twist - The Order discovers it's on the wrong side of the fight midway through the game and switches sides to help the underclass fight the half-breeds, who are in fact being used as a tool by the elite to keep the rest of the world under their thumbs. Not like we haven't seen that before.

Humm QTEs, corridors and linear gameplay? No thank you. We got enough of that nonsense in this and last generation.

Ultratwinkie:
GOW was big muscley men big impractical armor with big guns. It became an absolute joke to the point it was an insult. It was popular with all the "dudebros" that gamers loved to hate. Same with Halo.

It got the same hate that COD does now.

That "gamers love to hate"? You mean "other gamers" and you didn't mean to exclude "dudebros" from the gamer category, right? Or do you have a cogent argument to rob these individuals of the title of gamer because they enjoy games you or the general gaming populace don't believe are worthy? Do you see how this is sounding?

Is this (that people dislike other people who express a difference of preference) a valid complaint to levy against the games relevancy? People frequently hate on popular things. That doesn't make them bad or worth less somehow. Heck, we've already determined that many of the "haters" are also the players.

I would posit that people really mean that they aren't real "nerds" or they aren't part of the commonly associated nerd culture that generally appreciates games. This is the same kind of mindset that screams that women aren't realy gamers or nerds or whatever. It is an offensive assumption and it belittles people like me who have been gaming their entire lives and enjoy FPS titles as well. It is dismissive of what is clearly the one of largest gaming demographics of all if not THE largest demographic.

It was when COD came around and broke sales figures that in the grand scheme of things they became irrelevant. The staggering difference in sales was a nail in the coffin to exclusive games.

No, a large disparity in sales doesn't mean crap when the other games still sell several millions of copies. Like it or not, Halo and GOW still sell tremendously well with huge profits and ergo are still quite relevant and well liked. Unless you know of a ton of other games with 6+ million units sold. Halo 3 is the 360's 7th best seller of all time and Halo 4 is its 11th. How does this translate into irrelevant to you? Even Halo: Reach is their 9th best seller and Halo: ODST is their 17th. Even Halo Wars sold 2.4 million units and that was a strategy game...

GOW 2 is the 360's 16th best seller. GOW 3 is its 19th. All of these are more than 6 million units sold (if you grant me the .01 that GOW3's 5.99 million is shy of and has likely already reached this week).

So, let me be clear, by NO parameters are these games irrelevant. The big ticket items for producers to sell ARE these FPS dudebro games by a fair margin unless you're a publisher with the GTA trademark.

http://www.vgchartz.com/platform/7/xbox-360/

Of the top 20 games. 7 are either Halo or GOW. COD is another 7 itself. Only 5 games are non-shooters (Red Dead Redemption, 2 GTAs, One kinect game that I think was bundled, and Skyrim) with 1 Battlefield 3 being in the midst.

This makes 75% of the best performing games also FPS titles. Halo and GOW is essentially the equivalent of Modern Warfare and Black Ops alternating development lines (since those are two studios as well). Yes, COD has performed stronger by being higher up on the list, but irrelevant? Gross missuse of the term.

I am somewhat sick of defending these MS titles since, as I said, I'm more of a Sony fan, but let's get our facts straight. They have recent games that continue to perform excellently.

It was what killed Microsoft's exclusive superiority over the ps3, because their exclusives were one-uped by a franchise everyone had. So now xbox had no counter argument to the ps3, and the ps3 became the better console.

I would kind of agree with this. It's actually why I'm a Sony fan. I can get my FPS fix from COD and Sony had some AMAZING story driven IPs that dragged me in. Honestly, this 1886 gameplay can be crappy 3rd person shooter cover-based gameplay and if the story is anywhere near as good as things like The Last of Us then it'll be instantly forgiven and rightly so. The gameplay just has to work properly as long as the writing is there in spades.

When I think of Microsoft's major exclusives, I think GOW, Halo, and...? And I have both systems. When I think of Sony's I've got a great list of Heavy Rain, inFamous, Journey, God of War, Uncharted, Last of Us, Little Big Planet, etc.

But this isn't because of COD. This is because Microsoft basically just has one major genre that they are accomodating. COD provides a legitimate FPS title to play on any system so that does drastically weaken the draw of GOW and Halo as system sellers but Microsoft's real mistake is not having other genre's represented moreso than anything else.

GOW was only popular for a short time, so any attempt to make lightning strike twice but on sony side is 7 years after the fact. They'd be better off doing something else than doing another pretty cover based shooter. If anything, they should be copying the Metro series because it had equal parts graphics and gameplay.

You are wrong. Every GOW game is performing better than the first one. GOW sold 6.03 million copies and came out in 2006. GOW 2 came out in 2008 and sold 6.67 million copies. GOW 3 came out in 2011 and is already at 5.99 million (almost equal with GOW in less than half the time). This is lightning three times.

At least the gaming formula would be fresh. Not the same rehashed formula dating all the way back when the 7th generation was "new."

I disagree. I think they have good FPS games. GOW4 was crap but they're fine on the FPS front. The only thing Microsoft has to do is find major exclusives in other genres. I think capturing Dead Rising was a good start if it sticks.

Secondly, on a power scale you are wrong on both counts. If they wanted physics, they would have played ball with Nvidia to get actual physx tech. They paid for AMD, the bargain bin of PC gaming. Their TressFX is more limited, and rarely used in games. Its hair physics than actual physics themselves. Hell, physx is more supported and covers more things than TressFX. You get what you pay for.

I'm unsure what you're disagreeing with.

First off, whatever is in the console will be supported by developers because they like money like any business does. So you're "more supported" bit is a non-argument. It'd be like being in space and making the argument that before you were in space you didn't need a tank to breath and then subsequently suffocate because now things have changed.

Secondly, your argument is that the consoles could have been more powerful. I don't disagree with that. Hell, why didn't they slap 4 Titans and call it a day? The problem is that they had to come in at a reasonable price for the market at hand. This is a hell of a lot of bang for the buck.

Lastly, hair physics is physics. Do you mean it doesn't support particle physics or something like that? We're a fair bit off from rendering particle physics in real time. All this is, is a significant step in the right direction. I don't even know where you get the claim that the GPU can't support particle physics or any other kind. It may take longer to render it but video games aren't going to demand particle-level physics to be rendered in real time regardless. Not this generation and perhaps not in the next.

What I am saying is that this is more powerful than the average gaming pc at the moment and is 10x more powerful than the ps3 which was already capable of some pretty nice looking physics rendering. It will also be more powerful than the average pc with identical specs to the ps4 because of the huge advantages of fine tuning that optimisation provides to make 512mb systems magically work like they have 2gb and newer hardware.

The average desktop PC on steam, 70% of PC gaming, is a Nvidia 660. Their CPU is over 2.3 ghz on average, and a quad core. 1 GB of system ram, and 1 GB of VRAM. A 660 is actually better than the 750 that matches the consoles in power. So the average gaming PC is already more powerful than the next gen consoles running on very old tech.

That'd be pretty impressive since only 52.38% of Steam PCs are NVidia at all according to Steam's latest data. I strongly doubt that all 52.4% are also using an Nvidia 660 but it's impossible for 52.4% of the market to reach 70% of the market unless we're working on the Anchorman logic of something that 70% of the time works all the time. Maybe you just worded your claim incorrectly or I misunderstood you somehow?

http://store.steampowered.com/hwsurvey/

Perhaps your stats come from a different month? But if the months vary that much, what's the point?

Anyways, the current stats are:
52.38% of Steam users use Nvidia
Over half use Win7 64bit OS
The average CPU is a dual core (47.5% with quad core catching up at 44%)
Average RAM is 5+ with 4GB machines at 45% of the market

Keep in mind, developers aren't going to forget half of their target market, so it only has to be close to hold back the rest of the market from getting more demanding games. As long as a significant portion of the market is the lower stat, games will continue to be made with them in mind. You could have the latest CPU, 4 crossfired latest GPUs, and 32GBs of RAM and you're still likely to do no better than someone with a mediocre CPU, a single new card or 2 crossfied decent ones, and 8-16GBs of RAM because the games are only made for the minimum req machines with higher specs just polishing up the graphics to the point of diminishing return without really making any changes under the hood for the game engine.

One thing you're going to have to realise about CPUs is that they're mostly glorified switch board operators now. Yes, they still do some processing but most of the work is offloaded processes to the cheaper processing that GPUs and RAM provide. So I couldn't care less what the average CPU is and you shouldn't either unless you use a lot of software that doesn't offload the processing properly. Heck, take a look at your task manager right now. Odds are that your RAM is doing most of the work and your CPU should be under 10% unless something is wrong.

and lastly, your arguments are the same cookie cutter arguments from the 1990s. Any PC gamer knows that as long as you beat consoles in power, you can still play games because consoles are the bare minimum. Its only in PC exclusives do you have to worry, and graphics intensive PC exclusives are rare. You are not forced to upgrade until a new console generation comes around. Hell, on PC gamer subreddits there are loads of videos of modern games run on a 8800 on high. That card is ancient. Their optimization means nothing because AMD and Nvidia pay loads of money to ensure their cards are optimized with drivers. Which has been the case for the last 10 years. There is nothing special about consoles, at least not anymore. Even coding to the metal doesn't matter anymore when the difference in power is staggering beyond the bare minimum entry level.

No, again, let's use Skyrim.

Minimum Specs:

Windows 7/Vista/XP PC (32 or 64 bit)
Processor: Intel Dual Core 2.0GHz or equivalent processor (AMD Sempron @ 2.4 GHz)
2GB System RAM
6GB free HDD space
Direct X 9.0c compliant video card with 512 MB of RAM
DirectX compatible sound card

Recommended Specs:

Windows 7/Vista/XP PC (32 or 64 bit)
Quad-core Intel or AMD CPU processor
4GB System RAM
6GB free HDD (Hard disk drive) space
DirectX 9.0c compatible NVIDIA or AMD ATI video card with 1GB of RAM: Nvidia GeForce GTX 260 or higher; ATI Radeon HD 4890 or higher
DirectX compatible sound card

The PS3 had 512 MBs of RAM that was stupidly divided into two compartments of 246 MBs. The GPU was the equivalent of the NVidia 7800. It would flunk the minimum requirements test for the game if you made a PC that was identical to the ps3. Yet the ps3 and 360 can both play Skyrim on the medium settings equivalent. Not even low.

My response is the cookie cutter response because it's true. The optimization advantages of consoles is significant at the moment. You can't say, "You're just saying what most people say" as a way to dismiss my argument. If anything, that's merely giving more credence to my statement and making your claim the odd man out.

What's more, a PC in 2005 that roughly met those minimum requirements cost around $1,500. There is no reason to believe that things have drastically changed since the previous generation. The only difference is that we'll be able to directly determine the advantage of optimisation this generation thanks to the move to x86 environments.

Last year the average GPU was some crappy laptop gpu.

Lightknight:

Ultratwinkie:
GOW was big muscley men big impractical armor with big guns. It became an absolute joke to the point it was an insult. It was popular with all the "dudebros" that gamers loved to hate. Same with Halo.

It got the same hate that COD does now.

That "gamers love to hate"? You mean "other gamers" and you didn't mean to exclude "dudebros" from the gamer category, right? Or do you have a cogent argument to rob these individuals of the title of gamer because they enjoy games you or the general gaming populace don't believe are worthy? Do you see how this is sounding?

Is this (that people dislike other people who express a difference of preference) a valid complaint to levy against the games relevancy? People frequently hate on popular things. That doesn't make them bad or worth less somehow. Heck, we've already determined that many of the "haters" are also the players.

I would posit that people really mean that they aren't real "nerds" or they aren't part of the commonly associated nerd culture that generally appreciates games. This is the same kind of mindset that screams that women aren't realy gamers or nerds or whatever. It is an offensive assumption and it belittles people like me who have been gaming their entire lives and enjoy FPS titles as well. It is dismissive of what is clearly the one of largest gaming demographics of all if not THE largest demographic.

It was when COD came around and broke sales figures that in the grand scheme of things they became irrelevant. The staggering difference in sales was a nail in the coffin to exclusive games.

No, a large disparity in sales doesn't mean crap when the other games still sell several millions of copies. Like it or not, Halo and GOW still sell tremendously well with huge profits and ergo are still quite relevant and well liked. Unless you know of a ton of other games with 6+ million units sold. Halo 3 is the 360's 7th best seller of all time and Halo 4 is its 11th. How does this translate into irrelevant to you? Even Halo: Reach is their 9th best seller and Halo: ODST is their 17th. Even Halo Wars sold 2.4 million units and that was a strategy game...

GOW 2 is the 360's 16th best seller. GOW 3 is its 19th. All of these are more than 6 million units sold (if you grant me the .01 that GOW3's 5.99 million is shy of and has likely already reached this week).

So, let me be clear, by NO parameters are these games irrelevant. The big ticket items for producers to sell ARE these FPS dudebro games by a fair margin unless you're a publisher with the GTA trademark.

http://www.vgchartz.com/platform/7/xbox-360/

Of the top 20 games. 7 are either Halo or GOW. COD is another 7 itself. Only 5 games are non-shooters (Red Dead Redemption, 2 GTAs, One kinect game that I think was bundled, and Skyrim) with 1 Battlefield 3 being in the midst.

This makes 75% of the best performing games also FPS titles. Halo and GOW is essentially the equivalent of Modern Warfare and Black Ops alternating development lines (since those are two studios as well). Yes, COD has performed stronger by being higher up on the list, but irrelevant? Gross missuse of the term.

I am somewhat sick of defending these MS titles since, as I said, I'm more of a Sony fan, but let's get our facts straight. They have recent games that continue to perform excellently.

It was what killed Microsoft's exclusive superiority over the ps3, because their exclusives were one-uped by a franchise everyone had. So now xbox had no counter argument to the ps3, and the ps3 became the better console.

I would kind of agree with this. It's actually why I'm a Sony fan. I can get my FPS fix from COD and Sony had some AMAZING story driven IPs that dragged me in. Honestly, this 1886 gameplay can be crappy 3rd person shooter cover-based gameplay and if the story is anywhere near as good as things like The Last of Us then it'll be instantly forgiven and rightly so. The gameplay just has to work properly as long as the writing is there in spades.

When I think of Microsoft's major exclusives, I think GOW, Halo, and...? And I have both systems. When I think of Sony's I've got a great list of Heavy Rain, inFamous, Journey, God of War, Uncharted, Last of Us, Little Big Planet, etc.

But this isn't because of COD. This is because Microsoft basically just has one major genre that they are accomodating. COD provides a legitimate FPS title to play on any system so that does drastically weaken the draw of GOW and Halo as system sellers but Microsoft's real mistake is not having other genre's represented moreso than anything else.

GOW was only popular for a short time, so any attempt to make lightning strike twice but on sony side is 7 years after the fact. They'd be better off doing something else than doing another pretty cover based shooter. If anything, they should be copying the Metro series because it had equal parts graphics and gameplay.

You are wrong. Every GOW game is performing better than the first one. GOW sold 6.03 million copies and came out in 2006. GOW 2 came out in 2008 and sold 6.67 million copies. GOW 3 came out in 2011 and is already at 5.99 million (almost equal with GOW in less than half the time). This is lightning three times.

At least the gaming formula would be fresh. Not the same rehashed formula dating all the way back when the 7th generation was "new."

I disagree. I think they have good FPS games. GOW4 was crap but they're fine on the FPS front. The only thing Microsoft has to do is find major exclusives in other genres. I think capturing Dead Rising was a good start if it sticks.

Secondly, on a power scale you are wrong on both counts. If they wanted physics, they would have played ball with Nvidia to get actual physx tech. They paid for AMD, the bargain bin of PC gaming. Their TressFX is more limited, and rarely used in games. Its hair physics than actual physics themselves. Hell, physx is more supported and covers more things than TressFX. You get what you pay for.

I'm unsure what you're disagreeing with.

First off, whatever is in the console will be supported by developers because they like money like any business does. So you're "more supported" bit is a non-argument. It'd be like being in space and making the argument that before you were in space you didn't need a tank to breath and then subsequently suffocate because now things have changed.

Secondly, your argument is that the consoles could have been more powerful. I don't disagree with that. Hell, why didn't they slap 4 Titans and call it a day? The problem is that they had to come in at a reasonable price for the market at hand. This is a hell of a lot of bang for the buck.

Lastly, hair physics is physics. Do you mean it doesn't support particle physics or something like that? We're a fair bit off from rendering particle physics in real time. All this is, is a significant step in the right direction. I don't even know where you get the claim that the GPU can't support particle physics or any other kind. It may take longer to render it but video games aren't going to demand particle-level physics to be rendered in real time regardless. Not this generation and perhaps not in the next.

What I am saying is that this is more powerful than the average gaming pc at the moment and is 10x more powerful than the ps3 which was already capable of some pretty nice looking physics rendering. It will also be more powerful than the average pc with identical specs to the ps4 because of the huge advantages of fine tuning that optimisation provides to make 512mb systems magically work like they have 2gb and newer hardware.

The average desktop PC on steam, 70% of PC gaming, is a Nvidia 660. Their CPU is over 2.3 ghz on average, and a quad core. 1 GB of system ram, and 1 GB of VRAM. A 660 is actually better than the 750 that matches the consoles in power. So the average gaming PC is already more powerful than the next gen consoles running on very old tech.

That'd be pretty impressive since only 52.38% of Steam PCs are NVidia at all according to Steam's latest data. I strongly doubt that all 52.4% are also using an Nvidia 660 but it's impossible for 52.4% of the market to reach 70% of the market unless we're working on the Anchorman logic of something that 70% of the time works all the time. Maybe you just worded your claim incorrectly or I misunderstood you somehow?

http://store.steampowered.com/hwsurvey/

Perhaps your stats come from a different month? But if the months vary that much, what's the point?

Anyways, the current stats are:
52.38% of Steam users use Nvidia
Over half use Win7 64bit OS
The average CPU is a dual core (47.5% with quad core catching up at 44%)
Average RAM is 5+ with 4GB machines at 45% of the market

Keep in mind, developers aren't going to forget half of their target market, so it only has to be close to hold back the rest of the market from getting more demanding games. As long as a significant portion of the market is the lower stat, games will continue to be made with them in mind. You could have the latest CPU, 4 crossfired latest GPUs, and 32GBs of RAM and you're still likely to do no better than someone with a mediocre CPU, a single new card or 2 crossfied decent ones, and 8-16GBs of RAM because the games are only made for the minimum req machines with higher specs just polishing up the graphics to the point of diminishing return without really making any changes under the hood for the game engine.

One thing you're going to have to realise about CPUs is that they're mostly glorified switch board operators now. Yes, they still do some processing but most of the work is offloaded processes to the cheaper processing that GPUs and RAM provide. So I couldn't care less what the average CPU is and you shouldn't either unless you use a lot of software that doesn't offload the processing properly. Heck, take a look at your task manager right now. Odds are that your RAM is doing most of the work and your CPU should be under 10% unless something is wrong.

and lastly, your arguments are the same cookie cutter arguments from the 1990s. Any PC gamer knows that as long as you beat consoles in power, you can still play games because consoles are the bare minimum. Its only in PC exclusives do you have to worry, and graphics intensive PC exclusives are rare. You are not forced to upgrade until a new console generation comes around. Hell, on PC gamer subreddits there are loads of videos of modern games run on a 8800 on high. That card is ancient. Their optimization means nothing because AMD and Nvidia pay loads of money to ensure their cards are optimized with drivers. Which has been the case for the last 10 years. There is nothing special about consoles, at least not anymore. Even coding to the metal doesn't matter anymore when the difference in power is staggering beyond the bare minimum entry level.

No, again, let's use Skyrim.

Minimum Specs:

Windows 7/Vista/XP PC (32 or 64 bit)
Processor: Intel Dual Core 2.0GHz or equivalent processor (AMD Sempron @ 2.4 GHz)
2GB System RAM
6GB free HDD space
Direct X 9.0c compliant video card with 512 MB of RAM
DirectX compatible sound card

Recommended Specs:

Windows 7/Vista/XP PC (32 or 64 bit)
Quad-core Intel or AMD CPU processor
4GB System RAM
6GB free HDD (Hard disk drive) space
DirectX 9.0c compatible NVIDIA or AMD ATI video card with 1GB of RAM: Nvidia GeForce GTX 260 or higher; ATI Radeon HD 4890 or higher
DirectX compatible sound card

The PS3 had 512 MBs of RAM that was stupidly divided into two compartments of 246 MBs. The GPU was the equivalent of the NVidia 7800. It would flunk the minimum requirements test for the game if you made a PC that was identical to the ps3. Yet the ps3 and 360 can both play Skyrim on the medium settings equivalent. Not even low.

My response is the cookie cutter response because it's true. The optimization advantages of consoles is significant at the moment. You can't say, "You're just saying what most people say" as a way to dismiss my argument. If anything, that's merely giving more credence to my statement and making your claim the odd man out.

What's more, a PC in 2005 that roughly met those minimum requirements cost around $1,500. There is no reason to believe that things have drastically changed since the previous generation. The only difference is that we'll be able to directly determine the advantage of optimisation this generation thanks to the move to x86 environments.

Last year the average GPU was some crappy laptop gpu.

Its the same reason people hate COD. gamers hate it because its mainstream. Its the hipster effect. Its new so its cool, its becomes cool then we must hate it to be "a cut above." whether or not dudebros are gamers is irrelevant. If GOW 2 was its peak, how is 5.9 million an achievement? It did less then GOW 1 which was a brand new IP. Brand New IP should never beat a sequel ever.

Lastly, you are still wrong. Steam makes 70% of the PC market. 52% is nvidia. The most common card on that end is 660. Which beats the ps4. Which supports physx, which does more way than just hair, which is exclusive to nvidia GPUs only. If you try with an AMD card, you get a hit in performance. Also, AMD can't use it because they can't get the license. So the best you'd ever get are hair physics when nvidia simulates way more.

physics is AN EXTRA. Developers are not FORCED to support TressFX. They are not REQUIRED to use physx. physx is more supported them TressFX and is more versatile. The fact you don't even grasp this is staggering. Licenses make up a huge part of any business, and tech is no different. So GPUs can support different things due to what their manufacturers can offer.

Without nvidia, physics are out of the question for next gen consoles. The reason why is clear, nvidia will help you implement physics instead of you doing it all by yourself. Beyond hair physics, you have to do it yourself. Unless of course your crawl over to Havok and beg for physics. Its easier to just go to a manufacturer and request help.




We are not far off from real time particle physics. We are already here. We have been here for years. Its here for over 50% of pc gamers. Who don't even have the newest and greatest hardware. The fact you think full particle physics don't exist yet shows you have no idea how far tech has come. This has been around since 2008 when nvidia bought it. If you count how long its been around period, its from 2004.

There is nothing actually stopping this on consoles other than licensing issues and the inability to pay Nvidia's fees for making your hardware. Amd is willing to work for pennies, nvidia actually asks money.

Don't rely on what consoles can do to see what technology as a whole can do. They don't dictate anything. Technology progresses as much as it pleases. We can go very far, its just that the suits made very horrible decisions years ago.

If they stayed within their means for the last 7 years, we wouldn't have this tech whiplash cutting everything to the bone. Consoles would have been better with some forethought.

Facts are not democratic. No matter how many people you get to spout the same thing doesn't make it true. Technology has progressed a lot, and consoles can't keep up because they don't want to pay more after last gen almost bankrupted everyone. This is a budget generation, no other way around it.

If we actually did things right, this wouldn't have happened.

and the last gen used cutting edge hardware of the time. This gen is an equivalent to a 750 TI. An entry level card. In fact, the hardware they are using in the next gen consoles are TABLET BASED from AMD. That alone should tell you how desperate things have gotten.

So yes, this gen IS different. Its not as powerful as the last gen at launch. The low end to mid range PC already left next gen behind in pure power. The amount of cheap options that beat the consoles is staggering. Its gotten to the point where you can build your own console (steambox) for the same price, power, and with none of the drawbacks.

Consoles are going back to what they used to be, cheap and weak. The way they have traditionally been until the 7th gen. They can't invest in cutting edge and unproven technology anymore. Cell alone has been a huge headache.

This however is risky because people are used to a device that can do something new. There is no HD-blue ray war anymore. There is no big dichotomy to ride on. There is only 4K and no one can run that. The closest we have is 1440 (2k) and no one cares about that either.

They already introduced the idea of all powerful consoles in people's minds. There is no going back anymore.

And lets also point out you read the chart wrong. 8 GB is standard on steam. Its the biggest percent of users and it has the HIGHEST adoption rate. You bring up laptops but people browse steam on laptops.

It isn't "how many are using it for gaming." It scans your computer to see what you have regardless of you are playing or just browsing a sale. The average desktop is not as weak as you think it is. The only thing that laptop GPus prove is that a lot of people log onto steam with a laptop. Which is not necessarily gaming.

You also over estimate the power of last gen. Mobile devices are now more powerful than last gen consoles (Tegra K1). You might find Skyrim amazing, but the fact is anyone can it now. You don't even need a graphics card anymore, an APU can handle it at least on minimum.

In fact, Intel Graphics 2000 can run it. Which means console optimization is bullshit because an even weaker integrated graphics card can run the game.

And speaking of skyrim, no consoles do not run on medium. They run on low or close to it at 720p that is upscaled.

This is the ps3 vs the PC on max + the official texture patch that bethesda released. Which was meant to fix the low resolution textures problem at launch.

Now here is the comparison of medium to ultra:

As you can see, your argument on skryim running on medium is bullshit. The ps3 looks nothing like medium due to how low resolution the textures are and the differences in lighting. Not to mention the resolution difference of 1080p vs the console's 720p itself.

The models aren't the issue here. Its the resolution of the textures brought on by the low ram of the systems. Its why the dedicated graphics card with more ram has better textures.

And you know what? I am going to rub salt in the wound in order to prove a point here. Here is a 8800, full blown ancient, running skyrim at higher detail than consoles with 50+ FPS and AA enabled. It can even run Arkham City on high. So no, you don't need to upgrade and god like optimization is a myth. Consoles will never outpace anything that is more powerful,even slightly, than it nor will it keep up. It also won't outpace something on the same level of power as long as the versions are exactly the same.

If they are on the same level, they will be on the same level. Unless something went wrong, but we are assuming everything went right.

The only thing that can would be overclocking, which a console should never ever do unless you want to start a house fire or create a space heater.

Skryim isn't as graphics intensive as you think. And to top all this off, the 7800 can run skyrim on low-medium with roughly the same resolution:

http://www.uesp.net/wiki/Skyrim:System_Requirements

So that means even the equivalent GPU is actually better. PC games are more demanding, often because of texture sizes. If the ps3 isn't on medium, and it runs near minimum, its slightly better.

Consoles are not on some special plane of existence. They follow the same rules all other tech does. if they had any benefit, and they once did, they lost it when manufacturers and game devs emulated it.

OpenGL, Direct X, mantle. All of that threatens the idea of console optimization. Not to mention manufacturer drivers. Or rather, it did before the playing field leveled.

There is a reason I said they aren't special anymore.

Ultratwinkie:
Its the same reason people hate COD. gamers hate it because its mainstream. Its the hipster effect. Its new so its cool, its becomes cool then we must hate it to be "a cut above." whether or not dudebros are gamers is irrelevant.

So then, why did you use the term "dudebros" to dismiss the validity of a gaming genre? It's as ridiculous as saying that horror game/movies don't count because horror fans like them.

If GOW 2 was its peak, how is 5.9 million an achievement? It did less then GOW 1 which was a brand new IP. Brand New IP should never beat a sequel ever.

Did you catch where I said that GOW 3 has sold almost as much as GOW in less than half the time? Games don't stop selling after the first year. GOW 1's 6.03 million copies was obtained over 8 years. Last year GOW 1 sold another 56.5 thousand units and that's its 8th year. GOW 2 sold over 100k units last year and that's its 6th year. GOW 3 is already reaching GOW 1's total sales in its 3rd year.

In GOW's third year, it had sold 500k less (total) than GOW 3 did. GOW 3 is even 100k units above where GOW 2 was this time.

And how is it an achievement? The 360 had less than 20 titles on its system at all that reached 6 million. How can you claim it isn't an achievement even without accounting for the next few years of sales? Even if all their sales were magically cut short, these three games are all in the top 20 sellers of all time on the 360. I don't like defending Microsoft's titles but you're dead wrong here. I'm not sure why it's all that important to you but calling the top 20 biggest performers in a console generation failures is silly.

Lastly, you are still wrong. Steam makes 70% of the PC market. 52% is nvidia. The most common card on that end is 660.

Do you have any kind of backing for this statment that the most common card is a 660? That card is within the top 30 cards on the high-end card market according to Passmark. Saying that this is the average card sitting in most gamer's pc is laughable at best. Not quite like saying that all car owners own a ferrari but like saying the average car owner owns a sports car.

What's more is that the 660 is actually slightly less powerful than the PS4's GPU. Not better. Originally the PS4's GPU was estimated to be somewhere between a Radeon 7850 and a Radeon 7870. Guess what else is literally benchmarked between those cards too? Your 660.

http://www.videocardbenchmark.net/high_end_gpus.html

But after the ps4 was released and was actually tested, it turned out to be a modified 7870.

http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs

So even if your statement was true, your conclusion is still wrong. With the modifications the modified 7870 should be more powerful than the 660 Ti or comparable at the very least. I actually had no idea it was that powerful until just now when I researched to respond with actual data.

Additionally, do you know why I participated in the Steam Hardware survey? Because my pc is awesome. You do realise that this is an opt-in survey and not an actual cross section of all steam users. This is just a cross section of Steam Users who were willing and able to fill out the survey. If I'd had a crappy computer I wouldn't have done it. Can you see how any results could be biased that way?

Likewise, considering that there are hundreds of video cards on the market, doesn't it mean that a card could be the most common card with only 5% or less of the market? Again, one thing steam does report on is how many cores its users have. Two cores is still the most common so why do you think the 660 would be the one gamers go for if they aren't springing for a quad core?

Which beats the ps4.

I know I said this above, but I'm going to say it again for emphasis. The PS4 uses a modified 7870. A regular 7870 is already above (minor improvement) the 660.

physics is AN EXTRA. Developers are not FORCED to support TressFX. They are not REQUIRED to use physx. physx is more supported them TressFX and is more versatile. The fact you don't even grasp this is staggering. Licenses make up a huge part of any business, and tech is no different. So GPUs can support different things due to what their manufacturers can offer.

When the standard hardware uses something else, developers WILL use something else. No one has to be forced to do anything. If the console uses AMD GPUs the developers are sure as hell going to develop with them in mind.

Now, do you have any guesses as to what the modififications to the PS4 7870 video card does as far as performance? Because I have no idea. But one thing I know is that this has 10x the physics capabilities that the ps3 had and the ps3 was already capable of displaying some nice stuff.

Without nvidia, physics are out of the question for next gen consoles.

That's nonsensical. "Physics" have been present in gaming forever. Even pong had physics in its most basic form. The physics in games gets better as long as the hardware improves because developers make and use their own engines that include their own physics coding. There were more expensive and better graphics cards that could have done more stuff, but the hardware is already 10x that of the last generation and whatever card you think they should have gone with wouldn't have been another 100% increase or necessarily worth whatever the price increase would have been. Physics most

We are not far off from real time particle physics. We are already here. We have been here for years. Its here for over 50% of pc gamers. Who don't even have the newest and greatest hardware. The fact you think full particle physics don't exist yet shows you have no idea how far tech has come. This has been around since 2008 when nvidia bought it. If you count how long its been around period, its from 2004.

Steam and dust isn't "physics" anymore than a table in a game is. The physics is how it interacts with variables in the environment. The steam is usually just an asset that you can't interact with at all. Real particle physics is stuff like the wind generated from you walking past causes some of the steam to follow you or where some of the steam particles cool and drop due to the pressure differential it generates with the hotter steam. Shooting a wooden plank and seeing realistic splintering that correspond with the type of wood and dryness of the material.

You've really got to watch some of the physics simulations. They're extremely impressive. Just note that the rendering of many of these takes powerful machines hours or days to render and you're just watching the end result:

So I'm not talking about cute little atmospheric dust particles following a preordained path.

[b]and the last gen used cutting edge hardware of the time.

Not really. They were just better than average. People still had pcs with 1GB of RAM in 2005. Do you expect the consoles to have a Titan in them or something?

This gen is an equivalent to a 750 TI. An entry level card.

First off, a 750 Ti isn't an "entry level card". As of today, it's the 37th highest performing card on the market. I specify 37th because it's around the mid-upper side of that list. Secondly, that is lower than the lowest estimate of the PS4's GPU ever which started being estimated at above a 7850.

Now, if by "entry level card" you just mean that this is the kind of card a gamer would buy as their first card now given decent funds, sure. But not entry level as in bottom of the barrel and we CERTAINLY aren't talking about the average card currently in gaming pcs.

In fact, the hardware they are using in the next gen consoles are TABLET BASED from AMD. That alone should tell you how desperate things have gotten.

I'm going to explain this again. CPUs are no longer the processing go-to's they once were. Anymore they just push processes to RAM and GPU for rendering while handing much smaller background processing. They are the pc equivalent of switch board operators. Look at your processor right now, even if you're in a game. It isn't going to be working anywhere near as hard as your video card and RAM is. Not unless you've tapped them out. Most processes are now offloaded to RAM and GPU so the CPU simply doesn't need to be that powerful. I'm sorry but that CPU will more than accomplish the job it needs to do. If you're a PC gamer you should know the answer to this question, "What makes a bigger difference, upgrading my CPU or my Video Card?" Unless your CPU is old as hell the answer is almost always going to be video card.

Here's a decent article on the subject:

http://lifehacker.com/5891007/do-i-even-need-to-care-about-processors-anymore

This is the ps3 vs the PC on max + the official texture patch that bethesda released. Which was meant to fix the low resolution textures problem at launch.

My statement is regarding vanilla Skyrim. It's what they say it is comparable to.

Those links are terrible. Why in the world would most of the first video have mostly night scenes? And what was the point? A different texture set with shinier rocks and better lighting? Yes, we know that the PC Ultra HD is better than the console version. Skyrim is practically the reason I bought a PC.

Here's a comparison of the 360, ps3 and PC all together.

It simply looks pretty on any console. I mean, I installed a ton of mods and I do laugh a little when I see that comparison. But I also played Skyrim on the PS3 (especially at the start when it was buggy to the point of being broken) and I know it still looked pretty. It is somewhat telling that you did not compare a ps3 game with pc medium settings and instead slapped ps3 next to ultra settings with a detached version beneath it with no comparable scenes.

The most boggling part that you keep failing to address is that the minimum requirements for the PC hardware for low settings on this game are double the ps3 and yet the ps3's hardware was capable of playing the game.

OpenGL, Direct X, mantle. All of that threatens the idea of console optimization. Not to mention manufacturer drivers. Or rather, it did before the playing field leveled.

Not really, those allow for some standardization in protocol but they don't change the fact that every GPU and CPU combination has it's on unique compatibility issues and it's own limits. With just one hardware configuration the developer is able to test and push the hardware to the limit while still ensuring not going over. It does this in a way that would destroy some computers configured in one way or would under utilize the components of a computer configured in a slightly different way.

The benefits of optimization are hardware based. Even the individual components are optimized and fully tested to work perfectly with eachother right out of the gate (hopefully, anyways). The only way hardware is going to get standardized is if we stop having competition. But as long as your pc can have an AMD or Intel cpu combined with one of two different video cards then you've already got a significant possibility of combinations that prevent real optimization because there's no standard. Start throwing in different types of each brand of CPU and GPU and the increase in permutations is exponential.

I was almost interested, then it turned into a third person shooter with quick-time-events and I was thinking "nevermind".

PS4 ransomware doesn't really interest me, though you have to admire the character models. But that's about it. The rest seems very uninspired. I predict its campaign is no longer than 4-6 hours of gameplay and runs at like 900p.

Ultratwinkie:

The average desktop PC on steam, 70% of PC gaming, is a Nvidia 660. Their CPU is over 2.3 ghz on average, and a quad core. 1 GB of system ram, and 1 GB of VRAM. A 660 is actually better than the 750 that matches the consoles in power. So the average gaming PC is already more powerful than the next gen consoles running on very old tech.

Not counting over clocking of course.

The card that everyone is switching to is the 760, which blows even the 750 TI out of the water in a huge way. So on every front the average PC on steam is more powerful.

You are telling me that the average PC on Steam uses what is currently a $200 video card and that everyone is switching to a $250 video card? Just the video card? Tell me again how putting together a PC is cheaper than consoles? Let's face it, a low end computer with Win7 OS, 2 GB RAM and a 1 TB hard drive is going to cost at least $600, which is more than any console on the market. Making anything more powerful than that ups the price considerably, getting closer to the $1000 range, especially if you go for a high end video card, even worse if you get more than one for Crossfire or SLI.

Sorry but once you start talking about the performance of an "average" PC over a console and I start looking at prices, I think your definition of "average" is crap.

I would say it's disappointing to see this be yet another slow-paced, linear, cover-based shooter, but from the E3 trailer, it was entirely expected. It's a shame really. It seems like a neat setting.

Lightknight:

Ultratwinkie:
Its the same reason people hate COD. gamers hate it because its mainstream. Its the hipster effect. Its new so its cool, its becomes cool then we must hate it to be "a cut above." whether or not dudebros are gamers is irrelevant.

So then, why did you use the term "dudebros" to dismiss the validity of a gaming genre? It's as ridiculous as saying that horror game/movies don't count because horror fans like them.

If GOW 2 was its peak, how is 5.9 million an achievement? It did less then GOW 1 which was a brand new IP. Brand New IP should never beat a sequel ever.

Did you catch where I said that GOW 3 has sold almost as much as GOW in less than half the time? Games don't stop selling after the first year. GOW 1's 6.03 million copies was obtained over 8 years. Last year GOW 1 sold another 56.5 thousand units and that's its 8th year. GOW 2 sold over 100k units last year and that's its 6th year. GOW 3 is already reaching GOW 1's total sales in its 3rd year.

In GOW's third year, it had sold 500k less (total) than GOW 3 did. GOW 3 is even 100k units above where GOW 2 was this time.

And how is it an achievement? The 360 had less than 20 titles on its system at all that reached 6 million. How can you claim it isn't an achievement even without accounting for the next few years of sales? Even if all their sales were magically cut short, these three games are all in the top 20 sellers of all time on the 360. I don't like defending Microsoft's titles but you're dead wrong here. I'm not sure why it's all that important to you but calling the top 20 biggest performers in a console generation failures is silly.

Lastly, you are still wrong. Steam makes 70% of the PC market. 52% is nvidia. The most common card on that end is 660.

Do you have any kind of backing for this statment that the most common card is a 660? That card is within the top 30 cards on the high-end card market according to Passmark. Saying that this is the average card sitting in most gamer's pc is laughable at best. Not quite like saying that all car owners own a ferrari but like saying the average car owner owns a sports car.

What's more is that the 660 is actually slightly less powerful than the PS4's GPU. Not better. Originally the PS4's GPU was estimated to be somewhere between a Radeon 7850 and a Radeon 7870. Guess what else is literally benchmarked between those cards too? Your 660.

http://www.videocardbenchmark.net/high_end_gpus.html

But after the ps4 was released and was actually tested, it turned out to be a modified 7870.

http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs

So even if your statement was true, your conclusion is still wrong. With the modifications the modified 7870 should be more powerful than the 660 Ti or comparable at the very least. I actually had no idea it was that powerful until just now when I researched to respond with actual data.

Additionally, do you know why I participated in the Steam Hardware survey? Because my pc is awesome. You do realise that this is an opt-in survey and not an actual cross section of all steam users. This is just a cross section of Steam Users who were willing and able to fill out the survey. If I'd had a crappy computer I wouldn't have done it. Can you see how any results could be biased that way?

Likewise, considering that there are hundreds of video cards on the market, doesn't it mean that a card could be the most common card with only 5% or less of the market? Again, one thing steam does report on is how many cores its users have. Two cores is still the most common so why do you think the 660 would be the one gamers go for if they aren't springing for a quad core?

Which beats the ps4.

I know I said this above, but I'm going to say it again for emphasis. The PS4 uses a modified 7870. A regular 7870 is already above (minor improvement) the 660.

physics is AN EXTRA. Developers are not FORCED to support TressFX. They are not REQUIRED to use physx. physx is more supported them TressFX and is more versatile. The fact you don't even grasp this is staggering. Licenses make up a huge part of any business, and tech is no different. So GPUs can support different things due to what their manufacturers can offer.

When the standard hardware uses something else, developers WILL use something else. No one has to be forced to do anything. If the console uses AMD GPUs the developers are sure as hell going to develop with them in mind.

Now, do you have any guesses as to what the modififications to the PS4 7870 video card does as far as performance? Because I have no idea. But one thing I know is that this has 10x the physics capabilities that the ps3 had and the ps3 was already capable of displaying some nice stuff.

Without nvidia, physics are out of the question for next gen consoles.

That's nonsensical. "Physics" have been present in gaming forever. Even pong had physics in its most basic form. The physics in games gets better as long as the hardware improves because developers make and use their own engines that include their own physics coding. There were more expensive and better graphics cards that could have done more stuff, but the hardware is already 10x that of the last generation and whatever card you think they should have gone with wouldn't have been another 100% increase or necessarily worth whatever the price increase would have been. Physics most

We are not far off from real time particle physics. We are already here. We have been here for years. Its here for over 50% of pc gamers. Who don't even have the newest and greatest hardware. The fact you think full particle physics don't exist yet shows you have no idea how far tech has come. This has been around since 2008 when nvidia bought it. If you count how long its been around period, its from 2004.

Steam and dust isn't "physics" anymore than a table in a game is. The physics is how it interacts with variables in the environment. The steam is usually just an asset that you can't interact with at all. Real particle physics is stuff like the wind generated from you walking past causes some of the steam to follow you or where some of the steam particles cool and drop due to the pressure differential it generates with the hotter steam. Shooting a wooden plank and seeing realistic splintering that correspond with the type of wood and dryness of the material.

You've really got to watch some of the physics simulations. They're extremely impressive. Just note that the rendering of many of these takes powerful machines hours or days to render and you're just watching the end result:

So I'm not talking about cute little atmospheric dust particles following a preordained path.

[b]and the last gen used cutting edge hardware of the time.

Not really. They were just better than average. People still had pcs with 1GB of RAM in 2005. Do you expect the consoles to have a Titan in them or something?

This gen is an equivalent to a 750 TI. An entry level card.

First off, a 750 Ti isn't an "entry level card". As of today, it's the 37th highest performing card on the market. I specify 37th because it's around the mid-upper side of that list. Secondly, that is lower than the lowest estimate of the PS4's GPU ever which started being estimated at above a 7850.

Now, if by "entry level card" you just mean that this is the kind of card a gamer would buy as their first card now given decent funds, sure. But not entry level as in bottom of the barrel and we CERTAINLY aren't talking about the average card currently in gaming pcs.

In fact, the hardware they are using in the next gen consoles are TABLET BASED from AMD. That alone should tell you how desperate things have gotten.

I'm going to explain this again. CPUs are no longer the processing go-to's they once were. Anymore they just push processes to RAM and GPU for rendering while handing much smaller background processing. They are the pc equivalent of switch board operators. Look at your processor right now, even if you're in a game. It isn't going to be working anywhere near as hard as your video card and RAM is. Not unless you've tapped them out. Most processes are now offloaded to RAM and GPU so the CPU simply doesn't need to be that powerful. I'm sorry but that CPU will more than accomplish the job it needs to do. If you're a PC gamer you should know the answer to this question, "What makes a bigger difference, upgrading my CPU or my Video Card?" Unless your CPU is old as hell the answer is almost always going to be video card.

Here's a decent article on the subject:

http://lifehacker.com/5891007/do-i-even-need-to-care-about-processors-anymore

This is the ps3 vs the PC on max + the official texture patch that bethesda released. Which was meant to fix the low resolution textures problem at launch.

My statement is regarding vanilla Skyrim. It's what they say it is comparable to.

Those links are terrible. Why in the world would most of the first video have mostly night scenes? And what was the point? A different texture set with shinier rocks and better lighting? Yes, we know that the PC Ultra HD is better than the console version. Skyrim is practically the reason I bought a PC.

Here's a comparison of the 360, ps3 and PC all together.

It simply looks pretty on any console. I mean, I installed a ton of mods and I do laugh a little when I see that comparison. But I also played Skyrim on the PS3 (especially at the start when it was buggy to the point of being broken) and I know it still looked pretty. It is somewhat telling that you did not compare a ps3 game with pc medium settings and instead slapped ps3 next to ultra settings with a detached version beneath it with no comparable scenes.

The most boggling part that you keep failing to address is that the minimum requirements for the PC hardware for low settings on this game are double the ps3 and yet the ps3's hardware was capable of playing the game.

OpenGL, Direct X, mantle. All of that threatens the idea of console optimization. Not to mention manufacturer drivers. Or rather, it did before the playing field leveled.

Not really, those allow for some standardization in protocol but they don't change the fact that every GPU and CPU combination has it's on unique compatibility issues and it's own limits. With just one hardware configuration the developer is able to test and push the hardware to the limit while still ensuring not going over. It does this in a way that would destroy some computers configured in one way or would under utilize the components of a computer configured in a slightly different way.

The benefits of optimization are hardware based. Even the individual components are optimized and fully tested to work perfectly with eachother right out of the gate (hopefully, anyways). The only way hardware is going to get standardized is if we stop having competition. But as long as your pc can have an AMD or Intel cpu combined with one of two different video cards then you've already got a significant possibility of combinations that prevent real optimization because there's no standard. Start throwing in different types of each brand of CPU and GPU and the increase in permutations is exponential.

GOW selling less than the first GOW. That isn't an achievement. All you proved that the sales were front loaded and failed to reach penetration level sales.

And the cards I would have picked would have been 100x better. If they picked the 660 or even the 760 we would have much more range. We would be licensed to use physx. These are not expensive cards, the 660 is 180$. Since it beats the 750, the next gen equivalent to the ps4, the last generation 660 is more powerful than the next gen consoles.

High end cards are 250-300.

Hell, the 660 isn't actually high end. Its lower tier on the nvidia scale. Its even middle tier on AMD's scale. This card is old. The reason its in the high end card range is because nvidia consistently blows AMD away on the majority of cards. There is no contest. There was even evidence of that on the steam stats page that I referenced.

And the videos I choose because it compared medium, ultra, and consoles. The textures were much higher on medium than consoles would allow. Consoles run on low, just like the equivalent 7800.

And its hilarious how you miss that the skyrim specs have proven you wrong. The 7800 is the same as the ps3 and it runs it at the same level. No optimization ever made it better than the dedicated card. Which means your idea of console optimization is grade A bullshit. Again.

And you want to know why? Because console versions of the game are cut down. Their textures are lower resolution to fit in the restrictive RAM limit. Its always been this way.

If you actually saw the videos in 1080p, and looked closely, you notice the differences in texture. I couldn't find a ps3/medium video so i found a ps3/ ultra video and a medium video. Cross compare the god damn videos instead of running around spouting bullshit. Its right fucking there. You can play both and spot the difference in real time for christ sakes.

I didn't use the video you posted because there was no guarantee of it being on medium. So I used multiple videos from multiple perspectives. I EVEN TOLD YOU THIS WHEN I POSTED THEM.

Its not the models or even the open world that is the limitation. Skyrim cuts those into small chunks the consoles can handle. What the consoles can't handle is high resolution textures. Which are present in the PC version.

Not to mention that the hardware at the time of launch was cutting edge. Something you refuse to acknowledge. A PC from 2005 didn't have access to anything until a little bit later. because Both manufacturers splurged on tech that later became a pain in the ass. The tech in this console gen is entry level at best.

No matter how you spin it, these consoles are entry level. They don't want to splurge anymore, so they cut everything to the bone.

However, when equivalent hardware came around to the 7th gen. They performed the exact same as the 7th gen consoles. Which the minimum specs proved to you.

In fact, you don't even need a graphics card. You can run skyrim on low on intel HD 2000 graphics. Same performance. No card in sight. Where is the optimization you're so fond of now?

and the "combinations" arguments again? Is this 1989?

Okay, listen up. Manufacturers release drivers that optimize the hardware. They optimize games too with those drivers. They spend millions of dollars on driver optimization. You don't need to account for processors or anything else.

https://en.wikipedia.org/wiki/Device_driver

Read all of that. It shows that "different parts = bad" is grade A bullshit from someone who doesn't even live in this century. We fixed that problem long ago yet here you are crawling back to it. The hardware we use is insanely standardized, you don't need one company because multiple companies now work together.

Every device has a driver that takes care of all of that. PC doesn't have one company optimizing the parts. It has multiple, and they dump huge cash into it. PC and console have the same level of optimization now.

Your idea of special console optimization is a myth. Its been a myth for over 10 years now.

and passmark rankings mean nothing. Its an older card from nvidia, their cards are always powerful. That chart only shows what cards are good for gaming by being powerful. Which is mostly dominated by nvidia.

COMaestro:

Ultratwinkie:

The average desktop PC on steam, 70% of PC gaming, is a Nvidia 660. Their CPU is over 2.3 ghz on average, and a quad core. 1 GB of system ram, and 1 GB of VRAM. A 660 is actually better than the 750 that matches the consoles in power. So the average gaming PC is already more powerful than the next gen consoles running on very old tech.

Not counting over clocking of course.

The card that everyone is switching to is the 760, which blows even the 750 TI out of the water in a huge way. So on every front the average PC on steam is more powerful.

You are telling me that the average PC on Steam uses what is currently a $200 video card and that everyone is switching to a $250 video card? Just the video card? Tell me again how putting together a PC is cheaper than consoles? Let's face it, a low end computer with Win7 OS, 2 GB RAM and a 1 TB hard drive is going to cost at least $600, which is more than any console on the market. Making anything more powerful than that ups the price considerably, getting closer to the $1000 range, especially if you go for a high end video card, even worse if you get more than one for Crossfire or SLI.

Sorry but once you start talking about the performance of an "average" PC over a console and I start looking at prices, I think your definition of "average" is crap.

I got the average from non laptop pcs on steam stats. Before it went down. The most common nvidia card is a 660. In fact, I think the most common cards nvidia ever had are always 60s. I didn't grab the common ATI card. Though their new stuff is powerful in a way.

The average PC isn't looking to be a console. Its looking to be something beyond that because consoles set the minimum standard. Nvidia is popular, and they are expensive because of all the extras. AMD is for people on a budget and their R series beat consoles on the cheap too.

The next gen consoles are basically 750 TIs. 150$ at launch and will only get cheaper. The processor isn't important because console processors are anemic anyway. An old I3 will probably do the job. People don't shoot for console level power because they are stuck with console level performance which dies as the years go on. Which is why the 7th gen is stuck at 720p.

So PC gamers start climbing the power ladder, which consoles being bare minimum. There is no requirement to do so, but they do it because they want to. Not because they need to.

There are people running 2007 era graphics cards and still running games on near max. 7 years on.

Not to mention PC gamers have had rigs that are powerful for years thanks to the drought of new consoles before late 2013 that made a lot of parts cheap. All they have to do is swap a graphics card and they are done for most PCs now. Its what i did. You might need to upgrade the RAM but it isn't all that expensive anymore.

I got a 770 and spent less than a Ps4. That's the only upgrade I had to do to join the next gen. Not to mention physx and the all the other goodies that nvidia gives you. You shouldn't throw your PC away ever gen because most likely its still good. Its like throwing away your car every year because new cars are coming out, it doesn't make sense.

and that price tag you just claimed is just awful. Never buy prebuilt. Always build it yourself because the 30% mark up at all stores will come back to bite you.

In fact, here:

http://www.reddit.com/r/pcmasterrace/wiki/builds#wiki_the_.22next-gen.22_crusher

See the list of builds that says "next gen crusher?" Replace the 660 with a 2GB 750 Ti and save 30$. So ps4 level power for less than a ps4.

Unless of course you want more power than next gen, in which case keep the 660.

and before you complain about the reddit name, its a satirical joke. A lot like the onion, but they do have good guides here and there.

The reason I even have a 770 is because I wanted to really future proof the PC and get to play Metro last light on max.

Ultratwinkie:

Not to mention PC gamers have had rigs that are powerful for years thanks to the drought of new consoles before late 2013 that made a lot of parts cheap. All they have to do is swap a graphics card and they are done for most PCs now. Its what i did. You might need to upgrade the RAM but it isn't all that expensive anymore.

I got a 770 and spent less than a Ps4. That's the only upgrade I had to do to join the next gen. Not to mention physx and the all the other goodies that nvidia gives you. You shouldn't throw your PC away ever gen because most likely its still good. Its like throwing away your car every year because new cars are coming out, it doesn't make sense.

and that price tag you just claimed is just awful. Never buy prebuilt. Always build it yourself because the 30% mark up at all stores will come back to bite you.

In fact, here:

http://www.reddit.com/r/pcmasterrace/wiki/builds#wiki_the_.22next-gen.22_crusher

See the list of builds that says "next gen crusher?" Replace the 660 with a 2GB 750 Ti and save 30$. So ps4 level power for less than a ps4.

Unless of course you want more power than next gen, in which case keep the 660.

and before you complain about the reddit name, its a satirical joke. A lot like the onion, but they do have good guides here and there.

The reason I even have a 770 is because I wanted to really future proof the PC and get to play Metro last light on max.

I never DO buy pre-built, but when I factor in all the prices for motherboard/processor, RAM, HDD (more if SSD), video card, power supply, case, and OS, you get at least $600, and that is low to mid end. Anything better and you are looking at a higher cost.

Sure, you can upgrade your existing equipment, I'm not contesting that, but how often are you upgrading your RAM and or video card(s)? If it's more than once in a 5-8 year period, which is the approximate length of a console generation, you are probably paying more than you do for a console. Also, it seems every 5-8 years, you need to upgrade your motherboard, as new tech comes out that the old one does not support. I'm looking at how PCI was the norm, then AGP came along for video, now that's been replaced by PCIe. Or PATA changed to SATA. How long before something else comes along to replace either of them? Or has it already happened since I haven't built a computer in 4 years or so?

I'm not going to argue that a computer is more adaptable and ultimately puts out higher end graphics than a console, but I will always contest the opinion that it is cheaper to do so, and nothing you have said suggests otherwise to me. I will also contest that there is no reason to have a console, which is what your arguments pretty much always come down to. That is purely a matter of opinion, and you are never going to be able to convince some people otherwise, nor apparently will we convince you of our side of things.

Also, you continually miss the point Lightknight is making regarding Skyrim. According to the minimum requirements, Skyrim should NOT be able to run on a console. The fact that it does, even with "low" graphics, essentially proves that optimization based upon the console hardware DOES work and DOES matter.

COMaestro:

Ultratwinkie:

Not to mention PC gamers have had rigs that are powerful for years thanks to the drought of new consoles before late 2013 that made a lot of parts cheap. All they have to do is swap a graphics card and they are done for most PCs now. Its what i did. You might need to upgrade the RAM but it isn't all that expensive anymore.

I got a 770 and spent less than a Ps4. That's the only upgrade I had to do to join the next gen. Not to mention physx and the all the other goodies that nvidia gives you. You shouldn't throw your PC away ever gen because most likely its still good. Its like throwing away your car every year because new cars are coming out, it doesn't make sense.

and that price tag you just claimed is just awful. Never buy prebuilt. Always build it yourself because the 30% mark up at all stores will come back to bite you.

In fact, here:

http://www.reddit.com/r/pcmasterrace/wiki/builds#wiki_the_.22next-gen.22_crusher

See the list of builds that says "next gen crusher?" Replace the 660 with a 2GB 750 Ti and save 30$. So ps4 level power for less than a ps4.

Unless of course you want more power than next gen, in which case keep the 660.

and before you complain about the reddit name, its a satirical joke. A lot like the onion, but they do have good guides here and there.

The reason I even have a 770 is because I wanted to really future proof the PC and get to play Metro last light on max.

I never DO buy pre-built, but when I factor in all the prices for motherboard/processor, RAM, HDD (more if SSD), video card, power supply, case, and OS, you get at least $600, and that is low to mid end. Anything better and you are looking at a higher cost.

Sure, you can upgrade your existing equipment, I'm not contesting that, but how often are you upgrading your RAM and or video card(s)? If it's more than once in a 5-8 year period, which is the approximate length of a console generation, you are probably paying more than you do for a console. Also, it seems every 5-8 years, you need to upgrade your motherboard, as new tech comes out that the old one does not support. I'm looking at how PCI was the norm, then AGP came along for video, now that's been replaced by PCIe. Or PATA changed to SATA. How long before something else comes along to replace either of them? Or has it already happened since I haven't built a computer in 4 years or so?

I'm not going to argue that a computer is more adaptable and ultimately puts out higher end graphics than a console, but I will always contest the opinion that it is cheaper to do so, and nothing you have said suggests otherwise to me. I will also contest that there is no reason to have a console, which is what your arguments pretty much always come down to. That is purely a matter of opinion, and you are never going to be able to convince some people otherwise, nor apparently will we convince you of our side of things.

Also, you continually miss the point Lightknight is making regarding Skyrim. According to the minimum requirements, Skyrim should NOT be able to run on a console. The fact that it does, even with "low" graphics, essentially proves that optimization based upon the console hardware DOES work and DOES matter.

I just posted a computer that works for less than 400$. Where are you getting 600$? It only gets expensive if you make it expensive. The list I just I posted was a fully working gaming computer. You could hook it up to the TV and you have a pseudo console smart TV.

Things slowed down on PC. Companies no longer want to change anything unless a completely new generation of tech like DDR 4 comes around. There is no "lightning fast changes" its juts a bunch of words that sound alike that scare you. Tech doesn't change as fast as you think it does.

As it is, the graphics cards now are old graphics cards that's been overclocked. The only difference from each generation is about 10-20% difference in performance. Which you mostly could already do by overclocking the older one yourself.

and Light Knight is still wrong. He argues that consoles have special optimization when the Ps3's equivalent (7800) does the same thing. In fact, an old and weak intel HD 2000 APU that is weaker than consoles could run skyrim.

Consoles cannot be special if it runs on equivalent and weaker hardware the exact same way. When he realized that, he changes his tune to talk about how "all the different parts" are confusing for developers on PC to code for when drivers do all the hardware optimization for them.

So basically he is saying much more than just console optimization matters. He is saying its optimization is on a whole new level, which is false.

Ultratwinkie:
GOW selling less than the first GOW. That isn't an achievement. All you proved that the sales were front loaded and failed to reach penetration level sales.

What? "Sales were front loaded"? Every game sells the most copies in the first couple of years and then fewer and fewer as the years go on with a few spikes here or there. All of the GOW games did the same and GOW 3 is further ahead at this many years in than any of the others.

What do you think "penetration level sales" are? GOW 3 is the 19th best seller on the 360 for the entire generation. That's out of the 3,500 games it sold more than 3,481 of them. It is one of only 6 games post 2010 to be in the top 20. Had it been released on the ps3 it would have been 15th on the list (ironically, also being just better than Red Dead Redemption there too).

In what world is that not "penetration level sales" and what is the big difference between number 18 which is GOW 1 which you admit to have done splendidly and number 19 which is GOW 3? They're even likely to switch places within the month. I get it if you don't want to admit that you're wrong but the numbers couldn't be less in your favor. You can't look at a game that was one of the best sellers and somehow claim that it didn't sell well enough. That is axiomatically wrong.

And the cards I would have picked would have been 100x better. If they picked the 660 or even the 760 we would have much more range.

Why would we be better off with weaker cards? I think you're just arbitarily favoring one brand over another without any admission that both brands are legitimate companies and that development studios design their own game engines anyways and it's the game engines that determine the physics and rely on the power of the card to process. Physx is just a physics engine. Consoles and PCs and iOS devices use Havok as their physics engine which is a legitimate competitor of PhysX.

But, and I can't drive this point home enough for you, the PS4 DOES have PhysX and APEX support. NVIDIA decided to give it to them anyways (because it would be dumb/silly for them not to).

http://www.ign.com/articles/2013/03/07/nvidia-announces-physx-support-for-playstation-4
http://www.vg247.com/2013/03/08/ps4-nvidia-pledges-physx-support/

So, with AMD cards offering better overclocking and higher memory bandwidth and video RAM I can't think of a better scenario for the consoles with the best of both worlds. Why would NVIDIA do that? Because it means more PC games utilize it in PCs.

Hell, the 660 isn't actually high end. Its lower tier on the nvidia scale. Its even middle tier on AMD's scale. This card is old. The reason its in the high end card range is because nvidia consistently blows AMD away on the majority of cards. There is no contest. There was even evidence of that on the steam stats page that I referenced.

It doesn't matter. The 660 is a less powerful card than the HD 5870 and the ps4 uses a modified 5870.

What you've got to understand is that the two consoles going AMD means that development is going to start supporting that card brand a lot more. NVidia had benefitted from people relying on its CUDA cores which is also what made NVidia cards so much more expensive but with these major companies going with AMD developers will start relying on AMD's Open CL which is great. You're way overblowing the benefits of physx. Especially when Havok Software (supported by games like The Last of Us, Uncharted, and Bioshock) exists and is so excellent as is. Even with machines currently having PhysX, far fewer games use it than Havok.

The GTX 750 Ti is slightly slower than the 5870 too, by the way. You're literally arguing for a cheaper, shittier card just because you like the company more. But since Havok is CPU based and Physx is video card based, it simply doesn't matter if the video card has Physx or not since Havok is there but I've already made this a moot point.

I'm more than a little upset that I didn't research the topic a little more before getting dragged into this or I could have sidestepped this part days ago.

And its hilarious how you miss that the skyrim specs have proven you wrong. The 7800 is the same as the ps3 and it runs it at the same level. No optimization ever made it better than the dedicated card. Which means your idea of console optimization is grade A bullshit. Again.

You do realize that there are other specs in a computer other than GPU, right? Like RAM? Skyrim's minimum requirements = 2 GB, PS3's RAM = two 256MB.

I don't know about you but even on low Skyrim found its way above 1GB on my system. While having a lot of extra RAM doesn't really improve performance, having too little is crushing. They have to make up for it elsewhere and that gap is spanned by optimizations and special accomodations made specifically for consoles in a way that would never be done for PC owners.

and looked closely,

If you have to look closely in 1080p then who the hell cares? We're not looking for ghosts in the window at the back of a picture.

No matter how you spin it, these consoles are entry level. They don't want to splurge anymore, so they cut everything to the bone.

Do you remember what happened to the PS3 last generation? It came in at $600 and still lost money and the sales tanked because people weren't willing to pay for it. This is market driven.

I'll agree that it isn't as advanced as it was compared to the overall market last gen but it's still well above the average gaming machine now and a heck of a lot for the pricepoint.

Lightknight:

Ultratwinkie:
GOW selling less than the first GOW. That isn't an achievement. All you proved that the sales were front loaded and failed to reach penetration level sales.

What? "Sales were front loaded"? Every game sells the most copies in the first couple of years and then fewer and fewer as the years go on with a few spikes here or there. All of the GOW games did the same and GOW 3 is further ahead at this many years in than any of the others.

What do you think "penetration level sales" are? GOW 3 is the 19th best seller on the 360 for the entire generation. That's out of the 3,500 games it sold more than 3,481 of them. It is one of only 6 games post 2010 to be in the top 20. Had it been released on the ps3 it would have been 15th on the list (ironically, also being just better than Red Dead Redemption there too).

In what world is that not "penetration level sales" and what is the big difference between number 18 which is GOW 1 which you admit to have done splendidly and number 19 which is GOW 3? They're even likely to switch places within the month. I get it if you don't want to admit that you're wrong but the numbers couldn't be less in your favor. You can't look at a game that was one of the best sellers and somehow claim that it didn't sell well enough. That is axiomatically wrong.

And the cards I would have picked would have been 100x better. If they picked the 660 or even the 760 we would have much more range.

Why would we be better off with weaker cards? I think you're just arbitarily favoring one brand over another without any admission that both brands are legitimate companies and that development studios design their own game engines anyways and it's the game engines that determine the physics and rely on the power of the card to process. Physx is just a physics engine. Consoles and PCs and iOS devices use Havok as their physics engine which is a legitimate competitor of PhysX.

But, and I can't drive this point home enough for you, the PS4 DOES have PhysX and APEX support. NVIDIA decided to give it to them anyways (because it would be dumb/silly for them not to).

http://www.ign.com/articles/2013/03/07/nvidia-announces-physx-support-for-playstation-4
http://www.vg247.com/2013/03/08/ps4-nvidia-pledges-physx-support/

So, with AMD cards offering better overclocking and higher memory bandwidth and video RAM I can't think of a better scenario for the consoles with the best of both worlds. Why would NVIDIA do that? Because it means more PC games utilize it in PCs.

Hell, the 660 isn't actually high end. Its lower tier on the nvidia scale. Its even middle tier on AMD's scale. This card is old. The reason its in the high end card range is because nvidia consistently blows AMD away on the majority of cards. There is no contest. There was even evidence of that on the steam stats page that I referenced.

It doesn't matter. The 660 is a less powerful card than the HD 5870 and the ps4 uses a modified 5870.

What you've got to understand is that the two consoles going AMD means that development is going to start supporting that card brand a lot more. NVidia had benefitted from people relying on its CUDA cores which is also what made NVidia cards so much more expensive but with these major companies going with AMD developers will start relying on AMD's Open CL which is great. You're way overblowing the benefits of physx. Especially when Havok Software (supported by games like The Last of Us, Uncharted, and Bioshock) exists and is so excellent as is. Even with machines currently having PhysX, far fewer games use it than Havok.

The GTX 750 Ti is slightly slower than the 5870 too, by the way. You're literally arguing for a cheaper, shittier card just because you like the company more. But since Havok is CPU based and Physx is video card based, it simply doesn't matter if the video card has Physx or not since Havok is there but I've already made this a moot point.

I'm more than a little upset that I didn't research the topic a little more before getting dragged into this or I could have sidestepped this part days ago.

And its hilarious how you miss that the skyrim specs have proven you wrong. The 7800 is the same as the ps3 and it runs it at the same level. No optimization ever made it better than the dedicated card. Which means your idea of console optimization is grade A bullshit. Again.

You do realize that there are other specs in a computer other than GPU, right? Like RAM? Skyrim's minimum requirements = 2 GB, PS3's RAM = two 256MB.

I don't know about you but even on low Skyrim found its way above 1GB on my system. While having a lot of extra RAM doesn't really improve performance, having too little is crushing. They have to make up for it elsewhere and that gap is spanned by optimizations and special accomodations made specifically for consoles in a way that would never be done for PC owners.

and looked closely,

If you have to look closely in 1080p then who the hell cares? We're not looking for ghosts in the window at the back of a picture.

No matter how you spin it, these consoles are entry level. They don't want to splurge anymore, so they cut everything to the bone.

Do you remember what happened to the PS3 last generation? It came in at $600 and still lost money and the sales tanked because people weren't willing to pay for it. This is market driven.

I'll agree that it isn't as advanced as it was compared to the overall market last gen but it's still well above the average gaming machine now and a heck of a lot for the pricepoint.

The 5870? You're saying the next gen card is a 5870? A card from 2009? The 760 beats that.

http://www.hwcompare.com/14880/geforce-gtx-760-vs-radeon-hd-5870/
http://www.futuremark.com/hardware/gpu/NVIDIA+GeForce+GTX+760/benchmarks

Even the 660 does too.
http://www.futuremark.com/hardware/gpu/NVIDIA+GeForce+GTX+660/benchmarks

Even the 750 Ti is better, 2,000 vs 3,000:
http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+750+Ti&id=2815
http://www.videocardbenchmark.net/high_end_gpus.html

The reason you use 1-2GB of RAM is because of texture difference. Which is why I said look closely at the textures and you see the difference. RAM isn't power related, its memory related. More ram means more things to keep track of. Which as I already told you the PC has a lot of. Not counting much higher draw distance.

So Skyrim has reason it uses 2GB of ram, and that's because it wasn't cut down to play on consoles. Which you refuse to believe and think the PC and console version are the same thing. They aren't. And you refuse to look at anything else otherwise.

Even the phsyx support won't go as far as it could if they chose higher end parts. They cut down the order into a corridor without physx, how do you think their limited hardware will handle physx? Without cutting it down to almost nothing? Truth is it won't.

and this isn't because the ps3 cost 600$. This was because they wanted the best technology has to offer.

In equivalent times, it would be like sony putting a 780 TI into the ps4. Nevermind the 770 did about 60-80% of the power for much less cost.

Its not about the best, its about the versatile tech. Its like they bought a lambo, found it too expensive then threw the lambo in the garbage and went back to a scooter.

This isn't black and white, there is a middle area they completely ignored. They could've bought a family sedan instead of the lambo then bought another family sedan when it got too old.

It all about coasting on the gradient of moderation. Not race to the absolute best like they did with the ps3. They failed to coast on that gradient, got drunk, and now they are pouring all the alcohol down the drain because "alcohol is evil."

I would have been happy if they went to the R series of amd.The R series is one of the few areas that AMD can actually compete with nvidia on a power scale. Amd is known for cheap parts, not their power. If they got a 660,760, or 750 I would be happy. The highest I could ever see anything going would be an under volted 770. At absolute max. Instead they started penny pinching and cutting everything down to bare minimum. And the only people who are going to suffer are gamers, publishers, and devs.

how? low tech, low future proofing. They want this gen to last 10 years, and this tech won't be able to bring them there. They will scrape the barrel of power long before the end of the 10 year generation.

We saw how bad it got in the 7th gen when devs and publishers got desperate for power. The costs shoot up, DRM rears its ugly head, and tacky DLC a plenty. At this rate, we might have about 4-6 years of that. Possibly less because developers are boasting about how they maxed out the next gen already and how they need to compromise to get the graphics they want.

I take it this means you now agree that GOW 3 wasn't a failure somehow. Glad we could come into agreement on something over the internet. But it's pretty hard to continue claiming that 19th top selling game of the entire 360 console generation somehow underperformed. I am most pleased that I can stop defending a series I don't like. Hopefully that line of discussion will put into perspective that this being a cover-based shooter isn't an auto-fail.

Ultratwinkie:
The 5870? You're saying the next gen card is a 5870? A card from 2009? The 760 beats that.

Clearly a typo. We were discussing the amd 7870 (the PS4's equivalent GPU now that it has finally been reverse engineered). Sorry for the confusion but surely you'd have been able to figure that out from context.

Anyways, the ps4 it turns out is actually a modified 7870. So it should actually be more powerful even than the 7870 which itself is the 27th most powerful card on the consumer market at the moment (at least that Passmark has tested).

The difference between the 7870 and the 760 Ti is negligeable. It's 4,258 vs 4,387 so around 120 points difference when both are over 4,000. Even then, the 760 Ti is only for OEM. Quick, go find a link to be able to purchase it. As far as I can tell it still isn't even available yet despite having been made some time ago. From what I was told, it was supposed to just be a rebranded 670 which is a great card but it certainly isn't performing like it.

So please explain why Sony would spend 33% more on that card for less than 3% of an increase when 7870's have all the advantages of AMD gpus and can still support PhysX which you were touting as the big dynamite differece of Nvidia? This is all win and no lose.

Not only that, but all of these cards are in the top 30 performers at this time (4 months after release). You really don't have an argument here. The ps4 isn't cutting edge but it's darn nice for the price point. Do you recall what happened to the ps3 sales when they reached for that 33% more expensive card?

The reason you use 1-2GB of RAM is because of texture difference. Which is why I said look closely at the textures and you see the difference. RAM isn't power related, its memory related. More ram means more things to keep track of. Which as I already told you the PC has a lot of. Not counting much higher draw distance.

Here's where things get weird, the ps3 is a bit more proprietary than that. You are technically right here but the ps3 also decided to create asset categories to help deal with this kind of thing. The asset categories were generally responsible for tracking object while the RAM would have assisted with asset quality (aka, texture). You may recall Skyrim having a significant problem on the ps3 for something like 6 months while Bethesda scrambled to fix the issue (again, this was the reason I built my pc). The actual issue was that the individual asset categories were getting too bloated and the ps3 was crashing because of it. I've worked as a QA engineer on software for some time and the problem was known pretty early so, since I had the game, I tested it. On release, no assets were resetting. You could go back into any dungeon you'd cleared after years of in-game sleeping and still see bodies and arrows and objects litering the ground. Most outdoor bodies/enemies would refresh but not dropped weapons. The flowers would also refresh. Nirnroots were stacking blooms each time you picked one and enemies you encountered but did not engage were still being tracked across the world. The large file sizes people were talking about weren't the problem, they were just indicative of the issue. Basically, the ps3s were crashing because if any one of those asset categories got too bloated, the system crashes.

Now, those aren't RAM but those are responsible for keeping track of the objects. RAM would have been better/cheaper but I guess they'd already spent so much on the card that more RAM would have broken their backs with the initial price point already being $600 back then.

I wonder if the PS4's GDDR5 will impact texture at all. From what I've read it will make a difference but should be minor.

So Skyrim has reason it uses 2GB of ram, and that's because it wasn't cut down to play on consoles. Which you refuse to believe and think the PC and console version are the same thing. They aren't. And you refuse to look at anything else otherwise.

It wasn't cut down perse... The Console versions were deeply flawed in ways the pc version wasn't (such as the failing to reset dungeons). What actually happened is they patched the PC version in a way that made them better with the HD texture patch. But because they couldn't get asset bloating under control on the ps3 version (they didn't have this issue with the 360, fyi), they could not add anything. This is why Skyrim never got DLC on the ps3. Additionally, both consoles had actual texture glitches. Not even a resource problem but actual glitches in the code. It's easy now that we're entering a new generation of x86 consoles to forget that previous generations all had proprietary hardware and that porting wasn't anywhere near as simple as it will be going forward. No, the 360 and ps3 versions were both different codes (360 less so because it was a lot closer to x86 than any other console).

Now then, do you have examples comparing the 360 post texture patches to the pc? I am interested in seeing it.

Have you looked at the Bioshock Infinite comparisions I posted? They basically all looked the same but the 360 was actually hazier. This is because Bioshock Infinite didn't have any asset bloating issues.

So perhaps Skyrim was a particularly poor choice of mine to use considering that it's why I built my machine due to its general handling of assets. But the requirements should have all still been for Vanilla skyrim and not post-patch/upgrade Skyrim.

Even the phsyx support won't go as far as it could if they chose higher end parts. They cut down the order into a corridor without physx, how do you think their limited hardware will handle physx? Without cutting it down to almost nothing? Truth is it won't.

That's silly. Physx is just software code. If they were given the physics engine then it'll function the same as if it were an Nvidia card. The only thing to worry about here is the power of the card and the 7870 is no slouch. Again, it isn't a Titan or a 780 Ti, but it also isn't a 5870. There has to be some kind of balance. You can't make the console $1,000 and have people actually buy it. Sony already learned that lesson. Do you have an argument to explain why consoles have to be cutting edge? I don't think "because they were last generation" is a valid argument with both companies showing losses in the billions in the first generation they shot much higher than normal.

And I'm not sure if you're aware of this, but Sony and Microsoft didn't go with NVidia because they needed their GPU solution to offer a SOC option. System on a Chip. That's what also prevented Intel from CPU or GPU considerations.

Here's a Forbes article explaining that. If you want x86 SOC you go AMD. For computer users, we don't have to have SOC so it's not really a concern for us. It is for consoles.

and this isn't because the ps3 cost 600$. This was because they wanted the best technology has to offer.

In equivalent times, it would be like sony putting a 780 TI into the ps4. Nevermind the 770 did about 60-80% of the power for much less cost.

I don't have the numbers to compare the historical pricing and performance of those GPUs so I can't agree with or disagree with what you're saying. However, if you can figure out the price of the Nvidia 7800 at the time of pruchase then we can account for inflation or deflation in the video card market and figure out what the actual comparable video card would have been rather than wild guessing. I can't find the price history so I have no idea what the NVidia 7800 cost back then.

Its not about the best, its about the versatile tech. Its like they bought a lambo, found it too expensive then threw the lambo in the garbage and went back to a scooter.

Right, because a video card that's almost exactly as powerful as the one you recommended and is the 27th most powerful card at the moment is a scooter.[/sarcasm] A scooter isn't on the top of the list of performance for nearly any product.

This isn't black and white, there is a middle area they completely ignored. They could've bought a family sedan instead of the lambo then bought another family sedan when it got too old.

Ok, now you're sending mixed signals. First you seem to be saying, "Boo, they didn't release a Titan" and now you're like "they don't have to be the best". In what way does a 7870 not meet these requirements other than not having the name NVIDIA printed on the chip? Have you actually looked at the high performance list?

http://videocardbenchmark.net/high_end_gpus.html

Sorry, but you don't get to say what is a lambo and what isn't. The performance of the device speaks for itself. Look, if you gave up on the GoW 3 because it's sales were on the top of the list, you've got to give up on this too. It's like you have no regard for how things perform in relation to other things (let me know if this is the case and I'll try to adjust my position). Lists matter, especially a list made in order of performance while we're discussing performance. You keep listing Nvidia cards like they're the gold standard when they keep hitting lower in performance measure than the card being referenced. First your argument was that they don't have physx so the power difference doesn't matter and now that the PS4 does have physx you're beginning to behave like the cards themselves perform better. That's just not true according to actual non-biased testing. Raw numbers.

I would have been happy if they went to the R series of amd.The R series is one of the few areas that AMD can actually compete with nvidia on a power scale. Amd is known for cheap parts, not their power. If they got a 660,760, or 750 I would be happy. The highest I could ever see anything going would be an under volted 770. At absolute max. Instead they started penny pinching and cutting everything down to bare minimum. And the only people who are going to suffer are gamers, publishers, and devs.

So if they'd gone 660, 760, or 750 you'd have been happy? Then be happy because the 660 is three steps under the 7870 according to passmark (and better according to GPU Boss)and the 750 is 22 steps under the 7870 (just over 1,000 points lower on their scale). 750 is that much weaker than the 660 which is already weaker than the 7870.

Only the 760 from your list is more powerful and that's at a decent (but not impossible) jump in price. However, as stated above they couldn't produce an SoC option. As far as AMD's R line there's only the R9 270x that's a little bit better at a comparable price. I do know that the card is modified so perhaps it does go up that way somehow but I don't think the jump would be worth it. But almost every card above the 7870 is significantly more expensive. I think they managed to hit the sweet spot while maintaining their SoC requirement. Frankly, I don't think they could have done any better and especially not over a year ago when the prices could have been even more extreme.

how? low tech, low future proofing. They want this gen to last 10 years, and this tech won't be able to bring them there. They will scrape the barrel of power long before the end of the 10 year generation.

No such thing as future proofing. Never has been. But keep in mind that the 10 year generation doesn't mean that the next generation doesn't get released during it. That's what Sony and Microsoft are doing now with the ps3/360 continuing to be supported for the next few years (so yeah, they are hitting that 10-year target).

Will they scrape the bottom of the barrel sooner? I'd think so unless technology slows down like people keep claiming it's doing each year before we magically (seemingly) make another breakthrough that punts the ball further. But I'm not that worried about it because the advance in graphics is good enough to suit my needs. Honestly, the PS3/360 generation had nearly advanced enough to produce graphics that are simply good enough for me. The Last of Us and the Uncharted games were beautiful. It wasn't quite there but it was close. I feel like this generation is powerful enough to get there in a way that future consoles will just be polishing things up well enough. But then again, I'm not a graphiophile (and no fault to those who are).

But you can't expect consoles to shed billions of dollars over the generation because they took too big a hit at the start of the generation and think they'll do the same thing again. These aren't charities and like it or not, they will control the gaming market and will bottleneck it when their limits are reached (same as the 7th generation did).

Lightknight:
I take it this means you now agree that GOW 3 wasn't a failure somehow. Glad we could come into agreement on something over the internet. But it's pretty hard to continue claiming that 19th top selling game of the entire 360 console generation somehow underperformed. I am most pleased that I can stop defending a series I don't like. Hopefully that line of discussion will put into perspective that this being a cover-based shooter isn't an auto-fail.

Ultratwinkie:
The 5870? You're saying the next gen card is a 5870? A card from 2009? The 760 beats that.

Clearly a typo. We were discussing the amd 7870 (the PS4's equivalent GPU now that it has finally been reverse engineered). Sorry for the confusion but surely you'd have been able to figure that out from context.

Anyways, the ps4 it turns out is actually a modified 7870. So it should actually be more powerful even than the 7870 which itself is the 27th most powerful card on the consumer market at the moment (at least that Passmark has tested).

The difference between the 7870 and the 760 Ti is negligeable. It's 4,258 vs 4,387 so around 120 points difference when both are over 4,000. Even then, the 760 Ti is only for OEM. Quick, go find a link to be able to purchase it. As far as I can tell it still isn't even available yet despite having been made some time ago. From what I was told, it was supposed to just be a rebranded 670 which is a great card but it certainly isn't performing like it.

So please explain why Sony would spend 33% more on that card for less than 3% of an increase when 7870's have all the advantages of AMD gpus and can still support PhysX which you were touting as the big dynamite differece of Nvidia? This is all win and no lose.

Not only that, but all of these cards are in the top 30 performers at this time (4 months after release). You really don't have an argument here. The ps4 isn't cutting edge but it's darn nice for the price point. Do you recall what happened to the ps3 sales when they reached for that 33% more expensive card?

The reason you use 1-2GB of RAM is because of texture difference. Which is why I said look closely at the textures and you see the difference. RAM isn't power related, its memory related. More ram means more things to keep track of. Which as I already told you the PC has a lot of. Not counting much higher draw distance.

Here's where things get weird, the ps3 is a bit more proprietary than that. You are technically right here but the ps3 also decided to create asset categories to help deal with this kind of thing. The asset categories were generally responsible for tracking object while the RAM would have assisted with asset quality (aka, texture). You may recall Skyrim having a significant problem on the ps3 for something like 6 months while Bethesda scrambled to fix the issue (again, this was the reason I built my pc). The actual issue was that the individual asset categories were getting too bloated and the ps3 was crashing because of it. I've worked as a QA engineer on software for some time and the problem was known pretty early so, since I had the game, I tested it. On release, no assets were resetting. You could go back into any dungeon you'd cleared after years of in-game sleeping and still see bodies and arrows and objects litering the ground. Most outdoor bodies/enemies would refresh but not dropped weapons. The flowers would also refresh. Nirnroots were stacking blooms each time you picked one and enemies you encountered but did not engage were still being tracked across the world. The large file sizes people were talking about weren't the problem, they were just indicative of the issue. Basically, the ps3s were crashing because if any one of those asset categories got too bloated, the system crashes.

Now, those aren't RAM but those are responsible for keeping track of the objects. RAM would have been better/cheaper but I guess they'd already spent so much on the card that more RAM would have broken their backs with the initial price point already being $600 back then.

I wonder if the PS4's GDDR5 will impact texture at all. From what I've read it will make a difference but should be minor.

So Skyrim has reason it uses 2GB of ram, and that's because it wasn't cut down to play on consoles. Which you refuse to believe and think the PC and console version are the same thing. They aren't. And you refuse to look at anything else otherwise.

It wasn't cut down perse... The Console versions were deeply flawed in ways the pc version wasn't (such as the failing to reset dungeons). What actually happened is they patched the PC version in a way that made them better with the HD texture patch. But because they couldn't get asset bloating under control on the ps3 version (they didn't have this issue with the 360, fyi), they could not add anything. This is why Skyrim never got DLC on the ps3. Additionally, both consoles had actual texture glitches. Not even a resource problem but actual glitches in the code. It's easy now that we're entering a new generation of x86 consoles to forget that previous generations all had proprietary hardware and that porting wasn't anywhere near as simple as it will be going forward. No, the 360 and ps3 versions were both different codes (360 less so because it was a lot closer to x86 than any other console).

Now then, do you have examples comparing the 360 post texture patches to the pc? I am interested in seeing it.

Have you looked at the Bioshock Infinite comparisions I posted? They basically all looked the same but the 360 was actually hazier. This is because Bioshock Infinite didn't have any asset bloating issues.

So perhaps Skyrim was a particularly poor choice of mine to use considering that it's why I built my machine due to its general handling of assets. But the requirements should have all still been for Vanilla skyrim and not post-patch/upgrade Skyrim.

Even the phsyx support won't go as far as it could if they chose higher end parts. They cut down the order into a corridor without physx, how do you think their limited hardware will handle physx? Without cutting it down to almost nothing? Truth is it won't.

That's silly. Physx is just software code. If they were given the physics engine then it'll function the same as if it were an Nvidia card. The only thing to worry about here is the power of the card and the 7870 is no slouch. Again, it isn't a Titan or a 780 Ti, but it also isn't a 5870. There has to be some kind of balance. You can't make the console $1,000 and have people actually buy it. Sony already learned that lesson. Do you have an argument to explain why consoles have to be cutting edge? I don't think "because they were last generation" is a valid argument with both companies showing losses in the billions in the first generation they shot much higher than normal.

And I'm not sure if you're aware of this, but Sony and Microsoft didn't go with NVidia because they needed their GPU solution to offer a SOC option. System on a Chip. That's what also prevented Intel from CPU or GPU considerations.

Here's a Forbes article explaining that. If you want x86 SOC you go AMD. For computer users, we don't have to have SOC so it's not really a concern for us. It is for consoles.

and this isn't because the ps3 cost 600$. This was because they wanted the best technology has to offer.

In equivalent times, it would be like sony putting a 780 TI into the ps4. Nevermind the 770 did about 60-80% of the power for much less cost.

I don't have the numbers to compare the historical pricing and performance of those GPUs so I can't agree with or disagree with what you're saying. However, if you can figure out the price of the Nvidia 7800 at the time of pruchase then we can account for inflation or deflation in the video card market and figure out what the actual comparable video card would have been rather than wild guessing. I can't find the price history so I have no idea what the NVidia 7800 cost back then.

Its not about the best, its about the versatile tech. Its like they bought a lambo, found it too expensive then threw the lambo in the garbage and went back to a scooter.

Right, because a video card that's almost exactly as powerful as the one you recommended and is the 27th most powerful card at the moment is a scooter.[/sarcasm] A scooter isn't on the top of the list of performance for nearly any product.

This isn't black and white, there is a middle area they completely ignored. They could've bought a family sedan instead of the lambo then bought another family sedan when it got too old.

Ok, now you're sending mixed signals. First you seem to be saying, "Boo, they didn't release a Titan" and now you're like "they don't have to be the best". In what way does a 7870 not meet these requirements other than not having the name NVIDIA printed on the chip? Have you actually looked at the high performance list?

http://videocardbenchmark.net/high_end_gpus.html

Sorry, but you don't get to say what is a lambo and what isn't. The performance of the device speaks for itself. Look, if you gave up on the GoW 3 because it's sales were on the top of the list, you've got to give up on this too. It's like you have no regard for how things perform in relation to other things (let me know if this is the case and I'll try to adjust my position). Lists matter, especially a list made in order of performance while we're discussing performance. You keep listing Nvidia cards like they're the gold standard when they keep hitting lower in performance measure than the card being referenced. First your argument was that they don't have physx so the power difference doesn't matter and now that the PS4 does have physx you're beginning to behave like the cards themselves perform better. That's just not true according to actual non-biased testing. Raw numbers.

I would have been happy if they went to the R series of amd.The R series is one of the few areas that AMD can actually compete with nvidia on a power scale. Amd is known for cheap parts, not their power. If they got a 660,760, or 750 I would be happy. The highest I could ever see anything going would be an under volted 770. At absolute max. Instead they started penny pinching and cutting everything down to bare minimum. And the only people who are going to suffer are gamers, publishers, and devs.

So if they'd gone 660, 760, or 750 you'd have been happy? Then be happy because the 660 is three steps under the 7870 according to passmark (and better according to GPU Boss)and the 750 is 22 steps under the 7870 (just over 1,000 points lower on their scale). 750 is that much weaker than the 660 which is already weaker than the 7870.

Only the 760 from your list is more powerful and that's at a decent (but not impossible) jump in price. However, as stated above they couldn't produce an SoC option. As far as AMD's R line there's only the R9 270x that's a little bit better at a comparable price. I do know that the card is modified so perhaps it does go up that way somehow but I don't think the jump would be worth it. But almost every card above the 7870 is significantly more expensive. I think they managed to hit the sweet spot while maintaining their SoC requirement. Frankly, I don't think they could have done any better and especially not over a year ago when the prices could have been even more extreme.

how? low tech, low future proofing. They want this gen to last 10 years, and this tech won't be able to bring them there. They will scrape the barrel of power long before the end of the 10 year generation.

No such thing as future proofing. Never has been. But keep in mind that the 10 year generation doesn't mean that the next generation doesn't get released during it. That's what Sony and Microsoft are doing now with the ps3/360 continuing to be supported for the next few years (so yeah, they are hitting that 10-year target).

Will they scrape the bottom of the barrel sooner? I'd think so unless technology slows down like people keep claiming it's doing each year before we magically (seemingly) make another breakthrough that punts the ball further. But I'm not that worried about it because the advance in graphics is good enough to suit my needs. Honestly, the PS3/360 generation had nearly advanced enough to produce graphics that are simply good enough for me. The Last of Us and the Uncharted games were beautiful. It wasn't quite there but it was close. I feel like this generation is powerful enough to get there in a way that future consoles will just be polishing things up well enough. But then again, I'm not a graphiophile (and no fault to those who are).

But you can't expect consoles to shed billions of dollars over the generation because they took too big a hit at the start of the generation and think they'll do the same thing again. These aren't charities and like it or not, they will control the gaming market and will bottleneck it when their limits are reached (same as the 7th generation did).

Nvidia can do whatever you want them to do and they'll do it well, you just need to pay them.

Nvidida even admitted that they could do what AMD did, but they weren't willing to be paid at a loss.

It also factors into physx. They are very protective of it, and they don't like the idea of them losing their biggest edge. Consoles will have to deal with a huge performance hit than an nvidia card would. They make sure physx always works without the hassle.

AMD cards can run physx, but the performance hit they get negate any benefit. Want to know why? Device drivers and CUDA cores.

https://en.wikipedia.org/wiki/CUDA

Not to mention the higher build quality and lower heat output. Nvidia just makes better hardware, even the equivalent cards would be better on Nvidia because of how much extra work they put into them.

Consoles are all about compromise, and Nvidia cards have way more leeway than an AMD card could give. physx won't come to consoles unless serious cut backs are made to support it.

The 660 is actually more powerful depending on what version. the Ti is normally just an overclock and replaces the older card versions. They also can be just more efficiently designed versions, but people just call them overclocks even though its not always the case. Modern cards are all 660 TIs and that beats the 7870. Which is what a console would be built with if they choose it.

Same reason the 760 TI is not off the table. OEMs are basically businesses that buy the hardware, and Sony and microsoft qualify as those.

speaking of cards, the 7870 is under volted. So its actually less powerful. Its basically underclocked.

http://www.futuremark.com/hardware/gpu/AMD+Radeon+HD+7870/review
http://www.extremetech.com/gaming/173127-ps4-xbox-one-power-consumption-analysis-points-to-sony-advantage-and-future-efficiency-gains

GDDR5 is overkill beyond 2-4Gbs. GDDR 5 is awful for processors, and unless they want to use 2k and 4k textures anything beyond 4 is overkill. I doubt even blue ray could hold 4k texture games. We sit at 50GBs and we are far from 2k textures. Similarly, DDR3 is awful for graphics.

The ps4 is limited by processor. The xbox is limited by graphics. Both are on entirely different pages. processors may not need to be powerful, but DDR 3 helps when you want to do more complex stuff like AI over a large area.

But for textures, it won't make a difference. the Ps4 was way too much for it to matter.

The biggest problem with your list is that you assume a list entitled "high end cards" are actually high end forever. The majority of the cards on there are completely ancient.

You even have a "high end" card on that list that manages a pathetic 733. In an age of 8,000+ cards. At this rate every card is a high end card.

You also assume every card on there is what consoles have. Which isn't true, as I just posted the power draw for the 7870 would double the power draw of the ps4. It would be an underclocked 7870, where as the 750 TI is low power and wouldn't be under volted with just a 60w draw. There is a lot of headroom to even overclock it. It doesn't even run that hot.

http://www.futuremark.com/hardware/gpu/NVIDIA+GeForce+GTX+750+Ti/review

When you take it all in, there aren't all that many cards. Past AMD, and Nvidia, there really isn't anyone else. AMD, and Nvidia put out only a few cards every so often. So of course its going to be "high" on a list of only a handful. The list wasn't that big to begin with.

Nvidia only has 50-60-70-80. AMD isn't much better. That's around 8 cards per generation that can actually game, not counting the 90s cards that come around every so often.

Want a site that actually changes with the course of time?

http://www.futuremark.com/

Its the same people that made the 3dmark benchmark. Some of the best in the business. The fire strike benchmark is the true test of how a card performs.

and consoles already shed billions of dollars just on the research and development alone. Consoles are an expensive business, and the tech they want to use won't last.

They spent too much last generation for the best tech the world had to offer at the time, and now they cut everything down to the bargain bin because they didn't want to spend their money wisely.

if they went with reasonable tech back then, and went with the reasonable tech now, we wouldn't have run into this problem.

Ultratwinkie:
Nvidia can do whatever you want them to do and they'll do it well, you just need to pay them.

Nvidida even admitted that they could do what AMD did, but they weren't willing to be paid at a loss.

Ok? I'm unsure how this impacts anything. AMD could do it at the needed price.

It also factors into physx. They are very protective of it, and they don't like the idea of them losing their biggest edge. Consoles will have to deal with a huge performance hit than an nvidia card would. They make sure physx always works without the hassle.

Good thing that it's NVidia that providing and designing the actual Physx support then, unless you think they'll do a purposefully shitty job.

AMD cards can run physx, but the performance hit they get negate any benefit. Want to know why? Device drivers and CUDA cores.

https://en.wikipedia.org/wiki/CUDA

What do you think Nvidia agreeing to support Physx on the ps4 means? It means they'll make drivers that work with the PS4.

CUDA cores, which I mentioned earlier in our discussion, are just why Nvidia cards typically outperform other video cards. They don't magically make the cards perform higher than the numbers they actually perform at. For example, the 7870's performance will always be at that level above the other cards you mentioned, CUDA or not. It just comes at a cost. As long as the drivers are designed for the card they're implemented on then we shouldn't give two flying monkey shits what the name on the card is.

That being said, Physx has been around since 2005 and has had very few games that have used it (bear in mind, games that "support it" are not necessarily games that use it). It simply isn't that widely used. You're talking something like one or two major games a year and 4-5 games total that actually use it. Over 7 games total that "support it" but don't necessarily use it (in a given year). What you have to realise is that a lot of AAA game studios (the kind that generally need particle physics) develop their own proprietary physics engine. When they do that, they don't use anything else.

Your overall knowledge of Physx may be dated. I'd say 5 years ago they were the best with Havok being clearly dated at that point. But Havok just released a new physics engine last year. It's damn nice software but I don't know which one actually pulls out ahead but Havok is also very widely used (as I stated, the Source Engine uses a modified Havok engine for its physics).

Havok can be used with OpenCL whereas Physx can only be run on CUDA. That's more significant than you may think and is a non-trivial reason that most games go with Havok.

The 660 is actually more powerful depending on what version. the Ti is normally just an overclock and replaces the older card versions. They also can be just more efficiently designed versions, but people just call them overclocks even though its not always the case. Modern cards are all 660 TIs and that beats the 7870. Which is what a console would be built with if they choose it.

AMD cards are better at overclocking than Nvidia cards. Are we going to include overclocking capabilities here too?

Same reason the 760 TI is not off the table. OEMs are basically businesses that buy the hardware, and Sony and microsoft qualify as those.

As stated, the difference in power between the 7870 and the 760 TI is almost nonexistent but the price is.

speaking of cards, the 7870 is under volted. So its actually less powerful. Its basically underclocked.

Doesn't matter, the performance marks are still the performance marks and we have no idea how Sony has actually modified the card. This comment would be like telling me that some guy at the olympics came in at third but he was wearing pink. Um, ok, but that doesn't change the measureable performance.

GDDR5 is overkill beyond 2-4Gbs. GDDR 5 is awful for processors, and unless they want to use 2k and 4k textures anything beyond 4 is overkill. I doubt even blue ray could hold 4k texture games. We sit at 50GBs and we are far from 2k textures. Similarly, DDR3 is awful for graphics.

I'm not sure overkill (aka, too powerful) is a valid complaint. I do wonder why they went that way though. Maybe just a cheap (better be cheap) way to stand out in a lineup?

You're confusing the typical CPU with the console's combined GPU/CPU (APU). Also, you're ignoring the super high bandwidth. It basically makes up for the slower timing by eliminating the transmission latency which they're doing thanks to the the tighter connection between RAM and APU. As long as Sony got the GDDR5 for relatively cheap then there's no bad side to this and a heck of a lot of good. This configuration would have net negatives in a traditional PC though. But this is another example of hardware optimisation.

The ps4 is limited by processor. The xbox is limited by graphics. Both are on entirely different pages. processors may not need to be powerful, but DDR 3 helps when you want to do more complex stuff like AI over a large area.

They both have the same or a very similar processor, I thought.

Still, today's processes really don't rely on the CPU the way they used to. This really isn't a liability for the next couple of years because the CPU should be the last resource to be fully tapped except, again, to serve as a glorified switchboard operator to send processes to RAM/HDD/GPU. But at the end of the generation when the barrel is getting scraped it will diminish the wiggle room they could have had. But most any other type of CPU would have pushed that price up and unnecessarily for the next 4 years (maybe less). We'll begin to really notice the bottom of that barrel by year 5 and then it'll drag on until the ps5 comes out (assuming that it actually turns a profit this time which is already looking good).

The biggest problem with your list is that you assume a list entitled "high end cards" are actually high end forever. The majority of the cards on there are completely ancient.

What? No. I just assume that they are currently high end and that's all that we can ask for.

When you take it all in, there aren't all that many cards. Past AMD, and Nvidia, there really isn't anyone else. AMD, and Nvidia put out only a few cards every so often. So of course its going to be "high" on a list of only a handful. The list wasn't that big to begin with.

Doesn't matter. The performance numbers are still comparable.

Want a site that actually changes with the course of time?

http://www.futuremark.com/

If you really like that site then you should acknowledge that the highest performing card right now is the AMD 7990 and that the 7870 is even higher on this list than the other placed it. Also, the cards you said you'd be happier with are lower on the list.

and consoles already shed billions of dollars just on the research and development alone. Consoles are an expensive business, and the tech they want to use won't last.

Never said otherwise. But they have to pick a spot and run with it. That spot will then continue to be supported by developers as long as its popular. Though, this should make a PC owner a bit mad. It means most of our games are bottlenecked under the hood by the hardware in consoles because developers aren't going to create a new engine just for pc.

They spent too much last generation for the best tech the world had to offer at the time, and now they cut everything down to the bargain bin because they didn't want to spend their money wisely.

if they went with reasonable tech back then, and went with the reasonable tech now, we wouldn't have run into this problem.

If you're ok with them going with reasonable tech, then what's your issue here? This is actually higher-end tech at the moment so that it's well priced too is the best of all worlds for the average consumer. All I care is that it is a significant step in the right direction and our PC games will finally start moving forward too.

Lightknight:

Ultratwinkie:
Nvidia can do whatever you want them to do and they'll do it well, you just need to pay them.

Nvidida even admitted that they could do what AMD did, but they weren't willing to be paid at a loss.

Ok? I'm unsure how this impacts anything. AMD could do it at the needed price.

It also factors into physx. They are very protective of it, and they don't like the idea of them losing their biggest edge. Consoles will have to deal with a huge performance hit than an nvidia card would. They make sure physx always works without the hassle.

Good thing that it's NVidia that providing and designing the actual Physx support then, unless you think they'll do a purposefully shitty job.

AMD cards can run physx, but the performance hit they get negate any benefit. Want to know why? Device drivers and CUDA cores.

https://en.wikipedia.org/wiki/CUDA

What do you think Nvidia agreeing to support Physx on the ps4 means? It means they'll make drivers that work with the PS4.

CUDA cores, which I mentioned earlier in our discussion, are just why Nvidia cards typically outperform other video cards. They don't magically make the cards perform higher than the numbers they actually perform at. For example, the 7870's performance will always be at that level above the other cards you mentioned, CUDA or not. It just comes at a cost. As long as the drivers are designed for the card they're implemented on then we shouldn't give two flying monkey shits what the name on the card is.

That being said, Physx has been around since 2005 and has had very few games that have used it (bear in mind, games that "support it" are not necessarily games that use it). It simply isn't that widely used. You're talking something like one or two major games a year and 4-5 games total that actually use it. Over 7 games total that "support it" but don't necessarily use it (in a given year). What you have to realise is that a lot of AAA game studios (the kind that generally need particle physics) develop their own proprietary physics engine. When they do that, they don't use anything else.

Your overall knowledge of Physx may be dated. I'd say 5 years ago they were the best with Havok being clearly dated at that point. But Havok just released a new physics engine last year. It's damn nice software but I don't know which one actually pulls out ahead but Havok is also very widely used (as I stated, the Source Engine uses a modified Havok engine for its physics).

Havok can be used with OpenCL whereas Physx can only be run on CUDA. That's more significant than you may think and is a non-trivial reason that most games go with Havok.

The 660 is actually more powerful depending on what version. the Ti is normally just an overclock and replaces the older card versions. They also can be just more efficiently designed versions, but people just call them overclocks even though its not always the case. Modern cards are all 660 TIs and that beats the 7870. Which is what a console would be built with if they choose it.

AMD cards are better at overclocking than Nvidia cards. Are we going to include overclocking capabilities here too?

Same reason the 760 TI is not off the table. OEMs are basically businesses that buy the hardware, and Sony and microsoft qualify as those.

As stated, the difference in power between the 7870 and the 760 TI is almost nonexistent but the price is.

speaking of cards, the 7870 is under volted. So its actually less powerful. Its basically underclocked.

Doesn't matter, the performance marks are still the performance marks and we have no idea how Sony has actually modified the card. This comment would be like telling me that some guy at the olympics came in at third but he was wearing pink. Um, ok, but that doesn't change the measureable performance.

GDDR5 is overkill beyond 2-4Gbs. GDDR 5 is awful for processors, and unless they want to use 2k and 4k textures anything beyond 4 is overkill. I doubt even blue ray could hold 4k texture games. We sit at 50GBs and we are far from 2k textures. Similarly, DDR3 is awful for graphics.

I'm not sure overkill (aka, too powerful) is a valid complaint. I do wonder why they went that way though. Maybe just a cheap (better be cheap) way to stand out in a lineup?

You're confusing the typical CPU with the console's combined GPU/CPU (APU). Also, you're ignoring the super high bandwidth. It basically makes up for the slower timing by eliminating the transmission latency which they're doing thanks to the the tighter connection between RAM and APU. As long as Sony got the GDDR5 for relatively cheap then there's no bad side to this and a heck of a lot of good. This configuration would have net negatives in a traditional PC though. But this is another example of hardware optimisation.

The ps4 is limited by processor. The xbox is limited by graphics. Both are on entirely different pages. processors may not need to be powerful, but DDR 3 helps when you want to do more complex stuff like AI over a large area.

They both have the same or a very similar processor, I thought.

Still, today's processes really don't rely on the CPU the way they used to. This really isn't a liability for the next couple of years because the CPU should be the last resource to be fully tapped except, again, to serve as a glorified switchboard operator to send processes to RAM/HDD/GPU. But at the end of the generation when the barrel is getting scraped it will diminish the wiggle room they could have had. But most any other type of CPU would have pushed that price up and unnecessarily for the next 4 years (maybe less). We'll begin to really notice the bottom of that barrel by year 5 and then it'll drag on until the ps5 comes out (assuming that it actually turns a profit this time which is already looking good).

The biggest problem with your list is that you assume a list entitled "high end cards" are actually high end forever. The majority of the cards on there are completely ancient.

What? No. I just assume that they are currently high end and that's all that we can ask for.

When you take it all in, there aren't all that many cards. Past AMD, and Nvidia, there really isn't anyone else. AMD, and Nvidia put out only a few cards every so often. So of course its going to be "high" on a list of only a handful. The list wasn't that big to begin with.

Doesn't matter. The performance numbers are still comparable.

Want a site that actually changes with the course of time?

http://www.futuremark.com/

If you really like that site then you should acknowledge that the highest performing card right now is the AMD 7990 and that the 7870 is even higher on this list than the other placed it. Also, the cards you said you'd be happier with are lower on the list.

and consoles already shed billions of dollars just on the research and development alone. Consoles are an expensive business, and the tech they want to use won't last.

Never said otherwise. But they have to pick a spot and run with it. That spot will then continue to be supported by developers as long as its popular. Though, this should make a PC owner a bit mad. It means most of our games are bottlenecked under the hood by the hardware in consoles because developers aren't going to create a new engine just for pc.

They spent too much last generation for the best tech the world had to offer at the time, and now they cut everything down to the bargain bin because they didn't want to spend their money wisely.

if they went with reasonable tech back then, and went with the reasonable tech now, we wouldn't have run into this problem.

If you're ok with them going with reasonable tech, then what's your issue here? This is actually higher-end tech at the moment so that it's well priced too is the best of all worlds for the average consumer. All I care is that it is a significant step in the right direction and our PC games will finally start moving forward too.

AMD agreed because Nvidia was eating them alive at the time and they were near bankruptcy.

Even consoles couldn't help them, because they made less than 20% of their revenue. For a project that demanding. Which is why Nvidia said no because they asked too much and paid too little.

Even AMD admitted that the consoles got outdated tech that was already on its way out. The Jaguar was already rendered obsolete. If you need to go to AMD's lowest rung bargain bin, a company of cheap cards,you have a problem.

And its nice how you say Nvidia will help AMD. Except they won't. They haven't helped AMD on the Witcher 3, and you even admitted physx is restricted to CUDA.

ANd you read futuremark wrong:

7990 is a flagship that is no longer sold. The power draw is also horrendous for the power it gives. Its a 400w minimum draw when 250 watt cards like the 270x and the 780 TI give most of the power.

7870: 8090.
7850: 6630
The 7870 XT isn't whats in the ps4. Its an under volted 7870 which means its 7850-7870 in range, which has been said time and again.

http://www.tomshardware.com/answers/id-1689376/graphics-card-equivalent-ps4-xbox.html

So thats about 7,000 ish in performance.

660 TI (new version of 660, which isn't overlocked but people say its overclocked anyway): 8900.
760 standard: 8730

And on the topic of CPUs, GDDR 5 is specialized as graphics RAM. It wasn't meant for anything that processors do. DDR3 is processor memory, and allows you to do things like keeping track of a lot of AI and pathfinding, etc. This has nothing to do with APUs or power.

If they kept 4Gbs of GDDR 5, and replaced the other 4 with DDR3, the range of the ps4 would increase. There is a reason gaming PCs use both, because they both do different things.

The problem with both consoles is that they are using "high numbers" to call themselves more powerful. That is not the case.

This is why I said reasonable tech. They went back to "we are special snowflakes" and we are paying for it. Microsoft thought DDR3 would be great for a console with no GDDR5, which is an awful choice. Similarly, GDDR5 by itself is just bottlenecking the improvements that COULD be there like better AI and dynamic systems for RAM that we won't even use in this generation. Unless they want 4k textures, which current physical media won't hold unless they go full archive disc.

The card that sony went with is also a power hog, which clashes with a low power draw and a low power processor. So they won't use their "high end card" to the fullest thanks to their insistence on low power. Even the 760 and 770 draw less power than the 7870.

Ultratwinkie:
AMD agreed because Nvidia was eating them alive at the time and they were near bankruptcy.

If this would make AMD take a loss like it would Nvidia, why would they have agreed to it then? AMD actually has a competitive advantage regarding SoC solutions. They've already done it before whereas NVidia has practically no insight that way. So the cost of R&D was much steeper for Nvidia. Regardless of your feelings for Nvidia or against AMD, this should be clear. AMD was practically the only game in town where SoC solutions were required.

Even consoles couldn't help them, because they made less than 20% of their revenue. For a project that demanding. Which is why Nvidia said no because they asked too much and paid too little.

I'm not sure you're familiar with the hardware R&D cycle. There will always be large R&D costs associated up front and developing these custom chips for both companies would have been big. But now what? The chips have been produced and now it's just a matter of making them smaller and cheaper to make but they've already sold around 10 million cards so far. This year should not only be a firm profit margin for AMD but several sources thing AMD can double by year end. They are wonderfully poised thanks to this. Not only that but with their card being in the consoles they're going to have an advantage with games made for pc's too. I can't think of a better choice for them to have made. Do you honestly believe that AMD made a poor choice? This is basically the first time in AMD recent memory where everything appears to be coming up AMD (though Nvidia doesn't seem to be taking a hit which likely means this is bad news for Intel).

And its nice how you say Nvidia will help AMD. Except they won't. They haven't helped AMD on the Witcher 3, and you even admitted physx is restricted to CUDA.

If Intel is including Physx support on the PS4/XBO to encourage developers to use the tech so it will benefit their PC gamers, why would they make it shitty support? Precious few games use Physx already, poorly performing Physx drivers on the PS4/XBO would just be more nails in the coffin of a product that is actually good. But Havok's new software may actually be even better than the current Physx's offering. It's much newer and looks very slick.

Are you unaware that Havok exists and is Physx's competition? It's been much more heavily used than Physx. Also, why are you not responding to the fact that very few games even use Physx?

7990 is a flagship that is no longer sold. The power draw is also horrendous for the power it gives. Its a 400w minimum draw when 250 watt cards like the 270x and the 780 TI give most of the power.

You do realize that this is why AMD video cards are better at being overclocked, right? Nvidia places locks on how much power their cards can access and so they can only go so far. AMD doesn't lock it down.

Tell me, as a gamer, do you prefer more power output or do you care more about power consumption?

7870: 8090.
7850: 6630
The 7870 XT isn't whats in the ps4. Its an under volted 7870 which means its 7850-7870 in range, which has been said time and again.

http://www.tomshardware.com/answers/id-1689376/graphics-card-equivalent-ps4-xbox.html

So thats about 7,000 ish in performance.

Ok? You found a thread of people who aren't experts talking about something that they aren't sure about whereas I posted a link to people who actually took the damn thing apart and had the utilities to know what they were looking at.

Hmm... which to believe which to believe. Oh, here's another tomshardware thread not only agreeing with me but linking to the same link I gave you: http://www.tomshardware.com/answers/id-1993056/ps4-equivalent-gpu.html

The problem is that we still don't know some of the specifics of the card to really tell us how underpowered it is. But the general thinking is that it's slightly under the 7870 but much closer to it than the 7850. So you have your 660 equivalent. Either way, still not a low rung card, just not the best.

And on the topic of CPUs, GDDR 5 is specialized as graphics RAM. It wasn't meant for anything that processors do. DDR3 is processor memory, and allows you to do things like keeping track of a lot of AI and pathfinding, etc. This has nothing to do with APUs or power.

DDR memory is otpimized for desktop use because it has ultra low latency. GDDR5 is generally considered optimized for data transfer. That is the difference between the two. They're both memory and it's not like both couldn't be used to do what the other can do (such as keep track of multiple objects or load textures), it's just that thye're not as efficient at it.

But, as I stated, Sony has taken advantage of console's standardized hardware by almost eliminating transmission latency in their design. It won't be exactly as fast at processes that would be better suited for DDR3 but it would be close enough to be entirely unnoticeable while still having all the advantages of GDDR5 which can be an advantage.

Take for example the fact that the XBO has all this trouble with rendering games in 1080p. This is the reason why the PS4 doesn't blink twice at doing so. We should also see better load times and antistophic filtering.

As long as they really did resolve the transmission latency, then there's really no downside to this. Now, on the reverse, you can't really get DDR3 to do what GDDR5 does to any realistic degree where mass data transfer is concerned.

If they kept 4Gbs of GDDR 5, and replaced the other 4 with DDR3, the range of the ps4 would increase. There is a reason gaming PCs use both, because they both do different things.

That would be interesting and they did consider doing that. However, they stuck with one type to make the development process simpler. This is Sony speaking from experience if you recall their divided 512MB RAM setup and the frustration it heaped on an already frustration-saturated pile of hard to code for hardware.

The card that sony went with is also a power hog, which clashes with a low power draw and a low power processor. So they won't use their "high end card" to the fullest thanks to their insistence on low power. Even the 760 and 770 draw less power than the 7870.

Dude, again, this isn't just a 7870. It's a modified SoC card. Do you have a full spec sheet or understanding of the way it has been modified? My statements are that it is around or the equivalent of a 7870. It is not itself a 7870.

Lightknight:

Ultratwinkie:
AMD agreed because Nvidia was eating them alive at the time and they were near bankruptcy.

If this would make AMD take a loss like it would Nvidia, why would they have agreed to it then? AMD actually has a competitive advantage regarding SoC solutions. They've already done it before whereas NVidia has practically no insight that way. So the cost of R&D was much steeper for Nvidia. Regardless of your feelings for Nvidia or against AMD, this should be clear. AMD was practically the only game in town where SoC solutions were required.

Even consoles couldn't help them, because they made less than 20% of their revenue. For a project that demanding. Which is why Nvidia said no because they asked too much and paid too little.

I'm not sure you're familiar with the hardware R&D cycle. There will always be large R&D costs associated up front and developing these custom chips for both companies would have been big. But now what? The chips have been produced and now it's just a matter of making them smaller and cheaper to make but they've already sold around 10 million cards so far. This year should not only be a firm profit margin for AMD but several sources thing AMD can double by year end. They are wonderfully poised thanks to this. Not only that but with their card being in the consoles they're going to have an advantage with games made for pc's too. I can't think of a better choice for them to have made. Do you honestly believe that AMD made a poor choice? This is basically the first time in AMD recent memory where everything appears to be coming up AMD (though Nvidia doesn't seem to be taking a hit which likely means this is bad news for Intel).

And its nice how you say Nvidia will help AMD. Except they won't. They haven't helped AMD on the Witcher 3, and you even admitted physx is restricted to CUDA.

If Intel is including Physx support on the PS4/XBO to encourage developers to use the tech so it will benefit their PC gamers, why would they make it shitty support? Precious few games use Physx already, poorly performing Physx drivers on the PS4/XBO would just be more nails in the coffin of a product that is actually good. But Havok's new software may actually be even better than the current Physx's offering. It's much newer and looks very slick.

Are you unaware that Havok exists and is Physx's competition? It's been much more heavily used than Physx. Also, why are you not responding to the fact that very few games even use Physx?

7990 is a flagship that is no longer sold. The power draw is also horrendous for the power it gives. Its a 400w minimum draw when 250 watt cards like the 270x and the 780 TI give most of the power.

You do realize that this is why AMD video cards are better at being overclocked, right? Nvidia places locks on how much power their cards can access and so they can only go so far. AMD doesn't lock it down.

Tell me, as a gamer, do you prefer more power output or do you care more about power consumption?

7870: 8090.
7850: 6630
The 7870 XT isn't whats in the ps4. Its an under volted 7870 which means its 7850-7870 in range, which has been said time and again.

http://www.tomshardware.com/answers/id-1689376/graphics-card-equivalent-ps4-xbox.html

So thats about 7,000 ish in performance.

Ok? You found a thread of people who aren't experts talking about something that they aren't sure about whereas I posted a link to people who actually took the damn thing apart and had the utilities to know what they were looking at.

Hmm... which to believe which to believe. Oh, here's another tomshardware thread not only agreeing with me but linking to the same link I gave you: http://www.tomshardware.com/answers/id-1993056/ps4-equivalent-gpu.html

The problem is that we still don't know some of the specifics of the card to really tell us how underpowered it is. But the general thinking is that it's slightly under the 7870 but much closer to it than the 7850. So you have your 660 equivalent. Either way, still not a low rung card, just not the best.

And on the topic of CPUs, GDDR 5 is specialized as graphics RAM. It wasn't meant for anything that processors do. DDR3 is processor memory, and allows you to do things like keeping track of a lot of AI and pathfinding, etc. This has nothing to do with APUs or power.

DDR memory is otpimized for desktop use because it has ultra low latency. GDDR5 is generally considered optimized for data transfer. That is the difference between the two. They're both memory and it's not like both couldn't be used to do what the other can do (such as keep track of multiple objects or load textures), it's just that thye're not as efficient at it.

But, as I stated, Sony has taken advantage of console's standardized hardware by almost eliminating transmission latency in their design. It won't be exactly as fast at processes that would be better suited for DDR3 but it would be close enough to be entirely unnoticeable while still having all the advantages of GDDR5 which can be an advantage.

Take for example the fact that the XBO has all this trouble with rendering games in 1080p. This is the reason why the PS4 doesn't blink twice at doing so. We should also see better load times and antistophic filtering.

As long as they really did resolve the transmission latency, then there's really no downside to this. Now, on the reverse, you can't really get DDR3 to do what GDDR5 does to any realistic degree where mass data transfer is concerned.

If they kept 4Gbs of GDDR 5, and replaced the other 4 with DDR3, the range of the ps4 would increase. There is a reason gaming PCs use both, because they both do different things.

That would be interesting and they did consider doing that. However, they stuck with one type to make the development process simpler. This is Sony speaking from experience if you recall their divided 512MB RAM setup and the frustration it heaped on an already frustration-saturated pile of hard to code for hardware.

The card that sony went with is also a power hog, which clashes with a low power draw and a low power processor. So they won't use their "high end card" to the fullest thanks to their insistence on low power. Even the 760 and 770 draw less power than the 7870.

Dude, again, this isn't just a 7870. It's a modified SoC card. Do you have a full spec sheet or understanding of the way it has been modified? My statements are that it is around or the equivalent of a 7870. It is not itself a 7870.

AMD wouldn't really have lost money. They use lower quality parts and had nothing to lose at that point. Nvidia would have because they had a hold on much more lucrative markets. Basically, nvidia had better things to do and the stuff they sold couldn't be sold at the discounted prices AMD was giving out.

Why? AMD was losing the PC gaming market, and nvidia is storming the mobile market. When it came to processors, intel still dominated. They were running out of options, and it was better to sell a soon to be outdated jaguar than risk irrelevance.

And yes, it may be a modified 7870 but that doesn't make it impressive:

http://www.gameranx.com/updates/id/20377/article/xbox-one-vs-ps4-amd-outs-console-like-gpus/
http://www.futuremark.com/hardware/gpu/AMD+Radeon+R7+260X/review

So AMD says its a R7 260-265.

http://www.hwcompare.com/15496/geforce-gtx-660-ti-vs-radeon-r7-260x/
http://www.hwcompare.com/14298/radeon-hd-7790-vs-radeon-hd-7870/

Speaking of graphics cards, the RAM isn't responsible for high resolutions. Its the pixel rate. The RAM is just there for memory, and memory for different things.

The stuff you would use DDR3 for doesn't need to be fast. It just needs to exist.

For example, lets say we have a game where you have a lot of dynamic actions that are recorded. You killed a king, and now the cops are looking for you.

You don't need high bandwith for the game to remember you killed someone, you just need it to be there on hand. Its cheaper go DDR 3 too. You don't need "high numbers" for this stuff.

Its only there for marketing. Not for anything actually useful. Its just there to show high numbers for the customer to feel smart.

GDDR5 is fast because graphics need to be fast, and there is only so much that you need before it becomes useless. The stuff that requires GDDR 5 in 8 gbs is well beyond the ps4's ability and is going into 6K textures at that point.

Coding for two kinds of ram isn't hard. If the bedroom coder can make games for PC, then a big publisher can make a game for a console. If its such and issue, they can just make device drivers for it.

The reason the 512 limit was bad was because they used cell architecture. It was a tiny limit on an insane architecture. This is x86 and PC gaming don't have that issue. There is no reason to limit ram to one kind when you are emulating PC.

and lastly, what does a wildly inefficient card have to do with overclocking? Its a 400 watt minimum at factory default. It isn't overclocked, its just a jokey flagship like the titan was. AMD cards are known for their heat output. Overclocking does have a limit, and that limit is dictated by heat. There is a reason AMD cards are called "house fires."

Both power output and power consumption matter. They must be proportionate, and that card doesn't have proportionate power to draw ratio.

When it comes to physx, it doesn't matter. Its a PC feature. developers don't bother because they normally aren't building benchmark games. The 7th gen didn't use physx at all, and PC gamers still had it. Havok's new engine is still not fully used in the next gen as far as I know, and the only next gen demonstration I seen is them dropping tiny blue balls everywhere. Kinda like how knack was "so intensive."

Which is a far cry from what physx can do. The list of 500 games that use havok only has a handful that even touches what physx does normally. Most off the games only use basic physics and very few use anything more than that.

Nvidia hasn't supported AMD on the Witcher, which they have said they are waiting for a miracle on nvidia. So why would nvidia abandon the next gen on one of its biggest games?

Oh wait, they do that all the time. Just because they say "we'll support you" doesn't mean there isn't a catch. As you even admitted without CUDA cores physx is useless.

Taking hostage of cool stuff is one of the biggest reasons nvidia is the biggest graphics card manufacturer. Beyond PC gaming and nvidia consoles, they really don't care. If any console devs use it, it will be way more limited.

It would only be a "pay for our cards for the full effect" advert on a competitor's tech. If it comes on at all.

Nvidia has always played dirty to get what they want. Is this really a surprise? Even Havok is mostly used on only the most basic level, and that's mostly console games.

Nvidia plays dirty. It plays brutal. On the PC gaming side, it dominates from all the stuff they take hostage. Consoles are meaningless when over 50% of pc gamers use nvidia. No way nvidia would be abandoned when they are entrenched so much.

Everything is multiplat now, and no one can afford to shut out any customers. Nvidia knows this. Everyone does.

And the fact AMD is passing out TressFX to nvidia, their "ace in the hole," means there is literally no reason to side with AMD beyond basic optimization. Their cards are as basic as they come.

Nvidia built your ps3, and yet that doesn't mean AMD shriveled up and died when no one optimized for the hardware.

Ultratwinkie:
AMD wouldn't really have lost money. They use lower quality parts and had nothing to lose at that point. Nvidia would have because they had a hold on much more lucrative markets. Basically, nvidia had better things to do and the stuff they sold couldn't be sold at the discounted prices AMD was giving out.

Let me make this clear. You are still admitting here that AMD gave Sony what they asked for at the price they asked for it at and Nvidia couldn't or didn't do that for whatever reason. Any other rationalizations for Nvidia is bullshit unless you were at the negotions or are an Nvidia employee. AMD built a new facility for this. Busy didn't mean shit to them, they just created a new department and their existing teams continued on.

Why? AMD was losing the PC gaming market, and nvidia is storming the mobile market. When it came to processors, intel still dominated. They were running out of options, and it was better to sell a soon to be outdated jaguar than risk irrelevance.

Two things, I don't care how Nvidia is doing overall, they are a very strong company with a good product and a well liked name. I've purchased many cards from them but this is irrelevant to discussion. Doing well doesn't mean they turn down huge profits and marketshare. They lost a bid. You should understand that Nvidia is likely unhappy about this loss of .

And yes, it may be a modified 7870 but that doesn't make it impressive:

http://www.gameranx.com/updates/id/20377/article/xbox-one-vs-ps4-amd-outs-console-like-gpus/
http://www.futuremark.com/hardware/gpu/AMD+Radeon+R7+260X/review

So AMD says its a R7 260-265.

If that article is correct. AMD says it is closest to R7 265, a card that isn't out yet and isn't scored on the other sites we've been using. It's the XBO card that is R7 260 and is hilariously underclocked but I haven't brought XBO up because it looks like they've purposefully targetted shitty specs. Despite that picture, there's no such score of the R7 265 anywhere else. Interestingly enough, Microsoft's desire to augment performance with cloud computing would make any complaint about them somewhat obsolete if you have amazing internet connections. The PS4 may go that route too but I would hate that.

But this is just an article where a person is saying that they think these cards are comparable. This isn't AMD saying so. Actually, as nearly as I can tell this is the only source making this claim. The people who actually took the card apart and had the expertise to know what they were looking at pegged it higher. But then again, we don't have reliable performance results for the 265 so I don't know if that's better or worse and I'm not trusting that tomshardware link when I can't find it on their site.

Speaking of graphics cards, the RAM isn't responsible for high resolutions. Its the pixel rate. The RAM is just there for memory, and memory for different things.

The stuff you would use DDR3 for doesn't need to be fast. It just needs to exist.

I'm not sure what part of what I said you are disagreeing with. Did I accidentally say high resolutions? For some reason I mentally associate highly detailed textures with high resolution. Something about the detail being what makes me interchange the terms incorrectly. Perhaps I'm thinkin High res textures?

For example, lets say we have a game where you have a lot of dynamic actions that are recorded. You killed a king, and now the cops are looking for you.

You don't need high bandwith for the game to remember you killed someone, you just need it to be there on hand. Its cheaper go DDR 3 too. You don't need "high numbers" for this stuff.

Right, but you can't use DDR3 to do what GDDR5 does easily. So if you can get either for the same price AND avoid the latency issue that GDDR5 runs into then it's a best of both worlds.

Its only there for marketing. Not for anything actually useful. Its just there to show high numbers for the customer to feel smart.

Oh, yeah, I'm sure. It is brilliant to have RAM that seems better. However, it's more than that. They took advantage of standardized hardware to get rid of the latency issue that makes GDDR5 bad at performing DDR3 tasks. This move will actually extend the console's lifespan by a little bit of time by giving the GPU access to more video RAM. Honestly, if computers could do this too then I think they would. It would be a lot more unified to have this setup but that latency is a necessity for us to have the ability to pick and choose part manufacturers.

GDDR5 is fast because graphics need to be fast, and there is only so much that you need before it becomes useless. The stuff that requires GDDR 5 in 8 gbs is well beyond the ps4's ability and is going into 6K textures at that point.

Don't know how developers may make use of it. It'll be interesting to see.

Coding for two kinds of ram isn't hard. If the bedroom coder can make games for PC, then a big publisher can make a game for a console. If its such and issue, they can just make device drivers for it.

No, it's not hard. That's the standard process. But one would be easier.

The reason the 512 limit was bad was because they used cell architecture. It was a tiny limit on an insane architecture. This is x86 and PC gaming don't have that issue. There is no reason to limit ram to one kind when you are emulating PC.

There were several things wrong with it. Not just that. But that is what made it particularly insane to program for.

and lastly, what does a wildly inefficient card have to do with overclocking? Its a 400 watt minimum at factory default. It isn't overclocked, its just a jokey flagship like the titan was. AMD cards are known for their heat output. Overclocking does have a limit, and that limit is dictated by heat.

I'm talking about AMD cards vs Nvidia cards. One of the few known advantages (look it up) that AMD has over Nvidia is the ability to overclock the cards. Not that this impacts the PS4 at all, but it does impact how they view power consumption.

FYI, I've had NVidia cards burn out too.

When it comes to physx, it doesn't matter. Its a PC feature. developers don't bother because they normally aren't building benchmark games. The 7th gen didn't use physx at all, and PC gamers still had it. Havok's new engine is still not fully used in the next gen as far as I know, and the only next gen demonstration I seen is them dropping tiny blue balls everywhere. Kinda like how knack was "so intensive."

Havok's engine is pretty damn new. Just under a year old we certainly shouldn't be seeing the kind of games that need it yet until development cycles that use it conclude and release games. Knack was supposed to be partical physics demonstrations. Shame the game sucked but I haven't played it so I can't speak about it.

Which is a far cry from what physx can do. The list of 500 games that use havok only has a handful that even touches what physx does normally. Most off the games only use basic physics and very few use anything more than that.

Most games that use physx only do it for very minor things too. We've been over this. Very few games support physx at all and even fewer use it. There's a lot of games that use Havok and most of them at least use something.

Oh wait, they do that all the time. Just because they say "we'll support you" doesn't mean there isn't a catch. As you even admitted without CUDA cores physx is useless.

Ok, then physx will die because developers don't have a reason to develop for it. There's already fewer than 7 titles a year that support it and around 4 or less that actually use any part of it per year too. So that number will just get smaller if the entire x86 console market isn't properly supported.

The intention of Nvidia's offer of support was to prevent that from happening. If they do a shit job then they're ruining their name and business. We'll see though.

Taking hostage of cool stuff is one of the biggest reasons nvidia is the biggest graphics card manufacturer. Beyond PC gaming and nvidia consoles, they really don't care. If any console devs use it, it will be way more limited.

Taking hostage of cool stuff doesn't get you business when you suck on standardized hardware that around 15 million homes have right now (if we're including the WiiU which, haha, we shouldn't. In which case it's already around 10 million). People are going to develop games that they can easily port from x86 consoles to pc. That's the benefit of everything going that route. So if you hold it hostage entire platforms you will get burned here. So Nvidia is doing something smart and making sure they don't impact their PC Physx business. It isn't dumb.

And the fact AMD is passing out TressFX to nvidia, their "ace in the hole," means there is literally no reason to side with AMD beyond basic optimization. Their cards are as basic as they come.

TressFX is just hair physics. Who gives a damn?

Nvidia built your ps3, and yet that doesn't mean AMD shriveled up and died when no one optimized for the hardware.

What? That's exactly what was happening to them.

Lightknight:

Ultratwinkie:
AMD wouldn't really have lost money. They use lower quality parts and had nothing to lose at that point. Nvidia would have because they had a hold on much more lucrative markets. Basically, nvidia had better things to do and the stuff they sold couldn't be sold at the discounted prices AMD was giving out.

Let me make this clear. You are still admitting here that AMD gave Sony what they asked for at the price they asked for it at and Nvidia couldn't or didn't do that for whatever reason. Any other rationalizations for Nvidia is bullshit unless you were at the negotions or are an Nvidia employee. AMD built a new facility for this. Busy didn't mean shit to them, they just created a new department and their existing teams continued on.

Why? AMD was losing the PC gaming market, and nvidia is storming the mobile market. When it came to processors, intel still dominated. They were running out of options, and it was better to sell a soon to be outdated jaguar than risk irrelevance.

Two things, I don't care how Nvidia is doing overall, they are a very strong company with a good product and a well liked name. I've purchased many cards from them but this is irrelevant to discussion. Doing well doesn't mean they turn down huge profits and marketshare. They lost a bid. You should understand that Nvidia is likely unhappy about this loss of .

And yes, it may be a modified 7870 but that doesn't make it impressive:

http://www.gameranx.com/updates/id/20377/article/xbox-one-vs-ps4-amd-outs-console-like-gpus/
http://www.futuremark.com/hardware/gpu/AMD+Radeon+R7+260X/review

So AMD says its a R7 260-265.

If that article is correct. AMD says it is closest to R7 265, a card that isn't out yet and isn't scored on the other sites we've been using. It's the XBO card that is R7 260 and is hilariously underclocked but I haven't brought XBO up because it looks like they've purposefully targetted shitty specs. Despite that picture, there's no such score of the R7 265 anywhere else. Interestingly enough, Microsoft's desire to augment performance with cloud computing would make any complaint about them somewhat obsolete if you have amazing internet connections. The PS4 may go that route too but I would hate that.

But this is just an article where a person is saying that they think these cards are comparable. This isn't AMD saying so. Actually, as nearly as I can tell this is the only source making this claim. The people who actually took the card apart and had the expertise to know what they were looking at pegged it higher. But then again, we don't have reliable performance results for the 265 so I don't know if that's better or worse and I'm not trusting that tomshardware link when I can't find it on their site.

Speaking of graphics cards, the RAM isn't responsible for high resolutions. Its the pixel rate. The RAM is just there for memory, and memory for different things.

The stuff you would use DDR3 for doesn't need to be fast. It just needs to exist.

I'm not sure what part of what I said you are disagreeing with. Did I accidentally say high resolutions? For some reason I mentally associate highly detailed textures with high resolution. Something about the detail being what makes me interchange the terms incorrectly. Perhaps I'm thinkin High res textures?

For example, lets say we have a game where you have a lot of dynamic actions that are recorded. You killed a king, and now the cops are looking for you.

You don't need high bandwith for the game to remember you killed someone, you just need it to be there on hand. Its cheaper go DDR 3 too. You don't need "high numbers" for this stuff.

Right, but you can't use DDR3 to do what GDDR5 does easily. So if you can get either for the same price AND avoid the latency issue that GDDR5 runs into then it's a best of both worlds.

Its only there for marketing. Not for anything actually useful. Its just there to show high numbers for the customer to feel smart.

Oh, yeah, I'm sure. It is brilliant to have RAM that seems better. However, it's more than that. They took advantage of standardized hardware to get rid of the latency issue that makes GDDR5 bad at performing DDR3 tasks. This move will actually extend the console's lifespan by a little bit of time by giving the GPU access to more video RAM. Honestly, if computers could do this too then I think they would. It would be a lot more unified to have this setup but that latency is a necessity for us to have the ability to pick and choose part manufacturers.

GDDR5 is fast because graphics need to be fast, and there is only so much that you need before it becomes useless. The stuff that requires GDDR 5 in 8 gbs is well beyond the ps4's ability and is going into 6K textures at that point.

Don't know how developers may make use of it. It'll be interesting to see.

Coding for two kinds of ram isn't hard. If the bedroom coder can make games for PC, then a big publisher can make a game for a console. If its such and issue, they can just make device drivers for it.

No, it's not hard. That's the standard process. But one would be easier.

The reason the 512 limit was bad was because they used cell architecture. It was a tiny limit on an insane architecture. This is x86 and PC gaming don't have that issue. There is no reason to limit ram to one kind when you are emulating PC.

There were several things wrong with it. Not just that. But that is what made it particularly insane to program for.

and lastly, what does a wildly inefficient card have to do with overclocking? Its a 400 watt minimum at factory default. It isn't overclocked, its just a jokey flagship like the titan was. AMD cards are known for their heat output. Overclocking does have a limit, and that limit is dictated by heat.

I'm talking about AMD cards vs Nvidia cards. One of the few known advantages (look it up) that AMD has over Nvidia is the ability to overclock the cards. Not that this impacts the PS4 at all, but it does impact how they view power consumption.

FYI, I've had NVidia cards burn out too.

When it comes to physx, it doesn't matter. Its a PC feature. developers don't bother because they normally aren't building benchmark games. The 7th gen didn't use physx at all, and PC gamers still had it. Havok's new engine is still not fully used in the next gen as far as I know, and the only next gen demonstration I seen is them dropping tiny blue balls everywhere. Kinda like how knack was "so intensive."

Havok's engine is pretty damn new. Just under a year old we certainly shouldn't be seeing the kind of games that need it yet until development cycles that use it conclude and release games. Knack was supposed to be partical physics demonstrations. Shame the game sucked but I haven't played it so I can't speak about it.

Which is a far cry from what physx can do. The list of 500 games that use havok only has a handful that even touches what physx does normally. Most off the games only use basic physics and very few use anything more than that.

Most games that use physx only do it for very minor things too. We've been over this. Very few games support physx at all and even fewer use it. There's a lot of games that use Havok and most of them at least use something.

Oh wait, they do that all the time. Just because they say "we'll support you" doesn't mean there isn't a catch. As you even admitted without CUDA cores physx is useless.

Ok, then physx will die because developers don't have a reason to develop for it. There's already fewer than 7 titles a year that support it and around 4 or less that actually use any part of it per year too. So that number will just get smaller if the entire x86 console market isn't properly supported.

The intention of Nvidia's offer of support was to prevent that from happening. If they do a shit job then they're ruining their name and business. We'll see though.

Taking hostage of cool stuff is one of the biggest reasons nvidia is the biggest graphics card manufacturer. Beyond PC gaming and nvidia consoles, they really don't care. If any console devs use it, it will be way more limited.

Taking hostage of cool stuff doesn't get you business when you suck on standardized hardware that around 15 million homes have right now (if we're including the WiiU which, haha, we shouldn't. In which case it's already around 10 million). People are going to develop games that they can easily port from x86 consoles to pc. That's the benefit of everything going that route. So if you hold it hostage entire platforms you will get burned here. So Nvidia is doing something smart and making sure they don't impact their PC Physx business. It isn't dumb.

And the fact AMD is passing out TressFX to nvidia, their "ace in the hole," means there is literally no reason to side with AMD beyond basic optimization. Their cards are as basic as they come.

TressFX is just hair physics. Who gives a damn?

Nvidia built your ps3, and yet that doesn't mean AMD shriveled up and died when no one optimized for the hardware.

What? That's exactly what was happening to them.

AMD was desperate, and they were taken advantage of on that basis. Nvidia had way more leverage, and didn't want to waste their time on bargain bin consoles. These guys made the ps3, if they were to make a console they want to go all in. They wouldn't stop at the bargain bin.

If the consoles are not going all in, they really don't care. Its not a loss to them because they control most of the market. Its beneath them to waste time on something that isn't another ps3.

On AMD's side, all they could actually afford to sell was the mobile hardware. They were losing everything because they couldn't figure out what made nvidia popular.

It wasn't just the power, but the extra support. AMD doesn't do that. It doesn't have the extras aimed at gamers. Its these extras that made nvidia the fan favorite of PC gamers.

That's why physx exists, its meant as marketing as a nvidia exclusive feature. The only thing they would do is make sure physx is limited on consoles to hold it over the heads of console gamers to get a better machine to experience the full effect.

physx didn't shrivel up and die during the 7th gen, where it took work to port to PC, it won't now. The loist that does list physx games is incomplete on what game uses what. The amount of physx games that use more than basic physics is much more than the games that fully use havok.

Even when havok makes it more accessible, developers don't care. So when physx games bring out a lot more, its another marketing gimmick to drive hardware sales. Console gamers are buying consoles, they wouldn't care for hardware. Its not the market that nvidia is selling to.

Physx is a PC feature to market to PC gamers. Console gamers are irrelevant because they wouldn't buy the cards. You seem to forget that physx was doing just fine for 7 years without console support.

Its this reason AMD was going bankrupt, because their marketing stopped at "its cheap." Even Tress FX, their ONLY ace in the hole, was given away.

You may say "its just hair physics" but it was the only thing AMD actually put out there in terms of extras. Without exclusivity, you might as well go nvidia.

AMD wasn't dying because the devs weren't optimizing for it. Hardware manufacturers optimize their own hardware, which AMD also lags behind nvidia in that regard. AMD had a lot of problems, and not being in a console wasn't one of them.

And lastly, the power of the ps4 would be long gone before they use the 8Gb of ram. anything above 4 is worthless because that amount of ram needs more power or it gets bottle necked by the actual raw power of the console.

Its like strapping a nuclear reactor to a calculator. Its just not proportionate.

No amount of "standardization" or "latency" will fix the fact the ps4 isn't that powerful to justify 8 Gbs of video ram. Its an awful choice because that amount of ram would only be justified if they went with another ps3 style era. Which sony doesn't want to do because being the absolute best at launch is expensive.

and by the way, Nvidia itself said consoles would waste their time and precious resources for pocket change. AMD took it because it was operating at a huge loss.

http://www.gamespot.com/articles/ps4-not-worth-the-cost-says-nvidia/1100-6405300/

If a person walks into Burger King and can't afford what they have to sell, and instead goes to the 99 cents store for food, is that Burger King's problem? No. Does burger king offer less or somehow a worse choice? No. It was the person's fault for trying to buy food without the money to do so.

Even if they did sell the food at below asking price just to "get business" they would have lost money. There is no reason for Nvidia to do something that the other party can't afford them to do.

So next time, don't call bullshit when Nvidia came out and admitted the price they were offered wasn't worth their time.

 Pages PREV 1 2 3 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here