Nvidia: Console Graphics Will Never Again Outpace PCs

 Pages 1 2 NEXT
 

Nvidia: Console Graphics Will Never Again Outpace PCs

nvidia graph

Nvidia Senior Vice President Tony Tamasi says it's no longer possible for consoles to be better graphics platforms than PCs.

Let's get this out of the way early: Referring to "the PC" as if it's a standardized piece of hardware is dodgy business. Unlike consoles, PCs come in all shapes, sizes and capabilities, and so when referring to them in comparison to consoles, the general assumption has to be that it's high-end, cutting-edge equipment being discussed. And that kind of hardware, according to Tamasi, will always out-muscle anything that a console can bring to the table.

"It's no longer possible for a console to be a better or more capable graphics platform than the PC," he told PC PowerPlay. "In the past, certainly with the first PlayStation and PS2, in that era there weren't really good graphics on the PC. Around the time of the PS2 is when 3D really started coming to the PC, but before that time 3D was the domain of Silicon Graphics and other 3D workstations. Sony, Sega or Nintendo could invest in bringing 3D graphics to a consumer platform."

The Xbox 360 and PlayStation 3 were "on par" with PCs when they launched, he continued, because they're both powered by technology from either AMD or Nvidia, which is where all the innovation in graphics is now being done. "Nvidia spends 1.5 billion US dollars per year on research and development in graphics, every year, and in the course of a console's lifecycle we'll spend over 10 billion dollars into graphics research," he said. "Sony and Microsoft simply can't afford to spend that kind of money. They just don't have the investment capacity to match the PC guys; we can do it thanks to economy of scale, as we sell hundreds of millions of chips, year after year."

The other limiting factor is simply the power needed to drive the technology. Because the core graphics technology in consoles is the same as in PCs, there isn't going to be any meaningful improvement in efficiency, which means that in order to drive significantly more powerful GPUs a console would require a much beefier power supply. "Consoles have power budgets of only 200 or 300 Watts, so they can put them in the living room, using small fans for cooling, yet run quietly and cool," Tamasi explained. "And that's always going to be less capable than a PC, where we spend 250W just on the GPU. There's no way a 200W Xbox is going to be beat a 1000W PC."

Of course, not everyone is going to have a 1000W PSU in their rig, nor are very many people likely to pony up for an Nvidia Titan, which costs literally twice as much as an Xbox One. But in three years, the Titan will sell for a third of what it's currently going for and some new whiz-bang hardware will be perched on the bleeding edge, while the Xbox One will still be an Xbox One.

Source: PC PowerPlay

Permalink

I think the phrase 'no shit' springs to mind here.

Do you really think the Titan will depreciate that much in 3 years Mr Chalk?

No fucking shit.

And if you need a 1000W power supply for gaming you have likely been misinformed by a salesperson.

I'm not disputing what he's saying, but why do I get the feeling they're only saying this stuff because they're butthurt all the next-gen consoles are using AMD chips? Sounds like shit-stirring to me.

OF COURSE a console isn't going to outpace PC, unless console manufacturers stick a Titan in there and release a new console every year. Consoles are bound to go obsolete sooner or later.

EDIT: Better put in a disclaimer that you don't need a Titan or upgrade yearly to play PC games. It's an extreme example.

Hazy992:

EDIT: Better put in a disclaimer that you don't need a Titan or upgrade yearly to play PC games. It's an extreme example.

I don't really see a Titan as a gaming GPU, really - It's overkill to buy that *just* for gaming, unless you're a hardware enthusiast/overclocker. I feel like the mid-range cards are usually more than adequate for most customers.

Still Life:

Hazy992:

EDIT: Better put in a disclaimer that you don't need a Titan or upgrade yearly to play PC games. It's an extreme example.

I don't really see a Titan as a gaming GPU, really - It's overkill to buy that *just* for gaming, unless you're a hardware enthusiast/overclocker. I feel like the mid-range cards are usually more than adequate for most customers.

Agreed, something like a 660 and above and you're golden for a good long while.

Still Life:

Hazy992:

EDIT: Better put in a disclaimer that you don't need a Titan or upgrade yearly to play PC games. It's an extreme example.

I don't really see a Titan as a gaming GPU, really - It's overkill for gaming alone. I feel like the mid-range cards are usually more than adequate for most customers.

Exactly, I'm still getting by just fine on a pair of GTX 560ti cards I picked up like 2 years ago. I don't see them needing replacement any time soon.

We know. And we also know that PC games will never be as optimized as console games and they will always require more raw power to run games at console settings. And that's mostly because of bloated operating systems that are not designed specifically for gaming. Windows still has a shitty bloated kernel and that won't change as long as Microsoft has practically a monopoly on desktop operating systems.

Ed130:
I think the phrase 'no shit' springs to mind here.

Do you really think the Titan will depreciate that much in 3 years Mr Chalk?

IF the rumor about ATI's new flagship card are true then yes. It is only slightly under the Titan. If they price it lower then the price of the titan will drop quickly

Well a big "No duh" should be placed over the picture I believe.
Something's telling me he's still pissed that AMD got the bid for all 3 of the 8th Gen consoles and Nvidia didn't get a single one. Seriously, the only news I ever really seem to hear is some person from Nvidia angry at the PS4 or other game console.
Plus, I know the Titan is nice, but I like my AMD Radeon 7770 chip just fine, and it didn't cost me the soul of my best friend's unborn child Nvidia!

Kyrdra:

Ed130:
I think the phrase 'no shit' springs to mind here.

Do you really think the Titan will depreciate that much in 3 years Mr Chalk?

IF the rumor about ATI's new flagship card are true then yes. It is only slightly under the Titan. If they price it lower then the price of the titan will drop quickly

Yay, maybe then I'll be able to afford one. Or the rumoured ATI, whichever offers the most bang for my money.

Still Life:

Hazy992:

EDIT: Better put in a disclaimer that you don't need a Titan or upgrade yearly to play PC games. It's an extreme example.

I don't really see a Titan as a gaming GPU, really - It's overkill to buy that *just* for gaming, unless you're a hardware enthusiast/overclocker. I feel like the mid-range cards are usually more than adequate for most customers.

Same here. I never buy the best GPU available when getting a new PC or upgrading. The price isn't worth it for a regular gamer. As long as I keep to a 1600x900 resolution (which looks fine to me), I can run most games with most options maxed out with my GeForce GTS 240. It did burn out a couple times, but Dell has excellent warranty.

ah well, bit obvious, but gives me something to refer me "Xbox One will shit all over your PC"
none of them knew why I was laughing

Huh, I've had a 1000W PSU in my box for the last three or four years because it was just a bit more than the 750 I was looking at, and I wanted to make sure I didn't have to worry about a new PSU for awhile.

I agree that the real bang for your buck isn't the top of the line cards but the ones in the middle where you can really get phenomenal value.

Oh, but this was about how PCs are going to always outpace consoles. I didn't think that was ever in question, really.

One thing a console does do, and you can't really overstate this, is it largely prevents untrained individuals from mucking up their OSes with dodgy programs and wrong drivers that make their system perform less capably than it should. I suppose that's worth something to a lot of folks. I've seen more than a few gaming PCs that were so poorly maintained (both from the software and hardware angle) that they weren't getting the performance they paid for.

Hazy992:
I'm not disputing what he's saying, but why do I get the feeling they're only saying this stuff because they're butthurt all the next-gen consoles are using AMD chips? Sounds like shit-stirring to me.

Same. I think I remember reading that their talks with Sony ended because Nvidia wouldn't come down in price. At this point they're like a little kid throwing a temper tantrum

"Fine! We don't want our amazing cards in your weak-ass console anyway!"

Adam Jensen:
We know. And we also know that PC games will never be as optimized as console games and they will always require more raw power to run games at console settings. And that's mostly because of bloated operating systems that are not designed specifically for gaming. Windows still has a shitty bloated kernel and that won't change as long as Microsoft has practically a monopoly on desktop operating systems.

And pretty much the only other thing I was going to say. Realistically, no matter how much I may want to upgrade computer hardware every year, I can't afford it. Just because some people are rich enough to upgrade every few years doesn't mean we all can so it's not exactly a selling point. So I might as well buy a system that uses half the power, has a steady frame rate, doesn't require building, researching and bargain hunting and doesn't run on inefficient software powered by a company who stopped giving fucks about their users want a decade ago

Hopefully Steam OS will change that but unless they offer physical media and/or no drm I'll probably continue to own a console for a while

Not that game will scale to match the potential of the higher graphics cards. Game makers tend to produce what the consoles can run so your better graphics cards will usually end up being overkill. And unless there's a breakthrough in production method, game companies can't afford to take advantage of the titan level graphical capability.

"Microsoft simply can't afford to spend that kind of money"

Did someone actually say that? Say that MICROSOFT can't afford to spend money? The company whose founder was the richest man on Earth for several years running?

Adam Jensen:
We know. And we also know that PC games will never be as optimized as console games and they will always require more raw power to run games at console settings. And that's mostly because of bloated operating systems that are not designed specifically for gaming. Windows still has a shitty bloated kernel and that won't change as long as Microsoft has practically a monopoly on desktop operating systems.

The XBox One has pretty much the same kernel though, it's got a few modifications but considering the XBox One is supposed to be able to run standard Windows Apps with some minor modifications it won't be anything major.

Bloat on the XBox One could easily be worse considering it's running not just a modified Windows but also it's own OS and a third OS to allow the two other OS to communicate. Mostly depends on how active the Windows kernel will remain while you're gaming, but considering you're supposed to be able to pause at any time and immediately start Skyping or even do the two side-by-side I don't think it going fully inactive is part of the options.

Granted, it's already been pretty much established that the XBox One is no longer a device intended as a dedicated gaming machine. But outside of PS4 exclusives games will be manufactured to run smoothly on both systems so console games in the next generation will also be made to run on bloated OS that are not designed specifically for gaming.

Ukomba:
Not that game will scale to match the potential of the higher graphics cards. Game makers tend to produce what the consoles can run so your better graphics cards will usually end up being overkill. And unless there's a breakthrough in production method, game companies can't afford to take advantage of the titan level graphical capability.

Even if they don't scale games all that well for the PC, unless they do some dodgey shit you'll at least get 60+ fps on a PC. The Xbone and PS4 have had developers already sayin they might not be able to push 60 fps on the new consoles and will stick with 30.

Whilst I agree with him... well this here statement will reinforce some negative stereotypes about PC gamers...
And the amount of money we spend...

Ed130:
Do you really think the Titan will depreciate that much in 3 years Mr Chalk?

No idea, "a third" just sounded good. :) But three years is quite a bit of time for a GPU to depreciate, especially one at the very top end of the price range.

As for the "no shit" part of the story, sure, it goes without saying that a bleeding-edge PC will smoke any console on the market, but I find it interesting because it reminds me a bit of auto racing: The innovation takes place at the high end, but it's the mass-market, consumer-level stuff that ultimately enjoys the benefit.

Question is, did they ever? I mean I think the NES was the last era where a console could out graphics a PC. Seriously. Think back. Compare the Snes version of Doom to the PC version...or even wolf3d...The consoles were generally running the game at lower frame rates and resolutions than the average 486 of the time. Consoles were never really more graphically powerful.

PC's were always more powerful gaming wise... the trick is , consoles were always 'easier'. Literally plug and play as opposed to install and pray. Pray as in: pray you could free up enough conventional memory to run the game. Pray your graphics/sound/modem was supported by the game's limited built in drivers. Pray that windows wouldn't crash spontaneously.

You could also do nifty things like rent a game for a weekend play as opposed to a full on purchase so it was also cheaper. You also didn't have to worry about whether or not games would run... all console games will run on the console it was designed for... PC games you prayed you had enough ramp, cpu/gfx to run the damned thing. Remember those $100 required to play Doom3?

PC's were always more powerful graphics-wise (after the nes generation).

Adam Jensen:
We know. And we also know that PC games will never be as optimized as console games and they will always require more raw power to run games at console settings. And that's mostly because of bloated operating systems that are not designed specifically for gaming. Windows still has a shitty bloated kernel and that won't change as long as Microsoft has practically a monopoly on desktop operating systems.

Console settings this gen= Ultra Low, 690p, 28 fps. Anything above Low is and 30 fps and 720-p is higher then a console setting. That is why getting a PC and putting things on medium is by no means bad...
7th Gen difference in optimization was about 15-25%... not that much. 8th Gen it will be smaller, due to games being made on X86-64.

Ukomba:
Not that game will scale to match the potential of the higher graphics cards. Game makers tend to produce what the consoles can run so your better graphics cards will usually end up being overkill. And unless there's a breakthrough in production method, game companies can't afford to take advantage of the titan level graphical capability.

Most games do not push the limits of PC cards, but they do scale quite a bit. It looks like the Xbox One and PS4 are going to have some pretty amazing games. The PC versions of those can feature higher resolution textures, run at a faster framerate, use more AA, have more physics/AI actors on the screen at once, run at a higher resolution (I don't see 4k being a thing in living rooms soon, but in a few years you might see more PC gamer's using it), etc.

These advantages grow pretty quickly over the years, but they do hit a bit of a roadblock at some point. The current consoles are definitely holding things back and buying a Titan is amost certainly a waste of money.

My 550W PC already turns my room into a pressure cooker in the summer so I don't find the prospect of a 1000W very appealing. It would make a nice winter space heater though.

Adam Jensen:
We know. And we also know that PC games will never be as optimized as console games and they will always require more raw power to run games at console settings. And that's mostly because of bloated operating systems that are not designed specifically for gaming. Windows still has a shitty bloated kernel and that won't change as long as Microsoft has practically a monopoly on desktop operating systems.

Should probably cite the other reason consoles can get away with optimisation - hardware consistency. You can't optimise for PC when you don't know whether any given PC will be running an AMD or an Intel or some weird Exynos CPU, or an AMD or NVidia GPU, or what derivative of each it'll be running. Every XBone will have the same CPU and GPU (or APU) though, so you can specifically gear the code for those processors. Inconsistent hardware means reliance on inefficient, scalable, unified architectures (like DirectX).

It's the same reason why developers prefer writing apps for the iPhone than for Android-based devices.

Unless the tools that enable developers to create work that utilizes the increased horsepower of high-end PCs are feasible for anyone other than larger developers with dozens of people working specifically on texture, lighting, particle effects etc., the point is going to be somewhat moot.

Consoles have *never* been on-part with PCs when it comes to graphics. PCs were cranking HD-Ready (720P+) resolutions when consoles where still SD (480p). Now the consoles are 1080p at best while PCs can show more.

Consoles' advantages are unified hardware and API, and a proper plug and play approach, as well as being cheaper (but more limited in their use) than PC. PCs' advantages are, as long as you throw more cooling and power at a GPU, you basically have no limits - for a price. Because of the Joule effect the maximum power:heat ratio is much bigger in a PC than in a console. There's only so much heat you can dissipate in a locked down small case.

Callate:
Unless the tools that enable developers to create work that utilizes the increased horsepower of high-end PCs are feasible for anyone other than larger developers with dozens of people working specifically on texture, lighting, particle effects etc., the point is going to be somewhat moot.

That's what high level APIs such as DirectX or OpenGL are supposed to handle. Xbox games still use DirectX. But it's true you can tweak some more, use 'cheats', that you can't use on PC because the way the hardware interprets the API commands can vary. Nothing's that unsolvable, and Microsoft has done a good job at providing a great API for that - both on Windows (because they understood early that'd bolster the sales of their OS) and Xbox. I don't know much about Sony's APIs though.

Whether or not a PC can dominate the consoles when it comes to graphics...
I don't think it matters all that much. Sure you still want the graphics to look nice but it shouldn't be the main focus.

That's why certain games with splendid graphics haven't been heard of, nor favored as much. Not saying they aren't good but I know a few games with excellent display being an 'alright experience' gamewise. I really wish people focused on game mechanics and so much more on development of the plot then just wanting to beat competitors with "MY game is shinier and prettier!"

But yeah, who knows what'll become the future of our games, or movies, or anything.... we have to rely on the people who make them.

Johnson McGee:
My 550W PC already turns my room into a pressure cooker in the summer so I don't find the prospect of a 1000W very appealing. It would make a nice winter space heater though.

You must have a bad power supply then because all the PSU's I've ever had run completely quiet and cool and the latest too are 700w and 750w.

Andy Chalk:
Of course, not everyone is going to have a 1000W PSU in their rig, nor are very many people likely to pony up for an Nvidia Titan, which costs literally twice as much as an Xbox One. But in three years, the Titan will sell for a third of what it's currently going for and some new whiz-bang hardware will be perched on the bleeding edge, while the Xbox One will still be an Xbox One.

THIS ^. My crappy-as-balls computer was built using 3rd generation parts (4GB with a GTX560) and it is still more powerful than consoles. In a few years when I finally replace it it still won't be "High end" but it will still give out more juice than I know what to do with. I still enjoy consoles, but as far as tech specs go they can only compete with your mom's box Dell, and then just barely.

Does anyone think otherwise?

Caramel Frappe:
Whether or not a PC can dominate the consoles when it comes to graphics...
I don't think it matters all that much. Sure you still want the graphics to look nice but it shouldn't be the main focus.

That's why certain games with splendid graphics haven't been heard of, nor favored as much. Not saying they aren't good but I know a few games with excellent display being an 'alright experience' gamewise. I really wish people focused on game mechanics and so much more on development of the plot then just wanting to beat competitors with "MY game is shinier and prettier!"

But yeah, who knows what'll become the future of our games, or movies, or anything.... we have to rely on the people who make them.

Crysis springs to mind due to the fact no one actually bought that game, but rather downloaded it and used it to test their new drive cores.

I gotta agree that effort should probably be made in other areas too. Super duper shiny graphics don't really matter much if the story is hollow and the gameplay is shallow. It's like dating a stereotypical blonde cheerleader.

OT: I'd just like to say thank you to Nvidia for spending billions of dollars on cramming a few more pixels into my screen. How could i live without you guys? :')

This thread brought to you by Captain Obvious
image

For whenever there's a thing that doesn't need pointing out, Captain Obvious will be there... and he'll point it out, because he's Captain Obvious, and as such feels it is his duty to point out obvious things.

RikuoAmero:
"Microsoft simply can't afford to spend that kind of money"

Did someone actually say that? Say that MICROSOFT can't afford to spend money? The company whose founder was the richest man on Earth for several years running?

He means it wouldn't be a worthwhile investment for them, since they aren't strictly a graphics company; not that Microsoft literally doesn't have the extra dough laying around to fund the research if they really wanted to.

-Captain Obvious

 Pages 1 2 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here