The Order: 1886 Reveals Actual Gameplay Footage

 Pages PREV 1 2 3
 

Ultratwinkie:
AMD was desperate, and they were taken advantage of on that basis. Nvidia had way more leverage, and didn't want to waste their time on bargain bin consoles. These guys made the ps3, if they were to make a console they want to go all in. They wouldn't stop at the bargain bin.

AMD is making a profit off of every console being sold, it's only Sony that isn't making a profit on the console so what does it matter to a chip producer if the final product is cheap as long as they still get paid for their chip? Your reasoning here is deeply flawed. It is more apt to completely isolate the card they're making from what it's being used for. A banana salesman doesn't give a crap about what type of smoothy his bananas are going to be used in as long as he gets paid for the banana.

But it's also a little more complex in another way that makes the deal more important. AMD has already sold 15 million of these chips/cards because they're in every console. This means that their chip is the one developers have in mind when they create their games. Whether you want to admit it or not, this is a problem. AMD is going to take some market share and perform a lot better over the next several years. Though I anticipate that most of their acquisition of market share will be from Intel rather than Nvidia.

So I strongly maintain that Nvidia simply lost the bid that they probably wanted.

If the consoles are not going all in, they really don't care. Its not a loss to them because they control most of the market. Its beneath them to waste time on something that isn't another ps3.

Again, you're making bullshit (aka dishonest) statements when you claim to know Nvidia's mindset.

On AMD's side, all they could actually afford to sell was the mobile hardware. They were losing everything because they couldn't figure out what made nvidia popular.

Again, you're making wildly unfounded claims. AMD was already planning on making SoC GPU/CPU combos and this deal brought in revenue for something they were already going to make because SoC is great (required) for mobile devices. As such, it is far easier for them to take smaller margins and to simply let Sony, Microsoft, and Nintendo fund the research they were already going to perform. Even if they lost money they would still be better off because it took part of the cost of research on the matter. Nvidia, on the other hand, isn't an SoC manufacturer and this would have been a major shift for them and additional cost.

AMD was simply the right company at the right time.

It wasn't just the power, but the extra support. AMD doesn't do that. It doesn't have the extras aimed at gamers. Its these extras that made nvidia the fan favorite of PC gamers.

Nvidia does have nice extras on PCs. But neither company has ever had the same extras on consoles. All console developers want from them is their hardware and little else. I think you're operating under the misconception that I don't like Nvidia. I do. I'm just not going to blind myself to the fact that AMD is a legitimate competitor.

That's why physx exists, its meant as marketing as a nvidia exclusive feature. The only thing they would do is make sure physx is limited on consoles to hold it over the heads of console gamers to get a better machine to experience the full effect.

Again, very few games use physx and the difference is usually minor smoke effects that other physics engines can also do.

physx didn't shrivel up and die during the 7th gen, where it took work to port to PC,

Why is this relevant when now we have a bunch of x86 environments? In the 7th generation and prior you were going to have to do a lot of work anyways if you were going to port across multiple platforms so the PC was treated as if it were another proprietary console. Now, all the big players use the same x86 so the work is going to be minor. This is actually a detractor from your point. If you can just optimise it for the console hardware and easily port it over, why would you spend a ton of money to support physx as an engine? You wouldn't unless it would benefit the console version too. That's why Nvidia is providing support and why it would be dumb if it didn't work on the consoles.

it won't now. The loist that does list physx games is incomplete on what game uses what.

Yes, both lists are incomplete. So why don't you provide data to back your comments? Most sites that discuss whether to go AMD or Nvidia bring up the point that precious few good games actually use it. The only data I really see directly comparing the two is something from Physxinfo.com in 2009. What's weird is that they actually paint Havok in a damn good light. The total titles comparison was Physx 200 to Havok 181. However, they actually admit that Havok has a huge quality of game advantage over Physx:

image

Again, that chart was from a physx site for some reason. What that indicates is that in 2009 the games we cared about used Havok. The Third Rate or Specific rating is for any game with a metacritic of lower than 50 or having no rating at all. That is the only area that physx was beating havok in at that time despite a huge leap in the number of games.

Here's the link just in case it disappears again: http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_rating_graph.jpg

The amount of physx games that use more than basic physics is much more than the games that fully use havok.

What a weird comment. Games that use one piece of software partially is more than games that use another piece of software fully is a nonsense statement. The reverse is also true. Games that use havok partiallyis much more than games that use physx fully. That's an obvious statement because every game is going to have their own engine with a significant amount of physics already built in. Havok and Physx are both meant to augment those engines, not replace them. So they are almost always only partially used rather than fully used. So to count the number of games that partially use it is basically to include all of them.

Remember that steam's Source engine actually uses Havok Physics. That's already a huge head start for them like it was for Physx when the Unreal 3 engine used it.

But if it's anything like the graph above, Physx has padded their numbers with a lot of shitty games. I'd need a new graph to see otherwise. Of the games we may care about, there's only a handful that support it at all or even use a single feature.

Even when havok makes it more accessible, developers don't care.

Even Physx disagrees with you. Steam too (as a PC gamer that should mean something).

Physx is a PC feature to market to PC gamers. Console gamers are irrelevant because they wouldn't buy the cards. You seem to forget that physx was doing just fine for 7 years without console support.

It isn't for the gamers, it's to entice developers to create software that uses physx. They aren't going to do it if only PCs may have Nvidia cards but nothing else. Do you know what the PC market looks like to developers? 20.7% of PCs have an AMD graphics card, 16.3% has Nvidia, and then the lions share of 62.9% goes to Intel. That's a forbes article from last month and not a distance 2009 article like the one I found from Physx admitting that lower quality development studios pad their numbers.

Intel is certainly padded numbers with those imbeded graphics but they still make up a huge market segment of laptop users. The important fact is that more PCs use AMD than Physx. This is why Physx is so rarely used by larger companies.

You may say "its just hair physics" but it was the only thing AMD actually put out there in terms of extras. Without exclusivity, you might as well go nvidia.

Nobody cares. Extras don't mean shit if games don't utilize them. By your logic Nvidia just gave Physx and Apex away for free. Both of these things are just to get people to use their stuff. The logic being that Tress X will work better on an AMD card than on an Nvidia card but if every card had it then developers were more likely to utilize it. It's exactly the same reasoning Nvidia had for supporting the PS4.

You can't have it both ways. Either this is a good business practice or a bad one. Which is it? Did Nvidia make a mistake or did both companies make wise choices?

So next time, don't call bullshit when Nvidia came out and admitted the price they were offered wasn't worth their time.

Opportunity cost is different than saying they can't make a profit at it. What Nvidia actually said was that if they did a console they had to look at what other piece of business they would have to put on hold. Nvidia said they simply didn't have the resources to do everything they were doing and do this. Their words. How you interpreted it as the consoles being somehow beneath them is beyond me.

AMD did have the available resources to do this and as I stated, were already planning on SoC solutions. This was just at the right time for them and the wrong time for Nvidia.

Lightknight:

Ultratwinkie:
AMD was desperate, and they were taken advantage of on that basis. Nvidia had way more leverage, and didn't want to waste their time on bargain bin consoles. These guys made the ps3, if they were to make a console they want to go all in. They wouldn't stop at the bargain bin.

AMD is making a profit off of every console being sold, it's only Sony that isn't making a profit on the console so what does it matter to a chip producer if the final product is cheap as long as they still get paid for their chip? Your reasoning here is deeply flawed. It is more apt to completely isolate the card they're making from what it's being used for. A banana salesman doesn't give a crap about what type of smoothy his bananas are going to be used in as long as he gets paid for the banana.

But it's also a little more complex in another way that makes the deal more important. AMD has already sold 15 million of these chips/cards because they're in every console. This means that their chip is the one developers have in mind when they create their games. Whether you want to admit it or not, this is a problem. AMD is going to take some market share and perform a lot better over the next several years. Though I anticipate that most of their acquisition of market share will be from Intel rather than Nvidia.

So I strongly maintain that Nvidia simply lost the bid that they probably wanted.

If the consoles are not going all in, they really don't care. Its not a loss to them because they control most of the market. Its beneath them to waste time on something that isn't another ps3.

Again, you're making bullshit (aka dishonest) statements when you claim to know Nvidia's mindset.

On AMD's side, all they could actually afford to sell was the mobile hardware. They were losing everything because they couldn't figure out what made nvidia popular.

Again, you're making wildly unfounded claims. AMD was already planning on making SoC GPU/CPU combos and this deal brought in revenue for something they were already going to make because SoC is great (required) for mobile devices. As such, it is far easier for them to take smaller margins and to simply let Sony, Microsoft, and Nintendo fund the research they were already going to perform. Even if they lost money they would still be better off because it took part of the cost of research on the matter. Nvidia, on the other hand, isn't an SoC manufacturer and this would have been a major shift for them and additional cost.

AMD was simply the right company at the right time.

It wasn't just the power, but the extra support. AMD doesn't do that. It doesn't have the extras aimed at gamers. Its these extras that made nvidia the fan favorite of PC gamers.

Nvidia does have nice extras on PCs. But neither company has ever had the same extras on consoles. All console developers want from them is their hardware and little else. I think you're operating under the misconception that I don't like Nvidia. I do. I'm just not going to blind myself to the fact that AMD is a legitimate competitor.

That's why physx exists, its meant as marketing as a nvidia exclusive feature. The only thing they would do is make sure physx is limited on consoles to hold it over the heads of console gamers to get a better machine to experience the full effect.

Again, very few games use physx and the difference is usually minor smoke effects that other physics engines can also do.

physx didn't shrivel up and die during the 7th gen, where it took work to port to PC,

Why is this relevant when now we have a bunch of x86 environments? In the 7th generation and prior you were going to have to do a lot of work anyways if you were going to port across multiple platforms so the PC was treated as if it were another proprietary console. Now, all the big players use the same x86 so the work is going to be minor. This is actually a detractor from your point. If you can just optimise it for the console hardware and easily port it over, why would you spend a ton of money to support physx as an engine? You wouldn't unless it would benefit the console version too. That's why Nvidia is providing support and why it would be dumb if it didn't work on the consoles.

it won't now. The loist that does list physx games is incomplete on what game uses what.

Yes, both lists are incomplete. So why don't you provide data to back your comments? Most sites that discuss whether to go AMD or Nvidia bring up the point that precious few good games actually use it. The only data I really see directly comparing the two is something from Physxinfo.com in 2009. What's weird is that they actually paint Havok in a damn good light. The total titles comparison was Physx 200 to Havok 181. However, they actually admit that Havok has a huge quality of game advantage over Physx:

image

Again, that chart was from a physx site for some reason. What that indicates is that in 2009 the games we cared about used Havok. The Third Rate or Specific rating is for any game with a metacritic of lower than 50 or having no rating at all. That is the only area that physx was beating havok in at that time despite a huge leap in the number of games.

Here's the link just in case it disappears again: http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_rating_graph.jpg

The amount of physx games that use more than basic physics is much more than the games that fully use havok.

What a weird comment. Games that use one piece of software partially is more than games that use another piece of software fully is a nonsense statement. The reverse is also true. Games that use havok partiallyis much more than games that use physx fully. That's an obvious statement because every game is going to have their own engine with a significant amount of physics already built in. Havok and Physx are both meant to augment those engines, not replace them. So they are almost always only partially used rather than fully used. So to count the number of games that partially use it is basically to include all of them.

Remember that steam's Source engine actually uses Havok Physics. That's already a huge head start for them like it was for Physx when the Unreal 3 engine used it.

But if it's anything like the graph above, Physx has padded their numbers with a lot of shitty games. I'd need a new graph to see otherwise. Of the games we may care about, there's only a handful that support it at all or even use a single feature.

Even when havok makes it more accessible, developers don't care.

Even Physx disagrees with you. Steam too (as a PC gamer that should mean something).

Physx is a PC feature to market to PC gamers. Console gamers are irrelevant because they wouldn't buy the cards. You seem to forget that physx was doing just fine for 7 years without console support.

It isn't for the gamers, it's to entice developers to create software that uses physx. They aren't going to do it if only PCs may have Nvidia cards but nothing else. Do you know what the PC market looks like to developers? 20.7% of PCs have an AMD graphics card, 16.3% has Nvidia, and then the lions share of 62.9% goes to Intel. That's a forbes article from last month and not a distance 2009 article like the one I found from Physx admitting that lower quality development studios pad their numbers.

Intel is certainly padded numbers with those imbeded graphics but they still make up a huge market segment of laptop users. The important fact is that more PCs use AMD than Physx. This is why Physx is so rarely used by larger companies.

You may say "its just hair physics" but it was the only thing AMD actually put out there in terms of extras. Without exclusivity, you might as well go nvidia.

Nobody cares. Extras don't mean shit if games don't utilize them. By your logic Nvidia just gave Physx and Apex away for free. Both of these things are just to get people to use their stuff. The logic being that Tress X will work better on an AMD card than on an Nvidia card but if every card had it then developers were more likely to utilize it. It's exactly the same reasoning Nvidia had for supporting the PS4.

You can't have it both ways. Either this is a good business practice or a bad one. Which is it? Did Nvidia make a mistake or did both companies make wise choices?

So next time, don't call bullshit when Nvidia came out and admitted the price they were offered wasn't worth their time.

Opportunity cost is different than saying they can't make a profit at it. What Nvidia actually said was that if they did a console they had to look at what other piece of business they would have to put on hold. Nvidia said they simply didn't have the resources to do everything they were doing and do this. Their words. How you interpreted it as the consoles being somehow beneath them is beyond me.

AMD did have the available resources to do this and as I stated, were already planning on SoC solutions. This was just at the right time for them and the wrong time for Nvidia.

Jesus Christ, you move goalposts like a theist.

So suddenly we dropped power now? Well I guess I was right.

However, Nvidia made its views very clear:
http://www.extremetech.com/gaming/150892-nvidia-gave-amd-ps4-because-console-margins-are-terrible
http://www.maximumpc.com/article/news/nvidia_calls_ps4_%E2%80%9Clow_end%E2%80%9D123

or are you going to do the same shit and "speculate?" Like you have for the last fucking page?

Nvidia wanted more money than what was being paid. They refused to make consoles. AMD was desperate, so they took the contract to band aid their money loss.

On top of this, you seem to fail to understand hardware at all.

Physx lived when PC got no support from consoles. Outdated consoles that would never handle it. For 7 years. Now that its easier to port to PC, physx will have a much easier time. In fact, you even said that despite no support from consoles physx was still being used more than havok. Nice way to backtrack on your own damn statement. Regardless of who uses it, its still popular on PC. In fact, the 2nd biggest community on steam is Russian.

Only PC matters to nvidia. Physx is maarketing to PC, which is why its free for PC developers.

If the 7th generation console circle jerk didn't kill physx, nothing will. Its a PC gamer feature, and wasn't meant for consoles beyond exceedingly basic things. If physx could survive so could TressFX. Which AMD threw away, and the only thing they could use to contest physx's dominance on PC. The market they desperately need to stay afloat.

AMD in consoles are meaningless, developers haven't coded for hardware since the early 90s. We've been over this. Device drivers allow software to run regardless of the specifics of the hardware. As long as they make the software itself run without memory leaks and other issues, its fine. Consoles are literally meaningless to PCs. If anything, console hardware only makes PC gamers upgrade to much more powerful cards that make consoles look like a joke, which is nvidia's domain.

Hell, by your own logic the 7870 should be running sales like crazy. Except by steam's own stats its less popular than the absolutely ancient GTX 210.

AMD may have consoles now, but their PC division is still suffering. Even Steam stats show that. They don't have the extras PC gamers want, so unless developers want to cut out 52% of PC gamers (not you numbers) they better play ball with a company that isn't nearly bankrupt. Hell, its impossible to cut out 52% of gamers because of device drivers.

http://store.steampowered.com/hwsurvey/videocard/
http://store.steampowered.com/hwsurvey/

Consoles will not help AMD on PC with any magical "optimizations" or preferential treatment from devs. Because devs don't optimize the hardware anyway. Not to mention that preferential treatment would require someone to actually own an AMD card. By steam's own stats, very few do.

And its hilarious how your own Forbes news post has a link to steam stats that say the exact fucking opposite of what it claimed. This is hilarious.

Physx being on the ps4 is meaningless. Without CUDA it wouldn't work the way devs wanted it to, which is the higher end physics. In fact, Physx was licensed for the xbox 260 and ps3. When the last time you saw physx on a console? In its full glory? None. When was the last time you saw a 7th gen PC game have physx? Multiple times from AAA games and running a lot of what physx has to offer.

When was the last time you saw havok used fully? Rarely. Even Source isn't an ace in the hole when that engine is ancient and on its way out. Source doesn't even support the new stuff that havok supposedly uses.

Nvidia made a meaningless (to console gamers) decision to send it onto consoles. Physx wouldn't make its debut on consoles, not in its full form. Which is what I've been saying to you for the last 2 pages. Everything about physx is meant to market to PC gamers.

And its working. Physx wouldn't actually exist on consoles and on PC it can actually be turned on. At best its a trojan horse to market Nvidia's tech on AMD's only recent accomplishment.

AMD however made a horrid decision to hand out TressFX to everyone else. They could have built onto it and actually have something to compete with physx. They could make extras to market to the market they have been losing. Instead, they hand their greatest potential to their greatest enemy.

It doesn't matter if its just hair, it could be something they could actually market and build on. Tomb Raider used Tress FX. They could even put it on the consoles and AMD cards exclusively.

You talk about how "good" consoles are and how they are so important to get "supported" but AMD, the ones making the consoles, can't have their own software supported on their own platform?

Do you have any idea how sad that is? That's like Obama being locked out of the white house and the guards won't let him in.

If AMD doesn't actually put out extras and actually start marketing themselves, they'll lose everything. They can't rely on consoles because everything else is dragging them down, which they can't cut. Its prolonging the inevitable unless they actually get their act together. The consoles are just the band aid.

Ultratwinkie:
Jesus Christ, you move goalposts like a theist.

So suddenly we dropped power now? Well I guess I was right.

However, Nvidia made its views very clear:
http://www.extremetech.com/gaming/150892-nvidia-gave-amd-ps4-because-console-margins-are-terrible
http://www.maximumpc.com/article/news/nvidia_calls_ps4_%E2%80%9Clow_end%E2%80%9D123

or are you going to do the same shit and "speculate?" Like you have for the last fucking page?

I cited Nvidia. They explained that they did not have the resources to undertake the process because they would have to take the resources away from other projects. Too many irons in the fire doesn't mean they wouldn't have undergone this project had they been capable of juggling it too. SoC is something Nvidia could have done, but they really haven't been going that route so this would have been new ground breaking for them. Not to mention that they'd have to dance around AMD's patents in one of the few areas AMD is stronger in. Nvidia made $500 million off the 7th generation and that was with only one console. That's pretty good for one chip. Let's say the profit goes down to 1/5th of that. What do you think they typically make on any given chip? AMD won all three console companies. They stand to do VERY well off of this, a fact you dismiss because AMD was on shakey financial terms at the time of the deal. Even companies who are doing poorly aren't going to take huge losses just because. Companies would declare bankrupcty or even cash out and go out of business before willy nilly taking a loss.

Nvidia wanted more money than what was being paid. They refused to make consoles.

Yep. So they weren't able to produce the product in a way that was deemed worth their time. This is in contrast to AMD who was. Same argument as the first.

AMD was desperate, so they took the contract to band aid their money loss.

By "band aid" do you mean to make a profit? Again, AMD was already geared to design more SoC solutions. If nothing else, this funded the very expensive R&D process that they were already going to undertake. It's a business no brainer for them whereas Nvidia wasn't planning to do that so you were looking at a much lower margin for them.

This isn't saying NVidia did anything bad. This is just saying that the deal was much more attractive to AMD than it was to NVidia.

Look at it this way. Nvidia would have to incur unexpected costs designing an entirely new SoC solution that they weren't planning on making. It isn't an area they're particularly strong in yet so it would have taken them even more money to get up to speed on it. Add that to what I'm sure is a lower profit margin for the card and you could be looking at a much lower profit than most of their other projects would give them.

Then, look at AMD who was already going into this area of research. Even if they take a loss on this overall it will still significiantly reduce the cost of R&D on this chip which enables them to start releasing SoC solutions for products that would benefit from them. So this new research gives them a nice edge in SoC projects going forward whereas Nvidia's business model isn't so interested in SoC.

This deal was literally nothing but good for AMD and nothing but a risk for Nvidia at the certain high opportunity cost of losing resources for more lucrative projects. It isn't that Nvidia didn't want console business and it wasn't that AMD was desperate. It just made sense for both companies to take the route they did. The only loss to Nvidia is that their cards won't be specifically supported as much anymore but they're still an incredibly common card manufacturer so it's not like they won't be supported to the point of working. Just not optimized in several cases.

It's a common stage of loss to belittle the the value of the opportunity lost. Even for teams and companies we root for it's a simple physchological defense mechanism to pretend like the "other guys" didn't really get that good of a deal. Maybe you're doing that and maybe you're not, but this was a great deal for AMD even if it wasn't as good a deal for Nvidia. AMD would have been dumb not to take the deal whereas Nvidia made a calculated decision.

I'm just not a fan of either companies. You might as well be talking about chair companies to me. Had I found a comparable Nvidia card for a similar price in the same performance range I would have been more likely to buy that.

On top of this, you seem to fail to understand hardware at all.

Physx is software, not hardware. The extent to which it is used or the amount of processing it utilizes is entirely up to the developers implementing it. Think of it like particle physics. People like to use Physx for steam rendering. You can determine how detailed the steam is according to the available resources. This is why low to ultra pc settings matter. You are basically telling it how detailed the physics engines along with other settings can be. That doesn't mean tha the lowest setting doesn't have any physics or detail. Just that it's far less in comparison.

Physx lived when PC got no support from consoles. Outdated consoles that would never handle it. For 7 years.

Are you now claiming that the consoles have never been able to "handle" it? 7 years is the lifespan of the consoles. You seemed to indicate that the consoles were somehow cutting edge on last release.

Anyways, you are misunderstanding what physics engines are for. They aren't necessarily more resource demanding, in fact, they can even be less resource demanding than the various custom-made physics engines that development studios make for their games for certain processes. For example, ragdoll physics. A game could lean on Physx or Havok to help with that if their code is less efficient. True, they are usually meant to add effects on top of the vanilla game but not always.

Nvidia benefits from people working with consoles that have Nvidia cards because physx was still an option there. It makes it easy for developers to make use of physx to a lesser degree on the console and then turn it the heck up on the pc. Metro Last Light used Physx for persistent particles (blow a tile off a column parts stay visible on the ground) and for some steam/fog interaction (the steam is still there in all versions but with physx as characters move through the fog the steam dissapates). These are minor things that require a more detailed engine coupled with the hardware to use it. However, if you're already developing for an Nvidia card it does you no harm to include this feature as something that can be toggled on or off. If you are developing only for AMD cards, it does require extra steps to enable it.

That is why Nvidia went ahead and contributed the drivers. Yes, this allows the use of phyx on the consoles but makes it infinitely more easy for developers to implement the utility if they want to. However, and this is a major issue at the moment, there's not that big of a difference between Havok's newely released physics engine and Physx

That's real time on the ps4 hardware with a million objects.

If the new engine can already do physics this well with a million objects in such a varied environment then it can do everything that Physx is currently being used for. There is a huge difference between Havoc pre-2013 and Havok now. As stated, they released a new version last year that drastically improved performance.

Cut to around .48 to skip the silly fluff and see Havok 2012 compared with Havok 2013. The performance difference is staggering for the same task.

There's a reason why the majority of AAA studios use Havok if anything else at all. It's a lot more user friendly with a ton more tools for interface. Until 2013, Physx was the better software. Now I'm not sure which is better at all. They may even have individual strengths and weakensses for all I know but one isn't necessarily better than the other.

Now that its easier to port to PC, physx will have a much easier time. In fact, you even said that despite no support from consoles physx was still being used more than havok. Nice way to backtrack on your own damn statement. Regardless of who uses it, its still popular on PC. In fact, the 2nd biggest community on steam is Russian.

Actually, my statement has been that they only create something like 3 meaningful games a year that actually use any component of physx. I presented Nvidia's own chart to discredit the meaningfulness of your statement that more games used physx. Havok outscores NVidia in every game area except for the shit pile. The games that score less than 50 metacritic or don't even have a score at all. How can you even pretend to toute these numbers as meaningful? When the games scoring more than 50 on metacritic are firmly in Havok's corner at 154 to 109 Nvidia? Sure, the shit pile has 81 for Physx to Havok's 27 but really? I'd consider those all to be detractors.

But you're basically saying that Batman: Arkham Origins using Physx is no different than say a game that wasn't even popular enough to get reviewed by games critics or ones that got scored below 50? There is a difference and you know it. As the quality of the game goes up, the number of games made with Physx drops drastically. From 81 on the shit pile to 8 in the Excellent. Havok seems to follow a bell curve with both shit and excellent in the 20s and the majority of the titles in the middle. But still a MUCH higher weighted average than physx. As a Physx fan, aren't you even a little embarrassed by that chart?

What's more is that these games are only games that support those engines. They aren't games that necessarily use those engines. Most of the games that use/support Havok still use it while a sizeable chunk of the games that support physx do so in name only wihtout using any of its modules. Wierd, huh?

Only PC matters to nvidia. Physx is maarketing to PC, which is why its free for PC developers.

And yet, the Source Engine went with Havok. So Havok had its hand in Portal, Half-Life, Stanley Parable and several other significant PC games. Do you have any more recent numbers (and perhaps any more recent quality comparisons) that would indicate a significant change of some kind?

If the 7th generation console circle jerk didn't kill physx, nothing will. Its a PC gamer feature, and wasn't meant for consoles beyond exceedingly basic things. If physx could survive so could TressFX. Which AMD threw away, and the only thing they could use to contest physx's dominance on PC. The market they desperately need to stay afloat.

Hair physics. It is literally an engine solely devoted to how hair behaves. That is not a competing engine. That's ridiculous. Like saying that a radiator manufacturer is in direct competition with a car manufacturer. AMD itself is in competition with NVidia but it really isn't anywhere close with a physics engine. AMD is not only competitive in the PC market, but it actually has a larger market share than NVidia.

AMD in consoles are meaningless, developers haven't coded for hardware since the early 90s. We've been over this. Device drivers allow software to run regardless of the specifics of the hardware. As long as they make the software itself run without memory leaks and other issues, its fine. Consoles are literally meaningless to PCs. If anything, console hardware only makes PC gamers upgrade to much more powerful cards that make consoles look like a joke, which is nvidia's domain.

The number of hardware sold means everything to the hardware manufacturer. Do you think Nvidia gives a crap about how many people use their drivers when it's their cards people actually pay for? According to that Forbes article, AMD is still outselling Nvidia in the pc market.

Hell, by your own logic the 7870 should be running sales like crazy. Except by steam's own stats its less popular than the absolutely ancient GTX 210.

By what logic are you talking about? Why should the 7870 be selling more for some reason and why do you think I indicated that? The card on the ps4 is not the 7870. It's an SoC version of it.

As for the GTX 210 comment, you do realise that GTX 650 Ti is also under it, right? 7850 is the 12th most common card surveyed in the month of February too. What are you trying to draw from this? There's 13 games right there that are within 00.1% (added the double leading zeroes to avoid the thought that this is 10%, it's only .1%) difference of the overall market share. 650 Ti and 7870 both right beside that 210. So what's your point? These are single cards that own almost a full percent of the entire market share of the Steam community. Heck Intel's 4000 and 3000 series are the two most common cards here. If anything, that should tell you that the steam survey almost means nothing regarding card quality. A full 36.24% of the cards listed aren't even on the list. They make up less than the 00.50% of a the market share each. Even Intel Ironlake (Mobile) is at the 1.23% mark.

What part of anything I said would indicate that a 7870 would magically sell more than other cards? I'm just surprised that the 7970 is that much higher.

And its hilarious how your own Forbes news post has a link to steam stats that say the exact fucking opposite of what it claimed. This is hilarious.

You do realised that they only referenced Steam to point out that Intel has the first and second spot on the most common cards. Those two intel cards make up almost 9% of the Steam market by themselves. If you throw in the 4th most common card (HD 2000) you've got one brand already showing over 10%. The person was just saying that they couldn't leave Intel out of the dicussion and that's why.

But what the article was actually talking about was market shares for the year. Actual shipment rates and not surveys. Real numbers. Not what people currently have in their machines but what people were buying:

http://jonpeddie.com/back-pages/comments/intel-and-nvidia-raised-shipments-levels-for-the-quarter/

Now, Nvidia is increasing in shipments and it looks like AMD is decreasing. But at the moment and last quarter and last year, AMD sold and shipped more units.

But I'm not sure what the point of this part of the discussion is. I think AMD still F-d up their business management. I think you're confusing me saying that AMD makes a legitimate product as me saying that it's somehow better than Nvidia. I think AMD has made significant business mistakes and that will drown them if they don't make the appropriate changes. But their cards are fine.

Physx being on the ps4 is meaningless. Without CUDA it wouldn't work the way devs wanted it to, which is the higher end physics. In fact, Physx was licensed for the xbox 260 and ps3. When the last time you saw physx on a console? In its full glory? None. When was the last time you saw a 7th gen PC game have physx? Multiple times from AAA games and running a lot of what physx has to offer.

In it's full glory? Haven't seen it yet since Nvidia locked it to CPU processing. But Nvidia's statement was that it would be fully functional on the ps4. Time will tell if they're lying but developers won't be fooled.

http://www.pcgamer.com/2013/03/09/is-tomb-raiders-performance-trouble-the-herald-of-things-to-come-for-nvidias-gamers/?ns_campaign=article-feed&ns_mchannel=ref&ns_source=steam&ns_linkname=0&ns_fee=0

Even Source isn't an ace in the hole when that engine is ancient and on its way out. Source doesn't even support the new stuff that havok supposedly uses.

Depends on what Source II will use to build its engine. Considering the drastic changes to Havok in 2013 it wouldn't be crazy for them to use it but I always wouldn't be shocked if Valve did everything on their own this time around.

Physx wouldn't make its debut on consoles, not in its full form.

It's possible that they will relegate it to just the CPU again. We'll have to see if that has changed. Nvidia could stand to lose a lot of ground if games like Tomb Raider continue to come out developed in such a way to benefit API. I don't think it would benefit them almost at all to have it available if it can't be used at all.

And its working. Physx wouldn't actually exist on consoles and on PC it can actually be turned on. At best its a trojan horse to market Nvidia's tech on AMD's only recent accomplishment.

Perhaps, but Physx has actually been used in ps3 games. It's the GPU-based version that so far hasn't been.

AMD however made a horrid decision to hand out TressFX to everyone else. They could have built onto it and actually have something to compete with physx. They could make extras to market to the market they have been losing. Instead, they hand their greatest potential to their greatest enemy.

You know, you keep saying this and I keep not caring because it's just hair physics. As far as I'm concerned, AMD is light years behind Nvidia in the physics engine department and I'm actually impressed they managed to put anything competent together. The thing is, they don't have to be with Havok and custom engines being all over the place. But, because I don't care I haven't really asked what you mean by they gave it away, which your next sentence touches on.

It doesn't matter if its just hair, it could be something they could actually market and build on. Tomb Raider used Tress FX. They could even put it on the consoles and AMD cards exclusively.

Wait, you think because developers patched the issues with Tress FX on Nvidia and Intel cards that this means AMD "gave it to them"? That's silly. What's important for AMD is that it is optimized for their cards. Or do you not remember the embarrassment Tomb Raider's performance on the Nvidia line was?

Lightknight:

Ultratwinkie:
Jesus Christ, you move goalposts like a theist.

So suddenly we dropped power now? Well I guess I was right.

However, Nvidia made its views very clear:
http://www.extremetech.com/gaming/150892-nvidia-gave-amd-ps4-because-console-margins-are-terrible
http://www.maximumpc.com/article/news/nvidia_calls_ps4_%E2%80%9Clow_end%E2%80%9D123

or are you going to do the same shit and "speculate?" Like you have for the last fucking page?

I cited Nvidia. They explained that they did not have the resources to undertake the process because they would have to take the resources away from other projects. Too many irons in the fire doesn't mean they wouldn't have undergone this project had they been capable of juggling it too. SoC is something Nvidia could have done, but they really haven't been going that route so this would have been new ground breaking for them. Not to mention that they'd have to dance around AMD's patents in one of the few areas AMD is stronger in. Nvidia made $500 million off the 7th generation and that was with only one console. That's pretty good for one chip. Let's say the profit goes down to 1/5th of that. What do you think they typically make on any given chip? AMD won all three console companies. They stand to do VERY well off of this, a fact you dismiss because AMD was on shakey financial terms at the time of the deal. Even companies who are doing poorly aren't going to take huge losses just because. Companies would declare bankrupcty or even cash out and go out of business before willy nilly taking a loss.

Nvidia wanted more money than what was being paid. They refused to make consoles.

Yep. So they weren't able to produce the product in a way that was deemed worth their time. This is in contrast to AMD who was. Same argument as the first.

AMD was desperate, so they took the contract to band aid their money loss.

By "band aid" do you mean to make a profit? Again, AMD was already geared to design more SoC solutions. If nothing else, this funded the very expensive R&D process that they were already going to undertake. It's a business no brainer for them whereas Nvidia wasn't planning to do that so you were looking at a much lower margin for them.

This isn't saying NVidia did anything bad. This is just saying that the deal was much more attractive to AMD than it was to NVidia.

Look at it this way. Nvidia would have to incur unexpected costs designing an entirely new SoC solution that they weren't planning on making. It isn't an area they're particularly strong in yet so it would have taken them even more money to get up to speed on it. Add that to what I'm sure is a lower profit margin for the card and you could be looking at a much lower profit than most of their other projects would give them.

Then, look at AMD who was already going into this area of research. Even if they take a loss on this overall it will still significiantly reduce the cost of R&D on this chip which enables them to start releasing SoC solutions for products that would benefit from them. So this new research gives them a nice edge in SoC projects going forward whereas Nvidia's business model isn't so interested in SoC.

This deal was literally nothing but good for AMD and nothing but a risk for Nvidia at the certain high opportunity cost of losing resources for more lucrative projects. It isn't that Nvidia didn't want console business and it wasn't that AMD was desperate. It just made sense for both companies to take the route they did. The only loss to Nvidia is that their cards won't be specifically supported as much anymore but they're still an incredibly common card manufacturer so it's not like they won't be supported to the point of working. Just not optimized in several cases.

It's a common stage of loss to belittle the the value of the opportunity lost. Even for teams and companies we root for it's a simple physchological defense mechanism to pretend like the "other guys" didn't really get that good of a deal. Maybe you're doing that and maybe you're not, but this was a great deal for AMD even if it wasn't as good a deal for Nvidia. AMD would have been dumb not to take the deal whereas Nvidia made a calculated decision.

I'm just not a fan of either companies. You might as well be talking about chair companies to me. Had I found a comparable Nvidia card for a similar price in the same performance range I would have been more likely to buy that.

On top of this, you seem to fail to understand hardware at all.

Physx is software, not hardware. The extent to which it is used or the amount of processing it utilizes is entirely up to the developers implementing it. Think of it like particle physics. People like to use Physx for steam rendering. You can determine how detailed the steam is according to the available resources. This is why low to ultra pc settings matter. You are basically telling it how detailed the physics engines along with other settings can be. That doesn't mean tha the lowest setting doesn't have any physics or detail. Just that it's far less in comparison.

Physx lived when PC got no support from consoles. Outdated consoles that would never handle it. For 7 years.

Are you now claiming that the consoles have never been able to "handle" it? 7 years is the lifespan of the consoles. You seemed to indicate that the consoles were somehow cutting edge on last release.

Anyways, you are misunderstanding what physics engines are for. They aren't necessarily more resource demanding, in fact, they can even be less resource demanding than the various custom-made physics engines that development studios make for their games for certain processes. For example, ragdoll physics. A game could lean on Physx or Havok to help with that if their code is less efficient. True, they are usually meant to add effects on top of the vanilla game but not always.

Nvidia benefits from people working with consoles that have Nvidia cards because physx was still an option there. It makes it easy for developers to make use of physx to a lesser degree on the console and then turn it the heck up on the pc. Metro Last Light used Physx for persistent particles (blow a tile off a column parts stay visible on the ground) and for some steam/fog interaction (the steam is still there in all versions but with physx as characters move through the fog the steam dissapates). These are minor things that require a more detailed engine coupled with the hardware to use it. However, if you're already developing for an Nvidia card it does you no harm to include this feature as something that can be toggled on or off. If you are developing only for AMD cards, it does require extra steps to enable it.

That is why Nvidia went ahead and contributed the drivers. Yes, this allows the use of phyx on the consoles but makes it infinitely more easy for developers to implement the utility if they want to. However, and this is a major issue at the moment, there's not that big of a difference between Havok's newely released physics engine and Physx

That's real time on the ps4 hardware with a million objects.

If the new engine can already do physics this well with a million objects in such a varied environment then it can do everything that Physx is currently being used for. There is a huge difference between Havoc pre-2013 and Havok now. As stated, they released a new version last year that drastically improved performance.

Cut to around .48 to skip the silly fluff and see Havok 2012 compared with Havok 2013. The performance difference is staggering for the same task.

There's a reason why the majority of AAA studios use Havok if anything else at all. It's a lot more user friendly with a ton more tools for interface. Until 2013, Physx was the better software. Now I'm not sure which is better at all. They may even have individual strengths and weakensses for all I know but one isn't necessarily better than the other.

Now that its easier to port to PC, physx will have a much easier time. In fact, you even said that despite no support from consoles physx was still being used more than havok. Nice way to backtrack on your own damn statement. Regardless of who uses it, its still popular on PC. In fact, the 2nd biggest community on steam is Russian.

Actually, my statement has been that they only create something like 3 meaningful games a year that actually use any component of physx. I presented Nvidia's own chart to discredit the meaningfulness of your statement that more games used physx. Havok outscores NVidia in every game area except for the shit pile. The games that score less than 50 metacritic or don't even have a score at all. How can you even pretend to toute these numbers as meaningful? When the games scoring more than 50 on metacritic are firmly in Havok's corner at 154 to 109 Nvidia? Sure, the shit pile has 81 for Physx to Havok's 27 but really? I'd consider those all to be detractors.

But you're basically saying that Batman: Arkham Origins using Physx is no different than say a game that wasn't even popular enough to get reviewed by games critics or ones that got scored below 50? There is a difference and you know it. As the quality of the game goes up, the number of games made with Physx drops drastically. From 81 on the shit pile to 8 in the Excellent. Havok seems to follow a bell curve with both shit and excellent in the 20s and the majority of the titles in the middle. But still a MUCH higher weighted average than physx. As a Physx fan, aren't you even a little embarrassed by that chart?

What's more is that these games are only games that support those engines. They aren't games that necessarily use those engines. Most of the games that use/support Havok still use it while a sizeable chunk of the games that support physx do so in name only wihtout using any of its modules. Wierd, huh?

Only PC matters to nvidia. Physx is maarketing to PC, which is why its free for PC developers.

And yet, the Source Engine went with Havok. So Havok had its hand in Portal, Half-Life, Stanley Parable and several other significant PC games. Do you have any more recent numbers (and perhaps any more recent quality comparisons) that would indicate a significant change of some kind?

If the 7th generation console circle jerk didn't kill physx, nothing will. Its a PC gamer feature, and wasn't meant for consoles beyond exceedingly basic things. If physx could survive so could TressFX. Which AMD threw away, and the only thing they could use to contest physx's dominance on PC. The market they desperately need to stay afloat.

Hair physics. It is literally an engine solely devoted to how hair behaves. That is not a competing engine. That's ridiculous. Like saying that a radiator manufacturer is in direct competition with a car manufacturer. AMD itself is in competition with NVidia but it really isn't anywhere close with a physics engine. AMD is not only competitive in the PC market, but it actually has a larger market share than NVidia.

AMD in consoles are meaningless, developers haven't coded for hardware since the early 90s. We've been over this. Device drivers allow software to run regardless of the specifics of the hardware. As long as they make the software itself run without memory leaks and other issues, its fine. Consoles are literally meaningless to PCs. If anything, console hardware only makes PC gamers upgrade to much more powerful cards that make consoles look like a joke, which is nvidia's domain.

The number of hardware sold means everything to the hardware manufacturer. Do you think Nvidia gives a crap about how many people use their drivers when it's their cards people actually pay for? According to that Forbes article, AMD is still outselling Nvidia in the pc market.

Hell, by your own logic the 7870 should be running sales like crazy. Except by steam's own stats its less popular than the absolutely ancient GTX 210.

By what logic are you talking about? Why should the 7870 be selling more for some reason and why do you think I indicated that? The card on the ps4 is not the 7870. It's an SoC version of it.

As for the GTX 210 comment, you do realise that GTX 650 Ti is also under it, right? 7850 is the 12th most common card surveyed in the month of February too. What are you trying to draw from this? There's 13 games right there that are within 00.1% (added the double leading zeroes to avoid the thought that this is 10%, it's only .1%) difference of the overall market share. 650 Ti and 7870 both right beside that 210. So what's your point? These are single cards that own almost a full percent of the entire market share of the Steam community. Heck Intel's 4000 and 3000 series are the two most common cards here. If anything, that should tell you that the steam survey almost means nothing regarding card quality. A full 36.24% of the cards listed aren't even on the list. They make up less than the 00.50% of a the market share each. Even Intel Ironlake (Mobile) is at the 1.23% mark.

What part of anything I said would indicate that a 7870 would magically sell more than other cards? I'm just surprised that the 7970 is that much higher.

And its hilarious how your own Forbes news post has a link to steam stats that say the exact fucking opposite of what it claimed. This is hilarious.

You do realised that they only referenced Steam to point out that Intel has the first and second spot on the most common cards. Those two intel cards make up almost 9% of the Steam market by themselves. If you throw in the 4th most common card (HD 2000) you've got one brand already showing over 10%. The person was just saying that they couldn't leave Intel out of the dicussion and that's why.

But what the article was actually talking about was market shares for the year. Actual shipment rates and not surveys. Real numbers. Not what people currently have in their machines but what people were buying:

http://jonpeddie.com/back-pages/comments/intel-and-nvidia-raised-shipments-levels-for-the-quarter/

Now, Nvidia is increasing in shipments and it looks like AMD is decreasing. But at the moment and last quarter and last year, AMD sold and shipped more units.

But I'm not sure what the point of this part of the discussion is. I think AMD still F-d up their business management. I think you're confusing me saying that AMD makes a legitimate product as me saying that it's somehow better than Nvidia. I think AMD has made significant business mistakes and that will drown them if they don't make the appropriate changes. But their cards are fine.

Physx being on the ps4 is meaningless. Without CUDA it wouldn't work the way devs wanted it to, which is the higher end physics. In fact, Physx was licensed for the xbox 260 and ps3. When the last time you saw physx on a console? In its full glory? None. When was the last time you saw a 7th gen PC game have physx? Multiple times from AAA games and running a lot of what physx has to offer.

In it's full glory? Haven't seen it yet since Nvidia locked it to CPU processing. But Nvidia's statement was that it would be fully functional on the ps4. Time will tell if they're lying but developers won't be fooled.

http://www.pcgamer.com/2013/03/09/is-tomb-raiders-performance-trouble-the-herald-of-things-to-come-for-nvidias-gamers/?ns_campaign=article-feed&ns_mchannel=ref&ns_source=steam&ns_linkname=0&ns_fee=0

Even Source isn't an ace in the hole when that engine is ancient and on its way out. Source doesn't even support the new stuff that havok supposedly uses.

Depends on what Source II will use to build its engine. Considering the drastic changes to Havok in 2013 it wouldn't be crazy for them to use it but I always wouldn't be shocked if Valve did everything on their own this time around.

Physx wouldn't make its debut on consoles, not in its full form.

It's possible that they will relegate it to just the CPU again. We'll have to see if that has changed. Nvidia could stand to lose a lot of ground if games like Tomb Raider continue to come out developed in such a way to benefit API. I don't think it would benefit them almost at all to have it available if it can't be used at all.

And its working. Physx wouldn't actually exist on consoles and on PC it can actually be turned on. At best its a trojan horse to market Nvidia's tech on AMD's only recent accomplishment.

Perhaps, but Physx has actually been used in ps3 games. It's the GPU-based version that so far hasn't been.

AMD however made a horrid decision to hand out TressFX to everyone else. They could have built onto it and actually have something to compete with physx. They could make extras to market to the market they have been losing. Instead, they hand their greatest potential to their greatest enemy.

You know, you keep saying this and I keep not caring because it's just hair physics. As far as I'm concerned, AMD is light years behind Nvidia in the physics engine department and I'm actually impressed they managed to put anything competent together. The thing is, they don't have to be with Havok and custom engines being all over the place. But, because I don't care I haven't really asked what you mean by they gave it away, which your next sentence touches on.

It doesn't matter if its just hair, it could be something they could actually market and build on. Tomb Raider used Tress FX. They could even put it on the consoles and AMD cards exclusively.

Wait, you think because developers patched the issues with Tress FX on Nvidia and Intel cards that this means AMD "gave it to them"? That's silly. What's important for AMD is that it is optimized for their cards. Or do you not remember the embarrassment Tomb Raider's performance on the Nvidia line was?

1. 500 million over 7 years makes about 71 million per year. Garbage for hardware manufacturers. Do you have any idea how expensive hardware development is? Nvidia made this much and they came out and said it was garbage returns. Amd is flaling behind Nvidia by your own words. This is chump change. The money won't change anything.

http://techreport.com/news/25527/thanks-to-consoles-amd-posts-first-profit-in-over-a-year

See that? AMD got 48 million in profits. Nvidia got 71. So the profit margins nvidia hated just got smaller by around 50%. That 48 million was suspected to have 22 come from selling assets like buildings. so about 26 million profit.

So that's 36% of the profit Nvidia got every year on one console alone. Counting all the consoles of this gen for AMD's paltry income. Pathetic.

2. Source is over 10 years old. Physx didn't come around until 2007. Do you even hear yourself? Its like saying "if guns are so great, why didn't the Romans use them?"

3. Exclusives sell cards. AMD has none. Even a crappy exclusive is better than none. And They made it so it could work very well on Nvidia to the point AMD is worthless. Nvidia held physx hostage, so AMD cards can't use it unless you mod it on. Which makes the game run much worse than with a Nvidia card.

If they are so behind Nvidia, how can you say Sony got a good deal? You just admitted Sony took a big steaming dump in a box and sold it for $399. Because it won't have the next gen stuff you personally wanted? From a company that lives in the stone age? By your own words?

4. Havok demos that only show little blue Viagra pills in a very small area and with nothing actually happening and a promo with a very rough still photo.

Did you even see the physx demos? It does way more than that little demo did, and in actual games right now.

5. You cited supplier market share. That is like saying "xbox one sold more than Ps4" because it sold millions more to retailers. If the people aren't buying, they will rot. Steam shows that people aren't going AMD alone since AMD's market share is pathetic and being eaten by Intel of all companies.

On top of this,if the next gen consoles were so important how did AMD's sales NOT shoot up? Especially the equivalent card? To get all the "sweet optimization?" Oh right, because consoles are irrelevant to what people put into their PCs.

Which you refuse to explain. How did the 7870 be less popular than the oldest cards ever? We've known the power for months and no PC gamer gives a shit. Nvidia is dominating the desktops and intel dominates the mobiles.

The only card that showed growth on steam was the 7770. A pathetic card, weaker than next gen. Everything else has showed Nvidia is growing by leaps and bounds.

Oh wait, its because AMD is scaling back on PC and wants to leave it by 2016:
http://www.forbes.com/sites/sharifsakr/2013/04/23/amd-to-reduce-reliance-on-pc-market-by-2016-sell-new-arm-based-chips/

So any benefit consoles would give is out the window because they are running with their tail between their legs. This also means Mantle is utterly worthless too. And they wasted so much money on something no one will use.

Without PC support, mantle will just be another console only API in the land where OpenGL dominates. With no PC cards to actually take advantage.

Maybe you are right, maybe AMD is too far gone. Lets all just toss the next gen consoles in a big ditch in the desert next to those ET games, cover em with cement, and forget they ever existed. Because now even you are saying AMD are too far gone.

Or, alternatively, stick them in a museum because AMD won't be around to make them much longer when they start abandoning everything that isn't a mobile device.

because at this rate, AMD won't make enough money to develop their hardware enough to matter compared to the other two heavy hitters. Its an expensive business. As it is AMD seems to be an ever collapsing chain of failed companies.

luvd1:
Oo, pretty. And did I see reaction animation when a bullets bounces off the chest high wall your hiding behind? That's a step forward right?

Gears of War has it, your characters will flinch or duck their head when bullets fly within a certain range.

Ultratwinkie:
1. 500 million over 7 years makes about 71 million per year. Garbage for hardware manufacturers. Do you have any idea how expensive hardware development is? Nvidia made this much and they came out and said it was garbage returns. Amd is flaling behind Nvidia by your own words. This is chump change. The money won't change anything.

$71 million in net profit. That wasn't revenue.

https://ycharts.com/companies/NVDA/net_income

On just one card for one console, NVidia made $71 million in net profit. In 2010 Nvidia had a $99million net loss followed by a year where they made $256 million. That's nothing to sneeze at but $71 million would still be more than 1/4th of that number.

$71 million is not a trivial number. Even with them making over $500 million annually now it would have been a significant portion of their income for them. That is never less than 1/10th of their net income over the entire span of the console. If you think that's garbage then you're kidding yourself. What other card do you think came close to that return on investment?

The development of that card likely led to improvements for their other cards across the board. You're talking about one card that gave them huge profits and probably nontrivial advancements. It was even based on the 7800 so it didn't even necesarily take that long to develop it.

What's more is that you're being dishonest by dividing it by 7 years. Are you going to divide the same number by 17 years ten years from now? You divide it by the amount of time spent to develop it which I don't think we even know. So it's incorrect at best since the profits now for this very cheap card is likely a small fraction of what it was in the first quarter.

http://techreport.com/news/25527/thanks-to-consoles-amd-posts-first-profit-in-over-a-year

See that? AMD got 48 million in profits. Nvidia got 71. So the profit margins nvidia hated just got smaller by around 50%. That 48 million was suspected to have 22 come from selling assets like buildings. so about 26 million profit.

So that's 36% of the profit Nvidia got every year on one console alone. Counting all the consoles of this gen for AMD's paltry income. Pathetic.

Two things:

1. That is one quarterly net income, not annual like your erroneous $71 million estimate was based on. If they made $48 net each quarter that would be around $200 million in one year. $71 divided quarterly would be $17.75 million. But averages aren't numbers. We'd have to see what Nvidia's net profit was when the ps3 was released. But I'll tell you this much, the ps3 really failed to hit the desired numbers where as the ps4 is exceeding anticipated numbers. The PS4 alone is exceeding the ps3's sales and AMD has XBO and the WiiU to boost those numbers even though I'd consider both of those consoles to be doing poorly. The wiiU is practically circling the drain.

2. That is the entire company performance and not card specific. So saying that they made 48 million in Q3 isn't saying that the card made them 48 million. However, this is the first profit they've made in over a year which seems to indicate that it's only the consoles that brought it up to breath. The profit of the console cards could easily be over 100 million and making up for the huge losses they have been posting. However, I dont know if the losses being posted were due to R&D costs or not. Their actual difference in sales between Q2 and Q3 in the "Graphics and Visual Solutions" area was over $350 million but that's revenue.

We do know that AMD is making $60-100 per console sold.

2. Source is over 10 years old. Physx didn't come around until 2007. Do you even hear yourself? Its like saying "if guns are so great, why didn't the Romans use them?"

What? I'm just explaining that Havok is frequently used in games. I didn't say that source picked Havok because Havok is necessarily better. It's curious that you keep assuming my numbers are painting Nvidia in a negative light when I'm just portraying them realistically.

3. Exclusives sell cards. AMD has none. Even a crappy exclusive is better than none. And They made it so it could work very well on Nvidia to the point AMD is worthless. Nvidia held physx hostage, so AMD cards can't use it unless you mod it on. Which makes the game run much worse than with a Nvidia card.

If they are so behind Nvidia, how can you say Sony got a good deal? You just admitted Sony took a big steaming dump in a box and sold it for $399. Because it won't have the next gen stuff you personally wanted? From a company that lives in the stone age? By your own words?

No, card performance compared to price sells cards. People who get a weaker card that's more expensive just because it has physx on it are being dumb. All physx offers is some minor differences in physics. You get interaction with steam and persistent debris. That's it and that's now something that Havok easily offers in real time.

4. Havok demos that only show little blue Viagra pills in a very small area and with nothing actually happening and a promo with a very rough still photo.

Did you even see the physx demos? It does way more than that little demo did, and in actual games right now.

Maybe if I say it again it will help. 1 million persistent objects dropped and rendered in real time. Consider that physx was most commonly used so that glass that is shattered doesn't disappear, this is the equivalent of 1 million pieces of debris. Real time. Real time means game time. You shoot a window and this happens right away. Most tech demos are rendered over several hours to days. It's the real time and the number of objects that is impressive, not that they're blue objects. You can trivialize them all you want but it is impressive.

If you want something more than a million objects, each able to interact with eachother then I guess I can give you a less complex example that you may find more appealing.

Here's Havok's engine rendering tire friction, vehicle weight, and several other vectors for an eight wheeler through mud, a river and other terrain:

The once significant gulf between Physx and Havok has been drastically reduced. Physx could still maintain the edge but the difference has dropped enough to where it may not matter. Whatever engines are paired with Havok or Physx, they'll have the ability for objects to persist, water simulation (keep in mind, Physx's recent water simulation is impressive as hell but not real time by any stretch of the imagination while Havok's new ocean sims are real time), and steam/smoke physics but Havok already had those as well. Object collision is at an all time best and things will only get better going forward.

So I'm sorry, but Physx just isn't the beast it used to be now that the differences will be a lot more subtle. The actual game engine using these two will matter far more in addition (of course) to the hardware.

So maybe you have Nvidia examples that show more impressive rendering but my guess is you're just being fooled with more detailed object textures which would otherwise be supplied by the game engine and not the physics engine.

5. You cited supplier market share. That is like saying "xbox one sold more than Ps4" because it sold millions more to retailers.

Maybe if it was a new launch or something. But the numbers even out after launch because retailers won't resupply if their stock isn't moving. AMD has a fairly stable reorder process benefitting them. They are currently selling in larger numbers in the market and there's really no single card pushing that. I would say though, that Steam is inevitably going to be comprised of gamers. We do care more about our video cards and I'd posit we're more likely to go Nvidia for cutting edge tech. My entire point this thread is merely that a card whose performance is higher than another card, is better than the other card. You will have individual games that perform better on a specific card, but we've clearly seen that go either way and with the console race won by AMD it's going to slant more and more their way no matter how much you like Nvidia cards. But by and large, games which aren't built specifically for one card or another will perform better on the cards with the best performance. That you would ardently believe that slapping the Nvidia brand on a weaker card somehow sidesteps that because of a physics engine that has legitimate competition is misguided.

Physx, for all it's qualities, is rarely supported by games, especially games anyone cares about. A trickle of games each year that can be counted on each hand. Havok is regularly supported and used on major games all the time.

Lightknight:

Ultratwinkie:
1. 500 million over 7 years makes about 71 million per year. Garbage for hardware manufacturers. Do you have any idea how expensive hardware development is? Nvidia made this much and they came out and said it was garbage returns. Amd is flaling behind Nvidia by your own words. This is chump change. The money won't change anything.

$71 million in net profit. That wasn't revenue.

https://ycharts.com/companies/NVDA/net_income

On just one card for one console, NVidia made $71 million in net profit. In 2010 Nvidia had a $99million net loss followed by a year where they made $256 million. That's nothing to sneeze at but $71 million would still be more than 1/4th of that number.

$71 million is not a trivial number. Even with them making over $500 million annually now it would have been a significant portion of their income for them. That is never less than 1/10th of their net income over the entire span of the console. If you think that's garbage then you're kidding yourself. What other card do you think came close to that return on investment?

The development of that card likely led to improvements for their other cards across the board. You're talking about one card that gave them huge profits and probably nontrivial advancements. It was even based on the 7800 so it didn't even necesarily take that long to develop it.

What's more is that you're being dishonest by dividing it by 7 years. Are you going to divide the same number by 17 years ten years from now? You divide it by the amount of time spent to develop it which I don't think we even know. So it's incorrect at best since the profits now for this very cheap card is likely a small fraction of what it was in the first quarter.

http://techreport.com/news/25527/thanks-to-consoles-amd-posts-first-profit-in-over-a-year

See that? AMD got 48 million in profits. Nvidia got 71. So the profit margins nvidia hated just got smaller by around 50%. That 48 million was suspected to have 22 come from selling assets like buildings. so about 26 million profit.

So that's 36% of the profit Nvidia got every year on one console alone. Counting all the consoles of this gen for AMD's paltry income. Pathetic.

Two things:

1. That is one quarterly net income, not annual like your erroneous $71 million estimate was based on. If they made $48 net each quarter that would be around $200 million in one year. $71 divided quarterly would be $17.75 million. But averages aren't numbers. We'd have to see what Nvidia's net profit was when the ps3 was released. But I'll tell you this much, the ps3 really failed to hit the desired numbers where as the ps4 is exceeding anticipated numbers. The PS4 alone is exceeding the ps3's sales and AMD has XBO and the WiiU to boost those numbers even though I'd consider both of those consoles to be doing poorly. The wiiU is practically circling the drain.

2. That is the entire company performance and not card specific. So saying that they made 48 million in Q3 isn't saying that the card made them 48 million. However, this is the first profit they've made in over a year which seems to indicate that it's only the consoles that brought it up to breath. The profit of the console cards could easily be over 100 million and making up for the huge losses they have been posting. However, I dont know if the losses being posted were due to R&D costs or not. Their actual difference in sales between Q2 and Q3 in the "Graphics and Visual Solutions" area was over $350 million but that's revenue.

We do know that AMD is making $60-100 per console sold.

2. Source is over 10 years old. Physx didn't come around until 2007. Do you even hear yourself? Its like saying "if guns are so great, why didn't the Romans use them?"

What? I'm just explaining that Havok is frequently used in games. I didn't say that source picked Havok because Havok is necessarily better. It's curious that you keep assuming my numbers are painting Nvidia in a negative light when I'm just portraying them realistically.

3. Exclusives sell cards. AMD has none. Even a crappy exclusive is better than none. And They made it so it could work very well on Nvidia to the point AMD is worthless. Nvidia held physx hostage, so AMD cards can't use it unless you mod it on. Which makes the game run much worse than with a Nvidia card.

If they are so behind Nvidia, how can you say Sony got a good deal? You just admitted Sony took a big steaming dump in a box and sold it for $399. Because it won't have the next gen stuff you personally wanted? From a company that lives in the stone age? By your own words?

No, card performance compared to price sells cards. People who get a weaker card that's more expensive just because it has physx on it are being dumb. All physx offers is some minor differences in physics. You get interaction with steam and persistent debris. That's it and that's now something that Havok easily offers in real time.

4. Havok demos that only show little blue Viagra pills in a very small area and with nothing actually happening and a promo with a very rough still photo.

Did you even see the physx demos? It does way more than that little demo did, and in actual games right now.

Maybe if I say it again it will help. 1 million persistent objects dropped and rendered in real time. Consider that physx was most commonly used so that glass that is shattered doesn't disappear, this is the equivalent of 1 million pieces of debris. Real time. Real time means game time. You shoot a window and this happens right away. Most tech demos are rendered over several hours to days. It's the real time and the number of objects that is impressive, not that they're blue objects. You can trivialize them all you want but it is impressive.

If you want something more than a million objects, each able to interact with eachother then I guess I can give you a less complex example that you may find more appealing.

Here's Havok's engine rendering tire friction, vehicle weight, and several other vectors for an eight wheeler through mud, a river and other terrain:

The once significant gulf between Physx and Havok has been drastically reduced. Physx could still maintain the edge but the difference has dropped enough to where it may not matter. Whatever engines are paired with Havok or Physx, they'll have the ability for objects to persist, water simulation (keep in mind, Physx's recent water simulation is impressive as hell but not real time by any stretch of the imagination while Havok's new ocean sims are real time), and steam/smoke physics but Havok already had those as well. Object collision is at an all time best and things will only get better going forward.

So I'm sorry, but Physx just isn't the beast it used to be now that the differences will be a lot more subtle. The actual game engine using these two will matter far more in addition (of course) to the hardware.

So maybe you have Nvidia examples that show more impressive rendering but my guess is you're just being fooled with more detailed object textures which would otherwise be supplied by the game engine and not the physics engine.

5. You cited supplier market share. That is like saying "xbox one sold more than Ps4" because it sold millions more to retailers.

Maybe if it was a new launch or something. But the numbers even out after launch because retailers won't resupply if their stock isn't moving. AMD has a fairly stable reorder process benefitting them. They are currently selling in larger numbers in the market and there's really no single card pushing that. I would say though, that Steam is inevitably going to be comprised of gamers. We do care more about our video cards and I'd posit we're more likely to go Nvidia for cutting edge tech. My entire point this thread is merely that a card whose performance is higher than another card, is better than the other card. You will have individual games that perform better on a specific card, but we've clearly seen that go either way and with the console race won by AMD it's going to slant more and more their way no matter how much you like Nvidia cards. But by and large, games which aren't built specifically for one card or another will perform better on the cards with the best performance. That you would ardently believe that slapping the Nvidia brand on a weaker card somehow sidesteps that because of a physics engine that has legitimate competition is misguided.

Physx, for all it's qualities, is rarely supported by games, especially games anyone cares about. A trickle of games each year that can be counted on each hand. Havok is regularly supported and used on major games all the time.

Jesus, you have no idea how hardware works at all.

I divided it by the years the 7th gen was around. Which is 7 years.

1. AMD is leaving desktop cards. By 2016 they will be mostly gone. There will be no "optimization" because that word is a BUZZ WORD. No different from "cloud processing" o the xbone.

Its bullshit. They cannot optimize cards that will not exist. No dev would be that stupid to dump 52% of PC gamers NOW and 100% of PC gamers after 2016 when the support becomes nonexistent.

Even with your buzzwords, NO ONE IS BUYING AMD. How could sales for AMD doing well when they have been bleeding money to almost drive them to bankruptcy? They have been bleeding money for YEARS. Console deals couldn't even ship AMD, and its obvious why.

AMD is abandoning PC, and that is the final nail in their coffin. The gap will get bigger between nvidia, Intel, and AMD. AMD won't survive the decade.

No matter how many big numbers you post, it doesn't change the fact that hardware takes billions to develop. AMD is done.

Nvidia even came out and said the profits they got from

2. Oh boy, 1 million tiny and low poly and res triangles in a low poly and res world. It may be amazing to you but I can see the limitations they had to do. Even the mud demo isn't that all impressive. If you want impressive, see physx's fire breath demo or the water demo.

Unless they use it for rain physics, its not that impressive.

3. Again with the "AMD is beast" crap. A modern 660, which is the 660 TI, beats the 7870. 760 even more so. 770 beats the 7870 like a step child.

If price was the only reason that sells cards, why does no one want AMD cards? Its the extras that everyone cares about, and its the extras that only Nvidia offers. If price was the only factor, then try to explain how AMD's market share is collapsing.

4. More moving goalposts. First it was support, now its "games everyone cares about." Stop moving the damn goalposts. Physx has more games period. The fact its younger than havok and has more is staggering. Its a PC feature for PC devs, don't expect it on console. Which is where havok reigns and only in its BASIC FORM.

Evonisia:

luvd1:
Oo, pretty. And did I see reaction animation when a bullets bounces off the chest high wall your hiding behind? That's a step forward right?

Gears of War has it, your characters will flinch or duck their head when bullets fly within a certain range.

Right, this has been around for some time. Uncharted has it too and I think Last of Us but I didn't get into too many gun fights there.

Ultratwinkie:
I divided it by the years the 7th gen was around. Which is 7 years.

Oh, I get what you did. I'm saying it's wrong. If anything, it's the life of the card and not years. This one card made them $500 million in net profit as of that article's date and further encouraged console gamers who port to pc to use their software. What type of return do they see on other cards? Do they really typically get $500 million back on every card they make? More? Less?

1. AMD is leaving desktop cards. By 2016 they will be mostly gone. There will be no "optimization" because that word is a BUZZ WORD. No different from "cloud processing" o the xbone.

I've seen where AMD is trying to reduce its reliance on PCs but so is every PC company, ever. Even Nvidia is trying to branch out and successfully so. AMD's SoC solutions lend themselves quite well to mobile gaming.

DO you have citation stating that AMD announced that they will be leaving the market entirely? Or do you just have one of the many links stating their desire to be less reliant on PC sales?

Here, the an actual article on the 2016 strategy.

What they hope to do is to expand their non-traditional pc revenue sources to be as much as half of their revenue. This doesn't have anything to do with selling less on PC.Just more everywhere else. So I'm calling shenanigans on your misappropriation of this statement.

2. Oh boy, 1 million tiny and low poly and res triangles in a low poly and res world. It may be amazing to you but I can see the limitations they had to do. Even the mud demo isn't that all impressive. If you want impressive, see physx's fire breath demo or the water demo.

Unless they use it for rain physics, its not that impressive.

Rain is not object physics in most cases (particularly not persistent objects) and you certainly wouldn't have a million raindrops all at once. You don't get it. 1 million objects rendered simultaneously in real time is a huge step forward. Think about what Physx was popular for. It could persist tens of objects that were broken glass or fragments off of something that was shot. Maybe they could even do hundreds before it disappeared. That was last gen tech. This is a million objects interacting off of eachother. This would almost never need to be rendered, the impressiveness is in the number. I'm sorry if you don't think it's impressive. But you're also belittling Physx's claim to fame in the same breath. We've gotten to a point where rendering physics for persistent objects has far exceeded anything we'd typically see on screen in real time.

Whether you agree or not, this is a significant step in physics. If it can handle a million objects dropping on varying surfaces and bouncing off eachother in real time then it can certainly render a few fragments of broken glass and other debris.

3. Again with the "AMD is beast" crap. A modern 660, which is the 660 TI, beats the 7870. 760 even more so. 770 beats the 7870 like a step child.

*sigh* AMD is not beast. But an AMD card that performs at X is capable of X. If it's capable of X but an Nvidia card is only capable of X-n then the AMD card is more capable. I have not now, nor have I ever cared about "extras". All I and most of consumers have ever cared about is card performance and price. Why would I buy a different card for the rare occasions where a game ever uses it.

4. More moving goalposts. First it was support, now its "games everyone cares about." Stop moving the damn goalposts. Physx has more games period. The fact its younger than havok and has more is staggering. Its a PC feature for PC devs, don't expect it on console. Which is where havok reigns and only in its BASIC FORM.

Laughable. By "moving goalposts" do you really mean "making valid counterpoints"? You're going to defend the Nvidia-stated fact that the only area Nvidia exceeds Havok is in the shit pile of less than 50 metacritic or no score at all? I don't consider a physx engine on a card that supports shitty games to be any better than a non-card-based physics engine that supports great games. Do you honestly see no difference in quality over quantity? Do you really believe that the quality of the games referenced has no bearing on the discussion? That was a link directly from Nvidia and I still don't know why they posted it. Havok beat them in every area except for the less than 50 metacritic category. Why would I buy a card for games I don't want to play? Why would you? I understand you buying Nvidia for quality but not for a barely touched physics engine that works great on paper but doesn't see the light of day in the vast majority of games that don't suck.

And what's this silly "basic form" nonsense. Developers use Physx and Havok for specific physics calculations. They don't use them to supply all physics. At least, I don't think they do since all of the Physx support is usually just particle physics and not much else and the support for Havok is a vague "physics" which could mean it's used more fully or is really just vague. So either both types are solely used for augmenting physics or Havok is the one used more thoroughly here.

Lightknight:

Evonisia:

luvd1:
Oo, pretty. And did I see reaction animation when a bullets bounces off the chest high wall your hiding behind? That's a step forward right?

Gears of War has it, your characters will flinch or duck their head when bullets fly within a certain range.

Right, this has been around for some time. Uncharted has it too and I think Last of Us but I didn't get into too many gun fights there.

Ultratwinkie:
I divided it by the years the 7th gen was around. Which is 7 years.

Oh, I get what you did. I'm saying it's wrong. If anything, it's the life of the card and not years. This one card made them $500 million in net profit as of that article's date and further encouraged console gamers who port to pc to use their software. What type of return do they see on other cards? Do they really typically get $500 million back on every card they make? More? Less?

1. AMD is leaving desktop cards. By 2016 they will be mostly gone. There will be no "optimization" because that word is a BUZZ WORD. No different from "cloud processing" o the xbone.

I've seen where AMD is trying to reduce its reliance on PCs but so is every PC company, ever. Even Nvidia is trying to branch out and successfully so. AMD's SoC solutions lend themselves quite well to mobile gaming.

DO you have citation stating that AMD announced that they will be leaving the market entirely? Or do you just have one of the many links stating their desire to be less reliant on PC sales?

Here, the an actual article on the 2016 strategy.

What they hope to do is to expand their non-traditional pc revenue sources to be as much as half of their revenue. This doesn't have anything to do with selling less on PC.Just more everywhere else. So I'm calling shenanigans on your misappropriation of this statement.

2. Oh boy, 1 million tiny and low poly and res triangles in a low poly and res world. It may be amazing to you but I can see the limitations they had to do. Even the mud demo isn't that all impressive. If you want impressive, see physx's fire breath demo or the water demo.

Unless they use it for rain physics, its not that impressive.

Rain is not object physics in most cases (particularly not persistent objects) and you certainly wouldn't have a million raindrops all at once. You don't get it. 1 million objects rendered simultaneously in real time is a huge step forward. Think about what Physx was popular for. It could persist tens of objects that were broken glass or fragments off of something that was shot. Maybe they could even do hundreds before it disappeared. That was last gen tech. This is a million objects interacting off of eachother. This would almost never need to be rendered, the impressiveness is in the number. I'm sorry if you don't think it's impressive. But you're also belittling Physx's claim to fame in the same breath. We've gotten to a point where rendering physics for persistent objects has far exceeded anything we'd typically see on screen in real time.

Whether you agree or not, this is a significant step in physics. If it can handle a million objects dropping on varying surfaces and bouncing off eachother in real time then it can certainly render a few fragments of broken glass and other debris.

3. Again with the "AMD is beast" crap. A modern 660, which is the 660 TI, beats the 7870. 760 even more so. 770 beats the 7870 like a step child.

*sigh* AMD is not beast. But an AMD card that performs at X is capable of X. If it's capable of X but an Nvidia card is only capable of X-n then the AMD card is more capable. I have not now, nor have I ever cared about "extras". All I and most of consumers have ever cared about is card performance and price. Why would I buy a different card for the rare occasions where a game ever uses it.

4. More moving goalposts. First it was support, now its "games everyone cares about." Stop moving the damn goalposts. Physx has more games period. The fact its younger than havok and has more is staggering. Its a PC feature for PC devs, don't expect it on console. Which is where havok reigns and only in its BASIC FORM.

Laughable. By "moving goalposts" do you really mean "making valid counterpoints"? You're going to defend the Nvidia-stated fact that the only area Nvidia exceeds Havok is in the shit pile of less than 50 metacritic or no score at all? I don't consider a physx engine on a card that supports shitty games to be any better than a non-card-based physics engine that supports great games. Do you honestly see no difference in quality over quantity? Do you really believe that the quality of the games referenced has no bearing on the discussion? That was a link directly from Nvidia and I still don't know why they posted it. Havok beat them in every area except for the less than 50 metacritic category. Why would I buy a card for games I don't want to play? Why would you? I understand you buying Nvidia for quality but not for a barely touched physics engine that works great on paper but doesn't see the light of day in the vast majority of games that don't suck.

And what's this silly "basic form" nonsense. Developers use Physx and Havok for specific physics calculations. They don't use them to supply all physics. At least, I don't think they do since all of the Physx support is usually just particle physics and not much else and the support for Havok is a vague "physics" which could mean it's used more fully or is really just vague. So either both types are solely used for augmenting physics or Havok is the one used more thoroughly here.

1. AMD was destroyed by Intel's and nvidia's anti competitive tactics. You can argue all you want about how AMD isn't in the shitter but its obvious they are looking for the door. Nvidia is just branching out because its now poised to be the Intel of graphics cards.

AMD is branching out because its facing a much bigger competitor with much more effective marketing. Its facing Intel, the juggernaut of CPUs. Its obvious the GPU and CPU markets are locked down. If they stay in the market, they signed their own death certificate.

Its been losing money for years, and their cards are rotting on the shelves. Next gen consoles didn't provide them a push so now they are looking for a way out. Nvidia on the other hand dominates the desktop and is now moving to mobile to dominate that too.

Nvidia dominates everything, AMD owns nothing. Its different in each of their strategies.

There is a reason AMD is a glorified penny stock from its glory days of 2006. The only way they can make 50% off games consoles is if they dump EVERYTHING ELSE except the mobile division. Even the desktop processors and leaving that market to intel.

Who will just follow them to mobile processors and crush them there too with their foothold.

20%->50% of profits from consoles/low power tablets would require cuts of their biggest liabilities, which is everything at this point.

When you read in between the lines of their quarterly reports, its obvious they will cut everything. It makes much more sense to cut and run.

So mantle, tressfx, and consoles are essentially meaningless to their strategy now since there won't be any cards to benefit.

2. 1 million objects is only impressive on a console. It is only impressive on PC if it was used for rain or something really complex. However, this was a demo for a console and by no means indicative of anything.

Remember when they showed those high def faces in the PS2? That only happened in the ps3 era? The same in Ps3 where they showed PS4 faces? And none of those things happened in an actual game?

Same reason for havok. You can argue it can do all these things but consoles just can't. There is a reason many games don't use havok to its full potential.

Hell, there is a reason we don't have a game that doesn't look like the good samaritan demo.

By that full potential, I mean cloth physics on everything as well as particle physics and the mesh editing mud physics. Which you fail to grasp that very few console games even do that. Beyond the very basic level ragdoll physics set by source in 2004, console devs don't strive for higher level eyecandy. In fact, they can't otherwise it won't run.

That is the difference. All the havok extras almost never get used on console, the extras that can compete with physx. It won't go PC because nvidia has better support on PC and is godlike to indie devs thanks to their free support.

Now if you want impressive tech demos, look up the fire strike benchmark.

3. If people care only about power, then why is AMD sliding into bankruptcy? Until they gave themselves to consoles in a desperate bid to not go bankrupt?

Nvidia puts in the extras like the CUDA cores which turns a GPU into an pseudo-APU, the extra driver support, their entire "geforce experience" which includes free video capture that doesn't tax your PC at all unlike fraps?

4. Physx is supported by indie games. Which don't tend to score very well outside the best of the best like minecraft, Outlast, etc. Its given away for free for PC devs. Consoles depend on publishers.

So physx has more games by your own words. Its supported on PC more than console, so consoles are meaningless to its future whether they can handle it or not.

The only thing havok beats physx in is console support. On 3 platforms. Havok is mostly FPS or TPS games, which is console based. Thanks to the indie scene, PC doesn't need console games to survive. Even the "trickle" of games PC gets is more than what consoles get in a year.

But you want "games people care about" so how about Unreal engine?

http://en.wikipedia.org/wiki/List_of_Unreal_Engine_games

Unreal Engine 3 & 4 has physx, in fact even gamebryo is listed as having physx. So any unreal engine game and any gamebryo game after 2008-9 has physx. Even skyrim has physx, and that's how modders added breast, dong, cloth, and even hair physics. its not turned on by default but its there.

physx is in a lot of games, the fancy physx games however tend to be in good (read: AAA) games with the exception of batman arkham origins which was a buggy cash grab.

Which comes back to my original point, physx is a trojan horse and is destroying AMD's future by coming prepackaged with popular engines. Its not turned on unless its on PC. Which is why AMD is dying on PC and no one seems to care.

The only one who benefits from physx on consoles or in engines is Nvidia themselves.

or are you going to say all Unreal Engines are games no one cares about? Or Unity engine games? or gamebryo which means every Bethesda game after 2008? Hell, even COD has physx. nvidia always makes it a point to get AAA games to use advanced physx. Because it has great PR with devs and even engine developers.

When it comes to pure use, Physx wins whether you like indies or not. Even games that use OGRE can use physx and thats the lowest of the low. Its given away for free to indie devs for christ sakes. Which is why nvidia has such good will towards it from the devs because they don't pile on a price tag to very useful utility. If you can't pay, you can still get it as long as its on PC. On console, you have to pay for it.

You can argue how its not "relevant" because its not on consoles, but physx was never a console utility in the first place. On PC, it dominates with 50% of the physx games being on PC. In fact, the fact that next gen engines already incorporate physx into them means the potential for nvidia cards are limitless.

 Pages PREV 1 2 3

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here