AMD's "Mantle" Promises Next Big Step in PC Performance

AMD's "Mantle" Promises Next Big Step in PC Performance

AMD's Mantle is a low-level API for its "Graphics Core Next" architecture that promises to bring unprecedented graphics performance to the PC.

One of the advantages consoles hold over PCs is that as standardized pieces of hardware with a long lifespan, developers can program "close to the metal" and squeeze more performance out of them. PCs, on the other hand, are generally restricted to high-level APIs like DirectX, which allows software to run across a variety of components and configurations at the cost of a fairly substantial performance hit. Muscular video cards compensate for API inefficiencies but their full power goes untapped; with Mantle, however, AMD intends to combine GPU drivers with a low-level API that will allow programmers much more direct access to the display hardware, and thus to unlock much more of their potential.

It may not seem like the most exciting topic ever but it's actually a very big thing. Mantle supports new rendering techniques and offers direct access to all GPU features, and promises to simplify game development by "leveraging commonalities" between Graphics Core Next-equipped PCs and consoles. But the big issue is performance: High-end GPUs can process far more draw calls than high-end CPUs can typically submit, according to Anantech, but AMD claims Mantle enables nine times more draw calls per second than current APIs, potentially revolutionizing the rendering capabilities of PCs. PC versions of multiplatform games will also be able to take advantage of many of the performance-boosting optimizations currently exclusive to consoles.

Id Software mastermind John Carmack noted on Twitter that Mantle could do big things for Valve's Steam Machines as well, although that may bring headaches from Sony and Microsoft. "AMD has an interesting opportunity with Mantle because of their dual console wins, but I doubt Sony and MS will be very helpful," he said, referencing the fact that both the Xbox One and PlayStation 4 make use of AMD's GCN architecture. "Considering the boost Mantle could give to a steambox, MS and Sony may wind up being downright hostile to it."

And as great as it promises to be, it faces the same challenge as any low-level API in the PC arena: different hardware manufacturers with different architectures. Old-timers will remember Glide, a proprietary low-level API from 3dfx that allowed for unprecedented visual fidelty - as long as you had a 3dfx card. But it was eventually muscled out by Direct3D and OpenGL, which traded performance for compatibility and allowed for "one size fits all" game development. Mantle will face the same challenges, although AMD claims that this is actually something developers have been asking for for years; DICE actually put together a presentation for the Mantle unveiling, announcing that its Frostbite 3 engine will render with Mantle instead of DirectX 11 on compatible GPUs. The upcoming Battlefield 4 will be the first game to launch with Mantle support.

A far more detailed breakdown of Mantle is available at Anandtech and it's a worthwhile read for anyone interested in the nuts and bolts of this stuff, but the bottom line is that AMD could have something truly game-changing on its hands, and if it takes, some very big things could happen in the world of PC gaming in the not-too-distant future. AMD said it will reveal more about Mantle at the AMD Developer Summit in November.

Source: Anandtech, via Techspot

Permalink

I'll pass. Recently AMD has acquired a habit of promising the world and delivering underwhelming products. The Bulldozer CPU's were supposed to wipe the floor with Intel, but the first generation couldn't even match the performance of the previous generation of AMD CPU's.

The only way this could ever become a real substitute to DirectX is if AMD makes it open source. We can't have just one graphics card manufacturer making use of this API. Several things could happen then. Developers would refuse to use it because by using DirectX they would be supporting gamers with AMD or Nvidia GPUs. Or they could use Mantle and abandon Nvidia which would be stupid. We don't benefit from a monopoly. Or developers could try to make games using both. But what if Mantle ends up having better performance than DirectX? Everyone would jump to AMD bandwagon and we'd still end up with a monopoly.

Andrew_C:
I'll pass. Recently AMD has acquired a habit of promising the world and delivering underwhelming products. The Bulldozer CPU's were supposed to wipe the floor with Intel, but the first generation couldn't even match the performance of the previous generation of AMD CPU's.

But this would be a product by ATI (bought by AMD) and they have consistently delivered competitive hardware for a number of years.

An interesting snippet from the Anandtech article:

Anandtech:
Let's be very clear here: AMD will not discuss the matter let alone confirm it, so this is speculation on our part. But it's speculation that we believe is well grounded. Based on what we know thus far, we believe Mantle is the Xbox One's low level API brought to the PC.

If indeed Mantle is the Xbox One's low level API, then this changes the frame of reference for Mantle dramatically. No longer is Mantle just a new low level API for AMD GCN cards, whose success is defined by whether AMD can get developers to create games specifically for it, but Mantle becomes the bridge for porting over Xbox One games to the PC. Developers who make extensive use of the Xbox One low level API would be able to directly bring over large pieces of their rendering code to the PC and reuse it, and in doing so maintain the benefits of using that low-level code in the first place. Mantle will not (and cannot) preclude the need for developers to also do a proper port to Direct3D - after all AMD is currently the minority party in the discrete PC graphics space - but it does provide the option of keeping that low level code, when in the past that would never be an option.

And then all the talk with this and how it coincides roughly with the announcement of Steam Machines. 2014 is going to be mighty interesting.

spwatkins:

But this would be a product by ATI (bought by AMD) and they have consistently delivered competitive hardware for a number of years.

Not really, AMD integrated ATI years ago. And while they certainly remain competitive in the graphics arena, IMO it's been a while since they released anything that was clearly superior to NVidia.

Griffolion:

we believe Mantle is the Xbox One's low level API brought to the PC.

And then all the talk with this and how it coincides roughly with the announcement of Steam Machines. 2014 is going to be mighty interesting.

According to one source. According to another Mantle is actually similar to PS4′s GNM API.

I dropped my plan to buy a new PC. I don't know what to do and think right now. This has been a hell of a week for PC gaming. And it's only the beginning. Wait until we get DDR4 RAM modules with their increased performance and decreased power consumption and heat.

As if consoles weren't already hopelessly outdated, now PC wants to become even easier to use than consoles with all of it's other perks intact as well. It's kind of ridiculous, but I love it so much.

Freedom from DirectX you say? An API Microsoft has no interest in developing, as it is directly in competition with their own hardware, the XBox?

FUCK YES!

I only hope Nvidia follows suit and doesn't develop another API of their own, that would suck for everyone. We can't own both cards in our cases xD

Adam Jensen:
-snip-

Both use x86 and AMD GPU architectures, the similarities will be there.

Besides, I'm more likely to believe Anandtech.

I tend to stay behind the curve on PC gaming deliberately. Get 1-generation-behind components for cheaper that can still handle games really well. Been working for me well now for about 5 years.

Griffolion:

Adam Jensen:
-snip-

Both use x86 and AMD GPU architectures, the similarities will be there.

Besides, I'm more likely to believe Anandtech.

Maybe. But Microsoft likes to use their DirectX. That's why that other source may not be wrong about this. We need more info to be sure. Who knows, maybe it's not based on either. Maybe it's just new.

Adam Jensen:
But what if Mantle ends up having better performance than DirectX? Everyone would jump to AMD bandwagon and we'd still end up with a monopoly.

This is probably what I'm most worried about when reading this- it could end up like nVidia's own PhysX, where AMD hardware has to do everything on the CPU, thus impacting performance... meaning the game runs better on one brand of cards than the other, not because that card's hardware is superior, but because the software favors one of them.

This worries me. Id rather see OpenGL and DirectX get an upgrade then returning to the days of old on PC...

Charcharo:
This worries me. Id rather see OpenGL and DirectX get an upgrade then returning to the days of old on PC...

They are being upgraded. DirectX is up to 11.2 which (apparently) incorporates the stuff developed for the XBone and OpenGL is off their butts at last and up to 4.4 which is roughly equivalent to DirectX 10 with some stuff from 11 (to simplify things horribly).

This plus homogenous memory would be awesome, but anything to not have to "upgrade" to win8 for latest directx would be pure win.

The Rogue Wolf:

Adam Jensen:
But what if Mantle ends up having better performance than DirectX? Everyone would jump to AMD bandwagon and we'd still end up with a monopoly.

This is probably what I'm most worried about when reading this- it could end up like nVidia's own PhysX, where AMD hardware has to do everything on the CPU, thus impacting performance... meaning the game runs better on one brand of cards than the other, not because that card's hardware is superior, but because the software favors one of them.

nVidia has been doing it for years and its been making AMD cards look bad even if the hardware itself is better, its only fair AMD returns the favor

I forever welcome the day DirectX can go get fucked.

PC gaming is going to be insane in 2014. Finally companies are getting off their ass and changing shit up.

Adam Jensen:
The only way this could ever become a real substitute to DirectX is if AMD makes it open source. We can't have just one graphics card manufacturer making use of this API. Several things could happen then. Developers would refuse to use it because by using DirectX they would be supporting gamers with AMD or Nvidia GPUs. Or they could use Mantle and abandon Nvidia which would be stupid. We don't benefit from a monopoly. Or developers could try to make games using both. But what if Mantle ends up having better performance than DirectX? Everyone would jump to AMD bandwagon and we'd still end up with a monopoly.

As the example with Dice shows, it's not a question of either Mantle or DirectX. I think the use of Mantle will probably end up with the AAA developers, who have the man power and resources to make one version for DirectX and one version for Mantle. I'm perfectly okay with that. I don't need games like FTL to adopt Mantle to leverage all my graphics power. However, I would like the next Crysis or Battlefield to be able to do that. So this is a feature for AAA developers to increase sales with the AMD crowd but in no way will replace the use of DirectX.

Charcharo:
This worries me. Id rather see OpenGL and DirectX get an upgrade then returning to the days of old on PC...

Yeah, but the latest DirectX version is a Windows 8 exclusive, just like how DirectX 10 was exclusive to Vista.

If I can get top quality graphics without having to perform an upgrade to a crappy tablet OS I say more power to AMD and OpenGL.

Ed130:

Charcharo:
This worries me. Id rather see OpenGL and DirectX get an upgrade then returning to the days of old on PC...

Yeah, but the latest DirectX version is a Windows 8 exclusive, just like how DirectX 10 was exclusive to Vista.

If I can get top quality graphics without having to perform an upgrade to a crappy tablet OS I say more power to AMD and OpenGL.

Yeah but arent these different APIs a good amount of work for developers in order to implement? I just do not like it when AMD and Nvidia have exclusive tech...

As an ATI fanboy let me just say I applaud their work in this much needed field to break old brute force conventions through 20 layers of overhead bullshit, but this might be real bad for PC gaming...
A propriary API hinging on a single hardware manufacurer means this shit will not work properly on any other hardware or possibly exclude other hardware all together, worse yet could start cutting backwards compatibility forcing you to buy their specific new products for each itteration (much like MS is trying to get going now).
Not to mention Nvidia has absolutely no intention of supporting ATI plans, they also have 10x bigger budgets for everything and this could very well spawn yet another propriary API that will make this twice as horrible.

ATI's work could have just as well gone into OpenGL extensions, which is an established open freely available API that granted has backwards conventions but it is for anyone to use, a proprietary API however is just the opposite.
Also while ATI's software guys were working on new toys their users had do contend with poor drivers and even worse OpenGL support... how about putting some time into that.

For the record: AMD said they are opening up the Mantle API, so that anyone can access the source. They also said that they are open to support any hardware that offers the features required to use Mantle. It is not the tech from the XBONE or the PS4, it is something similar, but their goal is to allow access to low-level GPU programming on any hardware that has the minimum requirements -> DirectX doesn't do this and won't do this, at least in the near future. nVidia is open to support it, just as it was invited to implement any other tech AMD released, such as their own physics solution. But nVidia is anti-competitive, the code they released to run on CPUs is intentionally crippled so it will be slow as hell, for example. Among the professionals, nVidia pushed CUDA for years, they didn't care to implement OpenCL properly (as an example bitcoin mining on an nVidia card is 10 times slower than on a CPU), now that industry is moving to OpenCL and they pretty much have to play the catch-up game. Hope it will be the same with games too.

Mr.K.:
As an ATI fanboy let me just say I applaud their work in this much needed field to break old brute force conventions through 20 layers of overhead bullshit, but this might be real bad for PC gaming...
A propriary API hinging on a single hardware manufacurer means this shit will not work properly on any other hardware or possibly exclude other hardware all together, worse yet could start cutting backwards compatibility forcing you to buy their specific new products for each itteration (much like MS is trying to get going now).
Not to mention Nvidia has absolutely no intention of supporting ATI plans, they also have 10x bigger budgets for everything and this could very well spawn yet another propriary API that will make this twice as horrible.

ATI's work could have just as well gone into OpenGL extensions, which is an established open freely available API that granted has backwards conventions but it is for anyone to use, a proprietary API however is just the opposite.
Also while ATI's software guys were working on new toys their users had do contend with poor drivers and even worse OpenGL support... how about putting some time into that.

Mantle is open (but no mention of what the license is yet), not proprietary, so it is not a remake of the 90s with thing like 3Dfx Glide and the API communicate with the kernel-mode device driver, so there is still one layer of abstraction for other manufacturer to adapt their product to be compatible.

Andrew_C:
I'll pass. Recently AMD has acquired a habit of promising the world and delivering underwhelming products. The Bulldozer CPU's were supposed to wipe the floor with Intel, but the first generation couldn't even match the performance of the previous generation of AMD CPU's.

Exactly, what we need is games to be built on opengl as that is an open standard you can easily move your game anywhere you want windows OSX *nix, phones tablets consoles you name it. Then as a game dev you are not reliant on one platform to survive its good practice.

Mantle may be "open" and if it truly is open and can be easily ported to any platform then that's ok but it seams to me we already have a perfectly good solution in place.

alj:
Exactly, what we need is games to be built on opengl as that is an open standard you can easily move your game anywhere you want windows OSX *nix, phones tablets consoles you name it. Then as a game dev you are not reliant on one platform to survive its good practice.

Mantle may be "open" and if it truly is open and can be easily ported to any platform then that's ok but it seams to me we already have a perfectly good solution in place.

OpenGL is pretty crap, performance and otherwise compared even to DirectX, Don't get me wrong, I would pretty much prefer it if it was better, but unfortunately, it's not. What Mantle would enable devs to do, is basically to do the hacks you've seen on consoles, that use every last drop of performance available, to make things smoother and more pretty.

For example, companies like Naughty Dog, used assembly (machine code) to milk every last drop of power from the PS3, this would enable similar things to be done on the PC. The biggest bottleneck of doing this on a PC was the hardware variety. To code for the PS3, you write low level code for one piece of hardware, this enables you to go and use code specific to that HW, whereas, on PC it is obvious, why you couldn't do it before. Imagine the possibilities of hardware much, much faster than last gen's, with steroids. For stuff like that, even DX is inferior, let alone OpenGL. As far as support is concerned, AMD is much more open with their tech, than any other competing company. For example, nVidia is just now beginning to somewhat extend support for its own Optimus tech on Linux and it's pretty much unusable.

Andrew_C:
I'll pass. Recently AMD has acquired a habit of promising the world and delivering underwhelming products. The Bulldozer CPU's were supposed to wipe the floor with Intel, but the first generation couldn't even match the performance of the previous generation of AMD CPU's.

I'll add my support to this sentiment with one word. Catalyst. I've been all nvidia for years ever since "upgrading" my gpu for the original Knights of the Old Republic, requiring me to use the first iteration of the catalyst drivers, and seeing immediately that the graphics performance was actually worse than in my previous generation gpu. I managed to find some non catalyst drivers that worked (posted on a board full of people equally pissed about this) and while it performed better, it still wasn't an improvement over the gpu I'd just replaced. Not too long after that, I switched to an nvidia (can't remember exactly, but I think this was during their 6-series) and was blown away. Since then, it's been 7800gt, 8600gtx, 9800gx2, and currently a pair of 560ti's.

tangoprime:

I'll add my support to this sentiment with one word. Catalyst. I've been all nvidia for years ever since "upgrading" my gpu for the original Knights of the Old Republic, requiring me to use the first iteration of the catalyst drivers, and seeing immediately that the graphics performance was actually worse than in my previous generation gpu. I managed to find some non catalyst drivers that worked (posted on a board full of people equally pissed about this) and while it performed better, it still wasn't an improvement over the gpu I'd just replaced. Not too long after that, I switched to an nvidia (can't remember exactly, but I think this was during their 6-series) and was blown away. Since then, it's been 7800gt, 8600gtx, 9800gx2, and currently a pair of 560ti's.

I know what you mean, ATI/AMD drivers have always been dodgy. I remember the time the ATI Catalyst drivers accidentally disabled 3D acceleration on AGP cards (when AGP was still the standard). And the only reason why the release documentation to the developers of the open source Linux/BSD drivers is that they couldn't be bothered to write decent Linux drivers.

I'm using an old ATI Radeon HD 5770 at the moment, before that an NVidia 9600 GTX2. I have no loyalty to either brand

EDIT: and as everyone says,they should rather upgrade their drivers to support OpenGL 4.4 and put some work into the the next versions of OpenGL, rather than splitting the market with yet another API.

Andrew_C:

tangoprime:

I'll add my support to this sentiment with one word. Catalyst. I've been all nvidia for years ever since "upgrading" my gpu for the original Knights of the Old Republic, requiring me to use the first iteration of the catalyst drivers, and seeing immediately that the graphics performance was actually worse than in my previous generation gpu. I managed to find some non catalyst drivers that worked (posted on a board full of people equally pissed about this) and while it performed better, it still wasn't an improvement over the gpu I'd just replaced. Not too long after that, I switched to an nvidia (can't remember exactly, but I think this was during their 6-series) and was blown away. Since then, it's been 7800gt, 8600gtx, 9800gx2, and currently a pair of 560ti's.

I know what you mean, ATI/AMD drivers have always been dodgy. I remember the time the ATI Catalyst drivers accidentally disabled 3D acceleration on AGP cards (when AGP was still the standard). And the only reason why the release documentation to the developers of the open source Linux/BSD drivers is that they couldn't be bothered to write decent Linux drivers.

I'm using an old ATI Radeon HD 5770 at the moment, before that an NVidia 9600 GTX2. I have no loyalty to either brand

EDIT: and as everyone says,they should rather upgrade their drivers to support OpenGL 4.4 and put some work into the the next versions of OpenGL, rather than splitting the market with yet another API.

THEY ARE NOT SPLITTING THE MARKET. Mantle is open source. NVidia can implement it should they want to. But they'd rather stick their own proprietary tech down everyone's throat until those people switch to an open standard (CUDA? OpenCL!).

Catalyst drivers are pretty bad, that's true. And the newer the driver version is, the worse they get for legacy hardware - we're coming to a point where the only drivers legacy hardware runs with properly are from 2011. This is even partly valid for the HD66XX series. I own two HD6670s, and they both only work properly when I use the driver from 2011. With any newer driver, D3D performance goes down the drain and OpenGL applications directly crash the driver. I have no problems like this with the open source AMD cards driver for Linux.

Matthi205:
[
THEY ARE NOT SPLITTING THE MARKET. Mantle is open source. NVidia can implement it should they want to. But they'd rather stick their own proprietary tech down everyone's throat until those people switch to an open standard (CUDA? OpenCL!).

They are splitting the market. There are 2 perfectly good graphics API's, we don't need a third. And open means nothing when one company controls the standard. After all, CUDA (which you obviously detest) is technically an open standard, but because NVidia controls it and would not relinquish control, OpenCL was devised and now everyone uses it, even NVidia kind of supports it.

I hope this fails, we dont need no mantles that split the market and split the PC market in half. why cant people just use OpenGL anymore, that one worked, and while not as powerful as current DirectX, its so only becuase pretty much noone supports it anymore.

tangoprime:

Andrew_C:
I'll pass. Recently AMD has acquired a habit of promising the world and delivering underwhelming products. The Bulldozer CPU's were supposed to wipe the floor with Intel, but the first generation couldn't even match the performance of the previous generation of AMD CPU's.

I'll add my support to this sentiment with one word. Catalyst. I've been all nvidia for years ever since "upgrading" my gpu for the original Knights of the Old Republic, requiring me to use the first iteration of the catalyst drivers, and seeing immediately that the graphics performance was actually worse than in my previous generation gpu. I managed to find some non catalyst drivers that worked (posted on a board full of people equally pissed about this) and while it performed better, it still wasn't an improvement over the gpu I'd just replaced. Not too long after that, I switched to an nvidia (can't remember exactly, but I think this was during their 6-series) and was blown away. Since then, it's been 7800gt, 8600gtx, 9800gx2, and currently a pair of 560ti's.

Yeah, i as well was extremely dissapointed by ATI (Then Radeon) drivers and how badly they performed in pretty much everything and have been a happy camepr on Nvidias side. Nvidia card is prerequisite of me buying a computer now.
Another thing to note is that Nvidia actually works with game developers on optimization for, yes you guessed it, nvidia cards. thats why even if rap performance is not that great of a leap they actually often perform better in games. now granted not all game developers recieve this support, but they are actively trying to make it better.
Wheas AMD... continues to dissapoint.

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here