Richard Huddy, the worldwide developer relations manager of AMD’s GPU division, says one of the biggest obstacles to PC gaming performance is simply that Microsoft’s venerable DirectX keeps “getting in the way.”
I’m not quite the PC hardware nut I used to be so I’ll let Ben Hardwidge of bit-tech.net set the stage in his own rather poetic words. “Despite what delusional forum chimps might tell you, we all know that the graphics hardware inside today’s consoles looks like a meek albino gerbil compared with the healthy tiger you can get in a PC,” he wrote. “Compare the GeForce GTX 580’s count of 512 stream processors with the weedy 48 units found in the Xbox 360’s Xenos GPU, not to mention the aging GeForce 7-series architecture found inside the PS3.”
You don’t have to be a rocket surgeon to figure out that 512 stream thingies is a lot better than 48 so the obvious question is, why isn’t the PC pounding its console counterparts into the ground on the graphics front? PC visuals are generally accepted as being at least potentially better but side-by-side, the differences are usually slight and sometimes, in terms of overall performance, the PC actually finds itself outpaced.
According to Huddy, one of the biggest stumbling blocks is the technology that Microsoft rolled out years ago specifically to make PC gaming better. “It’s funny. We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it’s very clear that the games don’t look ten times as good,” Huddy said. “To a significant extent, that’s because, one way or another, for good reasons and bad – mostly good, DirectX is getting in the way.”
Those good reasons are what led to the widespread adoption of DirectX in the first place. PC gamers of a certain age will no doubt have fond memories of messing around with VESA drivers or buying “special editions” of games that would only run on specific video hardware but while nobody wants to go back to that era, unified APIs carry their own price tag.
“Wrapping it up in a software layer gives you safety and security, but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate,” Huddy said.
He acknowledged that “programming directly-to-metal” would make life more difficult for just about everyone, as hardware manufacturers would have to ensure component stability while developers push the limits of the performance envelope and PC enthusiasts end up dealing with the inevitable fallout, but he maintained that when it came to the question of performance uber alles, it’s the way to go. “In terms of doing the very best for the platform, that’s how they would actually achieve that,” he said.
Of course, how individual developers feel about that idea depends largely on what sort of game they’re making. “I don’t want anything to do with that, but presumably it depends on what you’re developing,” said Chris Delay, lead designer and developer at Darwinia and Defcon studio Introversion. “If you’re making Crysis 3 or something like that, then it may be exactly what you want.”
Crytek’s R&D Technical Director Michael Glueck did in fact say the idea “would appeal to us,” although he added, “It definitely makes sense to have a standardized, vendor-independent API as an abstraction layer over the hardware, but we would also prefer this API to be really thin and allow more low-level access to the hardware. This will not only improve performance, but it will also allow better use of the available hardware features.”