Forgot password
Enter the email address you used when you joined and we'll send you instructions to reset your password.
If you used Apple or Google to create your account, this process will create a password for your existing account.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Reset password instructions sent. If you have an account with us, you will receive an email within a few minutes.
Something went wrong. Try again or contact support if the problem persists.
Escapist logo header image

Victim of Technocide

This article is over 15 years old and may contain outdated information
image

Nothing will start a flamewar faster than asserting that the PC is “dead” or “dying” as a gaming platform. But I hope I can at least point out that the PC isn’t driving the industry the way it used to without PC stalwarts exploding into a state of apoplexy. I know that PC doomsayers have been wearing sandwich-board signs proclaiming “THE END IS NIGH” since long before Duke Nukem ran out of chewing gum, and the platform still won’t be dead when they release the next batch of Duke Nukem Forever screenshots in 2017. But while “death” is not an appropriate word for what has taken place on the PC, the platform’s influence isn’t what it used to be.

Ten years ago, the PC ruled the game store like the mighty T-Rex. The shelves were stacked floor-to-ceiling with PC games, PC peripherals and PC gamers – the exotic console stuff was all heaped along one wall. Now that setup is reversed. Long-time PC developers like id Software and Epic Games are beginning to see their PC efforts as secondary. Games are made for consoles first, and then (sometimes ineptly) ported to the PC as an afterthought. Yes, there are good PC games left out there. The PC platform lives on, but the glory days are over. And in the end, it was its greatest strength that led to a reversal of its fortunes: the GPU.

The main advantage of the PC as a gaming platform was that it wasn’t actually a gaming platform – not on purpose, anyway. People had PCs, and with ingenuity it was possible to make games that would run on them. Consoles were (incorrectly) seen as toys for kids and teens, and few adults would run out and buy an N64 or a Sega Genesis for themselves unless they were already gamers. What non-gamer would go out and drop $200 on a videogame system just to see if they would like it? Consoles could bring kids into the fold, but usually not adults.

On the other hand, a huge portion of the population went out and bought computers because they had computer stuff that needed doing. The computers were general-purpose devices, so gamers and non-gamers had pretty much the same machines. Even the machine Mom used to check her email was capable of running the latest games for the first year of its life.

This made PC games the gateway drug for most adults. The curious non-gamer could take a demo or a game on loan from a friend, run it on their PC and undergo the metamorphosis that transformed normal, well-adjusted people into bleary-eyed gamers who stayed up until dawn playing Sim City 2000, Dungeon Keeper, or (God help them) Civilization.

When the mighty GPU came onto the scene it made a host of new effects possible, and the first batch of games to take advantage of the technology straddled the line between the hardcore and the average gamer. You could play the game un-accelerated for a passable experience, or you could pop in one of those fancy Voodoo cards and see better effects, more detail and faster framerates. Big titles like Quake II and Unreal Tournament worked with and without software acceleration so that people with off-the-shelf PCs could play right along with the big spenders and hardware fetishists. Better hardware meant the game would look better, but everyone could play. This situation was beautiful, ideal and completely unsustainable.

Recommended Videos
image

As graphics cards became increasingly powerful, it became less and less practical to write games that would take advantage of the hardware without requiring it. Inevitably the ad-hoc PC platform was riven into the accelerated and the un-accelerated. Suddenly it was possible to buy a brand new computer that wasn’t capable of running games. Suddenly buying a graphics card was a requirement just to get in the door. At the same time, the graphics card market began to grow more byzantine. No longer were we shopping for a Voodoo 3, Voodoo 4 or Voodoo 5. Now we were shopping for specific chipsets, which were balkanized into sub-markets by memory loadouts and slot interfaces, and then further divided by vendors and brand names. Sure, there will always be people willing to do their homework at Tom’s Hardware, sink $200 into the latest pixel-accelerating toaster oven, pop open their computer and muck about installing the thing. But the number of people up for that sort of puzzle-solving will always be less than the number of people who got a computer because they needed something to plug the printer into, but who wouldn’t mind playing some games in the evening.

The sad thing is, I don’t see how this could have been averted. What was NVIDIA going to do, not sell graphics cards? Should gamers have not bought them? Should developers have just ignored the advanced hardware? Everyone acted rationally. Everyone moved forward because they wanted the PC to prosper. And yet, the changes they brought about ended up building a wall around PC gaming.

Although, it might have mitigated the problem if integrated graphics cards didn’t suck quite as bad as they do. Those built-in graphics cards on new PCs are shamefully useless. The Amish could whittle a more impressive GPU from driftwood than what comes prepackaged in the standard-issue new PC. It’s not that they run games slowly, or that games look bad – it’s that they usually can’t run new games at all. (And remember where we began: In the ’90s, new computers could run new games.) If the average integrated GPU was up to the job of running new games (albeit poorly), then the PC could still be an onramp to gaming.

Imagine how huge the Xbox 360 would be if it was as common as the personal computer. If there were classrooms full of them in every school, and you could use them for free in every internet cafe and library. This was what the world looked like for PC gamers before graphics acceleration created a distinction between “computers that can run games” and “regular computers.” PC enthusiasts – a group which includes me – often point out the openness of the hardware, the better resolution or the mouse and keyboard interface as the strengths of the platform. But none of those were the real reason PC gaming was so big for so long. The real reason PCs reigned was because everybody had one.

Nothing “killed” PC gaming. It just stopped being the all-encompassing omni-platform of ubiquity, and now sits around the retirement home drooling on itself and muttering about the good old days.

Shamus Young is the author of Twenty Sided, the vandal behind Stolen Pixels and is often baffled by graphics hardware.


The Escapist is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission.Ā Learn more about our Affiliate Policy