In 1992, you could pay $40 for a forty hour game that was unlike anything we'd ever seen before. Today, you'll pay $60 for a ten hour game that plays much like a lot of the titles you already have on your shelf. (Assuming you can get the thing to run at all.) We're getting shorter games and less innovation and more buggy games. All this, and developers are still having trouble keeping up financially and technologically. The constant push to improve visuals is hurting both parties, and I think it would be great if we could just call a graphical time-out and tried to make the most of what we have now.
It costs a lot to jump from one generation of technology to the next. Each new graphics engine has its own tools, its own quirks, its own limitations, its own visual trade-offs. It takes time to master these tools, and for the most part we're throwing them out just when artists are getting good at them. Compare the debut PS2 titles with the games that came out near the end of its lifespan. (Which would be now, I guess. Quality PS2 titles are still coming out.) The newer games look better and run smoother, even though the hardware hasn't changed. It's possible to improve the visuals and performance of your game without changing the hardware at all, just by giving artists enough time to become adept with the tools.
What developers should do - and what should have happened years ago - is start treating the PC (and if we're lucky, the Mac) like consoles. Pick a nice safe spot on the tech curve and make that your baseline target platform. Now keep it there for eight years or so. When you finish a game in 2003, make another game aimed at the same 2003 level hardware. Then another. Get three or four games out of your tech before you re-invent the wheel. Sure, it means the graphics still look a little stale the third or forth time around, but the games will be cheaper to produce. Millions of dollars cheaper.
And at this point in the tech curve, a lot of people might not even notice you're standing still. Quake II came out five years after Wolfenstein 3D. In those five years we'd seen the world of in-game graphics revolutionized twice. (At least.) Anyone that released a game with Wolfenstein-level graphics in 1997 would have been laughed at. Yet here we are five years after the release of Doom 3, and that game barely looks dated at all. You could be pumping out games based on 2004-level technology and produce something that's commercially viable, attractive to look at, and relatively cheap to produce. (Cheap compared to chasing after the next engine, anyway.) I suspect that with strong art direction and experienced artists you could actually get another five years out of that 2004 technology before you absolutely had to move to a new generation.
Yes, there are mainstream game reviewers out there who are obsessed with graphics and spend their non-gaming hours masturbating to the NVIDIA product catalog. They will indeed give you a hard time because you're not using the next-gen bling mapping. I'm sorry about those guys. But for what it's worth, some reviewers won't do that, and I think consumers will be happy to pony up for your game as long as its fun. This might sound risky, but think about the millions you'll save in development costs. You'll be producing a game for less money that can run on a far larger portion of PCs. It will run smoother, be less of a support headache, and give gamers more value for their gaming dollar. That sounds like a winning strategy to me. All you have to do is sacrifice a bit of your graphical spectacle. The odd snarky review might cost you a few sales, but I can't imagine it will hurt you as bad as riding the bleeding edge. What are you after here? Do you want the approval of a jaded graphics fetishist or do you want to make awesome games?