Experienced Points

Experienced Points
With Great Power...

Shamus Young | 21 Jan 2014 19:00
Experienced Points - RSS 2.0
image

How does Uncle Ben's quote go? "With great power comes the responsibility to not behave like a self-destructive ninny?" Something like that. Well, people developing on next-gen consoles (and PCs) have a ridiculous amount of power now. If you've got even a mid-range computer with a decent graphics card, then your computer has more processing power than every computer that existed before the year I was born. (1971) That's including the supercomputers built by world governments and all the computers involved in sending humans to the moon. That's a lot, you know?

I said last week that this new console generation threatens to bust the already tenuous budget problems that developers are having. But this doesn't mean that more processing power is a bad thing. I'm not suggesting that consoles should never evolve or that we would have been better off if computing power had stagnated around 2004. I'm not even against better visuals per se. What I'm really against is spending more than you make in order to achieve visuals that don't enhance the game. That talking demo head was cool to look at, but it also represented a massive increase in time and effort needed to get it all working just so. (And I still think it's got a bad case of "creepy crazy robot" when it talks.) Does it represent a massive improvement to the fun and interest level of a game? Probably not.

Well, we can't un-release these consoles, so we might as well make the best of it. What can we do with all this next-gen power? How can developers use it without going broke?

My first suggestion would be to focus more on macro detail, less on micro detail. This is tricky to explain, because when people see "more detail" they think "more detailed objects", which take longer to produce. The trick is that stuff is getting cheaper to render, but more expensive to produce. So what we want is a situation where we can fill out environments with easy-to replicate stuff.

For years we've been making our scenery clutter more detailed: Boxes, keyboards, books, cups, clothing, containers, furniture. Twelve years ago a "keyboard" on a scientist's desk might have been a simple flat rectangle with a grid of squares, like an oblong bingo card. Six years ago it was basically keyboard-shaped, but you couldn't make out the individual keys. Today in-game keyboards look like keyboards, but if you zoom in you can see the letters are too blurry to read. If the trend continues, then I suppose next-gen games will have nice, crisply defined keyboards that look like the real thing. And most players won't notice the difference because they run through the room and shoot all the bad guys without mashing their face up against all the objects in the room. Making that initial keyboard model is time consuming. Copy and pasting it all over the place is cheap. Instead of making the keyboard "perfect", just add more last-gen level clutter to the room.

Instead of making more detailed stuff, continue to make "2008" level of stuff, and just copy & paste more of them into the environment. More stuff to make the place feel authentic, and more crap to get tossed around by explosions, wizard spells, or whatever. It's cheap, it's quick, and at this stage it will make a bigger difference to how interesting the place is.

Along the same lines, we could increase the number of ambient NPC's. What if, instead of ramping up the model detail of next-gen NPC's, we just stuck with the models we've been looking at (which are already pretty good, in my opinion) and just had more of them? In GTA we could have streets packed with traffic and sidewalks packed with pedestrians.

A lot of people might suggest that we should spend these sexy new processor cycles on making smarter AI. For the last couple of generations the console markers talked about how the next-gen would give us amazing new levels of sophisticated AI, but in the end it feels like we've got roughly the game grade of cannon-fodder mooks we've been fighting for years. While I wouldn't mind seeing some experimentation in this area, I'll admit this one isn't bloody likely. We're not even sure if players want smarter foes. Maybe I'll do another column on this one, but I think AI probably needs a good "killer app" to show the potential and get people excited about it again.

But while nobody is sure if we want "smarter AI", just having more AI is perfectly reasonable. Check out this game, which is going for having real-time battles with 10,000 ships on each side. Yes, that's a PC title, and the Mantle technology they're talking about is mostly part of an ongoing pissing match between hardware companies, but this is very much a direction we could go in.

We could also enjoy some larger levels with shorter load times. If we're not cranking up the polygon count, then we won't have to pull as many polygons off disk when it's time to change levels. Same goes for texture maps. Less graphics data means less time spent waiting for the game and more time playing it.

But probably the best thing for developers to do with this extra power is just cut tons of corners. A lot of testing and development time is spent looking for and fixing those spots in the game where you hit a hitch due to some I/O bottleneck, or spots where the framerate drops. But this won't be a problem if we don't increase the graphics workload. If we're not constantly redlining the machine then everything doesn't need to be perfectly tuned just to keep it running smoothly. We can spend less time worrying about how a game runs and more time worrying about how it plays.

Again, graphics are nice. I'm a graphics programmer myself, and I can see the value and attraction of pushing a machine and making it do something amazing. But it's all a matter of tradeoffs. Graphics shouldn't come before entertainment value, and they really shouldn't come before good business sense.

Shamus Young is the guy behind Twenty Sided, DM of the Rings, Stolen Pixels, Shamus Plays, and Spoiler Warning.

RELATED CONTENT
Comments on