Experienced Points

Experienced Points
Why Video Games Need Their Own Programming Language

Shamus Young | 9 Dec 2014 19:00
Experienced Points - RSS 2.0

You've probably heard of Jon Blow. He's the designer of Braid and The Witness, and one of the first of the new-wave indie auteurs that began this whole indie craze a few years ago. Back in September he posted a two-hour video where he talked about the need for a new programming language designed specifically for games. He's certainly not the first to notice this need, and I'm sure he won't be the last, but I'm bringing this up because he's yet another important voice pointing out the awful situation our industry is in.

All AAA games are written in the programming language called C, or its descendant C++. (Or nearly all. If there are any exceptions to this rule, I can't think of them.) Yes, other languages are sometimes a part of game development. Minecraft is written in Java, and lots of AAA games use various scripting languages on top of C. But the C language is still the cornerstone of AAA game development.

(For the rest of this article I'm not going to bother making a distinction between C and C++. Most games are actually written in C++. The difference is pretty important to programmers, but those differences don't matter to the average non-coder. Broadly, C++ is a newer, hipper version of C. And by "newer" I mean it came out in 1983.)

We use C to make our games, our game development tools, the graphics drivers, and even the operating systems that run the games. To understand how strange this is, we need to look at what C is and what is was designed for.

In this discussion, there are two broad areas of programming: Applications programming, and systems programming. Applications is making software for you to use directly: Games, word processors, web browsers, MP3 players, digital delivery clients like Steam or Origin, and basically anything else you launch on your computer by clicking on an icon. Systems programming involves writing the software to support all of that stuff: Your operating system, your device drivers, and that sort of thing. If you've ever opened up task manager in Windows and looked at the long list of crap that's always running, most of that stuff qualifies as "systems". It only exists to make the application stuff go.

C was designed to be a systems programming language. It was supposed to be a language for writing stuff like operating systems. It was devised in a world before personal computers. It was made in the age of Big Iron, and in a time where computers had a million times less memory than they do today. There are lots of fussy details in the language designed to save a few bytes of memory that, in today's world, simply do not matter.

It was created in a world where software was less complex than it is today. Your typical AAA game of 2014 will be thousands of times more complex than entire operating systems of 1972. Consequently, the language is focused on saving memory and CPU cycles, and not focused on helping the coder manage terrifying levels of program complexity. John Carmack has said in the past that modern graphics programming is more complex than rocket science. And to be clear, he might be the only person on earth who has done both professionally.

By using C, developers are making programs that are more complex, harder to manage, more likely to have bugs, and take longer to write, and in return they're saving CPU cycles and memory which are now relatively plentiful.

The language was invented for a completely different kind of computer, in a different time, for a different purpose. So how did we end up building an entire industry of complex entertainment software on top of a language designed for writing operating systems in 1972? The obvious answer is the same problem that afflicts a lot of other areas of technology: Legacy infrastructure. People use C because people know C. People study C because all the jobs use C. And the jobs use C because that's what everyone knows. It's a self-perpetuating cycle. Moving to a new language would mean giving up the various graphics engines, AI systems, interface tools, and coding tricks we've built up and starting over with a clean slate. It would mean trading in your experienced veterans for fumbling newbies (even if you're still using the same people) and having those newbies re-write your large expensive C libraries.

Comments on