xbox one vs ps4

So the controversy over the past couple of months is that the PS4 outperforms the Xbox One in terms of frame rate and resolution. No, actually the Xbox One has a performance advantage. No, actually it doesn’t, but it doesn’t matter. But that hasn’t stopped people from compiling massive lists comparing the performance of the two platforms.

This argument is actually four-dimensional. On one axis are the merits of PS4 vs. Xbox One with regards to price vs. performance and features. On another axis is the argument over whether framerate or resolution is more important. Then there’s another axis where one extreme says that these numbers are critical and the other says they don’t matter. Then finally we have the debate over the even-ness of the framerate and whether or not it trumps any of the other arguments. These debates form a non-Euclidian (and extremely hyperbolic) volume of debate-space and half the work in any given argument is in determining if any two participants are even operating in the same frame of reference.

Which is to say that the whole thing is a confusing angry mess. For the record, I don’t have a horse in this race. I loved the PS2 and still think of it as the greatest console ever. I thought the PS3 was a technological misfire. I liked the Xbox 360, but once mine bricked I never had a strong desire to replace it. And now I’ve got such a massive backlog of PC games that I can’t justify getting a now-gen console. Why pay hundreds of dollars to gain access to a small library of titles I don’t have time for?

But while I don’t have any first-hand experience with either platform, I do have a lot of experience analyzing, scrutinizing, and agonizing over frame rate. I can’t settle the debate, but hopefully this will cut down on the ambient level of misinformation and misunderstanding we have going on here.

This debate isn’t new. Way back in 1998, now-defunct graphics card manufacturer 3dfx Interactive was busy turning out hardware designed to render blazing fast, high-resolution images using 16 bit color. At the time there was a big debate on what was more important: Raw speed or color depth.

16bit vs 32bit

To illustrate the difference, I took this image and created a version to simulate the difference between 32- and 16-bit color depth (seen at left).

For the nitpickers: I did this by reducing the red, green, and blue channels to 4 bits each, assuming that in the context of a game the other 4 bits might be given over to the alpha channel. I also reduced the resolution, both to make the difference easier to see and also to better show off how big the pixels were back then. Of course, back then we were looking at cartoonish texture maps and not photographs of ladies cosplaying as Rainbow Brite, but in this same time period game developers were going mad with colored lighting and fog, which greatly exacerbated the problem. The point is: We can haggle over how accurate this image is, but for the purposes of this discussion this is good enough for showing the reader what the 16 bit vs 32 bit distinction might have looked like in the context of a late-90’s PC game.

Most of the rest of the industry was moving to 32 bit color, which would produce something like the top image. 3dfx was doubling down on raw speed with 16 bit color. The game would (in theory) run faster and allow for higher resolutions, but would suffer from the color banding you see in the lower image.

People debated about which one was more important. Some people found the color banding to be incredibly ugly and distracting. While I could see the color loss if I tried, it never bothered me. Maybe it was because I spent so many years looking at 4-bit CGA graphics that 16 bits seemed “good enough” for me. In any case, it was important to a lot of gamers and developers. In the end 32 bit won out over 16 bit.

It’s important to note that frame rate and refresh rate are two different things. I see the two terms being used interchangeably, and it always makes my eye twitch. Refresh rate is the interval at which your monitor or television will show changes. This is interval is fixed and completely unrelated to whatever the videogame might be doing. This gets confusing for some people, because the monitor refresh rate is usually expressed in hertz, while videogame speed is expressed in frames per second. In a perfect world these two terms would be synonymous, but because of the strange way that technology evolved most older displays showed 30fps and ran at 60hz. (Or if you lived in America, it ran at 29.97 frames a second, for reasons that seemed perfectly sane in 1941 but ultimately just drove everyone crazy for the next half-century.)

Back in the day, those old CRTs needed to refresh as often as possible. That Cathode Ray Tube generated images by blasting a fluorescent screen with electron guns, which sounds very Star Trek but is actually very low-tech by today’s standards. That system produced an image that flickered very slightly. It needed to flicker as quickly as possible, because the faster it goes the smoother it seems to the eye. If it pulsed at only 30 frames a second, then watching television would feel like staring into a strobe light (which is actually the case) which can cause headaches, nausea, vomiting, and seizures. Refreshing 60 times a second was deemed fast enough to keep most of us from getting sick. (Although some sensitive people, like my daughter, can still get violently ill even at 60hz.)

30fps vs 60fps

At the same time, we didn’t really have the technology or desire to broadcast 60 unique images a second. So instead we recorded television shows at 30 frames a second, and then smeared those 30 frames over 60 updates using various schemes that sound awful but seemed to work out fine in practice. This became extra fun when we put feature films on television. Films were recorded at 24fps, which doesn’t divide nicely into 60fps the way that 30fps does.

This all worked out well enough until the age of video games, when your framerate became this variable thing. If the computer slowed down, then you might not get many frames a second. If the computer was fast enough, it might be able to provide a completely unique image for every single refresh cycle of the monitor, which is something you couldn’t get from television and movies.

Remember that regardless of framerate, refresh rate is a fixed, immutable thing. If a programmer goes insane and makes a game that runs at 120fps, then half of the frames will simply be lost. You’ll draw a frame, show it to the user. Then draw a frame, but the physical monitor isn’t ready to refresh yet. Then you’ll draw another frame, and that one will get shown.

If a game runs at 30fps, then half the frames will be repeated. You’ll draw a frame and show it. Then you’ll begin the next frame. When you’re halfway done, the monitor will be ready for a new one. Since you don’t have one ready yet, it will just re-use the previous image. Then you finish the frame just in time for the next refresh.

Note the uneasy case where (for whatever reason) the game runs at something oddball like 40fps. You draw a frame and show it. Then you’re only two-thirds of the way done when the refresh comes, so it repeats the previous image. Then you finish an image, but the refresh isn’t ready yet. Then when the refresh happens the current image is a bit old, but the new one is only one-third done. Then on the next cycle the frame and the refresh are in sync again. The result is this strange stutter-step where it feels like the game keeps shifting between 60fps and 30fps.

Quake 1

This sounds horrible, but whether or not it bothers you (or if you even notice) depends a lot on both your hardware and your wetware. When I was 27 and Quake was new-ish, I remember noticing the difference between 30fps and 60fps as I geeked around with my hardware and graphics settings. Today? I’m 43, and anything over 30fps is wasted on me. I can barely tell the difference, and it doesn’t start to bother me until it gets below 20.

That’s another thing about this debate over console framerates: It’s all very subjective. Sure, 60fps is, in a completely objective and technical sense, better than 30. And 720p is objectively worse than 1080p. But how much of a difference it makes depends a lot on who you are and where you game. How large is your screen? How far are you from it when playing? Are you directly in front of it, or slightly off to one side? Where are the lights and windows in the room? Do they create a bunch of glare? How old are your eyes, and are you the kind of person who is really “sensitive” to framerate? Do you play cautious and slow-paced games like Splinter Cell or Thief, or do you favor lightning-fast online multiplayer? Do you play for long sessions or short? Are you wide awake or at the tail end of the day when you game?

All of this impacts how your eyes perceive the game. A lot of the debates between people insisting that framerate and resolution are SO IMPORTANT and the people saying they’re MEANINGLESS comes down to the difference between playing in a dark room two meters from the screen or a bright office where the monitor is right in your face. It’s not that some people are lying, or delusional, or haters of one platform or the other. It’s that their gaming situation is likely very different from yours.

So before you wade into the debate, ready to explain to everyone on the other side just how wrong they are about 1080p and 60hz, just remember that it’s all subjective and not nearly as important now as it used to be. (Although if VR headsets take off framerate will become more important than ever, since frame-skipping on a screen strapped to your eyeballs can be a ticket to puke city.) In the long run, frame rate and resolution isn’t nearly as important as price, games library, and usability. All the resolution in the world won’t make a terrible game fun, and people still love the original Thief games, even though their graphics are horrible to the point of comedy. Just keep a sense of perspective.

Shamus Young is a programmer, a novelist, and a blogger.

You may also like