Experienced Points

Experienced Points
Just How Does the Oculus Rift Work?

Shamus Young | 23 Sep 2014 19:00
Experienced Points - RSS 2.0
Experienced Points Oculus Rift

One of the most annoying things about being a fan of this VR revolution is that a lot of people see no reason to be impressed, because they thought we were able to do this years ago. We've been showing it in movies and talking about it for ages. Heck, Nintendo even had that Virtual Boy thing in the 90's! Why are people acting like this virtual stuff is new?

And there's some truth to that. We have indeed been building VR prototypes since at least the late 80's, and it's always seemed like it was right around the corner. If you heard about a VR prototype in the early 90's and if you're not an avid follower of VR news, then it would be perfectly natural and reasonable to assume that it all worked out and VR was invented years ago. That's certainly how most technologies work.

But the story of VR has been sort of strange, because it's much less about machines and much more about learning how our bodies work. We've discovered that our eyes and our vestibular system are strongly linked, and there are limits on how far you can trick one before you piss off the other.

A great way to illustrate the complexity of the problem is to just go over what the Oculus Rift has to do to bring you something as simple as a pair of images. Note that I'm a software guy, not a hardware guy, so a lot of this is simplified because I don't feel qualified to discuss the finer points of accelerometers and the like. Still, this should give you a good broad-strokes overview of what this thing does when it's strapped to your face.

We begin with an OLED display. The old CRT screens (you remember those heavy old things, right?) were too heavy to use for VR. Aside from the neck strain of having one strapped to the front of your head, there's the problem that those screens flickered very, very fast. It was tolerable at normal viewing distances, but it would have been seizure-inducing in VR. An LCD screen is better, but the pixels can't turn on and off quickly enough to keep up in VR. So we need OLED screens. (OLED stands for "organic light-emitting diode", which sounds like Star Trek technobabble, but is a real thing.)

Now we render our virtual world to the screen. We've got to do it quick, which means we need a really good computer. Cutting-edge games struggle to hit 60 frames a second, but in VR that's the bare minimum. (Anything less feels awful.) 75 fps is much better. (And we probably need something like 120 fps to completely trick the eyes seamlessly.) Oh, not only do you need to hit this high frame rate target, you need to do it while rendering the whole scene twice - once for each eyeball.

Now we have a split screen with an image for each eye. But we need this little six-inch screen to envelop the viewer. (Note how you don't feel "enveloped" if you just mash your face into your smartphone.) The way to do this is to set the screen a couple of inches from the eye, and then use lenses to bend the image. But these lenses also distort the image. They "pinch" the everything towards the center.

This distortion is unnatural and would ruin the illusion. The solution is to run a shader on the image. This is a special full-screen effect (like full-screen anti-aliasing or depth of field, if you've heard those buzzwords before) that distorts the image by stretching it out in the center, perfectly negating the lens distortion. So not only are we rendering a video game at an insane frame rate (twice!) but we're also doing this while applying complex full-screen effects.

RELATED CONTENT
Comments on