Experienced Points

Experienced Points
Why the Oculus Rift is a Big Deal

Shamus Young | 8 Apr 2014 15:00
Experienced Points - RSS 2.0
EP 4.8 3x3

Maybe you're really upset that Facebook bought Oculus Rift. Maybe you're okay with it. Maybe you're wondering what the big deal is. I mean, it's just a gimmick peripheral, right? Who cares? And they've been talking about it for twenty years. What makes it so special this time?

What makes VR so special? Isn't it just like playing a game using 3D glasses? Minecraft was doing that years ago!

There's actually a lot of really cool stuff going here on that you don't get with 3D glasses. From a neuroscience standpoint, it's talking to a much more fundamental part of the brain. Your eyeballs are not just meat cameras, and how the brain processes images is a really complex process that we don't completely understand. When you're looking at a monitor - even a really big monitor - you're still only dealing with stuff directly in the center of your vision, which leaves out lots of the image-processing parts of your brain. Sure, you're looking at the monitor, but a lot of your brain is also busy looking at the room around you, very much aware that it's just looking at a rectangle of color while you're sitting in a chair.

By completely filling your view and extending into your peripheral vision, you get a level of immersion that doesn't happen with just a standard monitor, or even using many monitors like in a simulator. In fact this level of immersion is so fundamentally different from what you've experienced before that they have a new word for it: Presence. Presence is the sensation of "being there". It's possible to experience vertigo and a sense of scale in a way that just isn't possible using monitors or 3D glasses.

We had VR prototypes 20 years ago, so why aren't they out already? What's taking so long?

It's true. VR headsets have been one of those "right around the corner" technologies for more than 20 years. Like flying cars, thinking computers, and jetpacks, it's a symbol of some exotic future that always seems to be just a little over the horizon but never gets any closer. Like waiting for Half-Life 3, the public's demand has intensified simply because nobody can explain what's taking so damn long.

The big problem 20 years ago was that we didn't realize how much of a challenge VR was going to be. It was assumed that we could just slap some LCD screens over our eyeballs and we'd get "3D vision". We didn't know how mind-blowing the experience would be, but we also didn't understand how hard it would be to pull off. Every time we solved one problem, a new one emerged.

The first problem was that our LCD screen weren't good enough for VR. The screen you're using to read this article - even if you're using a high-end smartphone - is vastly superior to the screens of just four years ago, and it's still not quite good enough for VR. One problem is resolution: 1280x720 is great when you're sitting back looking at a screen, but it's lacking when it fills your entire field of view. (Especially since those pixels are split - just 640x720 for each eye.) A more serious problem is persistence: When an individual pixel changes from one color to another, the time it takes to make the transition is incredibly important. If the change takes a 30th of a second, that's fine for normal use, but it's way too slow when the screen is strapped to your eyeballs. Even the slightest linger will cause you to see smearing, blurry images when looking around or moving quickly.

Our LCD screens have been improving for years, but it wasn't until recently that they got close to good enough. And even now, our really high-end screens are just barely good enough. When it comes to VR, more is better. More pixels. More speed. Assuming VR takes off, the VR headsets we'll have in three years will be far better than anything we have now.

The next set of problems were related to head tracking. It turns out that when you immerse yourself into a scene that fills your vision, it really, really needs to act like the real world. If you turn your head to the side and your view doesn't change, it is incredibly disorienting and nauseating. But if we add the ability to track changes in orientation, then the brain expects it to also respond to positional changes. So if you move your head to the side, the in-game view needs to move as well. This means we need both motion tracking (like the Wiimote) and good gyroscopes (like the kind in your smartphone that detects you tilting the screen) and those technologies are pretty new.

On top of all of this, we needed more raw processing power. Most games run at 30 frames per second. That's fine for monitors, but not nearly good enough for VR. We need to bump up the framerate to 60 (120 would be even better), which means doubling the processing power required to draw the scene. On top of this we need to render the scene once for each eye, which means doubling it again. You don't need to be a math major to realize we're talking about needing four times as much power. No, it's worse than that: Not only are you drawing twice the frames at twice the speed, but you also need to render these full-screen distortion effects to correct for the fisheye distortion created by the VR lenses.

Comments on