Mob Rule

When Worlds Collide


As if MMOGs like World of Warcraft and Second Life weren’t already addictive enough, imagine if those virtual worlds crossed over into your real-world life.

No, not like missing your kid’s soccer game to raid Serpentshrine Cavern – I mean interacting with computer-generated avatars in a real-world space. For the last two years, faculty and students at the Georgia Institute of Technology have been conducting experiments into enhancing MMOG environments with “augmented reality” (AR) technologies. Mowing down murlocs on your morning commute has never been so tantalizingly close.

What sets AR apart from traditional (and clichéd) virtual reality? The latter refers to an artificial, interactive environment where everything that the user sees and hears is computer generated (or synthetic). “Augmented reality” refers to overlaying (or “augmenting”) a real-world object or environment with computer-generated content for the user to interact with. The user sees the virtual elements blended into their real-world surroundings by either looking at a screen or wearing a head-mounted display.

The AR Second Life project at Georgia Tech interfaces Second Life client code with AR technologies. One of the project’s experiments brings avatars out of the virtual world of Second Life and into the real world to interact with real people. Imagine wearing a head-mounted display and checking out that sexy avatar you’ve had your eye on standing right in front of you in life-sized form.

The Escapist sat down with Blair MacIntyre, an Associate Professor at the Georgia Institute of Technology and one of the project leaders of AR Second Life, to talk about how AR technology might change how we communicate – and play

The Escapist: Practically speaking, how can augmented reality technology benefit the user/player in an MMO environment, or enhance the gameplay?

Blair MacIntyre: Imagine a global-scale MMO. Imagine going into Google Earth, and you could author an overlay on it. So now I could be walking down Broadway in Google Earth and someone who’s standing on Broadway in New York would be able to look at their see-through display and see me walking down in exactly the same aligned perspective.

There’s all kinds of gaming applications: Maybe a fantasy roleplaying game where I’m looking at a representation of Times Square and I’m fighting the dragon over it, and people in Times Square, through head-mounted displays or their cell phones, see this unfolding. So you can start imagining games that align the physical and virtual worlds in large-scale experiences.


Sun has this MMOG [toolkit] called Wonderland. We have a full-scale model of the [Georgia Tech] campus and a 16-mile stretch of Peachtree Street near campus that one of the groups on campus built. What I would love to do is create an MMO that maps to a large chunk of the city and see what we could do in the gaming space.

TE: What were the goals of the AR Second Life project when you started research and development?

BM: The initial goal was to explore augmented reality [in] machinima … going from what people have been doing in the machinima world, which is analogous to animated films, and moving into having the ability to put virtual content, virtual characters and so on, in the physical world.

TE: Was the choice to use Second Life mainly due to its client code being freely available under open source?

BM: Linden Lab open-sourced the client, and that seemed to make it the right tool at the right time. We’d thought about using everything from Unreal to tools that we developed in my lab over the years. The problem is, how do you create content? How do you support this distributed collaboration, where you might want four or five people controlling avatars and objects at once?

The appeal to Second Life is content. It’s very easy to get lots of different artifacts, clones and avatar shapes for this kind of project without having to hire designers to build it. In Second Life, it’s so easy to build things.


We started to consider more deeply how you could use this project for mixed physical-virtual reality performances. Imagine doing a play where some people are real actors and some are avatars. Or doing first-person experiences where I walk into a room and there’s virtual characters around me. [We] actually created an augmented reality version of Façade where you would walk with a head-mounted display.

The problem is rich, interactive content is really difficult to build – an infrastructure that would let us explore dramatic things that are something like Façade as opposed to the flat, menu-based interaction you get in most games. I wanted to explore design ideas without having to build these things, because it takes so long to build them.

Have you read The Diamond Age? The main plot device is that there are actors sitting in small rooms taking acting jobs where they’re controlling characters in these interactive books around the world. Avatars are being controlled in real-time by people around the world, who happen to be paid to do that for small periods of time. Second Life has the potential to let us do that.

Right now, we’re doing performance with avatars using augmented reality: You put on the head-mount, and you see other avatars who are stand-ins for actors. Then on your screen you see the lines you need to deliver and your stage directions.

TE: What were some things the AR Second Life team learned through its research?

BM: A whole bunch of mundane, technical things about what does and doesn’t work well in Second Life. And people aren’t keen on the idea of head-mounted displays. I think over time that will get better.

People talk about things like, “Wouldn’t it be great if we could both put on our head-mounts, and I could see your avatar standing across the room from me and you could see my avatar standing across the room from you?” The problem is, how do you control the avatars? In order to make my avatar do what I’m doing, I need to do motion capture, and that’s not going to be possible anytime soon… not for the home user or even for a research lab.

We’ve focused on a stage where there’s actors and virtual actors, not trying to support distributed experiences, because in Second Life the latencies make it hard to do highly interactive, distributed things. A lot of latencies in a 3-D world you don’t notice. But as soon as you’re overlaying it on the world, and you see how physical things are supposed to be triggering virtual things, you start noticing.

In a head-mounted display, you can tell the difference between 30 frames-per-second and 60 frames-per-second. Thirty frames-per second feels really sluggish. Many latencies and other limitations of these distributed infrastructures that you don’t normally notice start being very noticeable.

I was surprised by the interest from technology companies. Lots of companies are in Second Life to hock their wares and that never really pans out. But some are there because it’s this big distributive platform for collaborative work potentially. Big companies are trying to figure out how to use it as a glorified audio conferencing system. Many of them were talking to us about if AR could make that better.

TE: Your research is trying to measure the physiological response people experience in an augmented reality environment by simulating a situation where the user stands near the edge of a pit. What has your team learned so far from this experiment?

BM: The main thing we’ve learned is that it can be amazingly compelling, even when the graphics are not as good as you would think. From the demos and informal experiments, it works really well even without high-end graphics. When you put people in the head-mounted display and they stand next to this pit and look down and see this room straight down, even though it’s relatively low resolution, some of them get freaked out. I have a fear of heights and when I stand near the edge of the pit and look down, I can feel a little bit of that sense of vertigo.

We’re now running people through the experiment, varying things like the frame rate and latencies, to see when that sense of vertigo breaks.

UNC Chapel Hill did experiments in virtual reality using this pit idea, which is where we got the idea from. The thing is, when you can see the world around you, in some ways it is less scary. When I tried [their] virtual reality pit, I couldn’t see the real world, but I could feel things, like the edge of this fake pit that they’d built, which made it scary because you just didn’t know what you might bump into.

TE: How intense are the hardware requirements to run AR Second Life? Does it need the latest CPU or video card technology to run well? Basically, is this something that can run on a typical, affordably priced PC?


BM: If you spend any time in Second Life, you’ll notice that there’s some places where the frame rate falls dramatically, even on the highest-end PC. That’s annoying, but it becomes really a problem in the augmented reality world if you’re using a head-mounted display: You’re trying to walk around in a space where it’s really jerky, slow moving, which is disconcerting.

When I first got into Second Life, I bought this great looking castle. It was vast with all these great rooms. We figured we could use it for conferencing. With the AR gear on, the frame rate fell. You don’t notice that normally, because 10 frames-per-second isn’t that bad in Second Life, but when you have the head-mount on, it really is noticeable.

TE: Is the software for AR Second Life available to the public to download and try out?

BM: Right now, we haven’t even got it integrated into the latest source build. We will have a student working on it starting in the fall semester. So, hopefully September, October, we’ll start making it available.

There’s some components of it that I want to contribute immediately back. For example, we repackaged the rendering engine such that we can do a bunch of effects like shading and blocking light. That would be really useful for anybody who wants to do machinima.

Aside from that, I’m interested in how can we build tools to let non-programmers experiment with augmented reality. We have an AR toolkit built on top of Macromedia Director to take people who are Flash or Director programmers, not hardcore programmers, and let them experiment with AR. That’s available on our website.

TE: How much longer will the AR Second Life project continue? What will happen with all the data you have collected?

BM: I expect this is going to go at least for another year. It looks like we’ve got funding for a student to keep working on it for at least that long.

Once we get to the point of releasing it, hopefully the technology will get integrated back into the [official Second Life] client. So you could create an interface to Second Life where you put on a head-mount.

TE: Have any game developers expressed interest in your research?

BM: Not really. We’ve done other augmented reality projects that have generated some interest from the game community on mobile phones. But nobody’s stepped up and said, “We’d like to turn this into a product.”


The big problem with augmented reality is it doesn’t seem feasible to imagine millions of people buying head-mounted displays right now. On cell phones, I think there will be interesting things happening in the near future. Hopefully, that will change as the displays get better and cheaper.

TE: To be honest, the first thing that comes to mind about adding augmented reality technology to Second Life was the possibility of more immersive cyber-sex experiences among users. Have you seriously considered or looked into this?

BM: It obviously has occurred to us. The cyber-sex in Second Life, basically I would describe as phone sex plus avatars. Imagine doing AR so that the avatar is dancing in your living room instead of on your computer. It’s not clear to me if that would be a better experience, or if people would actually pay for that. Probably. If you can imagine it, people will try it, but we have no intention of pursuing that … I’ve got only so much time to do so many things, and there’s a lot of other things I’d rather do than that. I think there are things in the collaborative work or wide-scale game sense that are more interesting.

Howard Wen ( plans to render himself a fancy sports car to get him through his Second mid-Life crisis.

About the author