MIT has a project in the works that is able to generate holographic video in real-time through Kinect.
One of the turning points for humanity will be when we can successfully replicate the hologram of Princess Leia from Star Wars: Episode IV where she asks Obi-Wan Kenobi for his help. In what couldn't be a better case scenario, MIT is on the job, and is even using Kinect to help the project along.
Michael Bove's Object-Based Media Group from MIT recently presented an in-development holographic display system at the Society of Photo-Optical Instrumentation Engineers' Practical Holography conference. I know, I missed it too. The system is similar to that created by the University of Arizona, but only uses 1 source of input instead of Arizona's 16. That source? Microsoft's Kinect.
The system works by capturing visual data through a Kinect camera attached to a PC, which sends it through the internet to a holographic display. It currently operates at 15 frames-per-second, up from 7 thanks to a recent boost. This is the highest frame rate yet for holographic streaming video, but MIT says it's confident it can get the frame rate to 24 FPS, or even 30 FPS which would replicate that of television.
Bove says that the system is unique because the OBMG is limiting itself to off-the-shelf hardware. The group wants to create a holo-video solution that is as cheap as possible. Right now, that solution includes a Kinect, laptop, and receiving PC that uses three graphics processing units in tandem to compute the "diffraction patterns" to create the final holographic product. That product is shown on a holographic display that MIT designed itself, but Bove's group is working on a cheaper version.
The video here might not look like much, but it's an incredible indication of what's to come within the next few years. Presumably, we'll soon be using holograms not to ask for help to save the universe, but to remind our significant others to pick up milk on the way home, amongst other menial tasks.