A new technological prototype developed by the University of Bristol adds physical feedback out of thin air.
Tony Stark holds the award for having one of the coolest computer interfaces ever created in the form of an array of 3D holographic projectors in his garage. The closest we have ever come to seeing this technology is arguably the Microsoft Kinect and its 3D camera mapping technology. But there is a distinct lack of physical feedback to the user when waving your arm around. The Escapist’s own Yahtzee Crowshaw holds true that such methods of input are un-immersive and detract from the experience. That might be about to change with new research from the University of Bristol, where researchers have designed a touchscreen you can feel reacting to your input, without even touching it.
The prototype uses a new technology called “UltraHaptics.” An array of tiny speakers, called “ultrasonic transducers,” are used to generate ultrasound waves which create vibrations in the air with pin-point precision. These vibrations can then be felt at various distances and intensities above the device. The team then figured out a way to transmit these signals through a screen, creating an invisible layer of vibrations above it, allowing it to be used as a computer interface. Haptic feedback is something we have seen in many smart phones, the most common form being a button on the touchscreen generating a vibration as a response. Where existing technology is very limited in the feedback it can generate, this new technology is truly a breakthrough in the field.
Previous incarnations of haptic technology have required the user to be in physical contact with the device, which can lead to many limitations. “UltraHaptics” goes some way towards addressing those issues and has a lot of future potential. “Even if you provide [haptic] feedback on a touch screen you have to fumble around pressing all the buttons, whereas with our system you can wave your hand vaguely in the air and you’ll get the feeling on the hand,” notes Tom Carter, a PhD Student with the BIG research group: “We can give different points of feeling at the same time that feel different so you can assign a meaning to them. So the “play” button feels different from the “volume” button and you can feel them just by waving your hand.” These perceived differences in texture are generated by the device varying the modulation frequency or pulsing the feedback.
The technology is still in its infancy, but many uses for it are already cropping up. Being able to add layers of tactile information above a screen adds a wealth of potential new uses. It can also help augment existing technology for those with sight impediments. When used in conjunction with current generation motion sensor technology, the creation of a more physically responsive and interactive 3D interface edges one step closer.
Source: University of Bristol