acm-header
Sign In

Communications of the ACM

ACM News

Ready, Aim . . . Puff?


View as: Print Mobile App Share:
The actual Aireal device.

The Aireal is the latest example of an emerging wave of haptic technologies that promise to bring a new level of realism to virtual 3D worlds.

Credit: Disney Research

Imagine shaking hands with a hologram, or patting an imaginary pet, or feeling the 'thud' of a virtual ball landing in your arms. That's the promise of Aireal, a new haptic technology prototype coming from Disney Research in Pittsburgh.

Housed in a casing about the size of a PC speaker, this cannon-like device shoots bursts of air through a vortex that creates a physical sensation on contact with human skin. When paired with a 3D projector, it creates a remarkably believable simulacrum of a physical object.

The Aireal is just the latest example of an emerging wave of haptic technologies that are promising to bring a new level of realism to virtual 3D worlds. As these tools grow more sophisticated, they may pave the way for new immersive experiences in video games, amusement parks, and other arenas where the real world blurs into the realm of make-believe.

While haptic devices have been around for years — from force-feedback joysticks to the vibrating cell phone in your pocket — they have largely been confined to physical surfaces. Now, recent innovations in acoustic engineering are opening up new possibilities for creating touchable objects out of, literally, thin air.

As far back as 1998, a Massachusetts Institute of Technology team developed a prototype stylus that could interact with a virtual object in real time. More recently, the Stormrider attraction at the Tokyo DisneySea amusement park deployed an elementary form of 3D haptics by puffing air onto riders' bodies to create the sensation of riding through a violent weather system.

Researchers at the University of Tokyo have been exploring the possibilities of ultrasound waves to create virtual tactile experiences. Professor Hiroyuki Shinoda has led a team developing the so-called Airborne Ultrasound Tactile Display, which uses multiple transducers to create an acoustic radiation pressure field that allows users to feel invisible objects in mid-air. This allows for the projection of what he calls "a non-contact tactile display" capable of triggering the sense of touch.

Shinoda sees enormous potential for such displays to replace conventional touch panels – for example, in an operating room, where doctors and nurses could control computers using touchless, germ-free interfaces. The team is also exploring possible applications for automotive interfaces and entertainment systems.

In a similar vein, a team at the University of Bristol is exploring the possibility of Haptic TV for mobile devices. By streaming haptic signals through a TV signal via low frequency ultrasound (~40 kHz), a receiver equipped with ultrasound transmitters could transmit tactile sensations onto a viewer's skin.

Ultrasound signals share an important constraint, however. They are only effective at short distances, up to about 4 centimeters. The Aireal, by contrast, works at distances of up to 6 feet with 95% accuracy. The team is now working on a prototype of a much larger version that could conceivably work at distances of 50 feet or more.

Aireal is the brainchild of Rajiv Sodhi, a Ph.D. candidate at the University of Illinois specializing in Human-Computer Interaction and computer vision (a subfield of computer science focused on algorithms that allow us to understand video and imaging). After doing some initial work on spatial augmented reality — e.g., using everyday objects as display surfaces — he began to explore the possibility of incorporating haptic feedback as a way to broaden the possibilities of virtual objects.

"We should be able to take any object and turn it into a display, and not have to mediate it through a phone or glasses," says Sodhi.

To that end, he began to look for ways to incorporate more physical cues into gesture-based interfaces (like the Microsoft Kinect). "One of the fundamental roadblocks to projection-based interfaces is that you can't feel them," he says.

The Aireal works by shooting air through a nozzle to create a vortex. By manipulating the strength, speed and direction of that vortex, the device can create waveforms that make an impression on contact with human skin.

Given the sophisticated physics involved, the device uses surprisingly low-tech parts, relying on ordinary computer speakers that push out a modest volume of air, with five of them combined in a small enclosure. "When they all push together, it creates a large disturbance," says Sodhi.

The major technical challenge involved designing the nozzle that directs the flow of air. Sodhi and his team spent countless hours fine-tuning the size and thickness of the nozzle to achieve the right level of stability with a flexible material that could be manipulated with two simple pan and tilt motors.

Rather than pursue advanced computational modeling techniques, Sodhi relied instead on a 3D printer to create a series of rapid iterations on the nozzle design. "This would have been very hard to do computationally, he explains. "3D printing was really crucial for us."

While this project remains in the experimental stage, it points the way towards a new range of virtual experiences that researchers are only beginning to imagine. There may be several technological paths forward, but they all point towards a not-too-distant future in which the boundaries between virtual and physical phenomena grow increasingly blurry. It may finally be time, as Sodhi puts it, to "take computing of the box and put it into the real world."

Alex Wright is a writer and information architect based in Brooklyn, NY.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account