Force-feedback devices like haptic gloves and gaming vests, which are peppered with electrically driven actuators, let people feel they are actually touching, or being touched by, three-dimensional (3D) objects in virtual reality (VR).
Making 3D visuals tangible does not always have to involve the use of such complex wearable technology, say three research groups that have come up with some intriguing – if not downright bizarre – new ways for people to interact with virtual environments.
Alternative haptic methods are needed, says Pedro Lopes, who leads the Human-Computer Integration Lab at the University of Chicago, because haptic gloves and vests are bulky, power-hungry and hard to build force-feedback motors into. "So people in our field, and in the haptics, VR, and AR (Augmented Reality) industries, need to bend the laws of physics and propose new types of actuators that consume less power and which are easier to integrate."
In attempting to engineer such a system, Lopes and his Ph.D. students Jasmine Lu and Jas Brooks have come up with a pretty radical idea: their new take on haptics applies a variety of chemical stimulants to a user's skin to evoke sensations related to the content that's playing out in a VR story or a game at any time.
To apply their stimulants, the team developed a prototype wrist-worn silicone sleeve, and a cheek patch for facial application, which use micropumps to apply to the skin topical skin-safe dosages of one of five chemicals, some of which sound like they'd be more at home in television medical shows like ER, Grey's Anatomy, or House:
"By having the receptors in your skin generate the haptic sensations, we don't need bulky heating and cooling devices; we just need to deliver a few drops of a chemical stimulant, and your skin does the rest," says Lopes.
To test the idea, they developed a VR experience with a storyline about a failing nuclear reactor – and tested it on four Oculus Quest VR users wearing their chemical haptics kit, with a chemical reservoir behind the VR helmet. The narrative included a shorting-out reactor control panel, which produced sparks that excited the tingling sensation on the arm – and as people ran outside, the cooling chemical was applied to the face patch. The failure of a VR arm interface that controlled a door in the story was reinforced, emotionally, by numbing the user's arm with lidocaine, and a VR "wound" actuated the stinging stimulant. Finally, as a reactor door opened, the heat issuing from it was registered by the warming stimulant.
The users agreed that the stimulants added to the immersive feel of the experience, adding a new dimension to VR, Lu told last October's virtual ACM Symposium on User Interface Software and Technology (UIST21) conference. The team believes hazardous chemical industrial/lab safety training simulations, in particular, could benefit from chemical haptics.
The tech struck a nerve among human-computer interaction (HCI) specialists: "At UIST, many people got excited about chemical haptics and are eager to explore it in their own projects, speculating on what new sensations could be generated," says Lu. To that end, her colleague, Ph.D. student Jas Brooks, is exploring new stimulants and additional sensations, she adds.
However chemical haptics fares in the long run—applying chemicals will doubtless face regulatory hurdles, however dilute they are—users are still going to want to "touch" virtual objects, which is where the aerohaptic feedback system developed by Ravinder Dahiya and colleagues at Scotland's University of Glasgow comes in.
Instead of being encumbered by a haptic glove or vest burgeoning with vibration motors or inflatable air blisters, the users of Glasgow's system simply cast their hand over an object they can see in a VR display, or in front of a computer-generated hologram, and precisely-aimed jets of air create the sensation of touch on users' hands and fingers.
In other words, an array of air jets creates the physical profile of the object using air pressure alone. It works because a Leap Motion sensor or a depth camera detects where the user's hand is – and directs the moveable jets at the hand, the force of the air impacting it being a measure of the hardness or softness of the virtual surface being touched.
It's not just static, either: the pressure varies with the image position, so that "users can even push the virtual ball with varying force and sense the resulting change in how a hard bounce, or a soft bounce, feels in their palm," the Glasgow researchers say.
While Chicago's system produces hot and cold sensations with chemicals, aerohaptics can do that in a much easier way: by varying the temperature of the air stream from the jets, says Dahiya. There's one more thing it can do as well: that air also can carry the aroma of the object in question, so an orange, say, would smell of the fruit.
Since revealing their aerohaptics system in an open access paper in the September 2021 edition of the journal Advanced Intelligent Systems, Glasgow's Bendable Electronics and Sensing Technologies research group has provoked strong commercial interest. "We are having discussions with a few potential investors about the next stage of development, which is upgrading the prototype for commercialization, in entertainment, gaming, communications and education applications," Dahiya says.
Because it lets people experience touch, temperature, and aroma unencumbered by clunky wearables, aerohaptics, Dahiya believes, could even be a step towards building something like the holodeck virtual environment seen in Star Trek. One reason: its air jet hand-tracking system runs at a constant 115 kilohertz, so it has a latency more than fast enough to be "considered real-time by the human senses," Dahiya says. In that context, a sphere could morph into a cube, and you could feel that happening in real time.
Another fascinating haptics system that could also find a role in holodecks was, like chemical haptics, also revealed at UIST'21. The work of Arata Jingu and colleagues at the University of Tokyo is designed to send covert notifications to a person who either is using a computer, or operating in a virtual environment. The target of this system?
Human lips. Seriously.
Called LipNotif, the Tokyo group's system makes use of the fact that the lips have very high tactile sensitivity, especially to mechanical disturbances at a frequency of 40 hertz – a fact that lends then the ability to sense ultrasound pulses well. On top of that, they note, people tend to do less with their lips than they do with their hands, so lips are almost always visible enough to receive messages. So Arata's team designed a phased-array ultrasound transmitter that uses a depth camera to track the position of a user's lips – and which can send a coded ultrasound message to them.
So for a user in a virtual environment, LipNotif could send a pulse to the left or right side of your lips—signifying that another person, creature, or alien has entered from the left or right, say. A pulse in the center of your lips could warn you that something is now behind you, perhaps. In effect, it gives VR users a supersense, while noone else can see or hear what you have learned.
However, this is all proof-of-concept stuff: ultrasound cannot be directed at the eyes or ears on safety grounds, so any system fielded as an eventual product would have to satisfy many safety measures, such as ensuring power levels are limited to safe levels at all times, and that the lip tracking system is fast and accurate enough to track lips and lips alone.
In advancing haptic technology, chemicals, air jets, and ultrasound might haul this particular field of endeavor into new regulatory domains. For instance, would chemical haptics need U.S. Food and Drug Administration approval, as well as the U.S. Federal Communications Commission approval that most electronics typically require?
"Our chemical-based haptics approach lives completely in the realm of research for now," says Lopes. "But I imagine that, like any devices that would one day go onto consumer's hands, they would certainly require such typical seals of approval."
Paul Marks is a technology journalist, writer, and editor based in London, U.K.
No entries found