Interaction with mid-air floating images expands human-computer interface possibilities. Here we propose a system that superposes haptic sensation on mid-air floating images using ultrasonic standing waves. Conventional approaches of using airborne traveling ultrasounds could only apply weak pressure to the finger surface from limited directions and generated air flow as a side-effect.
The use of focused standing waves generated by our surrounding phased-array system enables creating spatially varying acoustic pressures in all directions. In addition, this method suppresses the air flow. Such spatial variation of acoustic pressure produces a haptic feeling of an elastic surface and offers a rich haptic experience. Combined with ultrasonic beam steering and mid-air floating images generated by numerous micro-corner reflectors, this system can display a virtual spherical gadget that is “pinchable” and “movable.” Thus, an intuitive human-computer interaction can be offered.
Seki Inoue, Koseki Kobayashi, Yasuaki Monnai, Keisuke Hasegawa, Yasutoshi Makino and Hiroyuki Shinoda, ”HORN: The Hapt-Optic Reconstruction”, SIGGRAPH 2014, Emerging Technologies, Aug 10–14, 2014