Haptics in VR

Enhancing VR experiences with haptics and haptic illusions

Virtual Reality experiences typically focus on the visual aspects of immersion. This project has explored the challenges of creating haptic experiences for such immersive experiences – first through the use of tangible devices, and then through the use of visio-haptic illusions.

One way to create haptic experiences is through the use of tangible devices – that is, things that you can physically hold and feel. Yet, the challenge with this is that we may have fantastical items in VR that are difficult to re-create in real life. To this end, we developed the Tangi toolkit (Feick, Bateman, Tang, Miede, & Marquardt, 2020), which provided a means for designers to create physical objects that approximate what we see in the VR space. To some extent, these devices also had the capacity to be moved, and this added an element of additional realism.

The experience with Tangi made us realize that we were relying on a type of visuo-haptic illusion – that is, the idea that the visual aspects of the experience override our experience of the haptic sensation. This allowed us, for instance, to use blocky building blocks to represent what would otherwise be a smoothly-shaped bunny rabbit. To this end, we wanted to explore the limits of these illusions: to what extent can you approximate the physical/motor aspects of interacting with these objects without creating a “break” in the illusion? We explored these with linear physical proxies (Feick, Kleer, Zenner, Tang, & Krüger, 2021), and then later more generally with multiple variables at play (Feick, Regitz, Tang, & Krüger, 2022).

The work has resulted in interesting side projects with the idea of drones creating these illusions (Feick, Tang, & Krug̈er, 2022), as well as toolkits for measuring immersion within VR (Feick, Kleer, Tang, & Krüger, 2020). Further, we have now begun exploring physiological metrics to determine when the illusion breaks (Feick, Regitz, Tang, Jungbluth, Rekrut, & Krüger, 2023), which should allow us to one day create rich experiences without subjective reports on immersion.


  1. Martin Feick, Kora Persephone Regitz, Anthony Tang, Tobias Jungbluth, Maurice Rekrut, and Antonio Krüger. (2023). Investigating Noticeable Hand Redirection in Virtual Reality using Physiological and Interaction Data. In IEEE VR ’23: IEEE Conference on Virtual Reality and 3D User Interfaces. (conference).
  2. Martin Feick, Kora Persephone Regitz, Anthony Tang, and Antonio Krüger. (2022). Designing Visuo-Haptic Illusions with Proxies in Virtual Reality: Exploration of Grasp, Movement Trajectory and Object Mass. In CHI 2022: Proceedings of the 2022 SIGCHI Conference on Human Factors in Computing Systems. (conference).
    Acceptance: 24.7% - 638/2579.
  3. Martin Feick, Anthony Tang, and Krug̈er. (2022). HapticPuppet: A Kinesthetic Mid-air Multidirectional Force-Feedback Drone-based Interface . In UIST ’22: The Adjunct Publication of the 35th Annual ACM Symposium on User Interface Software and Technology. (poster).
  4. Martin Feick, Niko Kleer, André Zenner, Anthony Tang, and Antonio Krüger. (2021). Visuo-haptic Illusions for Linear Translation and Stretching using Physical Proxies in Virtual Reality. In CHI 2021: Proceedings of the 2021 SIGCHI Conference on Human Factors in Computing Systems. (conference).
    Acceptance: 26.3% - 749/2844.
  5. Martin Feick, Niko Kleer, Anthony Tang, and Antonio Krüger. (2020). The Virtual Reality Questionnaire Toolkit. In AP UIST 2020: Adjunct Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, UIST 2019. (poster).
  6. Martin Feick, Scott Bateman, Anthony Tang, André Miede, and Nicolai Marquardt. (2020). TanGi: Tangible Proxies for Embodied Object Exploration and Manipulation in Virtual Reality. In ISMAR 2020: 2020 IEEE International Symposium on Mixed and Augmented Reality. (conference).
    Acceptance: 28.8% - 87/302.