360 Video Interaction

New interaction techniques for immersive video

360 video cameras provide an omnidirectional view on the world around us, and new opportunities to explore those views. Videos from these devices, in particular, afford us new ways to see and understand the world, as well as certain kinds of freedoms to explore these recorded experiences. Our work in this space has explored new ways to take advantage of the media – both in “single user” kinds of scenarios, where we are exploring spaces on our own, as well as in “multi user” kinds of experiences, where we are exploring the spaces with others.

One of the neat challenges and opportunities with 360 video is that they allow us to decide what we want to look at as viewers. Most videos have an “intended viewing angle” determined by the producer/camera person. In this sense, they restrict your view, and your ability to explore the space. What makes 360 videos even more interesting is that we have an additional dimension of flexibilit (time), which we must also navigate in addition to viewing direction. Our work has explored ways to smoothing this interaction to make it more straightforward for viewers.

Exploring with Others – Navigating Time and Space Together

Most interfaces for exploring 360 video are made for single user exploration. This presents real challenges when we are trying to explore 360 video scenes with other people. Some of our earliest work in this space showed that viewing 360 videos with others is somewhat cumbersome – mainly because our goals as viewers may differ (Tang & Fakourfar, 2017). We observe similar kinds of challenges when people try to communicate over live mobile video chat (Jones, Witcraft, Tang, Bateman, & Neustaedter, 2015). We designed two prototypes to explore how live video interaction over distance could be made richer through 360 video cameras: in one where studied how a simple prototype might introduce new complexities to the interaction (Tang, Fakourfar, Neustaedter, & Bateman, 2017), and in another where we attached a 360 camera to a telepresence robot to support play (Heshmat, Jones, Xiong, Neustaedter, Tang, Riecke, & Yang, 2018). These experiences revealed the complexities of interacting with remote collaborators, and paved the way for new kinds of interaction techniques to smooth those interactions.

Our most recent explorations of this research space have involved the design and evaluation of a prototype called Tourgether360 (Kumar, Poretski, Li, & Tang, 2022; Kumar, Poretski, Li, & Tang, 2022). This prototype allows people to experience virtual 360 tours together, and focuses on the design of communication and awareness tools to smooth the interactive dialogue between partners. We learned valuable lessons in this work – especially in how much communication is implicit and subtle.

Single-User Exploration of 360 Videos

One challenge of exploring 360 videos is that one can really only manipulate one dimension at a time—either changing the view, or changing the time. Controlling both at the same time is hard. This means exploring 360 videos quickly is difficult. To this end, we created the Route Tapestries approach (Li, Lyu, Sousa, Balakrishnan, Tang, & Grossman, 2021), which borrows from the slit-tear approach (Tang, Greenberg, & Fels, 2008). This technique creates continuous orthographic-perspective projection of scenes along camera routes, which allows users to quickly scan the entire video rapidly without having to watch the video in time.

Publications

  1. Kartikaeya Kumar, Lev Poretski, Jiannan Li, and Anthony Tang. (2022). Tourgether360: Exploring 360° Tour Videos with Others. In EA CHI ’22: Extended Abstracts of the 2022 SIGCHI Conference on Human Factors in Computing Systems. (poster).
    Acceptance: 36.1% (261/722).
  2. Kartikaeya Kumar, Lev Poretski, Jiannan Li, and Anthony Tang. (2022). Tourgether360: Collaborative Exploration of 360 Tour Videos using Pseudo-Spatial Navigation. Proceedings of the ACM on Human-Computer Interaction (PACMHCI) 6, CSCW2. (conference).
  3. Jiannan Li, Jiahe Lyu, Mauricio Sousa, Ravin Balakrishnan, Anthony Tang, and Tovi Grossman. (2021). Route Tapestries: Navigating 360 Virtual Tour Videos Using Slit-Scan Visualizations. In UIST 2021: Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology. (conference).
    Acceptance: 95/367 - 25.9%.
  4. Yasamin Heshmat, Brennan Jones, Xiaoxuan Xiong, Carman Neustaedter, Anthony Tang, Bernhard E. Riecke, and Lillian Yang. (2018). Geocaching with a Beam: Shared Outdoor Activities through a Telepresence Robot with 360 Degree Viewing. In CHI 2018: Proceedings of the 2018 SIGCHI Conference on Human Factors in Computing Systems, Paper 359. (conference).
    Acceptance: 25.7% - 667/2595. Notes: 10 pages.
  5. Anthony Tang, Omid Fakourfar, Carman Neustaedter, and Scott Bateman. (2017). Collaboration in 360° Videochat: Challenges and Opportunities. In DIS 2017: Conference on Designing Interactive Systems 2017 , 1327–1339. (conference).
    Acceptance: 24% - 110/458. Notes: Appendix material: http://dspace.ucalgary.ca/handle/1880/51950.
  6. Anthony Tang and Omid Fakourfar. (2017). Watching 360° Videos Together. In CHI 2017: Proceedings of the 2017 SIGCHI Conference on Human Factors in Computing Systems , 4501–4506. (conference).
    Acceptance: 25% - 600/2400. Notes: Presentation - https://www.youtube.com/watch?v=OPc4mBD7pgw.
  7. Brennan Jones, Anna Witcraft, Anthony Tang, Scott Bateman, and Carman Neustaedter. (2015). Mechanics of Camera Work in Mobile Video Collaboration. In CHI 2015: Proceedings of the 2015 SIGCHI Conference on Human Factors in Computing Systems, ACM, 957–966. (conference).
  8. Anthony Tang, Saul Greenberg, and Sidney Fels. (2008). Exploring video streams using slit-tear visualizations. In AVI ’08: Proceedings of the working conference on Advanced visual interfaces, ACM, 191–198. (conference).