Touching the Void

Abstract

While touch technology has proven its usability for 2D interaction and has already become a standard input modality for many devices, the challenges to exploit its applicability with stereoscopically rendered content have barely been studied. In this project, we utilize different hardware and perception-based techniques to allow users to touch stereoscopically displayed objects when the input is constrained to a 2D surface.

Approaches to handle this problem can roughly be separated into three groups:

  1. Approaches that separate the interactive and the visualization surfaces, such that the user can move the interactive surface and manipulate an object in-place.
  2. Approaches that utilize the limitations of the human’s visual system. Here some visual illusion is used, such that the virtual scene is perceived and understood in 3D while the interaction tasks are carried out on a 2D surface.
  3. Approaches that shift the problem into the interface design space and distinguish 3D touch as separate input modality with its own set of interaction techniques.
Since the third approach only partially solves the problem, we have mainly focused on the first two options and analyze the relationship between the 3D positions of stereoscopically displayed objects and the on-surface touchpoints.

We have conducted a series of experiments to investigate the user’s ability to discriminate small induced shifts while performing a touch gesture. Our results indicate that slight object shifts during touch interaction make the virtual scene perceptually more stable, compared to a static scene. Therefore, applications have to manipulate virtual objects to make them appear more static to the user. The results were used to design a practical interaction technique, the "attracting shift technique", that is suitable for numerous application scenarios. Furthermore, we demonstrate how multi-touch hand gestures in combination with foot gestures can be used to perform navigation tasks in interactive 3D environments.

Publications

D. Valkov Chapter 9: Multi-Touch for Stereoscopic Displays
in F. Ortega, F. Abyarjoo, A. Barreto, N. Rishe and M. Adjouadi, "Interaction Design for 3D User Interfaces: The World of Modern Input Devices for Research, Applications, and Game Development", CRC Press, 2015

A. Giesler, D. Valkov, K. HinrichsVoid shadows: multi-touch interaction with stereoscopic objects on the tabletop
SUI '14: Proceedings of the 2nd ACM symposium on Spatial user interaction, 104-112, 2014

P. Lubos, D. Valkov, The significance of stereopsis and motion parallax in mobile head tracking environments
SUI '14: Proceedings of the 2nd ACM symposium on Spatial user interaction, 155-155, 2014

D. Valkov, A. Giesler, K. Hinrichs, Imperceptible depth shifts for touch interaction with stereoscopic objects
CHI '14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 227-236 2014

D. Valkov, A. Giesler, K. Hinrichs, Evaluation of depth perception for touch interaction with stereoscopic rendered objects
ITS '12: Proceedings of the 2012 ACM international conference on Interactive Tabletops and Surfaces, 21-30, 2012

S. Strothoff, D. Valkov, K. Hinrichs, Triangle cursor: Interactions with objects above the tabletop
ITS '11: Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, 111-119, 2011

D. Valkov, F. Steinicke, G. Bruder, K. Hinrichs, 2D Touching of 3D Stereoscopic Objects
CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1353–1362, 2011

D. Valkov, F. Steinicke, G. Bruder, K. Hinrichs, J. Schöning, F. Daiber, Touching Floating Objects in Projection-based Virtual Reality Environments
EGVE/EuroVR/VEC, 17-24, 2010

J. Schöning, F. Steinicke, A. Krüger, K. Hinrichs, D. Valkov, Bimanual interaction with interscopic multi-touch surfaces
INTERACT '09: IFIP Conference on Human-Computer Interaction, 40-53, 2009