ABSTRACT
In this paper, we investigate the concept of gaze-based interaction with 3D user interfaces. We currently see stereo vision displays becoming ubiquitous, particularly as auto-stereoscopy enables the perception of 3D content without the use of glasses. As a result, application areas for 3D beyond entertainment in cinema or at home emerge, including work settings, mobile phones, public displays, and cars. At the same time, eye tracking is hitting the consumer market with low-cost devices. We envision eye trackers in the future to be integrated with consumer devices (laptops, mobile phones, displays), hence allowing the user's gaze to be analyzed and used as input for interactive applications. A particular challenge when applying this concept to 3D displays is that current eye trackers provide the gaze point in 2D only (x and y coordinates). In this paper, we compare the performance of two methods that use the eye's physiology for calculating the gaze point in 3D space, hence enabling gaze-based interaction with stereoscopic content. Furthermore, we provide a comparison of gaze interaction in 2D and 3D with regard to user experience and performance. Our results show that with current technology, eye tracking on stereoscopic displays is possible with similar performance as on standard 2D screens.
- Alper, B., Höllerer, T., Kuchera-Morin, J., and Forbes, A. Stereoscopic highlighting: 2d graph visualization on stereo displays. IEEE Trans. Vis. Comput. Graph. 17, 12 (2011), 2325--2333. Google ScholarDigital Library
- Ashmore, M., Duchowski, A. T., and Shoemaker, G. Efficient eye pointing with a fisheye lens. In Proc. of GI'05, Canadian Human-Computer Communications Society (2005), 203--210. Google ScholarDigital Library
- Bangor, A., Kortum, P. T., and Miller, J. T. An Empirical Evaluation of the System Usability Scale. International Journal on Human Computer Interaction 24, 6 (2008), 574--594.Google ScholarCross Ref
- Benedek, J., and Miner, T. Measuring desirability: New methods for evaluating desirability in a usability lab setting. Proc. of Usability Professionals Association 2003 (2002), 8--12.Google Scholar
- Broy, N., André, E., and Schmidt, A. Is stereoscopic 3d a better choice for information representation in the carfi In Proc. of AutoUI'12, ACM (New York, NY, USA, 2012), 93--100. Google ScholarDigital Library
- Duchowski, A. T., Pelfrey, B., House, D. H., and Wang, R. Measuring gaze depth with an eye tracker during stereoscopic display. In Proc. of SIGGRAPH'11, ACM (New York, NY, USA, 2011), 15--22. Google ScholarDigital Library
- Grossman, T., Wigdor, D., and Balakrishnan, R. Multi-finger gestural interaction with 3d volumetric displays. In Proc. of UIST'04, ACM (New York, NY, USA, 2004), 61--70. Google ScholarDigital Library
- Häkkilä, J., Posti, M., Koskenranta, O., and Ventä-Olkkonen, L. Design and evaluation of mobile phonebook application with stereoscopic 3d user interface. In CHIfi13 Extended Abstracts on Human Factors in Computing Systems, ACM (2013), 1389--1394. Google ScholarDigital Library
- Harrison, B. L., Ishii, H., Vicente, K. J., and Buxton, W. A. S. Transparent layered user interfaces: An evaluation of a display design to enhance focused and divided attention. In Proc. of CHI'95, ACM Press / AddisonWesley Publishing Co. (New York, NY, USA, 1995), 317--324. Google ScholarDigital Library
- Hart, S. G., and Staveland, L. E. Development of nasa-tlx (task load index): Results of empirical and theoretical research. Human mental workload 1, 3 (1988), 139--183.Google Scholar
- Hilliges, O., Kim, D., Izadi, S., Weiss, M., and Wilson, A. Holodesk: direct 3d interactions with a situated see-through display. In Proc. of CHI'12, ACM (New York, NY, USA, 2012), 2421--2430. Google ScholarDigital Library
- Huhtala, J., Karukka, M., Salmimaa, M., and Häkkilä, J. Evaluating depth illusion as method of adding emphasis in autostereoscopic mobile displays. In Proc. of MobileHCI'11 (New York, NY, USA, 2011), 357--360. Google ScholarDigital Library
- Johnson, R., OfiHara, K., Sellen, A., Cousins, C., and Criminisi, A. Exploring the potential for touchless interaction in image-guided interventional radiology. In Proc. of CHI'11, ACM (New York, NY, USA, 2011), 3323--3332. Google ScholarDigital Library
- Kern, D., Mahr, A., Castronovo, S., Schmidt, A., and Müller, C. Making use of drivers' glances onto the screen for explicit gaze-based interaction. In Proc. of AutoUI'10, ACM (New York, NY, USA, 2010), 110--116. Google ScholarDigital Library
- Ki, J., and Kwon, Y.-M. 3d gaze estimation and interaction. In 3DTV Conference: The True Vision Capture, Transmission and Display of 3D Video, 2008 (2008), 373--376.Google Scholar
- McIntire, J. P., Havig, P. R., and Geiselman, E. E. What is 3d good forfi a review of human performance on stereoscopic 3d displays. In Proceedings of Head- and Helmet-Mounted Displays XVII; and Display Technologies and Applications for Defense, Security, and Avionics VI, vol. 8383, SPIE (2012).Google Scholar
- Ramasamy, C., House, D. H., Duchowski, A. T., and Daugherty, B. Using eye tracking to analyze stereoscopic filmmaking. In Adj. Proc. of SIGGRAPH'09: Posters, ACM (New York, NY, USA, 2009), 28:1--28:1. Google ScholarDigital Library
- Reichelt, S., Häussler, R., Fütterer, G., and Leister, N. Depth cues in human visual perception and their realization in 3d displays. In SPIE Defense, Security, and Sensing, International Society for Optics and Photonics (2010), 76900B--76900B.Google Scholar
- Sunnari, M., Arhippainen, L., Pakanen, M., and Hickey, S. Studying user experiences of autostereoscopic 3d menu on touch screen mobile device. In Proc. of OzCHI'12, ACM (New York, NY, USA, 2012), 558--561. Google ScholarDigital Library
- Vidal, M., Bulling, A., and Gellersen, H. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proc. of UbiComp'13, ACM (New York, NY, USA, 2013), 439--448. Google ScholarDigital Library
Index Terms
- Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays
Recommendations
Using eye tracking for interaction
CHI EA '11: CHI '11 Extended Abstracts on Human Factors in Computing SystemsThe development of cheaper eye trackers and open source software for eye tracking and gaze interaction brings the possibility to integrate eye tracking into everyday use devices as well as highly specialized equipment. Apart from providing means for ...
GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public Displays
ETRA '23: Proceedings of the 2023 Symposium on Eye Tracking Research and ApplicationsGaze is promising for natural and spontaneous interaction with public displays, but current gaze-enabled displays require movement-hindering stationary eye trackers or cumbersome head-mounted eye trackers. We propose and evaluate GazeCast – a novel ...
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Comments