skip to main content
10.1145/2557500.2557518acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article

Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays

Published:24 February 2014Publication History

ABSTRACT

In this paper, we investigate the concept of gaze-based interaction with 3D user interfaces. We currently see stereo vision displays becoming ubiquitous, particularly as auto-stereoscopy enables the perception of 3D content without the use of glasses. As a result, application areas for 3D beyond entertainment in cinema or at home emerge, including work settings, mobile phones, public displays, and cars. At the same time, eye tracking is hitting the consumer market with low-cost devices. We envision eye trackers in the future to be integrated with consumer devices (laptops, mobile phones, displays), hence allowing the user's gaze to be analyzed and used as input for interactive applications. A particular challenge when applying this concept to 3D displays is that current eye trackers provide the gaze point in 2D only (x and y coordinates). In this paper, we compare the performance of two methods that use the eye's physiology for calculating the gaze point in 3D space, hence enabling gaze-based interaction with stereoscopic content. Furthermore, we provide a comparison of gaze interaction in 2D and 3D with regard to user experience and performance. Our results show that with current technology, eye tracking on stereoscopic displays is possible with similar performance as on standard 2D screens.

References

  1. Alper, B., Höllerer, T., Kuchera-Morin, J., and Forbes, A. Stereoscopic highlighting: 2d graph visualization on stereo displays. IEEE Trans. Vis. Comput. Graph. 17, 12 (2011), 2325--2333. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Ashmore, M., Duchowski, A. T., and Shoemaker, G. Efficient eye pointing with a fisheye lens. In Proc. of GI'05, Canadian Human-Computer Communications Society (2005), 203--210. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bangor, A., Kortum, P. T., and Miller, J. T. An Empirical Evaluation of the System Usability Scale. International Journal on Human Computer Interaction 24, 6 (2008), 574--594.Google ScholarGoogle ScholarCross RefCross Ref
  4. Benedek, J., and Miner, T. Measuring desirability: New methods for evaluating desirability in a usability lab setting. Proc. of Usability Professionals Association 2003 (2002), 8--12.Google ScholarGoogle Scholar
  5. Broy, N., André, E., and Schmidt, A. Is stereoscopic 3d a better choice for information representation in the carfi In Proc. of AutoUI'12, ACM (New York, NY, USA, 2012), 93--100. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Duchowski, A. T., Pelfrey, B., House, D. H., and Wang, R. Measuring gaze depth with an eye tracker during stereoscopic display. In Proc. of SIGGRAPH'11, ACM (New York, NY, USA, 2011), 15--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Grossman, T., Wigdor, D., and Balakrishnan, R. Multi-finger gestural interaction with 3d volumetric displays. In Proc. of UIST'04, ACM (New York, NY, USA, 2004), 61--70. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Häkkilä, J., Posti, M., Koskenranta, O., and Ventä-Olkkonen, L. Design and evaluation of mobile phonebook application with stereoscopic 3d user interface. In CHIfi13 Extended Abstracts on Human Factors in Computing Systems, ACM (2013), 1389--1394. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Harrison, B. L., Ishii, H., Vicente, K. J., and Buxton, W. A. S. Transparent layered user interfaces: An evaluation of a display design to enhance focused and divided attention. In Proc. of CHI'95, ACM Press / AddisonWesley Publishing Co. (New York, NY, USA, 1995), 317--324. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Hart, S. G., and Staveland, L. E. Development of nasa-tlx (task load index): Results of empirical and theoretical research. Human mental workload 1, 3 (1988), 139--183.Google ScholarGoogle Scholar
  11. Hilliges, O., Kim, D., Izadi, S., Weiss, M., and Wilson, A. Holodesk: direct 3d interactions with a situated see-through display. In Proc. of CHI'12, ACM (New York, NY, USA, 2012), 2421--2430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Huhtala, J., Karukka, M., Salmimaa, M., and Häkkilä, J. Evaluating depth illusion as method of adding emphasis in autostereoscopic mobile displays. In Proc. of MobileHCI'11 (New York, NY, USA, 2011), 357--360. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Johnson, R., OfiHara, K., Sellen, A., Cousins, C., and Criminisi, A. Exploring the potential for touchless interaction in image-guided interventional radiology. In Proc. of CHI'11, ACM (New York, NY, USA, 2011), 3323--3332. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Kern, D., Mahr, A., Castronovo, S., Schmidt, A., and Müller, C. Making use of drivers' glances onto the screen for explicit gaze-based interaction. In Proc. of AutoUI'10, ACM (New York, NY, USA, 2010), 110--116. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Ki, J., and Kwon, Y.-M. 3d gaze estimation and interaction. In 3DTV Conference: The True Vision Capture, Transmission and Display of 3D Video, 2008 (2008), 373--376.Google ScholarGoogle Scholar
  16. McIntire, J. P., Havig, P. R., and Geiselman, E. E. What is 3d good forfi a review of human performance on stereoscopic 3d displays. In Proceedings of Head- and Helmet-Mounted Displays XVII; and Display Technologies and Applications for Defense, Security, and Avionics VI, vol. 8383, SPIE (2012).Google ScholarGoogle Scholar
  17. Ramasamy, C., House, D. H., Duchowski, A. T., and Daugherty, B. Using eye tracking to analyze stereoscopic filmmaking. In Adj. Proc. of SIGGRAPH'09: Posters, ACM (New York, NY, USA, 2009), 28:1--28:1. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Reichelt, S., Häussler, R., Fütterer, G., and Leister, N. Depth cues in human visual perception and their realization in 3d displays. In SPIE Defense, Security, and Sensing, International Society for Optics and Photonics (2010), 76900B--76900B.Google ScholarGoogle Scholar
  19. Sunnari, M., Arhippainen, L., Pakanen, M., and Hickey, S. Studying user experiences of autostereoscopic 3d menu on touch screen mobile device. In Proc. of OzCHI'12, ACM (New York, NY, USA, 2012), 558--561. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Vidal, M., Bulling, A., and Gellersen, H. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proc. of UbiComp'13, ACM (New York, NY, USA, 2013), 439--448. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      IUI '14: Proceedings of the 19th international conference on Intelligent User Interfaces
      February 2014
      386 pages
      ISBN:9781450321846
      DOI:10.1145/2557500

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 24 February 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      IUI '14 Paper Acceptance Rate46of191submissions,24%Overall Acceptance Rate746of2,811submissions,27%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader