"Read That Article": Exploring Synergies between Gaze and Speech Interaction


Vieira D., Freitas J. D. , ACARTÜRK C. , Teixeira A., Sousa L., Silva S., ...More

17th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2015), Lisbon, Portugal, 26 - 28 October 2015, pp.341-342 identifier identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.1145/2700648.2811369
  • City: Lisbon
  • Country: Portugal
  • Page Numbers: pp.341-342
  • Keywords: Multimodal, Gaze, Speech, Fusion, MULTIMODAL INTERACTION

Abstract

Gaze information has the potential to benefit Human-Computer Interaction (HCI) tasks, particularly when combined with speech. Gaze can improve our understanding of the user intention, as a secondary input modality, or it can be used as the main input modality by users with some level of permanent or temporary impairments. In this paper we describe a multimodal HCI system prototype which supports speech, gaze and the combination of both. The system has been developed for Active Assisted Living scenarios.