"Read That Article": Exploring Synergies between Gaze and Speech Interaction


Vieira D., Freitas J. D., ACARTÜRK C., Teixeira A., Sousa L., Silva S., ...Daha Fazla

17th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2015), Lisbon, Portekiz, 26 - 28 Ekim 2015, ss.341-342 identifier identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1145/2700648.2811369
  • Basıldığı Şehir: Lisbon
  • Basıldığı Ülke: Portekiz
  • Sayfa Sayıları: ss.341-342
  • Anahtar Kelimeler: Multimodal, Gaze, Speech, Fusion, MULTIMODAL INTERACTION
  • Orta Doğu Teknik Üniversitesi Adresli: Evet

Özet

Gaze information has the potential to benefit Human-Computer Interaction (HCI) tasks, particularly when combined with speech. Gaze can improve our understanding of the user intention, as a secondary input modality, or it can be used as the main input modality by users with some level of permanent or temporary impairments. In this paper we describe a multimodal HCI system prototype which supports speech, gaze and the combination of both. The system has been developed for Active Assisted Living scenarios.