17th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2015), Lisbon, Portekiz, 26 - 28 Ekim 2015, ss.341-342
Gaze information has the potential to benefit Human-Computer Interaction (HCI) tasks, particularly when combined with speech. Gaze can improve our understanding of the user intention, as a secondary input modality, or it can be used as the main input modality by users with some level of permanent or temporary impairments. In this paper we describe a multimodal HCI system prototype which supports speech, gaze and the combination of both. The system has been developed for Active Assisted Living scenarios.