Integrating Mobile Multimodal Interactions based on Programming By Demonstration

Bellal Z., Elouali N., Benslimane S. M. , ACARTÜRK C.

INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, vol.37, no.5, pp.418-433, 2021 (Peer-Reviewed Journal) identifier identifier

  • Publication Type: Article / Article
  • Volume: 37 Issue: 5
  • Publication Date: 2021
  • Doi Number: 10.1080/10447318.2020.1823688
  • Journal Indexes: Science Citation Index Expanded, Social Sciences Citation Index, Scopus, Academic Search Premier, ABI/INFORM, Applied Science & Technology Source, Business Source Elite, Business Source Premier, Compendex, Computer & Applied Sciences, INSPEC, Library and Information Science Abstracts, Linguistics & Language Behavior Abstracts, Psycinfo
  • Page Numbers: pp.418-433


Mobile Multimodal Interaction aims at exploiting complementary aspects of human communication capacities and new mobile sensors. Recently, most mobile applications are limited to a basic interaction modality, namely touchscreen, which is subject to restricted interaction under certain situations. In this paper, we present On-the-Fly Interaction Editor (OFIE), an application that allows mobile end-users to define and integrate sensor-based unimodal and multimodal input interactions in their already installed applications according to their contexts. OFIE is based on the Event-Condition-Action rules and Programming By Demonstration approach that allows end-users to demonstrate their expected action simply by performing it on the application's interface. We evaluated OFIE through a controlled user study. Our evaluation involves 15 participants distributed on 4 groups based on their programming experience. Each participant was invited to integrate six input interactions (three multimodal inputs). The initial results show that end-users are able to successfully integrate sensor-based input interactions using OFIE.