Mobile Multimodal Interaction aims at exploiting complementary aspects of human communication capacities and new mobile sensors. Recently, most mobile applications are limited to a basic interaction modality, namely touchscreen, which is subject to restricted interaction under certain situations. In this paper, we present On-the-Fly Interaction Editor (OFIE), an application that allows mobile end-users to define and integrate sensor-based unimodal and multimodal input interactions in their already installed applications according to their contexts. OFIE is based on the Event-Condition-Action rules and Programming By Demonstration approach that allows end-users to demonstrate their expected action simply by performing it on the application's interface. We evaluated OFIE through a controlled user study. Our evaluation involves 15 participants distributed on 4 groups based on their programming experience. Each participant was invited to integrate six input interactions (three multimodal inputs). The initial results show that end-users are able to successfully integrate sensor-based input interactions using OFIE.