Stress Detection in Human-Computer Interaction: Fusion of Pupil Dilation and Facial Temperature Features


GÖKÇAY D., GÖKÇAY D.

INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, cilt.32, sa.12, ss.956-966, 2016 (SCI-Expanded) identifier identifier

Özet

In order to differentiate the affective state of a computer user as it changes from relaxation to stress, features derived from pupil dilation and periorbital temperature can be utilized. Absolute signal values and measurements computed from these can be fused to increase the accuracy of affective classification. In this study, entropy in a sliding window was used to accommodate the time differences in the physiological rise and fall profiles of pupil and thermal data. Two methods, decision tree and Adaboost with Random Forest (ABRF), were used for classification tests. Detection accuracy of stressful states varied between 65% and 83.8%. Best results can be reported as 83.9% for sensitivity and 83.8% for specificity. ABRF classifier outperformed the decision tree model. This study emphasizes the importance of data fusion, particularly when physiological signals differ with respect to their rise and fall windows across time. Use of entropy within a predefined time window provides a useful set of features to combine with actual measurements. Furthermore, the collection of pupil and thermal data is feasible because surface sensors are eliminated.