Abstract
Current techniques to computationally detect human affect often depend on specialized hardware, work only in laboratory settings, or require substantial individual training. We use sensors in commodity smartphones to estimate affect in the wild with no training time based on a link between affect and movement. The first experiment had 55 participants do touch interactions after exposure to positive or neutral emotion-eliciting films; negative affect resulted in faster but less precise interactions, in addition to differences in rotation and acceleration. Using off-the-shelf machine learning algorithms we report 89.1% accuracy in binary affective classification, grouping participants by their self-assessments. A follow up experiment validated findings from the first experiment; the experiment collected naturally occurring affect of 127 participants, who again did touch interactions. Results demonstrate that affect has direct behavioral effect on mobile interaction and that affect detection using common smartphone sensors is feasible.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing |
Antal sider | 12 |
Forlag | Association for Computing Machinery |
Publikationsdato | 12 sep. 2016 |
Sider | 781-792 |
ISBN (Trykt) | 978-1-4503-4461-6 |
DOI | |
Status | Udgivet - 12 sep. 2016 |
Begivenhed | 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing - Heidelberg, Tyskland Varighed: 12 sep. 2016 → 16 sep. 2016 |
Konference
Konference | 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing |
---|---|
Land/Område | Tyskland |
By | Heidelberg |
Periode | 12/09/2016 → 16/09/2016 |