A gaze interactive textual smartwatch interface

John Paulin Hansen, Florian Biermann, Janus Askø Madsen, Morten Jonassen, Haakon Lund, Javier San Augustin, Sebastian Sztuk

9 Citationer (Scopus)

Abstract

Mobile gaze interaction is challenged by inherent motor noise. We examined the gaze tracking accuracy and precision of twelve subjects wearing a gaze tracker on their wrist while standing and walking. Results suggest that it will be possible to detect whether people are glancing the watch, but not where on the screen they are looking. To counter the motor noise we present a word-by-word textual UI that shows temporary command options to be executed by gaze-strokes. Twenty-seven participants conducted a simulated smartwatch task and were able to reliably perform commands that would adjust the speed of word presentation or make regressions. We discuss future design and usage options for a textual smartwatch gaze interface.
OriginalsprogEngelsk
TitelUbiComp & ISWC'15 : Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers
Antal sider9
UdgivelsesstedNew York
ForlagACM
Publikationsdato7 sep. 2015
Sider839-847
ISBN (Trykt)978-1-4503-3575-1
DOI
StatusUdgivet - 7 sep. 2015
BegivenhedInternational Workshop on Pervasive Eye Tracking and Mobile eye-based Interactio - Osaka, Japan
Varighed: 7 sep. 20157 sep. 2015
Konferencens nummer: 5

Konference

KonferenceInternational Workshop on Pervasive Eye Tracking and Mobile eye-based Interactio
Nummer5
Land/OmrådeJapan
ByOsaka
Periode07/09/201507/09/2015

Fingeraftryk

Dyk ned i forskningsemnerne om 'A gaze interactive textual smartwatch interface'. Sammen danner de et unikt fingeraftryk.

Citationsformater