Abstract
In 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system that enables users to interact with remote screens through a live video image on their mobile device. The handheld device tracks itself with respect to the surrounding displays. Touch on the video image is "projected" onto the target display in view, as if it had occurred there. This literal adaptation of Tani's idea, however, fails because handheld video does not offer enough stability and control to enable precise manipulation. We address this with a series of improvements, including zooming and freezing the video image. In a user study, participants selected targets and dragged targets between displays using the literal and three improved versions. We found that participants achieved highest performance with automatic zooming and temporary image freezing.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the SIGCHI Conference on Human Factors in Computing Systems |
Antal sider | 10 |
Forlag | Association for Computing Machinery |
Publikationsdato | 2010 |
Sider | 2287-2296 |
ISBN (Trykt) | 978-1-60558-929-9 |
DOI | |
Status | Udgivet - 2010 |
Udgivet eksternt | Ja |
Begivenhed | The SIGCHI Conference on Human Factors in Computing Systems 2010 - Atlanta, GA., USA Varighed: 10 apr. 2010 → 15 apr. 2010 Konferencens nummer: 28 |
Konference
Konference | The SIGCHI Conference on Human Factors in Computing Systems 2010 |
---|---|
Nummer | 28 |
Land/Område | USA |
By | Atlanta, GA. |
Periode | 10/04/2010 → 15/04/2010 |
Emneord
- augmented reality, input device, interaction techniques, mobile device, multi-display environments, multi-touch