Abstract
In 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system that enables users to interact with remote screens through a live video image on their mobile device. The handheld device tracks itself with respect to the surrounding displays. Touch on the video image is "projected" onto the target display in view, as if it had occurred there. This literal adaptation of Tani's idea, however, fails because handheld video does not offer enough stability and control to enable precise manipulation. We address this with a series of improvements, including zooming and freezing the video image. In a user study, participants selected targets and dragged targets between displays using the literal and three improved versions. We found that participants achieved highest performance with automatic zooming and temporary image freezing.
Original language | English |
---|---|
Title of host publication | Proceedings of the SIGCHI Conference on Human Factors in Computing Systems |
Number of pages | 10 |
Publisher | Association for Computing Machinery |
Publication date | 2010 |
Pages | 2287-2296 |
ISBN (Print) | 978-1-60558-929-9 |
DOIs | |
Publication status | Published - 2010 |
Externally published | Yes |
Event | The SIGCHI Conference on Human Factors in Computing Systems 2010 - Atlanta, GA., United States Duration: 10 Apr 2010 → 15 Apr 2010 Conference number: 28 |
Conference
Conference | The SIGCHI Conference on Human Factors in Computing Systems 2010 |
---|---|
Number | 28 |
Country/Territory | United States |
City | Atlanta, GA. |
Period | 10/04/2010 → 15/04/2010 |