Abstract
Orbits combines a visual display and an eye motion sensor to allow a user to select between options by tracking a cur- sor with the eyes as the cursor travels in a circular path around each option. Using an off-The-shelf Jins MEME pair of eyeglasses, we present a pilot study that suggests that the eye movement required for Orbits can be sensed using three electrodes: one in the nose bridge and one in each nose pad. For forced choice binary selection, we achieve a 2.6 bits per second (bps) input rate at 250ms per input. We also introduce Head Orbits, where the user fixates the eyes on a target and moves the head in synchrony with the orbiting target. Measuring only the relative movement of the eyes in relation to the head, this method achieves a maximum rate of 2.0 bps at 500ms per input. Finally, we combine the two techniques together with a gyro to create an interface with a maximum input rate of 5.0 bps.
Original language | English |
---|---|
Title of host publication | ICMI 2016 - Proceedings of the 18th ACM International Conference on Multimodal Interaction |
Publisher | Association for Computing Machinery, Inc |
Pages | 307-311 |
Number of pages | 5 |
ISBN (Electronic) | 9781450345569 |
DOIs | |
Publication status | Published - 2016 Oct 31 |
Event | 18th ACM International Conference on Multimodal Interaction, ICMI 2016 - Tokyo, Japan Duration: 2016 Nov 12 → 2016 Nov 16 |
Other
Other | 18th ACM International Conference on Multimodal Interaction, ICMI 2016 |
---|---|
Country/Territory | Japan |
City | Tokyo |
Period | 16/11/12 → 16/11/16 |
Keywords
- Eye tracking
- Gaze interaction
- Wearable computing
ASJC Scopus subject areas
- Computer Science Applications
- Human-Computer Interaction
- Hardware and Architecture
- Computer Vision and Pattern Recognition