TY - GEN
T1 - Eyefeel & Eye chime
T2 - 5th Augmented Human International Conference, AH 2014
AU - Hosobori, Asako
AU - Kakehi, Yasuaki
N1 - Copyright:
Copyright 2014 Elsevier B.V., All rights reserved.
PY - 2014
Y1 - 2014
N2 - In face-to-face communication, humans convey nonverbal information to supplement verbal language. Eye gaze in particular is a critical element. While a variety of studies on communication support focusing on eye gaze have been performed in the past, most of these studies have aimed to support communications between people in remote locations. In contrast, this study aims to extend and transform gaze to lower the hurdles of establishing communication, or to induce a new form of communication through gaze in face-to-face communication. As a specific proposal, in this study we developed two types of systems: Eyefeel, which converts and delivers the gaze of another as tactile information, and EyeChime, which produces a spatial presentation by converting events such as gazing at another or eyes contact to sound. The preliminary study suggested that these interfaces induced communication through active use of eye gaze, gave users the opportunity to increase the amount of time gazes were sent to the conversation partner, and that the hurdles to make eye contact were lowered. In this study, we discuss the design and implementation of the system as well as the details of its use.
AB - In face-to-face communication, humans convey nonverbal information to supplement verbal language. Eye gaze in particular is a critical element. While a variety of studies on communication support focusing on eye gaze have been performed in the past, most of these studies have aimed to support communications between people in remote locations. In contrast, this study aims to extend and transform gaze to lower the hurdles of establishing communication, or to induce a new form of communication through gaze in face-to-face communication. As a specific proposal, in this study we developed two types of systems: Eyefeel, which converts and delivers the gaze of another as tactile information, and EyeChime, which produces a spatial presentation by converting events such as gazing at another or eyes contact to sound. The preliminary study suggested that these interfaces induced communication through active use of eye gaze, gave users the opportunity to increase the amount of time gazes were sent to the conversation partner, and that the hurdles to make eye contact were lowered. In this study, we discuss the design and implementation of the system as well as the details of its use.
KW - Eye gaze
KW - Face-to-face communication
KW - Interface; Augmented reality
UR - http://www.scopus.com/inward/record.url?scp=84899878215&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84899878215&partnerID=8YFLogxK
U2 - 10.1145/2582051.2582058
DO - 10.1145/2582051.2582058
M3 - Conference contribution
AN - SCOPUS:84899878215
SN - 9781450327619
T3 - ACM International Conference Proceeding Series
BT - Proceedings of the 5th Augmented Human International Conference, AH 2014
PB - Association for Computing Machinery
Y2 - 7 March 2014 through 8 March 2014
ER -