TY - CHAP
T1 - A Model for Natural and Comprehensive Direction Giving
AU - Okuno, Yusuke
AU - Kanda, Takayuki
AU - Imai, Michita
AU - Ishiguro, Hiroshi
AU - Hagita, Norihiro
N1 - Publisher Copyright:
© 2022 selection and editorial matter, Kirsten Holmes, Leonie Lockstone-Binney, Karen A. Smith and Richard Shipway.
PY - 2017/1/1
Y1 - 2017/1/1
N2 - In Chapter ??, we introduced our field trials. In a shopping mall, we made the robot provide directions. We found such direction given by a robot useful. A robot has a number of appropriate features for direction giving; since it is physically co-located with people, it can proactively approach a person who needs such information, and then provide it “naturally” with its human-like body properties. While what was used in the field trial were simple directions, we are better prepared to understand now what good direction giving involves. What constitutes good direction giving from a robot? If the destination is within a visible distance, the answer might be intuitive. A robot would say “The shop is over there” and point. However, since the destination is often not visible, a robot needs to utter several sentences. Moreover, it would be expected to be accompanied with gestures. We designed our robot’s behavior to enable the listener to intuitively understand the information provided by the robot. This chapter illustrates how we integrate three important factors—utterances, gestures, and timing—so that the robot can conduct appropriate direction giving.
AB - In Chapter ??, we introduced our field trials. In a shopping mall, we made the robot provide directions. We found such direction given by a robot useful. A robot has a number of appropriate features for direction giving; since it is physically co-located with people, it can proactively approach a person who needs such information, and then provide it “naturally” with its human-like body properties. While what was used in the field trial were simple directions, we are better prepared to understand now what good direction giving involves. What constitutes good direction giving from a robot? If the destination is within a visible distance, the answer might be intuitive. A robot would say “The shop is over there” and point. However, since the destination is often not visible, a robot needs to utter several sentences. Moreover, it would be expected to be accompanied with gestures. We designed our robot’s behavior to enable the listener to intuitively understand the information provided by the robot. This chapter illustrates how we integrate three important factors—utterances, gestures, and timing—so that the robot can conduct appropriate direction giving.
UR - http://www.scopus.com/inward/record.url?scp=85132458417&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85132458417&partnerID=8YFLogxK
M3 - Chapter
AN - SCOPUS:85132458417
SN - 9781138071698
SP - 141
EP - 155
BT - Human-Robot Interaction in Social Robotics
PB - CRC Press
ER -