TY - JOUR
T1 - Empirical study of future image prediction for image-based mobile robot navigation
AU - Ishihara, Yu
AU - Takahashi, Masaki
N1 - Funding Information:
This work was supported by the Core Research for Evolutional Science and Technology (CREST) of the Japan Science and Technology Agency (JST) under Grant JPMJCR19A1 .
Publisher Copyright:
© 2022 Elsevier B.V.
PY - 2022/4
Y1 - 2022/4
N2 - Recent image-based robotic systems use predicted future state images to control robots. Therefore, the prediction accuracy of the future state image affects the performance of the robot. To predict images, most previous studies assume that the camera captures the entire scene and that the environment is static. However, in real robot applications, these assumptions do not always hold. For example, if a camera is attached to a mobile robot, its view changes from time to time. In this study, we analyzed the relationship between the performance of the image prediction model and the robot's behavior, controlled by an image-based navigation algorithm. Through mobile robot navigation experiments using front-faced and omni-directional cameras, we discussed the capabilities of the image prediction models and demonstrated their performance when applied to the image-based navigation algorithm. Moreover, to adapt to the dynamic changes in the environment, we studied the effectiveness of directing the camera to the ceiling. We showed that robust navigation can be achieved without using images from cameras directed toward the front or the floor, because these views can be disturbed by moving objects in a dynamic environment.
AB - Recent image-based robotic systems use predicted future state images to control robots. Therefore, the prediction accuracy of the future state image affects the performance of the robot. To predict images, most previous studies assume that the camera captures the entire scene and that the environment is static. However, in real robot applications, these assumptions do not always hold. For example, if a camera is attached to a mobile robot, its view changes from time to time. In this study, we analyzed the relationship between the performance of the image prediction model and the robot's behavior, controlled by an image-based navigation algorithm. Through mobile robot navigation experiments using front-faced and omni-directional cameras, we discussed the capabilities of the image prediction models and demonstrated their performance when applied to the image-based navigation algorithm. Moreover, to adapt to the dynamic changes in the environment, we studied the effectiveness of directing the camera to the ceiling. We showed that robust navigation can be achieved without using images from cameras directed toward the front or the floor, because these views can be disturbed by moving objects in a dynamic environment.
KW - Action-conditioned image prediction
KW - Mobile robot
KW - Omni-directional camera
UR - http://www.scopus.com/inward/record.url?scp=85123043940&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85123043940&partnerID=8YFLogxK
U2 - 10.1016/j.robot.2021.104018
DO - 10.1016/j.robot.2021.104018
M3 - Article
AN - SCOPUS:85123043940
SN - 0921-8890
VL - 150
JO - Robotics and Autonomous Systems
JF - Robotics and Autonomous Systems
M1 - 104018
ER -