TY - JOUR
T1 - A 2D-Lidar-Equipped Unmanned Robot-Based Approach for Indoor Human Activity Detection
AU - Bouazizi, Mondher
AU - Lorite Mora, Alejandro
AU - Ohtsuki, Tomoaki
N1 - Publisher Copyright:
© 2023 by the authors.
PY - 2023/3
Y1 - 2023/3
N2 - Monitoring the activities of elderly people living alone is of great importance since it allows for the detection of when hazardous events such as falling occur. In this context, the use of 2D light detection and ranging (LIDAR) has been explored, among others, as a way to identify such events. Typically, a 2D LIDAR is placed near the ground and collects measurements continuously, and a computational device classifies these measurements. However, in a realistic environment with home furniture, it is hard for such a device to operate as it requires a direct line of sight (LOS) with its target. Furniture will block the infrared (IR) rays from reaching the monitored person thus limiting the effectiveness of such sensors. Nonetheless, due to their fixed location, if a fall is not detected when it happens, it cannot be detected afterwards. In this context, cleaning robots present a much better alternative given their autonomy. In this paper, we propose to use a 2D LIDAR mounted on top of a cleaning robot. Through continuous movement, the robot is able to collect distance information continuously. Despite having the same drawback, by roaming in the room, the robot can identify if a person is laying on the ground after falling, even after a certain period from the fall event. To achieve such a goal, the measurements captured by the moving LIDAR are transformed, interpolated, and compared to a reference state of the surroundings. A convolutional long short-term memory (LSTM) neural network is trained to classify the processed measurements and identify if a fall event occurs or has occurred. Through simulations, we show that such a system can achieve an accuracy equal to 81.2% in fall detection and 99% in the detection of lying bodies. Compared to the conventional method, which uses a static LIDAR, the accuracy reaches for the same tasks 69.4% and 88.6%, respectively.
AB - Monitoring the activities of elderly people living alone is of great importance since it allows for the detection of when hazardous events such as falling occur. In this context, the use of 2D light detection and ranging (LIDAR) has been explored, among others, as a way to identify such events. Typically, a 2D LIDAR is placed near the ground and collects measurements continuously, and a computational device classifies these measurements. However, in a realistic environment with home furniture, it is hard for such a device to operate as it requires a direct line of sight (LOS) with its target. Furniture will block the infrared (IR) rays from reaching the monitored person thus limiting the effectiveness of such sensors. Nonetheless, due to their fixed location, if a fall is not detected when it happens, it cannot be detected afterwards. In this context, cleaning robots present a much better alternative given their autonomy. In this paper, we propose to use a 2D LIDAR mounted on top of a cleaning robot. Through continuous movement, the robot is able to collect distance information continuously. Despite having the same drawback, by roaming in the room, the robot can identify if a person is laying on the ground after falling, even after a certain period from the fall event. To achieve such a goal, the measurements captured by the moving LIDAR are transformed, interpolated, and compared to a reference state of the surroundings. A convolutional long short-term memory (LSTM) neural network is trained to classify the processed measurements and identify if a fall event occurs or has occurred. Through simulations, we show that such a system can achieve an accuracy equal to 81.2% in fall detection and 99% in the detection of lying bodies. Compared to the conventional method, which uses a static LIDAR, the accuracy reaches for the same tasks 69.4% and 88.6%, respectively.
KW - 2D Lidar
KW - activity detection
KW - deep learning
KW - healthcare
KW - machine learning
UR - http://www.scopus.com/inward/record.url?scp=85149573170&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85149573170&partnerID=8YFLogxK
U2 - 10.3390/s23052534
DO - 10.3390/s23052534
M3 - Article
C2 - 36904735
AN - SCOPUS:85149573170
SN - 1424-8220
VL - 23
JO - Sensors
JF - Sensors
IS - 5
M1 - 2534
ER -