TY - CHAP
T1 - Expert’s Gaze-Based Prediction Model for Assessing the Quality of Figure Skating Jumps
AU - Hirosawa, Seiji
AU - Yamashita, Takayoshi
AU - Aoki, Yoshimitsu
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.
PY - 2024
Y1 - 2024
N2 - Researchers in computer vision are developing a method for Action Quality Assessment (AQA) that evaluates the quality of human actions in videos rather than identifying them. Specifically for figure skating, the task involves estimating the final scores from a video of a short program. It serves as an auxiliary assessment for judging skaters’ performances. Despite the significance of accurately predicting individual jump scores due to their substantial impact on final scores, prior studies have overlooked this aspect. Although videos concentrate on a solitary skater, they often include extraneous elements unrelated to assessing the quality. Consequently, expert humans discard non-essential data to make visually precise evaluations. Our research has illuminated the gaze patterns of judges and skaters when assessing jumps, developing a jump-performance prediction model that leverages their gaze patterns to filter out irrelevant information. In addition, we enhanced its predictive precision by incorporating kinematic data from a tracking system. The findings revealed a marked contrast in gaze patterns: skaters focused mainly on the face, while judges paid more attention to the lower body. Integrating these gaze patterns into our model improved its learning efficiency, with the model improved accuracy by assimilating the gaze data from both groups of specialists. Our work marks an innovative step towards merging human insight and artificial intelligence to tackle the challenge of jump performance evaluation in figure skating, offering valuable contributions to computer vision and sports science.
AB - Researchers in computer vision are developing a method for Action Quality Assessment (AQA) that evaluates the quality of human actions in videos rather than identifying them. Specifically for figure skating, the task involves estimating the final scores from a video of a short program. It serves as an auxiliary assessment for judging skaters’ performances. Despite the significance of accurately predicting individual jump scores due to their substantial impact on final scores, prior studies have overlooked this aspect. Although videos concentrate on a solitary skater, they often include extraneous elements unrelated to assessing the quality. Consequently, expert humans discard non-essential data to make visually precise evaluations. Our research has illuminated the gaze patterns of judges and skaters when assessing jumps, developing a jump-performance prediction model that leverages their gaze patterns to filter out irrelevant information. In addition, we enhanced its predictive precision by incorporating kinematic data from a tracking system. The findings revealed a marked contrast in gaze patterns: skaters focused mainly on the face, while judges paid more attention to the lower body. Integrating these gaze patterns into our model improved its learning efficiency, with the model improved accuracy by assimilating the gaze data from both groups of specialists. Our work marks an innovative step towards merging human insight and artificial intelligence to tackle the challenge of jump performance evaluation in figure skating, offering valuable contributions to computer vision and sports science.
KW - action quality assessment
KW - computer vision
KW - deep learning
UR - http://www.scopus.com/inward/record.url?scp=85195987405&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85195987405&partnerID=8YFLogxK
U2 - 10.1007/978-981-97-2898-5_5
DO - 10.1007/978-981-97-2898-5_5
M3 - Chapter
AN - SCOPUS:85195987405
T3 - Lecture Notes on Data Engineering and Communications Technologies
SP - 42
EP - 52
BT - Lecture Notes on Data Engineering and Communications Technologies
PB - Springer Science and Business Media Deutschland GmbH
ER -