Surgical Tool Detection in Open Surgery Videos

Ryo Fujii, Ryo Hachiuma, Hiroki Kajita, Hideo Saito

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Detecting surgical tools is an essential task for analyzing and evaluating surgical videos. However, most studies focus on minimally invasive surgery (MIS) and cataract surgery. Mainly because of a lack of a large, diverse, and well-annotated dataset, research in the area of open surgery has been limited so far. Open surgery video analysis is challenging because of its properties: varied number and roles of people (e.g., main surgeon, assistant surgeons, and nurses), a complex interaction of tools and hands, various operative environments, and lighting conditions. In this paper, to handle these limitations and difficulties, we introduce an egocentric open surgery dataset that includes 15 open surgeries recorded with a head-mounted camera. More than 67k bounding boxes are labeled to 19k images with 31 surgical tool categories. Finally, we present a surgical tool detection baseline model based on recent advances in object detection. The results of our new dataset show that our presented dataset provides enough interesting challenges for future methods and that it can serve as a strong benchmark to address the study of tool detection in open surgery.

Original languageEnglish
Article number10473
JournalApplied Sciences (Switzerland)
Volume12
Issue number20
DOIs
Publication statusPublished - 2022 Oct

Keywords

  • deep neural network
  • egocentric camera
  • open surgery
  • surgical tool detection
  • surgical video analysis

ASJC Scopus subject areas

  • Materials Science(all)
  • Instrumentation
  • Engineering(all)
  • Process Chemistry and Technology
  • Computer Science Applications
  • Fluid Flow and Transfer Processes

Fingerprint

Dive into the research topics of 'Surgical Tool Detection in Open Surgery Videos'. Together they form a unique fingerprint.

Cite this