A deep learning model based on fusion images of chest radiography and X-ray sponge images supports human visual characteristics of retained surgical items detection

Masateru Kawakubo, Hiroto Waki, Takashi Shirasaka, Tsukasa Kojima, Ryoji Mikayama, Hiroshi Hamasaki, Hiroshi Akamine, Toyoyuki Kato, Shingo Baba, Shin Ushiro, Kousei Ishigami

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Purpose: Although a novel deep learning software was proposed using post-processed images obtained by the fusion between X-ray images of normal post-operative radiography and surgical sponge, the association of the retained surgical item detectability with human visual evaluation has not been sufficiently examined. In this study, we investigated the association of retained surgical item detectability between deep learning and human subjective evaluation. Methods: A deep learning model was constructed from 2987 training images and 1298 validation images, which were obtained from post-processing of the image fusion between X-ray images of normal post-operative radiography and surgical sponge. Then, another 800 images were used, i.e., 400 with and 400 without surgical sponge. The detection characteristics of retained sponges between the model and a general observer with 10-year clinical experience were analyzed using the receiver operator characteristics. Results: The following values from the deep learning model and observer were, respectively, derived: Cutoff values of probability were 0.37 and 0.45; areas under the curves were 0.87 and 0.76; sensitivity values were 85% and 61%; and specificity values were 73% and 92%. Conclusion: For the detection of surgical sponges, we concluded that the deep learning model has higher sensitivity, while the human observer has higher specificity. These characteristics indicate that the deep learning system that is complementary to humans could support the clinical workflow in operation rooms for prevention of retained surgical items.

Original languageEnglish
Pages (from-to)1459-1467
Number of pages9
JournalInternational Journal of Computer Assisted Radiology and Surgery
Volume18
Issue number8
DOIs
Publication statusPublished - Aug 2023

All Science Journal Classification (ASJC) codes

  • Surgery
  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Health Informatics
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'A deep learning model based on fusion images of chest radiography and X-ray sponge images supports human visual characteristics of retained surgical items detection'. Together they form a unique fingerprint.

Cite this