open access

Vol 95, No 3 (2024)
Research paper
Published online: 2023-10-13
Get Citation

Ultrasonographic diagnosis of ovarian tumors through the deep convolutional neural network

Min Xi1, Runan Zheng2, Mingyue Wang1, Xiu Shi1, Chaomei Chen1, Jun Qian1, Xinxian Gu3, Jinhua Zhou1
·
Pubmed: 37842987
·
Ginekol Pol 2024;95(3):181-189.
Affiliations
  1. The First Affiliated Hospital of Soochow University, Suzhou City, China, China
  2. Suzhou MicroClear Medical Ltd., Suzhou City, China
  3. Dushu Lake Hospital Affiliated to Soochow University, Suzhou City, China

open access

Vol 95, No 3 (2024)
ORIGINAL PAPERS Gynecology
Published online: 2023-10-13

Abstract

Objectives: The objective of this study was to develop and validate an ovarian tumor ultrasonographic diagnostic model based on deep convolutional neural networks (DCNN) and compare its diagnostic performance with that of human experts. Material and methods: We collected 486 ultrasound images of 192 women with malignant ovarian tumors and 617 ultrasound images of 213 women with benign ovarian tumors, all confirmed by pathological examination. The image dataset was split into a training set and a validation set according to a 7:3 ratio. We selected 5 DCNNs to develop our model: MobileNet, Xception, Inception, ResNet and DenseNet. We compared the performance of the five models through the area under the curve (AUC), sensitivity, specificity, and accuracy. We then randomly selected 200 images from the validation set as the test set. We asked three expert radiologists to diagnose the images to compare the performance of radiologists and the DCNN model. Results: In the validation set, AUC of DenseNet was 0.997 while AUC was 0.988 of ResNet, 0.987 of Inception, 0.968 of Xception and 0.836 of MobileNet. In the test set, the accuracy was 0.975 with the DenseNet model versus 0.825 (p < 0.0001) with the radiologists, and sensitivity was 0.975 versus 0.700 (p < 0.0001), and specificity was 0.975 versus 0.908 (p < 0.001). Conclusions: DensNet performed better than other DCNNs and expert radiologists in identifying malignant ovarian tumors from benign ovarian tumors based on ultrasound images, a finding that needs to be further explored in clinical trials.

Abstract

Objectives: The objective of this study was to develop and validate an ovarian tumor ultrasonographic diagnostic model based on deep convolutional neural networks (DCNN) and compare its diagnostic performance with that of human experts. Material and methods: We collected 486 ultrasound images of 192 women with malignant ovarian tumors and 617 ultrasound images of 213 women with benign ovarian tumors, all confirmed by pathological examination. The image dataset was split into a training set and a validation set according to a 7:3 ratio. We selected 5 DCNNs to develop our model: MobileNet, Xception, Inception, ResNet and DenseNet. We compared the performance of the five models through the area under the curve (AUC), sensitivity, specificity, and accuracy. We then randomly selected 200 images from the validation set as the test set. We asked three expert radiologists to diagnose the images to compare the performance of radiologists and the DCNN model. Results: In the validation set, AUC of DenseNet was 0.997 while AUC was 0.988 of ResNet, 0.987 of Inception, 0.968 of Xception and 0.836 of MobileNet. In the test set, the accuracy was 0.975 with the DenseNet model versus 0.825 (p < 0.0001) with the radiologists, and sensitivity was 0.975 versus 0.700 (p < 0.0001), and specificity was 0.975 versus 0.908 (p < 0.001). Conclusions: DensNet performed better than other DCNNs and expert radiologists in identifying malignant ovarian tumors from benign ovarian tumors based on ultrasound images, a finding that needs to be further explored in clinical trials.

Get Citation

Keywords

ultrasound; diagnosis; ovarian tumor; deep learning; radiologist

About this article
Title

Ultrasonographic diagnosis of ovarian tumors through the deep convolutional neural network

Journal

Ginekologia Polska

Issue

Vol 95, No 3 (2024)

Article type

Research paper

Pages

181-189

Published online

2023-10-13

Page views

292

Article views/downloads

259

DOI

10.5603/gpl.94956

Pubmed

37842987

Bibliographic record

Ginekol Pol 2024;95(3):181-189.

Keywords

ultrasound
diagnosis
ovarian tumor
deep learning
radiologist

Authors

Min Xi
Runan Zheng
Mingyue Wang
Xiu Shi
Chaomei Chen
Jun Qian
Xinxian Gu
Jinhua Zhou

References (31)
  1. Sung H, Ferlay J, Siegel RL, et al. Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA Cancer J Clin. 2021; 71(3): 209–249.
  2. Webb PM, Jordan SJ. Epidemiology of epithelial ovarian cancer. Best Pract Res Clin Obstet Gynaecol. 2017; 41: 3–14.
  3. Froyman W, Timmerman D. Methods of assessing ovarian masses: international ovarian tumor analysis approach. Obstet Gynecol Clin North Am. 2019; 46(4): 625–641.
  4. Van Holsbeke C, Daemen A, Yazbek J, et al. Ultrasound experience substantially impacts on diagnostic performance and confidence when adnexal masses are classified using pattern recognition. Gynecol Obstet Invest. 2010; 69(3): 160–168.
  5. Timmerman D, Schwärzler P, Collins WP, et al. Subjective assessment of adnexal masses with the use of ultrasonography: an analysis of interobserver variability and experience. Ultrasound Obstet Gynecol. 1999; 13(1): 11–16.
  6. Bi WL, Hosny A, Schabath MB, et al. Artificial intelligence in cancer imaging: Clinical challenges and applications. CA Cancer J Clin. 2019; 69(2): 127–157.
  7. Hosny A, Parmar C, Quackenbush J, et al. Artificial intelligence in radiology. Nat Rev Cancer. 2018; 18(8): 500–510.
  8. Li X, Zhang S, Zhang Q, et al. Diagnosis of thyroid cancer using deep convolutional neural network models applied to sonographic images: a retrospective, multicohort, diagnostic study. Lancet Oncol. 2019; 20(2): 193–201.
  9. Li J, Bu Y, Lu S, et al. Development of a deep learning-based model for diagnosing breast nodules with ultrasound. J Ultrasound Med. 2021; 40(3): 513–520.
  10. Xu HL, Gong TT, Liu FH, et al. Artificial intelligence performance in image-based ovarian cancer identification: A systematic review and meta-analysis. EClinicalMedicine. 2022; 53: 101662.
  11. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015; 521(7553): 436–444.
  12. He K, Zhang X, Ren S, et al. Deep Residual Learning for Image Recognition. In Proceedings of the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, NV, USA, 2016.
  13. Rupesh KS, Klaus G,Jürgen S. Highway Networks. In Proceedings of the International Conference on Machine Learning. Lille, France, 2015.
  14. Huang G, Sun Yu, Liu Z, et al. Deep networks with stochastic depth. Computer Vision – ECCV 2016. 2016: 646–661.
  15. Huang G, Liu Z, Van ML, et al. Densely Connected Convolutional Networks. In Proceedings of the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Honolulu, HI, USA, 2017.
  16. Esteva A, Kuprel B, Novoa RA, et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature. 2017; 542(7639): 115–118.
  17. Forrest I, Matt M, Sergey K, et al. DenseNet: implementing efficient ConvNet descriptor pyramids. Computer Science. 2014.
  18. Szegedy C, Vanhoucke V, Iofe S, et al. Rethinking the inception architecture for computer vision. In Proceedings of the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, NV, USA, 2016.
  19. Howard AG, Zhu M, Chen B, et al. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv 2017.
  20. Chollet F. Xception: Deep learning with depthwise separa-ble convolutions. In Proceedings of the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Seattle, WA, USA, 2017.
  21. CLOPPER CJ, PEARSON ES. The use of confidence or fiducial limits illustrated in the case of the binomial. Biometrika. 1934; 26(4): 404–413.
  22. Fleiss JL. Measuring nominal scale agreement among many raters. Psychological Bulletin. 1971; 76(5): 378–382.
  23. Jung Y, Kim T, Han MR, et al. Ovarian tumor diagnosis using deep convolutional neural networks and a denoising convolutional autoencoder. Sci Rep. 2022; 12(1): 17024.
  24. Hsu ST, Su YJ, Hung CH, et al. Automatic ovarian tumors recognition system based on ensemble convolutional neural network with ultrasound imaging. BMC Med Inform Decis Mak. 2022; 22(1): 298.
  25. Chen H, Yang BW, Qian Le, et al. Deep learning prediction of ovarian malignancy at US compared with O-RADS and expert assessment. Radiology. 2022; 304(1): 106–113.
  26. Gao Y, Zeng S, Xu X, et al. Deep learning-enabled pelvic ultrasound images for accurate diagnosis of ovarian cancer in China: a retrospective, multicentre, diagnostic study. Lancet Digit Health. 2022; 4(3): e179–e187.
  27. Christiansen F, Epstein EL, Smedberg E, et al. Ultrasound image analysis using deep neural networks for discriminating between benign and malignant ovarian tumors: comparison with expert subjective assessment. Ultrasound Obstet Gynecol. 2021; 57(1): 155–163.
  28. Wang H, Liu C, Zhao Z, et al. Application of deep convolutional neural networks for discriminating benign, borderline, and malignant serous ovarian tumors from ultrasound images. Front Oncol. 2021; 11: 770683.
  29. Gu J, Wang Z, Kuen J, et al. Recent advances in convolutional neural networks. Pattern Recognition. 2018; 77: 354–377.
  30. Akazawa M, Hashimoto K. Artificial intelligence in gynecologic cancers: Current status and future challenges — A systematic review. Artif Intell Med. 2021; 120: 102164.
  31. Guo C, Yu M, Li J. Prediction of different eye diseases based on fundus photography via deep transfer learning. J Clin Med. 2021; 10(23).

Regulations

Important: This website uses cookies. More >>

The cookies allow us to identify your computer and find out details about your last visit. They remembering whether you've visited the site before, so that you remain logged in - or to help us work out how many new website visitors we get each month. Most internet browsers accept cookies automatically, but you can change the settings of your browser to erase cookies or prevent automatic acceptance if you prefer.

By VM Media Group sp. z o.o., ul. Świętokrzyska 73, 80–180 Gdańsk
tel.:+48 58 320 94 94, faks:+48 58 320 94 60, e-mail:  viamedica@viamedica.pl