English Polski
Vol 19 (2024): Continuous Publishing
Review paper
Published online: 2024-05-17

open access

Page views 75
Article views/downloads 27
Get Citation

Connect on Social Media

Connect on Social Media

How can artificial intelligence be made into an ethically sound diagnostic instrument in medical practice?

Anna Maria Wojtkiewicz1, Magda Marta Piekarska1, Artur Mamcarz2, Daniel Śliż2
DOI: 10.5603/fc.98299

Abstract

The use of solutions from the field of artificial intelligence (AI) can help improve the quality of services in the healthcare sector. AI algorithms allow faster processing and analysis of data, and thus a more efficient process of diagnosing patients. But looking more into detail, it’s obvious that AI-generated solutions are still not on a 100% accuracy scale. Their algorithms are subject to biases that can exclude disadvantaged groups in society. Machine learning bias, alternatively termed algorithmic bias or AI bias, refers to the occurrence wherein an algorithm produces consistently skewed outcomes as a result of flawed assumptions embedded within the machine learning process. This is a situation where a valid algorithm excludes certain data or groups of data. The purpose of this article is to outline the issue of the ethical application of artificial intelligence in the medical sector with a particular focus on artificial intelligence bias. In medicine, this is an important issue as it translates into the quality of care for patients and how their chances of recovery are distributed. The following article addresses the legal issues and how artificial intelligence is classified by the European Union, the issues of artificial intelligence bias and its risks, along with examples of attempts to implement artificial intelligence in the medical sector to date and the prospects for the application of artificial intelligence in the medical sector with a particular focus on cardiology. Based on the following conclusions, it is recommended to persist in the advancement of artificial intelligence, with emphasis on the enhancement of algorithms. Despite its flaws, it is still a remarkably helpful diagnostic tool that should be widely introduced into the daily practice of physicians. This article is written using the method of analysis and critique of the literature, national and European Union legislation, and a review of existing research on the application of artificial intelligence in medicine.

Article available in PDF format

View PDF (Polish) Download PDF file

References

  1. Proposal for a Regulation laying down harmonised rules on artificial intelligence. https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-laying-down-harmonised-rules-artificial-intelligence (15.11.2023).
  2. Bias in AI: What it is, Types, Examples & 6 Ways to Fix it in 2023. https://research.aimultiple.com/ai-bias/ (15.11.2023).
  3. Caruana R, Lou Y, Gehrke J, et.al. Intelligible Models for HealthCare: Predicting Pneumonia Risk and Hospital 30-day Readmission. IN: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, s. 1721-1730. https://dl.acm.org/doi/10.1145/2783258.2788613 (21.11.2023).
  4. Smith R. The Key Differences Between Rule-Based AI And Machine Learning. https://becominghuman.ai/the-key-differences-between-rule-based-ai-and-machine-learning-8792e545e6 (21.11.2023).
  5. Juhn YJ, Ryu E, Wi CI, et al. Assessing socioeconomic bias in machine learning algorithms in health care: a case study of the HOUSES index. J Am Med Inform Assoc. 2022; 29(7): 1142–1151.
  6. Seyyed-Kalantari L, Zhang H, McDermott MBA, et al. Underdiagnosis bias of artificial intelligence algorithms applied to chest radiographs in under-served patient populations. Nat Med. 2021; 27(12): 2176–2182.
  7. Sjoding MW, Dickson RP, Iwashyna TJ, et al. Racial bias in pulse oximetry measurement. N Engl J Med. 2020; 383(25): 2477–2478.
  8. Hague DC. Benefits, pitfalls, and potential bias in health care AI. N C Med J. 2019; 80(4): 219–223.
  9. Guo Y, Wang H, Zhang H, et al. MAFA II Investigators. Mobile photoplethysmographic technology to detect atrial fibrillation. J Am Coll Cardiol. 2019; 74(19): 2365–2375.
  10. Hughes JW, Yuan N, He B, et al. Deep learning evaluation of biomarkers from echocardiogram videos. EBioMedicine. 2021; 73.
  11. Koulaouzidis G, Jadczyk T, Iakovidis DK, et al. Artificial intelligence in cardiology-a narrative review of current status. J Clin Med. 2022; 11(13).
  12. Moghaddasi H, Nourian S. Automatic assessment of mitral regurgitation severity based on extensive textural features on 2D echocardiography videos. Comput Biol Med. 2016; 73: 47–55.