Legal Liability for Algorithmic Malpractice in Digital Medical Diagnosis

Authors

  • Teguh Santoso Universitas Islam Bandung Author

DOI:

https://doi.org/10.62872/ij.v2i10.51

Keywords:

artificial intelligence, health law, legal liability, malpractice, medical algorithms

Abstract

The integration of artificial intelligence in digital medical diagnostics offers significant benefits but also introduces new risks, including algorithmic malpractice resulting from inaccurate or biased system outputs. This study examines the legal liability framework for algorithmic malpractice using a normative legal research method, analyzing statutory regulations, legal doctrines, and comparative international approaches. The findings indicate that liability remains predominantly placed on physicians, despite evidence that many algorithmic errors originate from design flaws, data bias, or technical failures outside clinical control. Hospitals and algorithm developers also contribute to systemic risks, highlighting the need for a multi-actor liability model. Regulatory reforms are required to establish algorithm audit obligations, risk assessments, human oversight mechanisms, transparency standards, and the adoption of shared or strict liability for developers. This study underscores the necessity of comprehensive regulation to ensure patient protection and legal certainty in the era of medical digitalization.

Downloads

Download data is not yet available.

References

Book & Journal

Ahmad, S., & Wasim, S. (2023). Prevent medical errors through artificial intelligence: A review. Saudi J Med Pharm Sci, 9(7), 419-423

Bernard Arief Sidharta. Refleksi tentang Struktur Ilmu Hukum. Bandung: Mandar Maju, 2019.

European Commission. AI Liability Directive Draft. Brussels: EU, 2022.

European Commission. EU Artificial Intelligence Act Proposal. Brussels: EU Commission, 2021.

FDA. Guidance for Software as a Medical Device. Washington, DC: U.S. Food and Drug Administration, 2021.

Government of Canada. Directive on Automated Decision-Making. Ottawa: Government of Canada, 2022.

Kementerian Kesehatan Republik Indonesia. Transformasi Digital Kesehatan: Laporan 2023. Jakarta: Kemenkes RI, 2023.

Lagioia, F., & Contissa, G. (2020). The strange case of Dr. Watson: liability implications of AI evidence-based decision support systems in health care. Eur. J. Legal Stud., 12, 245.

Määttä, J., Lindell, R., Hayward, N., Martikainen, S., Honkanen, K., Inkala, M., ... & Martikainen, T. J. (2023). Diagnostic performance, triage safety, and usability of a clinical decision support system within a university hospital emergency department: Algorithm performance and usability study. JMIR medical informatics, 11(1), e46760.

Marzuki, Peter Mahmud. Penelitian Hukum. Jakarta: Kencana, 2017.

Mrčela, M., & Vuletić, I. (2025). Navigating Criminal Liability in an Era Of AI-Assisted Medicine. Medicine, law & society, 18(1).

Singh, A. K. Evolving Standards of Duty of Care: A Critical Analysis of Negligence in the Age of AI and Automation..

Vrudhula, A., Kwan, A. C., Ouyang, D., & Cheng, S. (2024). Machine learning and bias in medical imaging: opportunities and challenges. Circulation: Cardiovascular Imaging, 17(2), e015495..

World Health Organization. Global Report on Artificial Intelligence in Health. Geneva: WHO Press, 2021.

Zawati, M. N., & Lang, M. (2020). What’s in the box?: uncertain accountability of machine learning applications in healthcare. The American Journal of Bioethics, 20(11), 37-40.

Legal Document

Undang-Undang Nomor 8 Tahun 1999 tentang Perlindungan Konsumen.

Undang-Undang Nomor 17 Tahun 2023 tentang Kesehatan.

Undang-Undang Nomor 27 Tahun 2022 tentang Perlindungan Data Pribadi.

Undang-Undang Nomor 29 Tahun 2004 tentang Praktik Kedokteran.

Downloads

Published

2025-11-28

Issue

Section

Articles

How to Cite

Legal Liability for Algorithmic Malpractice in Digital Medical Diagnosis. (2025). Ipso Jure , 2(10), 85-95. https://doi.org/10.62872/ij.v2i10.51

Similar Articles

31-40 of 164

You may also start an advanced similarity search for this article.