
AI & Health: Who Controls the Cure?—Part 3
The Black Box Clinic
How Opacity in AI Systems Collapses Transparency, Accountability, and Trust.
When Medicine Stops Explaining Itself
Modern medicine is built on explanation. Diagnosis requires justification, treatment demands rationale, and consent presupposes understanding. Yet AI-assisted healthcare increasingly operates in direct violation of this epistemic foundation. Across hospitals, decision-support systems now issue recommendations that cannot be meaningfully interrogated by clinicians, patients, or regulators. These systems work—but they do not explain.
London (2019) describes this as the central ethical rupture of AI in medicine: accuracy has been prioritized over intelligibility. High-performing models deliver predictions without reasons, …












