Subscribe to Newsletter
Inside the Lab Digital and computational pathology, Software and hardware, Technology and innovation

Diagnosis: Uncert(AI)n

Diagnostic artificial intelligence (AI) is developing rapidly – but concerns that the technology may reinforce pre-existing biases are keeping pace. Although there’s still debate over when AI can perform reliably in a clinical setting, there seems to be an increasing amount of skepticism toward the concept’s overall validity, with datasets in the direct line of fire.

A notable 2021 study by Seyyed-Kalantari and colleagues identified that widespread models trained on chest X-ray datasets showed a disparity between ethic groups when it came to accurately spotting disease known to be present. Specifically, Black, Hispanic, and other underserved groups received significantly more false “healthy” classifications than their White counterparts (1). In this case, the AI models were seen to parrot – if not exacerbate – known human biases in healthcare settings.

That study was a clear catalyst for wider discourse, spurring a number of papers that sought to respond to the team’s findings. One comment raised the study’s potential limitations and the original researchers’ inability to classify a cause of bias (2), then went on to note that disparities are a likely result when using a single prediction threshold due to underlying prevalence shift, as was the case in the original study.

Like an academic tennis game, the original authors returned with a reply of their own (3). They agreed with various points raised, particularly in regards to prevalence shift, difficulties in training with biased data, and use of a natural language processing tool to denote when the AI model had “no finding”. Most notably, they reiterated the importance of their findings – that biases are present in the datasets and must be addressed before AI can be deployed clinically.

Receive content, products, events as well as relevant industry updates from The Pathologist and its sponsors.
Stay up to date with our other newsletters and sponsors information, tailored specifically to the fields you are interested in

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

  1. L Seyyed-Kalantari et al., Nat Med, 27, 2176 (2021). PMID: 34893776.
  2. M Bernhardt et al., Nat Med, 28, 1157 (2022). PMID: 35710993.
  3. L Seyyed-Kalantari et al., Nat Med, 28, 1161 (2022). PMID: 35710992.
About the Author
George Francis Lee

Deputy Editor, The Pathologist

Interested in how disease interacts with our world. Writing stories covering subjects like politics, society, and climate change.

Register to The Pathologist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Pathologist magazine

Register