The old phrase “garbage in, garbage out” still rings true
George Lee | | News
Diagnostic artificial intelligence (AI) is developing rapidly – but concerns that the technology may reinforce pre-existing biases are keeping pace. Although there’s still debate over when AI can perform reliably in a clinical setting, there seems to be an increasing amount of skepticism toward the concept’s overall validity, with datasets in the direct line of fire.
A notable 2021 study by Seyyed-Kalantari and colleagues identified that widespread models trained on chest X-ray datasets showed a disparity between ethic groups when it came to accurately spotting disease known to be present. Specifically, Black, Hispanic, and other underserved groups received significantly more false “healthy” classifications than their White counterparts (1). In this case, the AI models were seen to parrot – if not exacerbate – known human biases in healthcare settings.
That study was a clear catalyst for wider discourse, spurring a number of papers that sought to respond to the team’s findings. One comment raised the study’s potential limitations and the original researchers’ inability to classify a cause of bias (2), then went on to note that disparities are a likely result when using a single prediction threshold due to underlying prevalence shift, as was the case in the original study.
Like an academic tennis game, the original authors returned with a reply of their own (3). They agreed with various points raised, particularly in regards to prevalence shift, difficulties in training with biased data, and use of a natural language processing tool to denote when the AI model had “no finding”. Most notably, they reiterated the importance of their findings – that biases are present in the datasets and must be addressed before AI can be deployed clinically.
- L Seyyed-Kalantari et al., Nat Med, 27, 2176 (2021). PMID: 34893776.
- M Bernhardt et al., Nat Med, 28, 1157 (2022). PMID: 35710993.
- L Seyyed-Kalantari et al., Nat Med, 28, 1161 (2022). PMID: 35710992.