Diagnosis: Digital
Liron’s landmark paper: DR Snead et al., “Validation of digital pathology imaging for primary histopathological diagnosis”, Histopathology, [Epub ahead of print] (2015). PMID: 26409165.
Around the world, use of digital pathology is becoming increasingly common. Pathologists are no longer using this technology for education and research only; many have started to employ it in their diagnostic work. What does this look like? It includes using telepathology not just to provide second opinions, but also for primary diagnosis. It comes as no surprise that now, in many countries, whole-slide imaging (WSI) has transitioned from a technology used primarily by innovators to one being leveraged by many early adopters. As a result, pathologists are increasingly aware of the importance of validating WSI for clinical use.
I selected this article on validating digital pathology for primary diagnosis for three reasons. The first reason is the authors’ sample size. This group of investigators – Snead and colleagues, from Coventry in the United Kingdom – had 17 pathologists report on 3,017 cases (10,138 slides) using digital pathology tools. This makes it one of the largest WSI validation studies published to date. Prior published validation studies have included, on average, eight individuals reviewing cases. Additionally, most prior studies used between 60 and 600 cases in their validations – clearly a far less comprehensive overview than this latest study. The authors from Coventry also included a broad distribution of subspecialties in their case mix, and compared the diagnoses in their original pathology reports (based on glass slide microscopy) to those rendered using digital slides.
The second reason for my selecting the paper is that, before embarking on their validation study, the authors set out to establish their baseline discrepancy rate for pathologists by recording the number of variances detected at their multidisciplinary team meetings (tumor boards). They found that they were concordant 98.78 percent of the time. Importantly, this indicates that even when examining glass slides, pathologists may not always agree on a diagnosis.
The third reason is because the study opted to use a noninferiority design for their validation. Many previously published validation studies determined whether or not diagnostic outcomes were different by using glass (“gold standard”) and digital modalities. The noninferiority approach does not hypothesize that one of these methods is superior, but rather establishes whether or not the newer (digital) method is at least as effective as another, better-established diagnostic modality (glass slide analysis). The results of the Coventry validation study were within the 95 percent confidence interval for intra- and inter-observer variability, proving that digital pathology is non-inferior to glass slide microscopy.
The take-home message of the paper is that digital pathology techniques are equivalent to reading glass slides for primary histopathological diagnosis. This is reassuring for the pathology community, because many of us have either already given up our microscopes or are strongly considering giving them up in exchange for digital slides.
Variations on a Drop by James Nichols
A Paper to Circulate by Ian Cree
Hyperspectral Disease Diagnosis by Peter Griffiths
Diagnosis: Digital by Liron Pantanowitz
Collagen and the Colon by Miguel Reyes-Múgica
Professor of Pathology, Department of Pathology and Clinical Labs, University of Michigan, Ann Arbor, Michigan, USA.