Pathologists Versus AI
When it comes to new technology, shouldn’t we be thinking in terms of collaboration rather than competition?
Holger Lange and Cris Luengo |
An automated artificial intelligence (AI) system for pathology that achieves a performance of 90 percent may appear to outperform human pathologists. But the conditions aren’t quite the same, so “pathologist versus AI” may not really be the right comparison. Instead, we may want humans and computers to work together, each performing to their own strengths.
Manual microscopy is the standard of care and current practice in pathology. Notably, microscopes are not FDA-cleared medical devices (because they pre-date the FDA); different microscopes have different optics and even light sources; and the microscopes in use are often not properly calibrated – so the human pathologist is starting off on the back foot. Now, we are asking those pathologists to assess 500,000–1,000,000 cells that can have considerable heterogeneity across a slide – and then to reduce that information to a single diagnosis or summary score. For example, in the case of a very simple immunohistochemical scoring, we ask the pathologist to determine the percentage of cells (to be evaluated against a threshold, for example, >10 percent) of a certain cell type (for example, tumor cells) that have staining (for example, DAB that can be collocated with hematoxylin) in a certain cell compartment (for example, the nucleus) that is above an absolute threshold – a very challenging computational task. And it seems obvious that it will lead to high inter- and intra-pathologist variation.
Furthermore, immuno-oncology, one of the major advances in drug development in recent years, now requires the pathologist to deal with even more stains, look at tissue context, and apply more complex scoring schemes. The level of analysis required is becoming impossible for a pathologist using just a microscope. A computer, on the other hand, would complete many of these tasks with little difficulty. With the increasing adoption of digital pathology, which enables computers to analyze images of histology slides, it’s time to replace the microscope with more fitting tools.
And the most fitting tool for pathologists is a pathology AI system that uses machine learning for cell classification, where pathologists provide the tissue expertise (identifying different cell types and verifying proper classification), and computers tackle the computational tasks of counting cells, calculating objective measurements, and complex scoring.
Rather than going “all in” with an automated AI system intended to replace humans altogether, the use of AI as an aid allows pathologists to provide high-performance, high-complexity tissue analysis. Automated AI systems will have a hard time increasing their performance to 95 or even 99 percent – the minimum achievement necessary to be considered a “replacement” for a pathology AI system where both humans and computers work together.
The key problem for automated AI systems in pathology is the variation between samples. No two examples of a disease look the same – even in similar patients under similar conditions. Human pathologists know that the same cell type can have different characteristics in different patients, to the point where one patient’s healthy cells can appear similar to another’s tumor cells. An AI system would have tremendous difficulty with this concept and require exponentially more training data to become familiar with all of the potential disease presentations, especially in exceptional cases that a human might pick out by applying “common sense” rather than hard-and-fast rules.
The best way, and in our opinion the only viable way, to create enough training data for such systems (at a reasonable price point) is to have the training data automatically generated as part of the standard clinical workflow. The perfect intermediate step between manual microscopy and automated AI is to provide pathologists with the right tools – that is, a pathology AI system that, as part of its normal workflow, generates an abundance of free training data (pathologist-verified cell classification for the whole slide for all slides) that can be used to train a fully automated system. As that system’s performance improves, it becomes obvious when it’s “ready” – for deployment and, eventually, for commercialization.
Let’s provide pathologists with the right tools – by replacing the microscope, not the pathologists.