The Road to Augmented Pathology
Paving the way for virtual and augmented reality in medicine
As a cellular pathologist, Laszlo began his career at a time when the microscope required a table lamp as its external light source and a mirror on its base to make sense of histology and cytology slides. But thanks to relentless technological development, the field has evolved considerably in the space of just a single generation – and new technologies continue to make their mark in pathology today.
But what does the future hold for this precise specialty that uncovers the secrets behind human tissue? Pathology’s ultimate aim is to use imaging techniques to link what we see in the tissue with what we know about disease so that we can diagnose our patients and recommend the most effective treatments.
To do this successfully, information is key. We interrogate tissue to reveal any underlying information that contributes to identification of the disease or neoplasm. But there is one major constraint: we can only examine tissue taken from a single point in the course of a disease, removed from its environment, and frozen in time. The resulting sample represents only a minuscule portion of an organ or tissue. A frozen section is a tissue wafer no more than 2.5 cm in diameter and approximately 3-4 μm thick, or half the diameter of a red blood cell. It’s no wonder that, with this alone to go on, we can’t always find the answers we seek.
Morphology is based on tissue structure, which we visualize through our toolkit of processing methods and microscopy. Although the microscope is increasingly being replaced by digital scanning and display technologies, the basic principle of information visualization – the hematoxylin and eosin-stained tissue – stays the same. In our view, the future of pathology lies in viewing tissue without the need for processing, speeding up the time to diagnosis. But it would require a huge leap for the field into 3D visualization of tissue through augmented or mixed reality. Imagine if we could view tissues inside the patient, using in vivo microscopy to link cellular features to radiological images.
Artificial intelligence (AI) has an integral role to play in this modernization of pathology visualization. As AI matures, it takes on increasingly difficult tasks, often with unimaginable precision and accuracy. But AI only does what it is programmed to – and currently, it carries out repetitive tasks deemed “boring” by humans, sifts through tons of data to find common features and denominators, and makes conclusions based on statistical probability. Its efficacy for these tasks is undisputed – and we are now entering a new era of augmented intelligence (AuI), whereby AI will work alongside clinicians in the decision-making process. However, AI still struggles with edge cases. It lacks the ability to choose its own research questions or select areas of a 3D object that deserve further study.
And that’s exactly where virtual reality (VR) and augmented reality (AR) will shine. Rendering whole-slide images in 3D to generate a model that can be the size of a room will completely change the game. And it is not only in the visualization of morphology that VR and AR will take over. Many AR companies dream of a world in which we abandon our desktop computers and spatially interact with floating holograms – a reality that is truly on the horizon. Many of the projects with which we are currently involved aim to develop tools for situations ripe for AR exploitation. For example, in the frozen section room, an AR headset could record the user’s actions, generate useful information from various data sources, take photos for a 3D render, transcribe, and make video calls to discuss where and how to cut the sample (3D-rendered for the caller’s convenience). It could also deal with patient safety tasks and even the volumetric measurement of biological samples.
As VR headsets get lighter and computers get smaller and more powerful, we enter another phase of human-computer interaction. AR and VR will take their place as the new canvas upon which diagnosticians work. They will be especially transformative for training laboratory professionals of the future, throwing teaching into a whole different dimension. Imagine a scenario in which students can follow a holographic model with their phones and simultaneously record the visuals, transcribe the audio, and record any volumetric measurements through 3D modeling. Clinicians could describe and explore every aspect of the sample and even manipulate it manually.
One of the projects we have worked on involves the world’s first AR patient controlled by AI and a dialogue management system. It takes a holistic approach to teaching; you can speak to the patient and then work down to the microscopic level and watch as the layers of skin under a lesion expand out, like a computer-aided design model. You can even touch each element to receive information about it.
Stepping into a bold new frontier where 3D data is beamed directly into a convenient headset – that you can pick up as easily as your phone – is right around the corner. VR and AR open the door to personalized, easily maintainable workspaces with context-sensitive tools at your fingertips. You will no longer be tied to a desk, instead able to work standing up or on the move. Being in an empty (virtual) office space, you will be able to design and use tools as you see fit in an almost unlimited environment that will be entirely customizable. Imagine being able to replay a full 3D holographic render of a cut-up sequence with volumetric measurement.
Eventually, we will even be able to record a patient’s skin lesions in a 3D render that can be played back or referred to in the future. VR and AR will enable a radical shift in the delivery of frontline medicine – and it’s going to produce a brave new (augmented) world.
Consultant Histopathologist at the Norfolk and Norwich University Hospital, Norwich, UK.
Lead Technician of the School of Education, Faculty of Social Science at the University of Sheffield, UK.