As molecular diagnostics continues to transform oncology care, significant barriers to its widespread adoption remain. From the challenge of harmonizing data across hundreds of laboratories, to the economic realities of funding precision medicine, the field stands at a critical juncture. Meanwhile, emerging technologies like liquid biopsy, AI-powered interpretation platforms, and digital twins promise to democratize access to molecular testing and accelerate clinical decision-making.
We spoke with Jurgi Camblong, founder and CEO of SOPHiA GENETICS, about the practical realities of scaling precision medicine. From the need for value-based medicine, to industrialization of data processing, and the promise of digital twin technology, his views challenge conventional assumptions while offering a roadmap for accessible precision medicine for every patient.
What do you think are the most pressing needs in molecular diagnostics, and how is the diagnostics industry addressing them?
It's important to recognize that this field is still quite young. We only began routinely sequencing DNA and using next-generation sequencing (NGS) for tissue testing around 2014 or 2015. While there's strong demand for molecular diagnostics today, several key areas need strengthening to make molecular information even more valuable for patient diagnosis and treatment decisions.
Harmonization is the first critical need. The data being produced across different laboratories contain significant biases because workflows vary considerably from site to site. One of our most important missions is ensuring that every patient receives the same quality of testing outcome, regardless of where their sample is collected and analyzed.
Data harmonization must occur at multiple levels: sample preparation, data production, and especially in the computational output of molecular tests. We need to build greater trust in the fact that NGS-based molecular testing is not only powerful but also highly accurate.
The second pressing need is education. COVID-19 taught the public about DNA and RNA, but when I speak with pathologists and oncologists, I find that not everyone fully understands the value of molecular testing or how it will continue to evolve. We need to identify and support champions within anatomic pathology laboratories – pioneers in molecular biology who can educate oncologists. Similarly, we need molecular oncology champions among clinicians.
This is crucial because physicians already have enormous amounts to learn during their training, yet molecular testing in oncology has become a critically important specialty. It's the definitive way to understand the drivers of cancer. Demystifying molecular diagnostics for all stakeholders is essential.
Third, we need better budget allocation. Funding must be appropriately distributed to precision medicine and molecular testing laboratories within pathology departments. Testing for biomarkers is the most effective way to ensure patients receive the right drugs and avoid expensive, toxic therapies that won't benefit them. This requires thinking about transitioning toward value-based medicine at a global level.
One reasonable approach might be allocating a percentage of drug costs to fund precision oncology laboratories. This would give pathology labs the resources they need to operate properly, maintain adequate staffing for interpretation, and scale their capabilities appropriately.
What needs to change for liquid biopsies to be more widely adopted in oncology diagnostics?
First, let me clarify that liquid biopsy won't replace tissue diagnostics – it's complementary. Many of the same principles we discussed earlier apply here, particularly harmonization. Ensuring accurate and consistent outcomes in data production and computation is absolutely critical for liquid biopsy, and this is exceptionally challenging because the quantity of circulating tumor material in blood is minimal.
The key to the success of SOPHiA GENETICS’ liquid biopsy test has been our algorithmic capability to harmonize data after production. We've demonstrated to laboratories that the concordance of identical samples tested across different hospitals is truly robust. This decentralized approach to liquid biopsy had not been achieved before.
Pharmaceutical support is the second critical factor. The democratization of our liquid biopsy test was a joint effort between Memorial Sloan Kettering Hospital, and AstraZeneca, which supported multiple hospitals in adopting the technology. This model of collaboration is essential for broader adoption.
Education remains paramount. Not everyone understands liquid biopsy – I recently had an investor ask whether there's imaging involved in liquid biopsy. We need to continually educate stakeholders that this is a complement to tissue diagnostics with specific applications: when tissue biopsy isn't feasible due to a patient's medical condition; when cancer progresses slowly, as in prostate cancer, where analyzing circulating tumor DNA years after surgery provides more current information than old tissue; or in regions with insufficient pathology resources, where blood-based testing may be more practical than surgical biopsy.
How might liquid biopsy transform disease monitoring?
The clinical applications are compelling. For monitoring, liquid biopsy is particularly valuable in metastatic patients. It's like playing poker against cancer – you can see the cancer's hand and anticipate how it will evolve. By testing every three to six months, we can detect emerging mutations before they become clinically apparent, staying ahead of the curve in treatment decisions.
For minimal residual disease (MRD) testing in early-stage, non-metastatic cases, the picture is still evolving. The most advanced evidence comes from colon cancer, where the colon sheds substantial ctDNA into the blood. We know that liquid biopsy has strong prognostic value – ctDNA-negative patients have better prognoses, while ctDNA-positive patients fare worse. The critical question is actionability: can we use this information to escalate treatment in positive cases or de-escalate in negative cases – perhaps with less aggressive surgery or reduced adjuvant therapy?
This requires a collective effort. We need to test patients for ctDNA content, measure MRD status, and systematically record clinical decisions and outcomes. By learning from what physicians do and how they modify treatment based on this information, we can refine our approach for the next generation of patients. Data collection and sharing are essential to answering these questions definitively.
What changes are needed to improve patient access to precision medicine?
This is a critical question, and it's why I've emphasized the pharmaceutical industry's role. Industry players like SOPHiA GENETICS work to optimize our technologies to make them affordable. However, there are limits to cost reduction. You still need sufficient statistical power in your data and robust algorithms, so you can't compress costs indefinitely.
This is where transitioning to value-based medicine becomes crucial. Perhaps we should reimburse drugs only when they're proven to work for a specific patient and withhold reimbursement when they don't. This could make healthcare systems more sustainable. When people say molecular testing is expensive, I disagree fundamentally. Think of it like automotive diagnostics: diagnosing the problem isn't expensive – fixing the car is what costs money. In precision oncology, testing isn't expensive, but treatment is. Testing might cost $800 to $1,000, but compare that to immunotherapy at $50,000 per patient per year, or targeted therapies with similar costs. We need to change the narrative, and pathologists need to lead this effort.
Beyond direct costs, there's a broader societal impact. Patients with cancer who don't receive molecular testing and targeted treatment are more likely to experience drug toxicity. That means they can't work or support their families – they cannot contribute to society. That's why we must find sustainable ways to fund diagnostic testing.
Governments need courage to have these discussions with pharmaceutical companies in the spirit of what value-based medicine was originally designed to achieve. Fortunately, pharma companies understand that cancer is becoming increasingly sophisticated, that patient stratification is essential, and that anticipating disease evolution is critical for positioning their drugs and developing new ones. They recognize the importance of real-time, real-world evidence generation.
Our partnership with AstraZeneca exemplifies this. Pharmaceutical companies want good outcomes for patients, and they've accepted their role in this ecosystem. Now governments need to engage in discussions about how the pharmaceutical industry can share responsibility for funding the testing systems that ultimately benefit everyone.
How does artificial intelligence (AI) augment molecular oncology?
SOPHiA GENETICS has used AI since the beginning. To be specific, we don't yet use transformers or generative AI; we employ deep learning, machine learning, statistical inference, and pattern recognition.
A concrete example is our AI agent that is trained to detect copy number variations. We've been supporting pathologists in detecting these variations in tissue for years. What's remarkable is that as pathologists discover more copy number variations, they're finding strict correlations between specific variations and cancer prognosis.
This exemplifies why AI is critical for finding signals in noisy, unharmonized data. By identifying these signals, we build knowledge that's essential for diagnosis, treatment decisions, and prognostic assessment. One public example of AI capabilities is our work with AstraZeneca on partial chromosomal instability in prostate cancer.
AI's primary role in our field is ensuring mutations are properly identified with high sensitivity and specificity – directly addressing the harmonization challenges I mentioned earlier. When your AI models have been trained on over 2.3 million genomic profiles, as ours have, they become substantially more advanced. The systems are never perfect and continue improving, but the depth of training data makes a significant difference.
To what extent will AI-augmented diagnostics accelerate the adoption of precision medicine?
For adoption, this means practical accessibility. A laboratory can purchase a next-generation DNA sequencer, work with the manufacturer to understand how to produce data with satisfactory statistical power, and then use AI tools confidently to assist with patient diagnosis. This removes significant technical barriers.
Another AI application is simplifying interpretation. Once signals are identified, AI helps associate them with publicly available information and other patient cases, allowing pathologists to quickly focus on actionable mutations. Machine learning is already being leveraged extensively on our platform. For approximately 80 percent of cases using our platform, pathologists should be able to make decisions within three minutes. For the remaining 20 percent of more complex cases, we provide advanced AI-powered search functionalities that typically require no more than ten minutes.
This efficiency is transformative. By reducing interpretation time from what might take hours without AI support to minutes, we're making precision medicine practical at scale. That's how AI truly accelerates adoption – not through futuristic promises, but through concrete improvements in accuracy, consistency, and workflow efficiency.
Given the increasingly complex data outputs from molecular pathology, do we need more clinical bioinformaticians in the pathology workforce?
For routine clinical diagnostics, AI and bioinformatics are now embedded in platforms precisely because of scalability requirements. Bioinformaticians remain essential for complex cases and clinical research, but not for routine sample processing.
What we really need is industrialization of data production: more automation, more sequencers, and more streamlined workflows for sample enrichment. We need well-trained technicians on the production side. For interpretation, we need more molecular biologists, because volumes are growing, and the work is becoming increasingly sophisticated.
This perspective comes from serving 800 hospitals across 72 countries. I try to influence policymakers to invest in education – training technicians and professionals capable of producing interpretation reports. Yes, invest in bioinformatics training, but for translational requirements, not routine clinical samples. Otherwise, the field will never achieve the necessary scale.
Consider the numbers: we compute two petabytes of data annually. At Institut Gustave Roussy in France, one of Europe's most sophisticated cancer centers, we process 35,000 patient cases per year. All molecular data computation runs on our platform – the same applies to Guy's Hospital for certain applications, Memorial Sloan Kettering, and many others. This requires industrial solutions.
But human expertise remains vital. These industrial solutions are designed to bring information to humans more rapidly and accurately, allowing them to apply their expertise. We need technology and human intelligence working in tandem. The platform handles the computational heavy lifting, freeing molecular biologists to focus on interpretation and decision-making where their expertise truly matters. That's the model that will scale precision medicine effectively.
Which emerging trends or innovations in precision medicine are you most excited about?
Digital twins – I'm extremely excited about this. We should be able to show clinicians that a similar patient was treated in a certain way and responded to that treatment in a particular manner. We've just announced that we're launching digital twins for lung cancer. This is possible because we have over 6,300 patient cases with extensive, sophisticated data embedded in our platform through our partnership with Memorial Sloan Kettering. Through this feature, we can now offer anyone in the world the ability to see how other patients with the same characteristics have responded to specific treatments.
Looking ahead, transcriptomics will become crucial for cancer subtyping in pathology, and I believe it will be far more scalable than digital pathology. The challenge with digital pathology is that when you multiplex multiple biomarkers on an image – say 30 different markers – it becomes very difficult to distinguish the signals. Full transcriptome analysis combined with validated algorithms for cancer subtyping is inherently more scalable and interpretable.
MRD testing will also expand significantly – with the caveats I mentioned earlier. We still need to demonstrate clinical utility definitively. But in five years, I expect MRD testing to be holistically adopted around the world as we build the evidence base for actionability.
These innovations share a common theme: they leverage comprehensive data, sophisticated algorithms, and collaborative networks to provide clinicians with actionable insights at the point of care. That's the future of precision medicine – not replacing human judgment, but augmenting it with the collective experience of thousands of patients and the analytical power to find meaningful patterns in complex biological data.
