(AI) Trial and (Diagnostic) Error
How can new healthcare technology help address the number one patient safety issue?
Thomas Westerling-Bui | | Longer Read
At a Glance
- Diagnostic errors were recently named the number one issue in the 2019 ECRI Patient Safety Concerns report
- Artificial intelligence (AI) can help to detect and mitigate these errors
- Using AI for image analysis frees up time for pathologists to carry out the tasks that require the most skill
- Broader adoption will help produce real-world evidence supporting the role of AI in improving patient safety
The Emergency Care Research Institute (ECRI) recently published their 2019 patient safety report outlining the top 10 concerns affecting patients across the continuum of care. Using information from their patient safety organization database, root cause analyses, and votes from a panel of experts, the ECRI creates the list to help healthcare organizations identify and respond to new patient safety threats.
For the second year running, the 2019 report names diagnostic error as the number one patient safety issue, stating, “When diagnoses and test results are not properly communicated or followed up, the potential exists to cause serious patient harm or death.” More specifically, the briefing scrutinizes the management of test results using electronic health records. “Providers have begun relying on the electronic health record (EHR) to help with clinical decision support, to track test results, and to flag issues. However, the EHR is only part of the solution,” the report says.
Read the full article now
Log in or register to read this article in full and gain access to The Pathologist’s entire content archive. It’s FREE and always will be!
Or register now - it’s free and always will be!
You will benefit from:
- Unlimited access to ALL articles
- News, interviews & opinions from leading industry experts
- Receive print (and PDF) copies of The Pathologist magazine
Or Login via Social Media
By clicking on any of the above social media links, you are agreeing to our Privacy Notice.