Sector News

Why Google, Amazon, and Nvidia are all building AI notetakers for doctors

November 15, 2020
Borderless Future

Automated medical transcription that’s actually accurate could save doctors a huge amount of time, and the tech giants are getting in on the action.

For doctors, taking notes and inputting them into electronic medical records is so cumbersome that they often have to use human medical scribes to do it for them. That’s changing as more hospital systems turn to artificial intelligence-based transcription tools.

However, some doctors feel the tools available today are just not accurate enough. “If there were a really smart voice transcription service that was 99% accurate, I would definitely use it,” says Bon Ku, an emergency room doctor at Thomas Jefferson Hospital University and director of the university’s Health Design Lab. “A lot of times, I feel like I’m a data-entry clerk.”

For the last several years, big tech companies have been jockeying to be the one who finally delivers the kinds of tools doctors have been craving.

This week, Google launched open source machine learning software to help doctors make sense of patient medical records. The platform is composed of two programs. One, an API for healthcare-related natural language processing, scans medical documents for key information about a patient’s journey, puts it into a standard format, and summarizes it for the doctor. It can pull from multiple sources of information like medical records as well as transcribed doctors’ notes. The goal is to create an easy way for doctors to review a patient’s past care. The second, called AutoML Entity Extraction for Healthcare, is a low-code tool kit that helps doctors to pull out specific data from a patient’s record, like information about a genetic mutation. Both tools will be available for free until December 10, 2020 for doctors, insurers, and biomedical companies.

Much of Big Tech’s enthusiasm for medicine is focused on building a better way for doctors to record their interactions with patients without having to type into a computer. Amazon, Microsoft, and Google have all created software to this effect and are increasingly creating tools for healthcare settings, likely in a quest for new sources of recurring revenue.

Even Nvidia, which has traditionally focused more on its imaging technology, has started offering medical transcription. Earlier this year, Nvidia launched a service called BioMegatron, which is built to recognize conversational speech. The data set is trained on over six billion medical terms and is 92% accurate. There are also a host of smaller companies like Dragon, MModal, Suki Ai, and Saykara providing transcription for doctors.

AI-powered transcription is the latest push toward automating medical processes. Much of doctors’ work is already electronic: many use a computer system to pull up patient data. A 2013 paper found that over the course of a 10-hour shift, emergency room doctors made 4,000 clicks during a busy shift. Doctors who use the EPIC electronic health record system also have a program called “dot phrases” to help make it faster to write notes and pull information about patients (EPIC also has an AI transcription module). The problem some doctors have with dot phrases is that it enables quick pre-written entries about an ailment or symptom to be inserted into patient records. Such shorthand is fine for medical billing, but it leads patient records to be overly generalized. As a result, doctors reviewing patient’s history often don’t get the context surrounding a patient’s last visit.

“Most of patient records are garbage—they’re full of templates,” says Ku. “Ninety percent of our diagnoses come from the interview; it doesn’t come from diagnostic imaging or lab tests. It’s about me being able to get the story from my patient—but that becomes hindered because there’s this insane pressure to enter data into a computer.”

Doctors also spend an enormous amount of time entering data in the electronic health record. Ku says that doctors have what’s called “pajama time,” which refers to the hours they spend at home recalling patient information into the system. This is why doctors would love a notetaking experience that was more akin to talking to Alexa. A system that extracts patient data from a conversation or the ability to order tests by voice would be a game changer, says Ku. He’d be able to spend a lot more time with patients. However, the technology needs to get to a place where it is more accurate so doctors don’t have to spend more time updating what the AI got wrong.

“There has to be some safety mechanism,” says Ku.

By Ruth Reader


Join the discussion!

Your email address will not be published. Required fields are marked *

Related News

January 10, 2021

What will happen in 2021? Here’s what people in 1921 predicted.

Borderless Future

As we enter this new year, I thought it would be helpful to see what people from a full century ago envisioned for us. Newspapers of 1921 were full of predictions—some right, and some very wrong.

January 2, 2021

Mental health in the workplace: The coming revolution

Borderless Future

Employees are in need and are demanding better coverage. Here’s how leading companies get mental health coverage right.

December 20, 2020

Which economies showed the most digital progress in 2020?

Borderless Future

Now more than ever, digital capabilities are essential to ensure a country’s growth and economic resilience. But how do different economies compare as far as the current state and ongoing momentum of their digital development?

Send this to a friend