How an AI-powered clinical notes API could boost telehealth
How an AI-powered clinical notes API could boost telehealth Bill Siwicki
Healthcare providers spend many hours every week writing clinical notes to document patient visits.
Kwindla Hultman Kramer, cofounder and CEO at Daily, developer of WebRTC, an open-source tool for developers to work with video and audio, has worked with healthcare professionals and telehealth application engineers to develop a technological capability with the goal of reducing documentation time by 80% or more.
Daily’s new APIs enable seamless integration of SOAP and other clinical notes workflows into virtual care platforms. Daily's WebRTC developer platform offers a suite of HIPAA-compliant APIs, and supports Safari and iOS for one-click, no-download telehealth video calls.
We spoke with Kramer to discuss AI's role in telehealth, documentation during virtual care visits, and the new documentation API.
Q. Why do you think telemedicine is fertile ground for the application of artificial intelligence?
A. From a technology point of view, an important thing about telehealth interactions is that all audio is already being captured digitally, ready for transcription and summarization. This makes telemedicine a good starting point for adding new AI tools into healthcare workflows.
The category of AI tools that is getting the most attention and adoption right now are called large language models. ChatGPT, for example, is a user interface on top of OpenAI’s large language models, GPT-3.5 and GPT-4.
The newest, state of the art LLMs are quite good at taking unstructured text as input and producing structured data as output. This is something that computers haven’t been able to do before, and this new capability is a big reason that technologists are so excited about these new tools.
The transcript of a telehealth visit is a good example of "unstructured data" that has a lot of valuable information embedded in it. Until very recently, the only way to make sense of a telemedicine transcript was for a person to read it and then pull out some of the information for input into a medical records system.
Now, large language models can do things like automatically create clinical documentation in specific formats.
Telemedicine usage grew rapidly during the COVID-19 pandemic, which accelerated regulatory, billing and technology changes that had started already. Because telemedicine is now an established component of healthcare delivery, there’s an opportunity to think about ways that adding AI tools to telemedicine – thoughtfully and in a non-disruptive way – can help expand access to care, save clinicians significant amounts of time, and improve patient outcomes.
Q. What can AI do for caregivers' SOAP and other clinical notes that the caregivers themselves cannot do?
A. In general, I don’t think we should be trying to find things for AI to do that caregivers can’t do themselves. I think we should be trying to find things for AI to do that save time and allow caregivers to focus on working with patients.
Nobody is drawn to healthcare as a career because they want to spend time writing clinical documentation. SOAP notes and the like are really important. But doing SOAP notes isn’t particularly creative work and doesn’t leverage a caregiver’s unique expertise and human touch very much, particularly in comparison to interacting with patients.
An AI tool that can automatically produce a first draft of a SOAP note – for the caregiver’s approval or editing – has the potential to save an enormous amount of time. Potentially, ten hours or more every week. We want this note to be as similar as possible to what the caregiver would produce themselves, were they writing it up.
Over the long term, there may be value in exploring what new AI tools can do differently or better than humans. I don’t think we should dismiss that direction of research out of hand.
But right now, we’re starting to be pretty sure that today’s new AI tools can do a new class of relatively rote things that take up a lot of time, with perhaps minimal loss of quality. If that does turn out to be true, that means caregivers can spend more time with their patients, or more time with their families, or both. That’s a big deal.
Q. Your company just released the AI-Powered Clinical Notes API for Telehealth. Who is this API aimed at and how do they integrate it with virtual care technology? Further, what are the outcomes the API is designed to produce?
A. Our customers are software developers who build telehealth applications. We specialize in the video and audio parts of software development. So, our APIs – application programming interfaces – are designed to be used as one component of a full product or service.
As we’ve grown, we’ve expanded our feature set to include things that are adjacent to video and audio, such as transcription, analytics and now AI tools.
Our goal with everything we do is to enable the creation of new software that has value in the world. We don’t work only in healthcare, but healthcare is a particularly important part of what we do because the value of helping people get the best possible medical care is so clear and so motivating.
In the case of the new AI-powered Clinical Notes APIs that we just released, we started working on this because our customers were telling us that their customers – healthcare providers – typically spend 10 or more hours a week writing clinical documentation and asking us if we had any ideas about how to reduce that burden.
We did have some ideas, so we worked closely with several of our customers to test, evaluate and iterate. It’s obviously important to get even a first public version of this kind of thing right, in several ways: data privacy, quality of output, reliability and dependability of the service.
Q. How else do you see artificial intelligence being applied in telemedicine in the future?
A. Some of the things we’re seeing that seem likely to have an impact in the near term include making it easier for both patients and caregivers to prepare for visits, creating more detailed analyses of care outcomes, and various approaches to real-time "copilots" that provide access to information during a session.
Farther out and more experimentally, I’m interested in the progress being made in "multi-modal" AI models. These are large models that are trained on a huge amount of data that includes text, images and audio, and sometimes data from temperature, inertial and other specialized sensors.
There is potential to use all the data that our digital devices collect about us to help clinicians diagnose and treat patients. There are obvious privacy issues, but in general I think we do a good job regulating access to and privacy of healthcare data (much more so than we do for other kinds of data).
We can think of this, perhaps, as an expansion of telemedicine to include passive and "always on" health monitoring and diagnostics.
Follow Bill's HIT coverage on LinkedIn: Bill Siwicki
Email him: bsiwicki@himss.org
Healthcare IT News is a HIMSS Media publication.