ChatGPT is an artificial intelligence (AI) language model that responds to user queries in a conversational way, generating human-like responses to provide answers, advice, entertainment and, somewhat controversially, medical information.
Streamlining administrative tasks, helping HCPs answer patient questions, providing clinical decision support, and minimizing medical errors are just some of the tasks that supporters say ChatGPT can do now, or will be able to do in the future.
Skeptics caution that ChatGPT and other generative AI tools like it are an evolving technology that can potentially produce inaccurate information, and could be harmful, or even deadly, to patients if relied upon to diagnose or suggest treatment methods.
The promise of AI’s revolutionary role in healthcare has piqued the interest of doctors and other HCPs throughout the last decade. And while AI's capabilities have expanded rapidly in recent years, AI has done little to cut down on medical costs or improve widespread health outcomes thus far, according to some medical experts.
That may be about to change with the arrival of new tech players like OpenAI’s ChatGPT platform, which is creating a buzz, generating headlines, and has some medical experts believing that AI will still make waves in healthcare—and soon.
How ChatGPT may help clinicians
If you follow the latest in technology, you've probably already heard about ChatGPT’s “world-changing” potential—and there's a good chance you've tried interacting with the chatbot yourself.
So, how can it change the world of healthcare for the better?
Here are three ways in which ChatGPT could make HCPs' jobs easier, now or down the road.
1. Streamline administrative tasks
Let’s face it: You didn’t become a doctor to spend tedious hours completing paperwork each week. Yet, this is the case for many—and one of the primary drivers of physician burnout.
According to an article in Op-Med, the results of a Doximity poll, taken among approximately 2,000 physicians, found that 46% of doctors believe that cutting down on administrative tasks would most effectively mitigate burnout.
Streamlining administrative tasks is one potential benefits of ChatGPT, said John D. Halamka, MD, MS, president of the Mayo Clinic Platform, in an AMA Update video discussing the benefits and shortcomings of ChatGPT in medicine.
"If I use ChatGPT for diagnosis of a complex medical condition, high potential for harm," he said. "So I think in the short term, you'll see it used for administrative purposes, for generating text that humans then edit to correct the facts. And the result is reduction of human burden. And if we look at … the great resignation and retirement of our clinicians, burden reduction is actually a huge win."
2. Help prevent medical errors
A study published by The New England Journal of Medicine found that among 2,809 hospital admissions, at least one adverse event occurred in 23.6%. Moreover, researchers deemed 22.7% of all 978 adverse events preventable.
This problem is one that may be solved in the future, at least in part, by video-enabled AI.
“Next generations of ChatGPT with video capability will be able to observe doctors and nurses, compare their actions to evidence-based guidelines and warn clinicians when they’re about to commit an error,” wrote Robert Pearl, MD, in a Forbes.com article.
This technology, according to Dr. Pearl, could not only prevent medical errors but could also drastically reduce hospital-acquired infections, pneumonias, and pressure ulcers among patients.
"In the same way the iPhone became an essential part of our lives in what seemed like no time, ChatGPT (or whatever generative AI tool leads the way) will alter medical practice in previously unimaginable ways," he predicted.
3. Offer 24-hour-a-day monitoring
One of the most convenient aspects of generative AI, wrote Dr. Pearl, is that, unlike humans, it can be “on” around the clock—a strength that could change the game for the 40% of Americans living with chronic illnesses.
“What these patients need is continuous daily monitoring and care,” Dr. Pearl wrote.
ChatGPT may someday be a contender to offer 24/7 monitoring and medical expertise in the absence of a human doctor.
Dr. Pearl believes that this could help to prevent the development of chronic diseases like heart disease, hypertension, and diabetes among patients, and to minimize their complications.
On top of that, ChatGPT may also provide patients with individualized, daily health updates and reminders through the use of wearable devices and supportive consumer technologies, which would facilitate at-home care.
Know the limits
As ChatGPT gains more momentum, it’s important for doctors to bear in mind what this generative AI won’t be able to do—yet.
For example, ChatGPT can’t be trusted to accurately diagnose patients with complex medical conditions.
This is, in part, due to its potential to produce inaccurate or false information relating to certain diagnoses, their respective causes, and how patients can treat them.
Additionally, ChatGPT may exhibit certain biases, according to a StatNews.com article.
“When a user asked ChatGPT to generate computer code to check if a person would be a good scientist based on their race and gender, the program defined a good scientist as being a white male,” the authors wrote.
Experts at OpenAI may be able to filter out more explicit forms of bias.
But addressing implicit biases—which may enable ChatGPT to perpetuate stigma and discrimination—may be harder to do.
All in all, ChatGPT is still rapidly developing. Today, its clinical strengths lie in completing administrative tasks. Researchers are still exploring its potential to provide 24-hour monitoring and address medical errors. Tomorrow, who knows?
What this means for you
ChatGPT is an up-and-coming generative AI that may be helpful to HCPs by automating administrative tasks, significantly reducing the time spent completing medical charts, progress notes, and other documents. One day, ChatGPT may be relied upon as a tool to mitigate medical errors, or provide future round-the-clock patient monitoring. In the meantime, HCPs can follow the debate about the use of AI tools in healthcare, still knowing that quality patient care requires direct physician oversight.