ChatGPT: A pocket-sized mentor or a useless AI gadget? Doctors debate its role in medicine

By Claire Wolters
Published September 5, 2023

ChatGPT has entered modern medicine—and it’s not just sitting in the waiting room.

Physicians use it for multiple reasons, including research, patient education, synthesizing information, and assisting in disease diagnosis—and some say they work with it daily. Yet despite the frequency of use, some also question its safety and ethics, particularly when it comes to privacy, accuracy, and day-to-day use.

To better understand ChatGPT’s place and popularity in modern medicine, MDLinx conducted an exploratory survey asking physicians about their thoughts in these areas.

Two sides to the debate

Wael Harb, MD, a medical oncologist at MemorialCare Cancer Institute, says that when it comes to being for or against artificial intelligence (AI) systems like ChatGPT, doctors seem to have formed “two camps.”

“We have people who are very excited about AI and healthcare because they see the potential of it, and we have a camp of people who are very skeptical or hesitant because of their concern about risks or safety issues for the patients,” Harb says.

Exploratory

In our survey, when asked if they support ChatGPT use among medical trainees, responses were mixed. About 76% answered in favor of ChatGPT for trainees, and about 21% voted against. About 2% of physicians responded yes but with limitations.

Dr. Harb says both excitement and hesitation may be valid, suggesting that the safest and most productive place to meet is “somewhere in the middle.”

If trends are any indicator, ChatGPT and medical AI systems will continue to grow in prevalence in the next few years. And like it or not, Dr. Harb says those who are paying attention may be most equipped to shape the future.

Privacy concerns

When it came to challenges practitioners had about using ChatGPT, the most commonly expressed roadblock was concerns for their privacy – like fears of being logged or tracked. 

Patient privacy, such as HIPPA rights infringement or copyright problems, have also been mentioned as practitioner concerns with ChatGPT and other AI medical software. A majority of respondents said they thought chatGPT use could increase their risks for legal entanglements or medical malpractice.

Hippa

Concerns physicians have for themselves can be different than those they have for their patients. When asked what worries they had about using ChatGPT to treat a patient, survey responses highlighted accuracy and ethical risks.

In the study conducted by MDLinx, 63% of respondents said they are concerned that what they type into ChatGPT is being logged or tracked.

Assessing accuracy

Skepticism over safety, ethics, and accuracy isn’t limited to ChatGPT and isn’t unwarranted. Created in 2022, the system is not uniquely FDA-regulated for medical use—though the FDA has created an action plan for regulating a wider realm of AI and ML products—and it’s not flawless, either. 

One studied flaw with ChatGPT is the phenomenon of “hallucinations.” Similar to its definition in the medical world, hallucinating in ChatGPT refers to an artificial intelligence algorithm generating false data or making up data that isn’t there. This, along with other reasons, can make physicians hesitant about talking about how they use ChatGPT. In fact, 26% of respondents cited the biggest challenge with ChatGPT is whether or not the information returned is correct or possibly misleading.

Replace medicine

Sometimes, the system does this to fill up space or if they begin replicating a pattern versus data. While this isn’t the biggest deal in some industries, false information in medicine can pose acute dangers to patient health or doctors’ reputations. Hallucinations don't occur every time someone uses ChatGPT or with every AI system. When they do, however, they may skew reports on patient records or impair diagnostic criteria.

In the medical field, hallucinations are not hiccups that can be overlooked.

Because of them, Yiannis Kiachopoulos, CEO and co-founder of Causaly, a platform that helps researchers accelerate drug discovery, says ChatGPT is “not a solution that scientists can depend on, especially for drug discovery where reliability is vital.”

That said, he adds that ChatGPT serves a purpose: “to provide a brief answer to a user query, without showing extensive proof of its response.” To benefit from ChatGPT, people need to fill in the “proof” component with detailed scientific references, sources, footnotes, or other research to validate findings, he says.

Using and building upon the FDA’s action plan may be an important step in establishing “guard rails” to keep a human hand in the picture and ensure doctors are not using ChatGPT out of context and putting patients at risk, Dr. Harb says.

Harb adds that the most important question when altering products or devising new innovations is how these can be made safe?

Providers should be involved in answering this question and can play a role in fine-tuning ChatGPT and otherwise building upon AI in healthcare, Dr. Harb says. “It's not the question of whether we should use it or not, but how to optimize using it and use it correctly,” he adds.

Success story

ChatGPT and other AI systems have produced noteworthy victories for medical assessments and diagnoses, Dr. Harb says. He adds that he expects more wins to come.

“There are so many things we do in hospitals and health care systems that have errors and inefficiencies,” Dr. Harb says. “What AI can bring to the table is a huge advantage in reducing waste, improving efficiency, and ultimately improving patient care.”

Among other things, some AI systems can successfully detect diseases that doctors overlook – or help providers work more efficiently with their time so that they don’t make mistakes elsewhere in their work.

ChatGPT and other AI innovations occupy a large portion of the medical field. From telemedicine and patient monitoring to machine learning (ML) algorithms and predictive analysis, AI technologies assist with note-taking, research gathering, disease diagnostics and identification, curating treatment plans, and drug discovery.

In the survey conducted by MDLinx, more than 40% of physicians said they use ChatGPT as a diagnostic tool. While 12% say they use it to help save time with note dictation.

How is ChatGPT being used in medicine?

Doctors who responded to MDLinx’s survey say they use a variety of AI products, including ChatGPT, transcription services, Amazon’s Alexa, radiology tools, diagnostic technologies, AI-based meal plan systems, and more.

Sumair Akhtar, MD, the chief clinical officer at Strive Health, a national provider for patients with chronic kidney disease, says he commonly uses predictive analytics AI to track and monitor kidney disease in his patients. He says this can help assess high-risk scenarios and stay on top of treatment.

“These details can raise important red flags that could progress a patient’s kidney disease too quickly, putting them at risk for dialysis dependence or a kidney transplant,” Dr. Akhtar says. “This use of technology helps us determine next steps and how we can best monitor each patient’s condition.”

Strive Health uses an AI system called CareMultiplier™. This predictive analytics system tracks “kidney disease indicators from disparate sources, including a patient’s long history of doctor appointments, medications, insurance claims, emergency room visits and anything else throughout their medical timeline,” Dr. Akhtar says.

The FDA has authorized “a growing number” of ML medical devices, specifically over the last decade, and the agency expects upward trends to continue. ML is a type of AI that trains algorithms to learn from data. The systems may provide novel benefits like the ability to “create new and important insights from the vast amount of data generated during the delivery of health care every day,” according to the FDA.

Medical AI systems may be able to reduce healthcare costs, says Kiachopoulos.

“Right now, the average time it takes from initial drug discovery to final drug approval is around twelve years, and it costs upwards of $2 billion for each new drug on the market,” Kiachopoulos explains. “While these costs are spread out throughout the drug development process, many drug programs fail because the drug discovery process is complex and antiquated.”

In the survey conducted by MDlinx, 65% of respondents said they used ChatGPT for drug information.

While Kiachopouloulous does not include ChatGPT in this group, he says that some AI systems can “read millions of documents of research, enabling researchers to rapidly map diseases, biomarkers, genes, molecular targets, and identify potential treatment options,” thus taking away failure risks and eliminating some forms of human bias. Building upon platforms can help researchers make new and more successful predictions, which can be critical for creating future “treatment options for rare diseases, growing antibiotic resistance, vaccines for widespread viruses and more,” Kiachopoulos adds.

As of its most recent October 2022 update, the FDA has approved at least 520 artificial intelligence (AI) and machine learning (ML) medical devices. Approved devices span several fields of medicine, including radiology, cardiology, and anesthesiology. 

Keeping healthcare human

One of our time's biggest national—and global—dilemmas is the shortage of healthcare providers. The shortage has forced many doctors and nurses to overload their schedules with new responsibilities, which can jeopardize their ability to give each case thorough dedication and care, Dr. Harb says.  

“Providers have more work, and they have to analyze larger amounts of data in a short period of time and make some critical decisions for patient care,” Dr. Harb says. “Sometimes they might not be able to get to the important data [quick enough] to make a decision.”

ChatGPT systems can help supplement doctors’ workloads by conducting research and translating info that doctors do not have time to thoroughly review, he adds. It is important to view ChatGPT as something that can work with providers, not instead of, he says. 

Not all physicians think ChatGPT is around to stay. The Mdlinx survey found that 58% of physicians believe the AI tool will eventually fall out of favor.

Dr. Akhtar voices a similar sentiment, adding that “although care teams still need to execute care plans and regularly interact with patients, AI can help get the wheels in motion for optimal success.”

“It’s certainly not a replacement for care, but we can expect more specialties to embrace similar tools that can act as an additional defense line for diagnoses,” Dr. Akhtar says.

Sources and References:

  1. Ai algorithms for disease detection: methodological decisions for development of models validated through a clinical, analytical, and commercial lens. Pharmaceutical Management Science Association.

  2. Alkaissi H, McFarlane SI. Artificial hallucinations in chatgpt: implications in scientific writing. Cureus. 15(2):e35179.

  3. Dave T, Athaluri SA, Singh S. ChatGPT in medicine: an overview of its applications, advantages, limitations, future prospects, and ethical considerations. Front Artif Intell. 2023;6:1169595.

  4. Health C for D and R. Artificial intelligence and machine learning (Ai/ml)-enabled medical devices. FDA. Published online October 5, 2022.

  5. Health C for D and R. Artificial intelligence and machine learning in software as a medical device. FDA. Published online August 4, 2023.

  6. Kumar Y, Koul A, Singla R, Ijaz MF. Artificial intelligence in disease diagnosis: a systematic literature review, synthesizing framework and future research agenda. J Ambient Intell Humaniz Comput. 2023;14(7):8459-8486.

  7. Metz C. A. I. Shows promise assisting physicians. The New York Times. Published February 11, 2019. Accessed August 31, 2023.

  8. Study looks at capability of ai for detecting overlooked liver metastases on cect. Diagnostic Imaging.

ADVERTISEMENT