Medical mystery: ChatGPT correctly diagnoses a boy after three years with no clear answers
Key Takeaways
ChatGPT found a diagnosis missed for three years by 17 doctors.
ChatGPT’s AI pulls data from the entire internet to suggest possible diagnoses.
A mother spent three years taking her young son to multiple specialists, looking for an explanation for his chronic pain. Starting in 2020, Courtney and her son, Alex, saw 17 doctors over three years, according to news sources. None of them found an answer. Eventually, Courtney turned to ChatGPT and entered Alex’s medical information. ChatGPT returned with a diagnosis no doctor had ever suggested: tethered cord syndrome.[]
Courtney first observed her son’s chronic pain during the COVID-19 lockdown in 2020. Alex began experiencing pain after using a recently purchased bounce house. The family’s nanny reported giving Alex Motrin each day to prevent “‘gigantic meltdowns.’” Alex’s pain sometimes kept him from playing, and he began chewing on things.[]
Courtney took her son to a dentist to address his symptoms. At the time, she wondered if his molars were growing in or if he had a cavity. The dentist ruled out these and other dental issues but thought that Alex might be grinding his teeth at night. The dentist suggested that the family see an orthodontist specializing in airway obstruction.[]
An orthodontist examined Alex and found that his palate was too small for his mouth and teeth, causing breathing difficulties at night. The orthodontist fitted Alex with an expander, which helped temporarily. However, Courtney soon observed that Alex was no longer growing taller. By 2021, Alex had started walking with an imbalance, leading with his right foot and dragging his left. His pediatrician referred them to a physical therapist.[]
Around this time, Alex began experiencing severe headaches. He visited a neurologist, who diagnosed him with migraines. When Alex started physical therapy sessions, the therapist suggested he might have a Chiari malformation. Alex was also displaying signs of exhaustion. Courtney brought him to an ear, nose, and throat specialist to assess him for possible sinus problems. In the following years, they visited more physicians, including a new pediatrician, a pediatric internist, an adult internist, and a musculoskeletal doctor. Alex was seen by a total of 17 doctors over three years.[]
After three years without a diagnosis, Courtney turned to a new source for answers. She signed up for ChatGPT and spent an evening entering her son’s medical information into the program. That’s when ChatGPT suggested a diagnosis of tethered cord syndrome.[]
After researching the condition, Courtney made an appointment for Alex with a new neurosurgeon and told her she suspected Alex had tethered cord syndrome. After looking at Alex’s MRI images, the neurosurgeon diagnosed him with spina bifida occulta and an associated tethered spinal cord.[]
Since being diagnosed, Alex has received surgery for his tethered spinal cord.
Physicians and ChatGPT
ChatGPT is an artificial intelligence (AI); it can make predictions using advanced search algorithms. ChatGPT finds its answers using data that already exists on the internet and takes cues from the questions users ask it. Unlike older symptom checker tools that only allow users to choose from preset symptoms and that pull possible answers from a programmed dataset, ChatGPT allows users to enter as much information and as many details as they want. The system then pulls its responses from the entire internet. []
Given this, it’s no wonder that it’s possible for ChatGPT to sometimes find answers that physicians have missed, especially when it comes to difficult cases or rare diagnoses. Even the most knowledgeable physicians can’t access the entire internet’s medical data storage in just seconds. However, this doesn’t mean that ChatGPT can replace a physician or their diagnostic expertise.
Related: ChatGPT: A pocket-sized mentor or a useless AI gadget? Doctors debate its role in medicineChatGPT might be able to suggest diagnoses quickly, but it’s not perfect; it can pull up false, misleading, and confusing results and is prone to user error. Additionally, as leading organizations like the American Medical Association (AMA) have stated, AI, including ChatGPT, does not have a thought process and cannot exercise judgment. As such, it cannot substitute for an actual medical diagnosis from a physician. However, what it can do is act as a tool for physicians.[]
Speaking to the AMA Update Podcast, John Halamka, MD, President of the Mayo Clinic platform, said, “‘In the short term, you’ll see it [ChatGPT] used for administrative purposes, for generating text that humans then edit to correct the facts, and the result is reduction of human burden.”[]