5 specialties AI can’t replace (yet)

By Alpana Mohta, MD, DNB, FEADV, FIADVL, IFAAD | Fact-checked by Davi Sherman
Published June 17, 2025


Key Takeaways

Industry Buzz

  • "AI is unable to detect transference, and because it has no feelings, by definition it has no countertransference, an essential tool in psychoanalytic work.” — Steven Reidbord, MD, psychiatrist

  • “Involving both the child and the parents is crucial in pediatric visits… Some parents want their child involved from the start, while others prefer to handle the information themselves at home. AI lacks the ability to navigate these nuances.” — Ben Reinking, MD, pediatric cardiologist

It's 2025, and hospitals and clinics everywhere are now using AI to summarize charts, flag malignancies on scans, and even suggest differential diagnoses—but it still falters where medicine is most human. 

Many physicians are of the opinion that AI systems lack genuine empathy, ethical judgment, and the ability to interpret non-verbal cues—qualities intrinsic to human therapists.[]

And recent studies show that patients value empathy, contextual judgment, and moral guidance as much as technical accuracy—elements that remain stubbornly hard to code.

Here are five specialties AI can't replace—and that best illustrate why the doctor is still, in crucial ways, irreplaceable.

Psychiatry & mental health care

A recent survey published in BMC Medical Ethics explored public perceptions of AI in healthcare. While respondents acknowledged the efficiency of AI for administrative tasks, many expressed concerns about the potential loss of personal relationships and the "human touch" in patient care.[]

A 2025 study compared GPTo-4-based chatbots with human therapists and found “major gaps” in authentic empathy and bias detection: GPT-4o could sound warm, sometimes excessively so, when users described distress, yet it rarely offered solution-focused insights or genuine cognitive understanding.[] Lead author Mahnaz Roshanaei observed, “It’s [the chatbot] is very emotional in terms of negative feelings, it tries to be very nice. But when a person talks about very positive events happening to them, it doesn’t seem to care.” 

Clinicians echo this concern. Board-certified psychiatrist Steven Reidbord, MD, explains, “AI can serve a teaching function or present cognitive exercises and suggestions to patients. But [it] only simulates empathy—a human emotion. AI is unable to detect transference, and because it has no feelings, by definition it has no countertransference, an essential tool in psychoanalytic work.” 

Fellow psychiatrist Cooper Stone, DO, adds that therapeutic empathy must be deployed deliberately to help patients “open up and provide valuable information,” noting, “At the end of the day, patients ultimately want to feel understood.” 

Primary care & family medicine

Front-line physicians spend as much time coaching and counseling as they do prescribing. Unlike specialties who may lean more on imaging or lab data, primary care often centers around verbal communication, emotional cues, and trust-building—factors that AI currently struggles to replicate.[]

Ben Reinking, MD, a board-certified pediatrician and pediatric cardiologist, calls the physician-patient relationship foundational. “It involves understanding emotions and building connections with the entire family. These are critical elements that I believe AI will never fully replace,” he asserts.

Meanwhile, a recent University of Maine study has shown that clinicians still outperform large language models when cases become ethically or emotionally complex.[]

C. Matt Graham, author of the study and Associate Professor of Information Systems and Security Management at the Maine Business School, said, “This isn't about replacing doctors and nurses. It's about augmenting their abilities. AI can be a second set of eyes; it can help clinicians sift through mountains of data, recognize patterns and offer evidence-based recommendations in real time."

Pediatrics

Children, especially younger ones, often can’t articulate symptoms, feelings, or fears clearly.[] Pediatric care depends heavily on a physician’s observational skills, intuition, and ability to interpret non-verbal cues—something AI still cannot do reliably.

Furthermore, these algorithms, which are trained on adult data, can often miss developmental subtleties, and ethical frameworks for children demand heightened transparency and parental trust.

Dr. Reinking says that AI cannot replicate the nuanced approach that physicians take while interacting with children of different ages. 

He adds, “Involving both the child and the parents is crucial in pediatric visits. The approach varies with the child's age, developmental status, and the parents' preferences. Some parents want their child involved from the start, while others prefer to handle the information themselves at home. AI lacks the ability to navigate these nuances.”

Evan Nadler, MD, MBA, the founder of ProCare Consultants and ProCare TeleHealth, states “AI won't be able to know whether a pediatric-age patient is at a developmental stage commensurate with their chronological age or not, which could impact choice of treatment for obesity or perhaps other disorders.”

Pediatricians adjust their language and behavior to fit a child’s age and developmental stage. AI lacks the flexibility to do this authentically, risking oversimplifying or misunderstanding a child’s needs.

Palliative & end-of-life medicine

Hospice conversations blend clinical facts with existential questions that no algorithm can parse. A May 2025 systematic review of AI in palliative settings concluded that while machine learning can predict symptom trajectories, only human clinicians can understand and empathize with the moral nuance and cultural meaning at the end of life.[] As Moti Gamburd, CEO of CARE Homecare, observes, “In palliative care, silence and presence can be more powerful than words.”

With respect to dehumanization in medicine, connectional silence is one of the intangible human behaviors that algorithms cannot replicate because it relies on emotional intuition, timing, and real-time sensitivity to another person’s inner state.

Gamburd recalls how caregivers would “sit beside a client in their final hours, holding their hand quietly while the family gathered. No one needed to say anything. The presence alone brought comfort.”

Surgery & interventional fields

Robotic platforms, such as the latest da Vinci XI, improve precision, yet they remain extensions of the surgeon’s hands. Interventional radiologist Tonie Reincke, MD, observes, “AI may be able to teach a robot how to perform a procedure but falls short regarding critical thinking related to complications and acute decision-making if an abrupt change in the procedure is necessary.” Challenges like lack of haptic feedback, steep learning curves, and system cost are barriers to full automation.

A separate systematic review on the ethics of robot-assisted surgeries highlights unresolved questions of accountability: Patients and regulators still expect a human surgeon to own every intra-operative decision.[]

Plastic surgeon Arnold Breitbart, MD, notes, “Patients come into my practice vulnerable and oftentimes anxious about their own body image… They are looking for reassurance, empathy, and sometimes tough, honest feedback on realistic outcomes.”

Across these five fields, AI excels at pattern recognition, documentation, and logistical support. What it cannot yet do is care—to notice a subtle wince, weigh a cultural value, or hold a patient’s hand. That gap is where physicians continue to add irreplaceable value.

Read Next: These 7 specialties may be obsolete in the next decade

SHARE THIS ARTICLE

ADVERTISEMENT