How do we integrate AI into mental health? These researchers have pragmatic ideas

By Joe Hannan | Fact-checked by MDLinx staff
Published October 18, 2022

Key Takeaways

  • Research on the use of artificial intelligence (AI) in healthcare has largely focused on its ethical, societal, and computer programming implications—not its applications in the clinical realm.

  • A research-based framework provides a roadmap for implementing AI on the frontline of mental healthcare.

  • Mental health clinicians can use this framework as more AI-based clinical technology becomes available for practice.

For years, the use of artificial intelligence (AI) in healthcare has been more speculative than practical. But as the technology matures, what was once hypothetical is becoming real—and useful.

The work of two researchers may usher in a new age for use of AI in mental healthcare.

Their hope is that it may help clinicians with administrative tasks, patient support, and training.

Huge administrative burden

Katherine Kellog, PhD, and Shiri Sadeh-Sharvit, PhD, Chief Clinical Officer, Eleos Health, and Associate Professor, Palo Alto University, have developed a framework to advance widespread AI use in mental healthcare. Their research, which was published in a September 2022 Frontiers in Psychiatry perspective article, outlines a roadmap for clinical AI implementation in mental healthcare.[]

Dr. Kellogg is a professor of business administration at the MIT Sloan School of Management. Dr. Sadeh-Sharvit is a clinical psychologist and an associate professor at Palo Alto University.

Dr. Kellogg said that while there has been an abundance of research on AI’s ethical implications and organizational uses for quality and cost control, its clinical applications are underexplored.

“We think that the key to getting frontline providers to use AI technologies is to figure out, ‘ ‘What are their challenges in putting these technologies into practice?’” she said. “And how can clinical leaders and software developers help address these very real challenges that occur in clinicians’ day-to-day work?”

Measurement-based care is where the rubber meets the road for mental health clinicians and AI, according to Dr. Sadeh-Sharvit.

Clinicians know that data improve care, but the processes of manual collection and retrospective analysis can lead to inaccuracies. AI-based solutions, however, can collect and parse that data in real time.

“As a therapist myself, I know that I’m drowning in paperwork and am under a huge administrative burden,” Dr. Sadeh-Sharvit said. “If anyone can help with that, I think it would be highly recommended.”

How do we do it? Drs. Kellogg and Sadeh-Sharvit have some ideas.

A pragmatic framework

Their framework includes three categories of AI-based technology that can assist clinicians on the frontline of care: automation, patient engagement, and clinical decision support technologies.

Automation technologies automate or partially automate screening, diagnosis, and treatment recommendations, the researchers wrote. This might look like computer vision combined with machine learning to track eye-gaze patterns for early autism detection.

Automation technologies that incorporate AI can also recognize conversations and emotions for screening purposes—and can administer digital surveys.

“There's been a lot of research recently on how scribes can help in delivering care by freeing up clinician time to focus on face-to-face interaction with patients,” Dr. Kellogg said.

"Automation technologies are a way to act as a scribe in the clinical session."

Katherine Kellog, PhD

Patient engagement technologies include conversational agents and chatbots. These are dialogue systems that carry out natural language processing and respond with human language.[] Clinicians are likely more familiar with chat bots, such as Woebot.

The researchers wrote that clinicians could use these technologies to promote behavior changes such as smoking cessation, exercise, or nutrition. They can also provide limited treatment functions like CBT, DBT, and motivational interviewing where interactions are scripted.

Dr. Sadeh-Sharvit gave the example of working with a patient who has anorexia and is not sleeping well. “I can prescribe her an evidence-based, ethical app (for insomnia),” she said. “This increases engagement with the treatment and also serves as an extension of myself as a therapist between the sessions to continue sporting her.”

Decision support technologies make predictions using past data. This technology has evolved to a point where it can analyze speech as well as text. Supervised machine learning can predict ADHD, autism, schizophrenia, Tourette syndrome, and suicidal ideation.

“Clinicians are just bombarded with a million different things coming at them and a million different things they need to think about,” Dr. Kellogg said. “What decision support technologies can help with is focusing them on what’s most important, giving them suggestions to help with their evidence-based care.”

Addressing the provider shortage

These innovations won't just free up clinicians to focus more on human interaction. Drs. Kellogg and Sadeh-Sharvit said they could also help alleviate the provider shortage.

Dr. Kellogg said that the increase in demand for mental healthcare will create a wave of clinicians who are short on experience. Decision support technologies can help them develop their skills as they advance through the field.

"Where I see these technologies really being able to help with access is through training and providing more targeted supervisory support."

Katherine Kellog, PhD

Dr. Sadeh-Sharvit predicted these technologies will improve the quality of care, making it more efficient and ultimately allowing for higher patient volumes.

“What we see right now in community-based clinics all across the US is that they rely heavily on unlicensed therapists, because the older therapists—the ones [who] are more seasoned—just scaled back their hours or just leave completely to open their own private practices,” she said. “There’s no one [who] can help refine the clinical skills in a way that is really scalable and feasible.”

Addressing the doubts

There will be doubts along the way; chief among them may be skepticism from clinicians. Can AI do the same things as a trained mental health professional?

Drs. Kellogg and Sadeh-Sharvit did not frame this as an either-or proposition. Rather, it’s a both-and: Humans and AI working in tandem.

“Millions of years of evolution have made sure that we rely on the human connection,” Dr. Sadeh-Sharvit said. “Human attachment and interpersonal bonding are crucial to make significant changes in one's life."

"I do not see a future where bots and automated platforms replace [the] interpersonal connection that’s so crucial to our well-being, and to our survival as a species."

Shiri Sadeh-Sharvit, PhD

What this means for you

Most likely, AI will not replace the human interaction that’s at the core of mental healthcare. However, it may help alleviate the administrative burden, support patients outside of human-centered talk therapy, and help train clinicians, helping to alleviate the shortage.

Read Next: Has AI lived up to its promise for healthcare?
Share with emailShare to FacebookShare to LinkedInShare to Twitter