秋葵视频

秋葵视频 Article
|
December 11, 2025

Experts Discuss AI in Mental Health Care Landscape

Share

鈥淎I can鈥檛 distinguish between a wink and a blink,鈥 said psychiatrist and social anthropologist Arthur Kleinman (Harvard University) when asked about how a human processing an algorithm in their head is different than an AI model processing an algorithm at the time of care during the 秋葵视频鈥檚 recent discussion, What are the Challenges and Opportunities of AI in Mental Health Care? 

Moderated by Sanjay Gupta (CNN; Emory University Medical School), the panel brought together members of the 秋葵视频鈥檚 AI and Mental Healthcare project Steering Committee to discuss the role of AI in mental healthcare, as detailed in the body's recent publication, AI and Mental Health Care: Issues, Challenges, and Opportunities. Building on the 秋葵视频鈥檚 cross-disciplinary work in AI, the project sought to identify the benefits and challenges of incorporating this technology in mental healthcare, as well as areas where more research is needed. 

Kleinman was joined on the panel by mental health innovation expert Kacie Kelley (Meadows Mental Health Institute) and physician and computer scientist Paul Dagum (Applied Cognition), who also served as a co-chair for the project. Alan Leshner (American Association for the Advancement of Science) and Sherry Glied (New York University), both co-chairs for this project, delivered opening and closing remarks for the panel discussion.

The event provided an opportunity for the Steering Committee鈥檚 experts to engage with the nuances of this topic in a way that is difficult without interdisciplinarity. Dagum said there are two tracks when we think about AI in mental healthcare: the consumer track, where evidence about its efficacy is mixed; and the regulated track, which examines efficacy and safety through rigorous studies similar to those required by the Food and Drug Administration. AI has tremendous potential as a therapeutic modality, but it requires regulation and rigorous study to ensure safety and efficacy. 

Panelists focused the discussion on valid concerns about AI being used with high-risk individuals and those experiencing mental health crises, as chatbots have a record of missing suicidal ideation. Often, though, the emergency room or the criminal justice system is where people experiencing mental health crises are diagnosed. Many people also receive mental health services from their primary care provider, rather than a mental healthcare provider. In thinking about this, Kelley noted that, 鈥淲e need to detect risk earlier and give access to care before鈥痗risis.鈥 AI, she said, excites her for its potential to aid in the primary care setting. 

The conversation then turned to the therapeutic benefits to using AI in mental healthcare. This technology has the potential to support human practitioners, for example, by encouraging patients to practice skills and holding them accountable between sessions. While AI won鈥檛 solve the problem of disparities in mental healthcare, it can help address them. Rural communities without trained mental healthcare workers may have community health workers who can use AI to provide some measure of mental healthcare, Kleinman pointed out. 

The 秋葵视频 continues to engage in cross-disciplinary work centered on AI, convening experts across fields to consider the impact of this technology, with a dedicated Daedalus volume examining questions of AI and identity slated for release in Fall 2026. 

Read the AI and Mental Health publication.
Share

Related

Project

AI and Mental Health Care

Chairs
Paul Dagum, Sherry Glied, and Alan I. Leshner