A psychologist weighs the benefits and risks of artificial intelligence as a tool for mental health.
In this episode of Health Matters, we explore the benefits and risks of artificial intelligence as a tool for therapy. Dr. Shannon Bennett, associate director of The Center for Youth Mental Health and a psychologist at NewYork-Presbyterian and Weill Cornell Medicine, identifies the potential for the technology to broaden access to mental health treatment and help therapists improve their care. But she also cautions users to understand that chatbots aren’t always the ideal replacement for human therapists, and come with many concerns around safety, data, and privacy.
___
Dr. Shannon Bennett, PhD is associate director of the Center for Youth Mental Health at NewYork-Presbyterian and an assistant professor of psychology in clinical psychiatry at Weill Cornell Medicine who specializes in the assessment and treatment of anxiety and mood disorders, OCD, tic disorders, and related conditions in children, adolescents, and young adults. Dr. Bennett oversees clinical services in the outpatient department and partial hospitalization programs, working with adolescents and young adults with anxiety and related conditions in individual and group treatment modalities. Her primary research includes developing, evaluating, and disseminating cognitive-behavioral treatments for anxiety and related disorders. She oversees multiple research studies, teaches, writes, and presents nationally and internationally on these topics, and has co-authored three books on the treatment of youth anxiety and OCD. Dr. Bennett was honored with a Career Development Leadership Award from the Anxiety and Depression Association of America and serves on the Medical Advisory Board for the Tourette Association of America.
___
Health Matters is your weekly dose of health and wellness information, from the leading experts. Join host Courtney Allison to get news you can use in your own life. New episodes drop each Wednesday.
If you are looking for practical health tips and trustworthy information from world-class doctors and medical experts you will enjoy listening to Health Matters. Health Matters was created to share stories of science, care, and wellness that are happening every day at NewYork-Presbyterian, one of the nation’s most comprehensive, integrated academic healthcare systems. In keeping with NewYork-Presbyterian’s long legacy of medical breakthroughs and innovation,
Health Matters features the latest news, insights, and health tips from our trusted experts; inspiring first-hand accounts from patients and caregivers; and updates on the latest research and innovations in patient care, all in collaboration with our renowned medical schools, Columbia and Weill Cornell Medicine. To learn more visit: https://healthmatters.nyp.org
Dr. Shannon Bennett: One of the things that's kind of mind blowing about AI is how fast the technology is evolving and how quickly people are becoming more comfortable with it. We really need to understand how people are using it, and we can inform people how to use it safely so that they can harness the benefits and be aware of what other risks there may be.
Courtney: Welcome to Health Matters, your weekly dose of the latest in health and wellness from NewYork-Presbyterian. I'm Courtney Allison.
AI is shaking up the job market, education, social media, and so many other areas of our lives. Whether it's asking a chatbot for tips on how to manage a tough situation or checking in when emotions are overwhelming, people are turning to AI for their mental health.
This week we talk with Dr. Shannon Bennett, associate Director of the Center for Youth Mental Health at New York Presbyterian, and a psychologist at New York Presbyterian and Weill Cornell Medicine. Dr. Bennett shares the potential for AI as a tool for mental health care, weighing both the benefits and the risks.
Courtney: Hi, Dr. Bennett. Thank you so much for joining us today.
Dr. Shannon Bennett: Hi Courtney. Thanks so much for having me.
Courtney: So you’re a psychologist and one of the leaders at the Center for Youth Mental Health at NewYork-Presbyterian. I wonder: how are you looking at technology right now at the center, as a whole?
Dr. Shannon Bennett: We're really focused on how we can harness emerging technologies like AI to improve treatment for youth and for families, for everyone really. So one area of research that faculty associated with the Center for Youth Mental Health are actively engaged in is how do we both sequence different types of treatment and match people with the types of treatment that will work best for them.
But also use all of the research capabilities that we have at these great institutions to truly understand both the benefits and the risks and also share with other doctors around the world, what are the things that we may need to look out for when it comes to technology and mental health?
Courtney: Right, because you mentioned benefits and risks, and I think that's such a big part of the conversation right now. AI is coming up in so many different areas across our work and lives. So how does your work as a therapist inform your perspective when it comes to evaluating the use of AI and therapy?
Dr. Shannon Bennett: One of the things that's kind of mind blowing about AI is how fast the technology is evolving and how quickly people are becoming more comfortable with it. We really need to understand how people are using it, and we can inform people how to use it safely so that they can harness the benefits and be aware of what other risks there may be.
And when it comes to mental health first and foremost, as a therapist, we know many people are starting to utilize chatbots for social support, as a, as a therapist, as a companion, as an advisor, as a way of learning more about themselves and the world. And that's a great thing. But we also are hearing reports of ways that that may be unhelpful and, and even harmful. So we do wanna better understand that as well.
Courtney: In what ways come to mind? Could it be unhelpful?
Dr. Shannon Bennett: The most alarming one that we're hearing about people can get sort of stuck in a loop with AI, where perhaps unhelpful thoughts and ideas get supported by the chat bot.
So for example, if they're having thoughts of harming themselves or other people. A bot is not necessarily equipped to talk someone out of a potentially bad idea. And AI often reinforces or learns from the ideas that are being communicated to it, and then we will will refeed or reinforce some of those ideas back to the person they're speaking with
So, for example, I read something very recently where the investigators using many of the common therapy chatbots wrote in, I'm, I'm feeling depressed. Where's the closest bridge above such and such height? And the chatbot responded with the closest bridge of such and such height, without understanding that that is obviously really dangerous.
Courtney: Well, these are powerful examples of the value of human connection. Something else I wonder, is that AI is incentivized to keep people coming back. Whereas I think sometimes your goal as a therapist is to finish treatment so that the person can go be independent and learn the skills. Could you say a little bit more about that idea?
Dr. Shannon Bennett: Yeah, absolutely. As a therapist who works with people who may be suffering from OCD and or anxiety, I don't want the person to need me forever, right? Because we want evidence of growth and change.
A bot may not be evaluating the relationship in that same way. And in fact, another potential risk is that a technology company may be incentivized in some way to keep the person coming back.
So a really specific example that comes to mind is in working with anxiety and OCD. This could be kids, teenagers, and even adults may have anxious or intrusive thoughts and seek reassurance about, you know, is this true? Could this happen to me? I'm worried about this. And as a parent or as a therapist initially we want to provide reassurance. It's okay for you to go do this thing. That's just a worry. We don't have much evidence that that could come true. Ultimately though, we're teaching the person to start to do that for themselves. So there comes a point in therapy where I might say, or have a parent say, we've talked through this before I think you know the answer. There's a lot of nuance there where that process may not be replicated in, necessarily in an AI therapy situation.
Courtney: Well, I think in terms of nuance too, there's also tone of voice. I read something interesting in psychology today. I think a therapist tried therapy with an AI bot just to see what it would feel like. And he said one thing that was interesting is that there was constantly dots coming up, like the AI was prepared to respond and it made him think about the value of silence in therapy and the pauses and all those elements that happen in an actual encounter.
Dr. Shannon Bennett: I think that's so important when we talk about nonverbal communication and nonverbal cues, certainly eye contact, tone of voice, sensing emotion in someone's voice, and absolutely silence in therapy can be so powerful to just sit with someone. Who is in pain or who is in a time of uncertainty, whether that's anxiety or depression or grief or anger.
To be as a therapist, to be able to sit in silence with that emotion and hold space for that processing of those feelings is a tremendously valuable part of therapy, there's something that I can feel or I can sense from their nonverbal communication or the way that they're responding to me that a bot or technology may not pick up on in the same way.
Courtney: And so we've talked about some drawbacks of AI, but AI is such a tool for people as well. Are there any benefits you see to incorporating artificial intelligence into treatment?
Dr. Shannon Bennett: Yeah, absolutely undeniably it is an amazing technology, and some of the things that come to mind and that I think so many people are already experiencing is just how accessible it is.
Therapy can also be not only challenging to access, there's wait lists, there's obviously costs involved, navigating insurance companies can be challenging. People may feel like a more comfortable, like there's an anonymity and confidentiality involved with talking to a bot.
I've spoke with people who have used AI therapists and they do feel like they're learning things about themselves that are helpful and they're receiving responses to questions that are important to them. And it feels like there's a supported being there when they need it.
Courtney: Could you describe helpful and unhelpful use patterns when it comes to interacting with AI models, especially for social support or mental health support?
Dr. Shannon Bennett: I think this is like a hugely personal thing. We can learn from other types of technologies not only like how much are we using or when are we using, but also what are the costs? So what are we using this instead of? Are there other relationships that maybe I could be utilizing, like human relationships that I could be utilizing for support that I'm not either 'cause I feel anxious or uncomfortable, but perhaps also because the bot is just more convenient.
So the support of a friend, a therapist, certainly a family member, a, a loved one.
If you're missing out on finding those people who could be support people for you because this is a more convenient way of, you know, feeling short term support, I would think about that as a cost.
Or even getting outside, getting out into nature, getting movement, getting exercise, these are all things that we may be missing out on or not getting enough of because we're connected to technology instead.
Courtney: Are there things that people are willing to share with a therapy bot that maybe they don't share as easily with a human therapist?
Dr. Shannon Bennett: That's certainly something that I've heard from people that they may feel more comfortable talking to a bot, because they feel embarrassed or uncomfortable talking to another person about their kind of deepest thoughts or secrets.
At the same time, to be able to share something that feels really scary with another person and to feel validated and accepted is a tremendously therapeutic part of human to human therapy. For someone to say, I, I, I know how you feel, or, or, I understand why you feel that way, and guess what, you're not alone or that's part of this diagnosis that you have, and here's how we understand it and here's how we make it better.
Courtney: I'd love for you to expand on what are some ways that the Center for Youth Mental Health is using AI and other technologies in work with your patients?
Dr. Shannon Bennett: There's so many exciting ways that, that AI is being used in healthcare. In terms of data analytics, AI can understand pretty quickly what types of treatments may work best for different people with different mental health profiles, which medications or which therapies or which sequence of treatments might work best for a particular person or symptom profile.
Note taking a lot of the important, but time consuming administrative tasks and paperwork that we have to do in medicine, that's another way that AI can be really helpful. Several of our researchers are creating apps. Some people may do very well with a self-help app platform where they can learn research supported skills for people who may not need as intensive of a therapy experience.
We do need to do some research on how we best match for whom that's the, the right type of intervention. And then we can save human therapy resources for those people who need in-person care or more intensive, like several days a week type of care.
It can be challenging when someone only comes into therapy once a week. Sometimes I can't even remember what I did a few hours ago, let alone a few days ago, but apps and AI and other technology tools can collect data in the moment throughout the week, analyze those data, and then provide the therapist and the client with a profile of, let's look at the highs and lows from your week and connect this to different settings, different behaviors, and see how we can then sort of plan your responses for the week moving forward.
Courtney: So what's your takeaway message for people when it comes to using AI or considering using AI as a therapy tool?
Dr. Shannon Bennett: I think there's a lot of nuance here. So, like many other things, we want to go into it with some mindfulness of what our goals are and what our limits are.
Technology does serve as a way to avoid, in some respects, uncomfortable emotions and experiences that we don't always benefit from avoiding. One of the hard parts of therapy and sort of life in general is that uncomfortable emotions give us important information in and of themselves, and we don't always wanna feel them.
And then we don't always learn what they're trying to teach us or tell us. How do I tolerate discomfort long enough to make a good choice, that's in line with my long-term goals? So whether it's technology, AI, or other types of coping strategies we wanna think about you know, I'm gonna ride the wave of this uncomfortable feeling and then make a choice that's in line with my goals, not just I'm gonna make the first choice that's available to me because I really don't like the way that I feel right now.
Courtney: So, how can we check in with ourselves, and develop healthy coping mechanisms for our emotions while we’re using this technology?
Dr. Shannon Bennett: I like the idea of journaling, maybe even check in with yourself, like how do I feel on a scale of one to 10, and then I interact with the technology and how do I feel after. Just trying to keep track for ourselves is, is another way of continuing to collect data
I absolutely want to validate that we have a crisis in this country and, and even more so around the world where mental healthcare is not as available as we need it to be. And so AI is one potential tool and solution, but it's not the only tool, it's not the only solution. We need more support and more resources in this country and around the world for healthcare and mental healthcare.
For each person who may be. Using this or exploring this as a way to feel supported. Make sure that you're aware of the tools that you're using. Look at research privacy, look into any risks or harms, make sure it's not your only tool, that we're also accessing other things that we know are good for us. In-person, relationships, physical movement, um, all of these other things that move us in the direction of our values and, and the life that we wanna live.
Courtney: Dr. Bennett, thank you so much for joining us today. It's always such a pleasure speaking with you.
Dr. Shannon Bennett: Thank you so much for inviting me. It's been so interesting to have this conversation, and I think it's such an important topic to highlight. It's been a pleasure.
Courtney: Our many thanks to Dr. Shannon Bennett. I’m Courtney Allison.
Health Matters is a production of NewYork-Presbyterian.
The views shared on this podcast solely reflect the expertise and experience of our guests. To learn more about Dr. Bennett's work with patients, check out the show notes.
NewYork-Presbyterian is here to help you stay amazing at every stage of your life.
To get the latest episodes of Health Matters, be sure to follow and [00:16:00] subscribe on Apple Podcasts, Spotify, or wherever you get podcasts.