As artificial intelligence crawls its way into different aspects of human experiences, jeopardizing certain occupations, one may wonder what AI could mean for mental health professionals.
Ireti DeBato-Cancel, a mental health coordinator at Oregon State University’s Counseling and Psychological Services, believes AI cannot replace therapists but the use of AI can improve how therapy is conducted.
“Artificial intelligence is not going to replace clinicians, but clinicians using AI are going to replace those who don’t,” DeBato-Cancel said. “So it’s a matter of learning how to use the tool in a way that enhances the work that we already do.”
DeBato-Cancel emphasized a key point: AI cannot replicate the essential human connection that defines therapeutic relationships.
“At the core of therapy is a human connection,” DeBato-Cancel said. “AI can crunch numbers, recognize patterns, but there’s something really healing about sitting in a room across from somebody that sees you, hears you, and empathizes with your journey.”
DeBato-Cancel emphasized the potential of AI and what that could mean in the mental health field.
Chatbots and virtual assistants can offer immediate support in crisis situations, providing a lifeline for individuals who may not have access to a therapist at the moment.
DeBato-Cancel used a natural disaster as an example for when human resources could be limited for a period of time.
“Say a natural disaster happens and you don’t want to bring more human bodies into that area,” DeBato-Cancel said. A bot could step in to assess a person’s emotional state, identify trauma, and provide support until human help is available.
Furthermore, Debato-Cancel said that on a daily basis, a therapist spends a significant amount of time on paperwork and administrative tasks.
AI has the potential to assist with scheduling, tracking progress between sessions, and even analyzing data to identify patterns in client behavior. This way, mental health professionals are left with more time to focus on interpersonal aspects of care.
However, integrating AI into therapy comes with ethical challenges. DeBato-Cancel said transparency regarding AI use in therapeutic settings is very important and it’s critical to ensure that clients’ data remains confidential and secure.
DeBato-Cancel envisioned AI as a tool to augment, not replace, therapeutic practices. She underscored the need for human oversight to ensure that AI systems align with ethical and clinical standards.
At OSU, innovation in this field is already underway. Both CAPS and the Counseling and Mental Health Peer Support program are exploring how AI can enhance their services.
These initiatives place OSU at the forefront of integrating AI into counseling practices, potentially setting a standard for universities nationwide.