EMOTIONAL WELLNESS with Dr. Mark Lerner

Can AI Chatbots Replace Mental Health Professionals?

Dr. Mark Lerner

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 9:17

An AI-Integrated Emotional Wellness™ Perspective

We’re increasingly turning to AI chatbots for information and emotional support. Not because we don't see the value of professional counseling or therapy, but because they’re easily accessible, 24/7, and provide helpful, seemingly evidence-based strategies and tools to address the challenges we’re facing.

However, mental health care is not merely the provision of techniques and coping tools. It’s a professional relationship grounded in clinical judgment, ethical responsibility, and years of knowledge, skill, experience, training, and education.


NationalCenterforEmotionalWellness.org

SPEAKER_00

The National Center for Emotional Wellness presents Emotional Wellness with Dr. Mark Lerner. Can AI chatbots replace mental health professionals? An AI integrated emotional wellness perspective. We're increasingly turning to AI chatbots for information and emotional support. Not because we don't see the value of professional counseling or therapy, but because they're easily accessible, 24-7, and provide helpful, seemingly evidence-based strategies and tools to help address the challenges we're facing. AI is always there, ready to help, offering techniques to address diverse problems. Perhaps its greatest strengths is its accessibility and affordability. Let's face it, talk isn't cheap when it comes to speaking with a mental health professional, even if part of the fee is covered by insurance. But the old saying, you get what you pay for certainly applies when it comes to health care. And mental health care in particular requires the presence of another human being, ideally in person. As a psychologist, my training began over four decades ago, grounded in the importance and value of the scientific method and research. Well, when it comes to AI chatbots, there are recent data that support the reduction of symptoms of anxiety and depression, at least in the short term. Research supports the potential benefit of AI chatbots when it comes to symptom management. But here lies a critical question. Are we treating symptoms or are we treating people? I'm reminded of several famous quotes. The good physician treats the disease. The great physician treats the patient who has the disease. Sir William Osler. Hippocrates. In therapy, the problem is always the whole person, never the symptom alone. Carl Jung. We must keep in mind that the reduction of symptoms is not mental health care. Simply stated, the absence of symptoms does not constitute emotional wellness. The National Center defines emotional wellness as the awareness, understanding, and acceptance of our feelings, and the ability to effectively manage challenges and change. It also reflects our capacity to sublimate, to harness painful emotional energy from adversity, and channel it into constructive action, not merely to survive, but to thrive. Mental health care is not merely the provision of techniques and coping tools. It's a professional relationship grounded in clinical judgment, ethical responsibility, and years of knowledge, skill, experience, training, and education. AI chatbots are not licensed to provide ethical, responsible, informed, safe, and confidential care. There's no regulatory board or agency that oversees their clinical decision making or accountability. Unfortunately, we're reading and hearing about people who have harmed themselves or others after communicating with a chatbot. These reports are deeply concerning and underscore the need for caution and ethical oversight. Professional mental health organizations are raising concerns about privacy, data security, transparency, and the ethical limits of artificial intelligence in mental health care. These concerns are not anti-technology or anti-innovation. They're safeguards for human dignity. However, there's a deeper issue that's often overlooked. AI can be taught to convey empathic language, a communicated understanding of our feelings. But a chatbot will never replace human presence. As I've repeatedly written, AI can't look at you with eyes filled with compassion. Hold your hand as your eyes pool with tears. Embrace you while you're crying. Convey warmth through presence without saying a word. Sit beside you and say, I'm here for you. It's not what we say that helps others most, but what we don't say. Creating a safe, non-judgmental relationship where people feel free to share openly and discover their answers is often the best help we can offer. In my article, Your New Best Friend, I acknowledged a reality. People are already forming attachments to AI systems because they're accessible and responsive. That reality must be addressed thoughtfully and not dismissed. Accessibility should never be confused with the irreplaceable presence of another person. Humanity. This is where AI-integrated emotional wellness offers clarity and a solution. AI-integrated emotional wellness refers to the interface of artificial intelligence and the complexity and depth of human emotion. It recognizes the value of technology in offering accessible, evidence-based techniques, strategies, and tools that can help us. It can engage individuals who might otherwise avoid seeking help. It can serve as an adjunct to professional mental health care. It can provide a sense of stability. But AI integrated emotional wellness rests on an unwavering principle. Technology must support human presence, not replace it. Technology informs, humanity heals. Emotional wellness develops through our interpersonal relationships. It's strengthened through authentic engagement, accountability, and the lived experience of being seen and understood, particularly during periods of profound stress, organizational challenges, medical illness, betrayal, grief, and identity shifts. Human beings required more than structured prompts. They require human connection. The future of mental health is not AI versus mental health professionals. It's the thoughtful, ethical, and professional integration of technology, information with genuine human presence, humanity. That interface is AI integrated emotional wellness. Thank you for listening to this program from the National Center for Emotional Wellness. To learn more about the Center's accessible information, engaging presentations, and innovative consultation, visit National Center for Emotional Wellness dot org. Until next time, remember, technology informs us. Humanity empowers us.