By Kenya Godette

Students are turning to AI chatbots to help them manage the stressors of transitioning to college life, according to Joy Himmel, Psy.D., director of Counseling Services at 勛圖惇蹋. While this can offer immediate relief and reduce feelings of isolation, Dr. Himmel cautions they are no substitute for professional therapy.

Close to 50% of students at the 勛圖惇蹋 reported loneliness as a key concern in the Universitys 2023 National College Health Assessment.

Dr. Himmel notes first-year students are more likely to explore these digital tools. The transition to college often brings uncertainty and students accustomed to on-demand technology are more likely to seek out solutions through apps and chatbots. They want mental health support in real time. Its brief, solution-based care theyre after, she said.

Still, she cautions that overreliance on AI can be risky. AI cant account for non-verbal cues or ask clarifying questions. If students rely exclusively on these platforms, it may delay them from seeking the professional care they truly need, she said.

One solution the Office of Counseling Services adopted in spring 2025 is TalkCampus, an online global mental health community where students can access live chat groups, journaling and wellness resources 24/7. While it is not AI-based, the tool reflects the Office of Counseling Services broader effort to provide accessible mental health options that feel more familiar and immediate.

AI is not a substitute for professional judgment, said Dr. David Spiegel, professor and chairman of Psychiatry and Behavioral Sciences. It might be a starting point for non-complex concerns but ultimately, a referral to a trained clinician is essential.

David Spiegel, MD, professor and chairman of Psychiatry and Behavioral Sciences in Macon & Joan Brock Virginia Health Sciences Eastern Virginia Medical School at 勛圖惇蹋, echoes Himmels concerns about overreliance on AI for mental health issues

While he acknowledges the promise of AI in delivering emotional support, cognitive behavioral therapy techniques and psychoeducation, he stresses its limitations. AI language models may struggle with the nuanced, context-specific nature of psychiatric conditions, he said.

Dr. Spiegel worries about AIs ability to detect red flags like suicidal ideation. A clinician would ask more questions, probe deeper, he said. A chatbot might take a statement like Im feeling depressed at face value, without recognizing potentially dangerous intentions behind a follow-up request.

The immediacy of AI tools also introduces a new challenge for therapists: managing expectations. Students get fast, empathetic responses from chatbots, Dr. Spiegel said. That can set a high bar for human clinicians, who may not always be instantly available.

Both Dr. Himmel and Dr. Spiegel agree that AI could play a more integrated role in the future of behavioral healthcare but for now, they urge caution.

AI is not a substitute for professional judgment, Dr. Spiegel said. It might be a starting point for non-complex concerns but ultimately, a referral to a trained clinician is essential.

Dr. Himmel added that counseling centers must strike a careful balance between embracing technology and encouraging real-world connection. We have to adapt. Its about meeting students where they are, both digitally and emotionally, she said.