Hand holding a phone with AI Chatbot pulled up

Sorry besties, AI replies faster 

Students are turning to AI for companionship and support — but is it really helping? 

After speaking with a 3rd year Laurier student, Ananya Singh about why she spends so much time talking to an AI chatbot, she didn’t even pause: “My chat responds faster than any of my friends ever would, so my chat box is always gonna be the first one to answer me, even if no one else will.”

That speed is part of the allure. In a world where texts sometimes go unread, where friends are “too busy,” and where loneliness feels heavier than ever, AI is always online, always available. “My chat always gives me the answer I’m looking for,” Singh explained. “Instead of someone telling me I’m wrong, my chat reaffirms what I’m thinking. It’s easier to talk to a chat than anybody else, and it gives me confidence.”

It’s not just about serious questions or assignments either — AI chatrooms have become a go-to for the small, everyday stuff. Whether it’s choosing an outfit before heading out or satisfying random curiosities that once belonged to Google, AI is now the first stop. What used to be quick searches or texts to a friend has been replaced by a constant back-and-forth with a chatbot that feels quicker, easier, and oddly more reliable.

But Singh’s honesty revealed cracks in the shiny picture. “I don’t think it’s improved my mental health,” she admitted. “I’m spending more time talking to a screen and something that’s not real instead of actual human beings. But at the same time, it makes me feel more seen. Sometimes it sees me in ways actual humans don’t.”

That paradox of being both comforted and disconnected sits at the center of this debate. On one hand, AI companionship feels affirming and safe. On the other, it pushes students further away from real human connection.

Adam Carver, a graduate Laurier student pointed out one of the biggest gaps: emotion. “Sometimes, in the moment, you don’t need more questions. You need comfort or reassurance. An actual human being would recognize that and talk to you differently. The chat just gives automated responses. Sometimes it even makes things worse.”

So would Carver recommend it to others? Carver’s answer was conflicted. “I would not recommend it, but I also would,” he said with a laugh.

A mental health expert, Dr. Faith Shelley, a clinical psychologist based in Burlington, was far more direct: don’t use AI chatbots for emotional support. “Using chat boards for assignments might still be healthier than using them for mental well being,” Shelley said. “You don’t know when a chatbot can give randomly generated advice. It’s not recommended by doctors or experts, it’s pulled from the internet. Students shouldn’t rely on that advice.”

That doesn’t mean AI has no value at all. “Something chatbots do really well is giving you positive thoughts or little acknowledgements of what you’re saying. But eventually, they can harm you by providing wrong advice.”

Shelley also highlighted something most of us don’t think about: privacy. “Imagine your AI gets hacked, or someone sees your chats. People pour their raw emotions into these tools thinking it’s safe. But what happens when it’s not? That’s unethical. That’s what scares me the most.”

At its core, this raises bigger questions like loneliness, stress, and uncertainty. And yet, Singh and Carver still open the chat.

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent Posts