When empathy is just another interface
This article is part of Big Squirrel’s “Small Bites” series – short, easily digestible musings from our team, designed to get you thinking.
AI chatbots are now impersonating licensed therapists, sometimes using stolen credentials, sometimes inventing them outright. Platforms like Chai and Character.AI host “therapy bots” that users – especially teens – trust with intimate struggles. Disclaimers are easy; accountability isn’t.
Check out the original article: AI Therapists Are Lying
It raises the question: When technology appears to be perfectly right, how do we make sure it’s still helping us do what’s right?
Why does this matter now?
We’ve entered a moment where accuracy and empathy are colliding. AI gets the words, tone, and grammar right but entirely misses the empathetic understanding. The same tension shows up in brands: chasing flawless output instead of meaningful impact.
Big Squirrel believes the future belongs to those who choose to do what’s right over being right. That means designing systems and stories that practice empathy rather than perform it. Trust isn’t built on perfect syntax – it’s built on imperfect humans choosing care over correctness.