At this point, we’ve seen enough headlines to spot the pattern.
“AI Therapy Is Here.”
“Your New Robot Therapist Has Arrived.”
“Slingshot AI Raises $93M to Redefine Mental Health Care.”
AI is not therapy and calling it that is misleading at best and dangerous at worst.
Therapy Requires a License. Your App Does Not.
As we wrote in our coverage of Slingshot AI’s $93 million funding round, the founders described their product as a platform that “takes you on a therapeutic journey and delivers the right modality at the right time.”
It sounds promising, maybe even helpful but it isn’t therapy.
Why? Because therapy, by definition, involves a licensed provider, trained in ethics, supervision, trauma-informed care, and clinical judgment.
An app, no matter how sophisticated, is not licensed, It can’t ethically manage risk, It can’t legally diagnose and it certainly can’t offer accountability when things go wrong.
Call It What It Is: A Tool to Support Mental Health Work
There is absolutely room for AI in mental health but not as a replacement. Instead of branding it as “therapy,” we should be calling it what it is:
- A support tool for therapists
- A mental health assistant
- A wellness enhancement platform
- A documentation aid or coaching supplement
These tools can help reduce therapist burnout, automate admin tasks, provide conversation prompts, or support emotional tracking between sessions.
That’s useful but it’s not the same as sitting in a room with someone trained to hold your trauma, track your patterns, and know when something is clinically urgent.
Marketing Matters, So Do Boundaries
As we said in our “Robotherapy Take One” piece:
“Words like therapy shouldn’t be tossed around to make a pitch deck sound more human.”
We understand why startups use the language. It’s sticky, It sounds empathetic, and It gets funding.
Language carries weight! When users hear “therapy,” they trust it like they would trust a licensed provider. That trust, when misplaced, puts people at risk.
Especially those who are struggling, vulnerable, or uninsured, the very people these tools often claim to serve.
We’re Not Anti-AI, We’re Pro-Clarity
This is not a call to shut down innovation, We want tools that improve care!
We want technology that supports therapists and expands access but we also want language that respects the difference between care and code.
It’s time to draw a line between human-delivered therapy and AI-driven support and it starts with the labels we use.
The Future of Mental Health Depends on the Words We Choose
If we keep calling every wellness app with an algorithm “therapy,” we devalue the depth of real clinical work and we mislead the people most in need of real help.
Therapists don’t fear AI. But they do fear what happens when empathy is replaced with branding.
It’s high time we respect the line between support and care.
And stop calling it therapy.