Slingshot AI, a new artificial intelligence platform claiming to deliver “personalized mental health support,” made headlines this week after raising $93 million in Series B funding.
According to its founders, the app’s in-house AI companion “Ash” can walk users through therapeutic challenges by delivering the “right modality at the right time.”
But within hours of the news breaking, licensed mental health professionals across platforms like LinkedIn voiced concerns about the platform’s messaging and what it could mean for patients seeking actual therapy.
“Let’s Stop Calling This Therapy”
“This is not therapy,” wrote Matthew Bierds, a licensed professional counselor supervisor (LPC-S) in Texas.
“Therapy can only be offered by a licensed provider… I haven’t seen an app with a license to offer counseling services.”
His post, which has already garnered comments from clinicians and software engineers alike, highlights a growing tension in the mental health tech space: the use of clinical language to describe what is essentially AI-driven coaching.
Jon Sustar, co-founder of Quill Therapy Solutions and a software engineer, replied in agreement.
“These proclamations can mislead folks who just need help, professional help, which this most definitely is not, despite their confusing label of ‘therapy.’”
A Close Look from the Inside
Saroosh Khan, CTO of Allia Health, told foorum Insider he tested the app personally. His first impression? Familiar.
“The difference between Ash and any other AI therapist I’ve tried is like the difference between Claude and OpenAI,” he said. “One just sounds slightly better but is still trained to continue asking questions while sprinkling in some CBT language.”
Khan said the experience felt more like a chatbot mimicking therapy than any meaningful breakthrough in AI care.
“It’s clearly instructed to ‘break down the key challenges shared by the user and ask follow-up questions.’ But ultimately, it doesn’t adapt like a human clinician would and it doesn’t hold the nuance or ethical responsibility of one either.”
What the Company Claims
According to Slingshot AI’s co-founders, the platform was built to improve access, especially for users who may not feel ready for human therapy or face long waitlists.
“Ash understands what real mental health support looks and feels like,” one of them said in a press release. “Ash will challenge you, take you on a therapeutic journey, and deliver the right modality at the right time.”
That kind of language, “therapeutic journey,” “modality,” “mental health support” is part of what clinicians worry about most.
Mental Health, AI, and Regulation
With rising demand for care and provider shortages nationwide, AI mental health tools have become one of the most aggressively funded spaces in digital health.
But the regulatory gap remains wide. There are currently no formal standards for what constitutes “therapy” when delivered by a machine, nor for how users are informed about what they’re actually receiving.
For many professionals, that distinction matters.
“It’s not about gatekeeping,” Khan told foorum Insider. “It’s about safety.
When people are vulnerable, the last thing they need is to be marketed a simulated version of care that can’t hold risk, nuance, or accountability.”
What Comes Next
As more startups enter the mental health space, the debate is shifting from whether AI can be helpful to whether its role is being misrepresented.
Tools like Slingshot may have a place in the wellness ecosystem, but licensed professionals say one thing should remain clear:
Therapy is not an app. And trust can’t be artificially generated.