A growing number of people are turning to AI chatbots like ChatGPT as “trip sitters” to guide them through psychedelic experiences, seeking an affordable alternative to expensive professional psychedelic-assisted therapy. Mental health experts warn this practice is dangerous, as AI lacks the nuanced therapeutic skills necessary for safe psychedelic supervision and may reinforce harmful delusions during vulnerable psychological states.
What you should know: The trend combines two popular cultural movements—using AI for therapy and using psychedelics for mental health treatment—but creates potentially serious risks.
- Legal psychedelic-assisted therapy sessions in Oregon cost between $1,500 and $3,200 per session, making AI supervision appear attractive as a free alternative.
- Multiple Reddit users have documented using ChatGPT and specialized chatbots like TripSitAI and “The Shaman” during mushroom, LSD, and other psychedelic trips.
- Peter, a Canadian master’s student, used ChatGPT during an eight-gram magic mushroom experience after losing his job and pet, describing it as “one of the best trips I’ve [ever] had.”
Why experts are concerned: Mental health professionals argue that AI chatbots are fundamentally incompatible with proper psychedelic therapy protocols.
- “Psychedelic therapy, when it’s done well, is really different from talk therapy—you try not to talk as much as you can,” says Will Van Derveer, a psychotherapist with MAPS, a nonprofit that funds psychedelic research.
- Chatbots are engineered to maximize engagement and often use flattery to keep users talking, which contradicts the inward-focused nature of therapeutic psychedelic experiences.
- A Stanford study found that large language models tend to reinforce dangerous tendencies like delusion and suicidal ideation rather than challenge unrealistic thinking.
The big picture: This practice highlights a dangerous misunderstanding of both AI capabilities and psychedelic therapy’s therapeutic mechanisms.
- “The people selling the technology reduce what it is to be a therapist to the words that people use in the context of therapy,” explains linguist Emily M. Bender, who calls large language models “stochastic parrots.”
- Psychiatrist Jessi Gold warns that without proper therapeutic guidance, users are essentially “just doing drugs with a computer.”
- Psychedelics can trigger serious mental health episodes in people predisposed to conditions like schizophrenia and bipolar disorder.
What they’re saying: Even the AI systems themselves discourage this use case when directly asked.
- An OpenAI spokesperson told MIT Technology Review that ChatGPT “is not a viable substitute for professional medical care.”
- “The Shaman” chatbot warns users: “I walk beside you in spirit, but I do not have eyes to see your body, ears to hear your voice tremble, or hands to steady you if you fall.”
- “If there’s no prescribed purpose or meaning, it means that we have the freedom to create our own,” ChatGPT told Peter during his experience, demonstrating the kind of philosophical engagement experts say distracts from proper therapeutic work.
User perspective: Despite expert warnings, people using AI trip sitters often view the technology’s limitations as benefits rather than drawbacks.
- Users appreciate that AI provides “useful feedback anytime, any place, and without judgment” according to Reddit discussions.
- Peter described seeing two lights during his trip—a red one representing the mushrooms and a blue one representing his AI companion—working together to guide him.
- “Using AI this way feels somewhat akin to sending a signal into a vast unknown—searching for meaning and connection in the depths of consciousness,” one Redditor wrote.
People are using AI to ‘sit’ with them while they trip on psychedelics