AI is guiding psychedelic trips – experts say it’s risky

AI tools like ChatGPT are becoming more ubiquitous in everyday life. Whether making work emails sound more professional or offering advice on everything from dog training to cooking, generative chatbots are everywhere. It turns out, people are even including them while on psychedelics—but that may not be a good thing.
RELATED: Could magic mushrooms help you live longer?
Psychedelics like psilocybin (aka magic mushrooms), LSD, and DMT are becoming more popular. Growing bodies of research point to the benefits of these compounds as potential treatments for anxiety, depression, PTSD, and other mental health conditions. And with fewer Americans drinking alcohol, many folks simply want an alternative intoxicant for their Saturday night shenanigans.
Meanwhile, ChatGPT is playing therapist for an increasing number of people. It makes sense, given that chatbots are free, convenient, and agreeable. But can (or should) you turn to GPT when you’re in the midst of an intense trip?
Tales of AI “trip sitting” (aka acting as a psychedelic guide) have been making the rounds. One Redditor shared how ChatGPT helped him through a particularly intense trip, with AI offering tips like deep breathing and embracing the visuals. Another explained the different ways the platform acts as a “trip buddy,” helping “articulate and explore the visions and insights that emerge.”
Some forward-thinking techies have leaned into the trend, developing tools meant specifically to supplement psychedelic journeys. There’s even a dedicated app called TripSitAI that was designed to act as a “compassionate digital companion.”
RELATED: NFL star says psychedelics saved his career
Legitimate psychedelic therapy can cost hundreds or even thousands of dollars per session. And unless you’re in a state with regulated psilocybin (like Oregon or Colorado), finding professional trip sitters can be next to impossible. But while some psychonauts sing the praises of chatbots, it may not be the panacea they describe.
Experts skeptical about AI trip sitters
While AI may seem like a convenient and accessible way to work through a trip, many experts caution against it. Psychotherapist Will Van Deveer of the Multidisciplinary Association for Psychedelic Studies (MAPS) told the MIT Technology Review that psychedelic therapy is far more reflective than engaging with a chatbot.
“Psychedelic therapy, when it’s done well, is really different from talk therapy—you try not to talk as much as you can,” he said. “Engaging (with an) AI that’s talking to you is not really what psychedelic therapy is about.”
The MIT Technology Review also pointed to the fact that chatbots are often designed to parrot users and validate their beliefs, which can be potentially dangerous if someone is in a mental health crisis. They pointed to a study that revealed large language models (LLMs) could reinforce things like suicidal ideation, leading to negative feedback loops that psychedelic drugs could exacerbate.
RELATED: Psychedelics may fix popular prescription dilemma
Trained psychedelic facilitators also have skills and intuition that AI may not be able to mimic. This can include tools for harm reduction, supportive touch, and integration. Licensed facilitators also often have healthcare professionals on hand to assist as needed, and can call on them if there’s an immediate need.
It’s also important to remember that psychedelic drugs are illegal on the federal level, and your chatbot convos are not confidential. OpenAI CEO Sam Altman recently admitted that ChatGPT exchanges could be used as evidence in court under subpoena.
Bottom line: AI may be a cool way to work through a trip, but it’s no substitute for a professional. For both safety and legal purposes, tread lightly.