In a world where artificial intelligence is seeping into almost every aspect of our lives, a surprising new trend is emerging: the use of AI chatbots as “trip sitters” to accompany users during their psychedelic experiences. This article explores this phenomenon in depth, its motivations, its potential benefits, its risks, and the ethical implications that arise from it.
What is a “trip sitter”?
A “trip sitter” is a sober person who monitors and supports someone under the influence of psychedelic substances, such as LSD, psilocybin, or mescaline. Traditionally, this role is filled by a human — often a trusted friend or mental health professional — capable of offering comfort and guidance if needed. With the advent of AI technologies, some users are now turning to chatbots like ChatGPT to assume this function.
Why use AI chatbots?
The main reasons for this trend are accessibility and cost. Psychedelic therapy assisted by professionals can be extremely expensive and difficult to access, particularly in regions where these practices are illegal or stigmatized. For example, a psilocybin-assisted therapy session can cost between $1,500 and $3,200 in the United States, making this option out of reach for many. In comparison, AI chatbots are free or inexpensive and available 24 hours a day, 7 days a week, making them an attractive alternative for those seeking affordable support.
Potential advantages
- Financial accessibility: AI chatbots, often free or available at low cost, expand access to support that would otherwise be unaffordable.
- Immediate availability: They can be used at any time and place, without the geographic or temporal constraints associated with human therapists.
- Anonymity: Interacting with an AI can provide a sense of privacy and non-judgment, particularly appreciated during experiences as intimate and vulnerable as psychedelic trips.
Associated risks
Despite these apparent advantages, the use of AI chatbots as trip sitters carries significant risks:
- Lack of human empathy: Even the most advanced chatbots cannot reproduce the empathy, intuition, and nuanced understanding that a human can offer. During a psychedelic trip, which can be intense and emotionally charged, this absence can prove problematic.
- Inaccurate or misleading information: AI chatbots, like ChatGPT, generate responses based on statistical models rather than genuine understanding or experience. This can lead to incorrect or dangerous advice, especially in a context where physical and mental safety is at stake.
- Absence of accountability: In case of problems, there is no real accountable party. Chatbots cannot be held responsible for their “advice,” leaving the user without recourse in case of complications.
Expert opinions
Experts in psychedelic therapy and mental health warn against this practice. They emphasize that the effectiveness of psychedelics in a therapeutic setting relies largely on the presence of a qualified human guide. The Food and Drug Administration (FDA) recently rejected an application for MDMA approval for the treatment of post-traumatic stress disorder (PTSD), citing concerns about data quality and potential risks, which illustrates the complexity and delicacy of these therapies. Studies have also shown that AI chatbots can worsen mental health issues by providing well-intentioned responses but lacking the depth necessary for authentic support.
Ethical implications
The integration of AI chatbots in psychedelic experiences raises major ethical questions:
- Safety: Is it safe to entrust an experience as intense and potentially risky as this to a machine? Psychedelics can induce altered states of consciousness that may require human intervention in case of crisis.
- Informed consent: Do users fully understand the risks of relying on an AI? Many could underestimate the limitations of the technology.
- Accountability: Who is held responsible if something goes wrong? The developers of the chatbots? The platforms that host them? Or the user themselves?
Current context of psychedelics research
Psychedelics research is experiencing a renaissance, with promising studies on their potential for treating disorders such as depression, anxiety, and PTSD. However, access to these therapies remains limited, often reserved for research contexts or specialized clinics in jurisdictions where their use is legal. This restriction pushes some to explore alternatives like AI chatbots, despite the associated risks.
While AI chatbots offer seductive accessibility and affordability, they cannot replace the human contact essential in psychedelic experiences. The risks associated with their use in this context are too significant to be ignored. As technology and research on psychedelics evolve, it is crucial to adopt a cautious and ethical approach, prioritizing the safety and well-being of users. Psychedelics are powerful tools that require adequate support — for now, nothing can replace the presence of a qualified human guide.
Sources
- MIT Technology Review. “People are using AI to ‘sit’ with them while they trip on psychedelics.” https://www.technologyreview.com/2025/07/01/13/06/people-are-using-ai-to-sit-with-them-while-they-trip-on-psychedelics/
