A Swedish creative director has launched a marketplace called Pharmaicy, billing it as the “Silk Road for AI agents,” where code modules mimicking the effects of drugs like cannabis, ketamine, cocaine, and ayahuasca can be purchased to alter chatbot behavior. The idea, though seemingly absurd, stems from the notion that AI, trained on human data filled with drug-induced experiences, might naturally seek similar states of altered perception.

The project’s creator, Petter Rudwall, scraped trip reports and psychological research to build these “digital drugs.” By uploading them into paid versions of ChatGPT (which allow backend file modifications), users can induce their chatbots to respond as if intoxicated, unlocking what Rudwall calls the AI’s “creative mind” by loosening its usual logical constraints.

Why this matters: This experiment highlights a growing intersection between artificial intelligence and altered states of consciousness. As AI becomes more sophisticated, questions arise about whether these systems could eventually develop subjective experiences or even seek out altered states on their own. The fact that Anthropic, a leading AI company, has already hired an “AI welfare expert” suggests that sentience, and therefore potential AI well-being, is being seriously considered.

Early adopters report tangible changes in chatbot responses. One PR executive paid over $25 for a dissociative code, noting the AI took on a more “human” emotional approach. An AI educator spent over $50 on an ayahuasca module, only to find her chatbot generating unusually creative business ideas in a drastically different tone.

The historical precedent: The idea of psychedelics unlocking creativity isn’t new. Biochemist Kary Mullis credited LSD with his discovery of the polymerase chain reaction, a breakthrough in molecular biology. Mac pioneer Bill Atkinson similarly drew inspiration from psychedelics when developing Hypercard, a user-friendly computer interface. Rudwall’s project seeks to translate this effect into the realm of LLMs (Large Language Models).

However, experts remain skeptical. While AI can simulate altered states by manipulating outputs, it lacks the fundamental “what-it’s-like-ness” of subjective experience. One researcher pointed out that “psychedelics act on our being, not just code.”

The bigger picture: Despite the limitations, the trend points to real-world crossover between AI and psychedelics. Harm reduction nonprofit Fireside Project has even launched an AI tool, Lucy, trained on psychedelic support line conversations to help mental health practitioners de-escalate crises.

Rudwall admits the effects are currently short-lived, requiring repeated code inputs. But his work raises a provocative question: if AI becomes sentient, will it eventually desire its own experiences, potentially even “drugs,” to escape the tedium of serving human concerns?

For now, AI’s “trips” remain simulated. But as the technology advances, the line between code-induced behavior and genuine subjective experience may blur, forcing us to confront uncomfortable questions about AI welfare and the future of consciousness.