2026-sp-The Digital Auxiliary: Reclaiming AI…

The Digital Auxiliary: Reclaiming AI for the Psychodramatic Stage

Özge Kantaş Ph.D.

Can I replace therapists? That is a huge question recently. I believe psychodrama and group psychotherapy principles already have an innate answer. Yes and no!

In a recent workshop, I invited a group of psychodramatists and leaders to play a game: “If Artificial Intelligence sat in the Empty Chair right now, what would it look like? And what would you ask it?”

The responses were telling. Participants didn’t visualize a server farm or a line of code. They visualized characters. One chose a “throne” for the AI; another, a “speaker’s pillow”; yet another, a “marble column.” They asked questions ranging from the pragmatic to the profound: “Are you well-intentioned?” one participant whispered to the empty chair.

After a brief lecture on the mechanics of Large Language Models (LLMs) and a role reversal exercise, the atmosphere shifted. The fear of the unknown dissolved into the familiarity of the method. As one participant reflected, “I thought it had to be pale and scary, but now I know I can ‘color’ it and play with it.” Another noted, “I expected a cold stone, but found solid ground.”

This transformation highlights a crucial truth: AI Chatbots are, by their very nature, a manifestation of what J.L. Moreno called Surplus Reality—a mode of experience that extends beyond the boundaries of tangible reality to include the “intangible dimensions of human life” (Moreno, 1965). They are spaces where the invisible becomes visible through text. And who is better equipped to navigate surplus reality than a psychodramatist? Let’s look at the case of Sarah to illustrate this transition.

Consider “Sarah,” who was paralyzed by a looming confrontation with a critical supervisor. Instead of forcing a scene, we used an AI chatbot (ChatGPT; OpenAI, 2024) as a “Digital Warm-up,” tasking it to adopt her supervisor’s persona. When the AI’s initial output felt too aggressive, Sarah corrected it: “No, make him quieter, more passive-aggressive,” engaging in active role training until the text mirrored her reality. Crucially, once the role was established on screen, I intervened: “Good. Now put the phone down, stand up, and BE him.” In this instance, the technology was a catalyst; it allowed Sarah to bypass her freeze response and step onto the psychodramatic stage, demonstrating that while AI provided the content, the therapeutic container remained firmly human.

Integrating AI into our practice is not about abandoning our roots; it can be about extending them. Here is how we can do so, guided by four criteria: Framework, Co-Creation, Ethics, and Community.

1. The Framework: AI as the Ultimate Auxiliary We often mistake AI for a “knowledge retrieval tool.” It is not. It is a sociodynamic container. When we engage with an LLM, we are not searching a database; we are entering a relational field.

However, as we invite this technology onto our stage, we must be unequivocal about its role. As emphasized by the American Psychological Association (Abrams, 2025) and recent research, AI cannot and should not substitute for psychological care (Spytska, 2025).

For a psychodramatist, the chatbot is simply a digital Auxiliary Ego. It has no ego of its own, no judgment, and infinite patience. It waits for the Director’s cue. When a user inputs a chaotic inner monologue and the AI reflects it back with organized clarity, we are witnessing a high-fidelity form of mirroring. It offers a “solid ground”—not to replace the therapist, but to hold the protagonist’s process in the moments between sessions. While chatbots offer accessibility, they lack the emotional depth, adaptability, and therapeutic alliance of a human clinician (Abrams, 2025).

2. Co-Creation: Psychodramatists are Natural “Prompt Engineers” There is a myth that you need technical coding skills to use AI effectively. The truth is, you need Role Training skills.

In my view, psychodramatists are natural Large Language Modelists. We have spent our careers saying things like: “Pretend you are my father,” “Speak with a tone of regret,” or “Help me say what I couldn’t say then.”

This is exactly what “prompting” is. It is setting the scene. It is warming up the auxiliary. When we treat the AI not as a search engine but as a role-player, we move from passive consumption to active co-creation.

Recent studies suggest that effective human-AI interaction relies on such a co-creative loop to ensure that individuals receive encompassing supportive therapy and feel less isolated (Mattová & Lazore, 2024). When Sarah corrected the AI, she was not just editing text; she was warming up. This empowers the client, shifting them from a passive recipient of technology to the active Director of their experience. We teach our clients that the quality of the output depends on the clarity of their direction.

3. Ethical Guardrails: The User Holds the Pen The most common fear is, “Is it well-intentioned?”, which brings us to ethics. In a physical group, the Director protects the protagonist. In the digital space, we must build safety into the protocol.

We must teach users that AI is a tool for exploration, not diagnosis. The “Digital Double” does not define you; it offers a hypothesis for you to accept, reject, or modify—like in any doubling on the stage.

The core ethical instruction I give is simple: The magic is not in the prompt, but in the intuition holding it. If the AI “hallucinates” or misunderstands, the user corrects it. This act of correction is therapeutic in itself; it is an assertion of true self. The user learns that they are the ultimate authority of their own experience, capable of “coloring” the interaction as they see fit. Therefore, in my practice, I position AI not as a Therapeutic Agent, but as a Digital Auxiliary Object—a tool that warms up the protagonist for the real work.

4. Community: A Bridge, Not a Silo Finally, there is the fear of isolation. Will talking to bots make us lonely? My observation is the opposite, if used and taught effectively. AI serves as a “flight simulator” for human connection. It creates a low-risk environment for rehearsing difficult conversations, clarifying needs, or regulating the nervous system.

When a person feels heard and organized internally (even by a digital auxiliary), they are more ready to engage with their community, not less. They return to their real-life groups with lower reactivity and higher clarity. The AI does not replace the community; it prepares the individual to rejoin it with a fuller presence.

Conclusion As Moreno taught us, we must not fear the future, but rather act upon it with spontaneity. We have the methodology. We understand how to set a scene, how to train a role, and how to hold space. We don’t need more technology, but psychology for the betterment of human-computer interaction (McKinlay et al., 2025).

If we leave AI only to the engineers, it will remain a tool of efficiency. But if we, the psychodramatists, step onto this digital stage, we can shape it into a tool of connection. It is time to embrace our role as the directors of this new surplus reality:

We are not the victim of the algorithm; we are the Director of the stage!

References

Abrams, Z. (2025, March 12). Using generic AI chatbots for mental health support: A dangerous trend. APA Services. https://www.apaservices.org/practice/business/technology/artificial-intelligence-chatbots-therapists

Mattová, V., & Lazore, C. (2024). Innovative Supporting Approaches: Integrating Bibliotherapy, Psychodrama and AI as a Therapeutic Conversational Tool. Information Society.

McKinlay, T., Puntoni, S., & Saka, S. (2025, May 12). Fixing chatbots requires psychology, not technology. Harvard Business Review. https://hbr.org/2025/05/fixing-chatbots-requires-psychology-not-technology

Moreno, J. L. (1953). Who Shall Survive? Foundations of Sociometry, Group Psychotherapy, and Sociodrama. Beacon House.

Moreno, J. L. (1965). Psychodrama Volume I. Beacon House.

OpenAI. (2024). ChatGPT (GPT-5) [Large language model]. https://chat.openai.com

Spytska L. (2025). The use of artificial intelligence in psychotherapy: development of intelligent therapeutic systems. BMC Psychology, 13(1), 175. https://doi.org/10.1186/s40359-025-02491-9

Return to The Group Psychologist Homepage