Can AI do therapy and is it an existential risk?

I've been thinking a bit recently (like many people) about what role artificial intelligence will play in the future for psychotherapy. So I thought I would ask one of the LLM's what it thought about it. The answer was not surprising in that it detailed the obvious high points and low points of the impact that AI will have on psychotherapy, but missed the important area of "alignment". What is the aim of psychotherapy and who are we doing this for? 

As with most technologies they can represent a huge advancement and help for humanity if they are used well ... and this last part is entirely subjective and easy to be manipulated as we have seen with some disturbing high profile international cases. So we will wait and see what the consequences are but thought you may enjoy reading this below.

What are the existential threat to psychotherapists and AI? My sense is that LLM can do perhaps 80% of the heavy lifting when it comes to understanding, affirming, distilling etc but it's not currently capable of the co-regulation that is really underneath the therapeutic interaction. Or is it capable of that? and if it is what does that mean for the future of psychotherapy?

Your intuition is actually very close to where a lot of serious discussion about AI and psychotherapy currently sits. The key distinction many clinicians and researchers make is cognition vs regulation.

AI is getting very good at the cognitive layer of therapy, but psychotherapy is arguably grounded in nervous-system interaction.

Let’s break it down.

1. What AI can already do surprisingly well in therapy

Large language models can already perform much of the explicit cognitive work therapists do:

Understanding

  • summarising narratives

  • identifying patterns

  • recognising emotional language

Reflective responses

  • paraphrasing

  • empathic language

  • validation

Conceptual framing

  • CBT reframing

  • attachment interpretations

  • psychoeducation

Practical support

  • journaling prompts

  • behavioural plans

  • cognitive restructuring

Apps like Wysa, Woebot, and Replika already do versions of this.

For many people with mild anxiety or loneliness, these tools already provide meaningful psychological support.

So your “80% of the cognitive heavy lifting” estimate is not unreasonable.

2. The core thing psychotherapy may rely on: co-regulation

Where your intuition becomes very important is co-regulation.

Modern neuroscience (especially interpersonal neurobiology and polyvagal theory) suggests that therapy works through:

  • nervous system attunement

  • safety signals

  • facial expression

  • tone of voice

  • breathing synchrony

  • micro-timing of response

Key figures discussing this include:

  • Stephen Porges

  • Allan Schore

  • Daniel Siegel

The idea is that healing happens partly through regulated relational presence, not just insight.

A client’s nervous system literally settles in the presence of a regulated other human.

That’s something text alone cannot do.

3. Can AI do co-regulation?

Currently: not really.

But it depends on what we mean.

AI can simulate emotional attunement

It can generate language like:

  • “That sounds painful.”

  • “It makes sense you feel that way.”

This can feel supportive, but it’s symbolic empathy, not physiological attunement.

Real co-regulation involves physiology

True co-regulation often includes:

  • eye contact

  • micro facial expressions

  • vocal prosody

  • breathing rhythm

  • embodied presence

Even on Zoom some of this is lost.

With text AI, almost all of it disappears.

However, partial regulation may still occur

Humans can self-regulate through:

  • writing

  • reflection

  • journaling

  • structured thinking

AI can facilitate those processes.

So it may act more like:

a cognitive scaffolding for self-regulation.

4. Where AI might become much more powerful

The picture changes if AI is combined with embodied interfaces.

For example:

Future systems could include:

  • voice with emotional prosody

  • facial expression recognition

  • physiological monitoring

  • real-time nervous system feedback

Imagine a system that tracks:

  • heart rate variability

  • voice stress

  • breathing patterns

Then adjusts its responses accordingly.

At that point AI could begin approximating regulation feedback loops.

Still not human — but closer.

5. The likely future structure of psychotherapy

Most analysts think psychotherapy will split into three layers.

Layer 1 — AI mental health support (mass scale)

For:

  • mild anxiety

  • loneliness

  • journaling

  • emotional reflection

  • psychoeducation

Accessible 24/7.

Huge global impact.

Layer 2 — AI-augmented therapy

Therapists using AI for:

  • session summarisation

  • pattern detection

  • treatment planning

  • progress tracking

You already see early tools like Eleos Health doing this.

Layer 3 — high-value relational therapy

Where human therapists remain essential:

  • trauma work

  • attachment repair

  • complex personality dynamics

  • embodied regulation

  • transference/countertransference

  • psychedelic therapy

  • deep existential work

This is the part least replaceable.

6. Why psychotherapy may actually become more valuable

Paradoxically, AI may increase demand for real human therapeutic presence.

Because AI interactions may highlight the difference between:

  • simulated empathy

  • genuine relational presence

Much like:

  • streaming music increased demand for live concerts

  • social media increased desire for in-person community

7. A deeper point relevant to your work

Given your involvement with breathwork and expanded states, there’s an additional layer.

Practices like:

  • breathwork

  • psychedelic therapy

  • somatic therapy

depend heavily on:

  • embodied presence

  • energetic attunement

  • nervous system resonance

These are far harder to digitise.

The more therapy moves toward state work rather than talk work, the harder it is to automate.

8. The real existential threat to psychotherapy

The biggest risk probably isn't replacement.

It’s commodification of emotional support.

Cheap AI systems could become the default mental health interface for billions of people.

Human therapy could become:

  • more specialised

  • more expensive

  • more niche

But also potentially more valued.

Your core intuition is strong:

LLMs may replace much of the verbal processing layer of therapy.

But psychotherapy may ultimately rest on co-regulation and embodied relational presence, which is far harder to replicate.



About Anthony Olsen

My work is informed by Gestalt, Process Oriented, and Transpersonal psychotherapy, with roots in Traditional Chinese Medicine. It is awareness-based and attends to breath, body, and relational experience as central elements of the therapeutic process. This work supports psycho-spiritual integration, meaningful relationships, and a deeper sense of connection to self and others.


Or if you'd like to find out if we are a good fit or I offer what you are looking for
feel free to book a free 20 minute exploratory call.