Blog ai
AI in Coaching: Presence, Permission, and Professional Responsibility

Artificial intelligence is no longer knocking at the door of coaching—it’s already sitting in the room.

Clients are using it between sessions to process thoughts, regulate emotions, and seek guidance. Coaches are experimenting with it for reflection prompts, session summaries, and idea generation. The question is no longer whether AI belongs in coaching, but how we engage it responsibly.

Because with all its promise, AI introduces something subtle and easily overlooked: a shift in the relational field.

AI as Tool… and as Mirror

AI can be remarkably useful. It can help clients articulate what feels vague, generate possibilities, and even support insight. But it also functions as a kind of psychological mirror—reflecting language, patterns, and meaning back to the user.

That mirror is not neutral.

Clients may project authority, wisdom, or even emotional presence onto AI tools. They may experience it as supportive, attuned, or “knowing.” From a psychospiritual or depth-oriented perspective, this isn’t surprising—it echoes the same projective processes we see in dreams, symbols, and even the coaching relationship itself.

But here’s the difference: AI is not bound by ethical codes. You are.

The Coaching Relationship Still Leads

No matter how sophisticated AI becomes, it does not replace the core of coaching: presence, attunement, discernment, and ethical responsibility.

AI can support the process, but it cannot hold the container.

Which means if AI is brought into coaching work—by you or your client—it must be done intentionally, transparently, and with care.

Using AI in Session: Start with Permission

If you are considering using AI during a coaching session—perhaps to generate reflective questions, explore reframes, or co-create language— or even record for notetaking purposes, this should never be done casually or without the client’s awareness.

At minimum, this involves:

  • Explicit informed consent: The client understands what tool is being used and why
  • Clarity of role: AI is a support tool, not a decision-maker or authority
  • Collaborative use: The client is part of the process, not subjected to it

A simple check-in can go a long way:
"Would you be open to us using a tool to generate a few additional perspectives here?"

That moment preserves agency—and keeps the coaching relationship primary.

Not All AI Platforms Are Equal

Here’s where things get more serious.

If you are entering any client-related information into an AI platform, the type of platform matters.

  • Clinician- or coach-specific AI tools (designed with privacy safeguards, data protection, and professional use in mind) are the appropriate environment for anything resembling client material.
  • General-use platforms (like ChatGPT or similar tools) are not designed for confidential client work unless explicitly configured for that level of security.

Even then, caution is warranted.

The Golden Rule: De-Identify Everything

If you choose to use a general AI tool for support—brainstorming, language refinement, or conceptual exploration—never enter identifying information.

That includes:

  • Names (obvious, but worth saying)
  • Specific locations
  • Unique personal details
  • Anything that could reasonably trace back to a real individual

Think in terms of archetype and pattern, not person.

Instead of:

“My client Sarah, a 42-year-old teacher in Denver…”

Shift to:

“A mid-career professional navigating burnout and identity transition…”

You’re still working with the essence of the material—without compromising confidentiality.

The Ethical Thread Running Through It All

This isn’t about fear of technology. It’s about alignment with the foundational principles of coaching:

  • Do no harm
  • Protect client privacy
  • Maintain clear boundaries
  • Stay within scope

AI doesn’t remove these responsibilities—it amplifies them.

Used well, AI can deepen reflection, expand language, and support both coach and client in meaningful ways. Used carelessly, it can blur boundaries, erode trust, and introduce risk where none is needed.

A Simple Way to Stay Grounded

Before using AI in your coaching work, ask yourself:

  • Does this support or dilute the coaching relationship?
  • Is my client fully aware and consenting?
  • Am I protecting their identity and confidentiality?
  • Would I be comfortable explaining this choice to a licensing board or ethics committee?

If the answer to any of those feels shaky, pause.

Because at the end of the day, AI may be intelligent—but it is not accountable.

You are.

View ILCT's course on AI and coaching here.