top of page

AI Note Takers in Mental Health: Helpful Tool or Ethical Minefield?

  • Writer: Mara B. Edmunds, LMFT
    Mara B. Edmunds, LMFT
  • Jul 13
  • 6 min read

By Mara B. Edmunds, Licensed Marriage & Family Therapist

© 2025 Hope Harbor Counseling & Family Therapy, PLLC. All rights reserved.


Let’s talk about something that’s been gaining traction recently in the field of mental health: AI-assisted note taking. Sounds like a dream, right? A tool that helps therapists save time and take more accurate notes? What could go wrong?


Well, here’s the catch: it involves recording therapy sessions. With client consent, of course. But even that opens up a complicated ethical discussion we can’t afford to ignore.


A woman in a counseling session wearing a white outfit, sits on a sofa, looking attentive in a bright room with plants. Calm mood. Another person, blurred, faces her. Topic: Should mental health counselors use AI to help them take notes during session?

Why Mental Health Therapy Notes Matter

(and What Clients Might Not Know)


Most therapy sessions last around 50 minutes. That last 10 minutes of the hour? It’s reserved for the therapist to write a session note which is a clinical summary that includes treatment goals and how a client is progressing. These notes help us do our jobs well as therapists. They're part of what sets therapy apart from talking to your best friend or your grandmother.


But here’s the uncomfortable question we need to ask:


When did the convenience of clinical documentation become more important than the psychological safety of our clients?


Psychotherapist at desk contemplates purchasing software that records mental health counseling sessions to make her note taking more automated.

Trust Is Always Earned, Not Given


Therapy is one of the most emotionally vulnerable spaces a person can enter. Unlike other medical fields, we’re not just treating symptoms. We're often helping people process trauma, shame, grief, and deeply personal struggles. That’s why our ethical code around confidentiality is so rigorous.


In fact, many of us hear variations of this phrase weekly: 

“I’ve never told anyone this before, but…”

So it’s jarring to think that we might respond with: 

"That’s brave! By the way, this is being recorded. But don’t worry, it’ll be deleted later.”


No matter how carefully we phrase it, that kind of disclosure changes the dynamic in the room. Even if the software company claims it deletes the recording later, the reality is that neither therapists nor clients can ever be fully sure what happens to that transcript data once it leaves the therapy space for processing.


By opting to record a session, we have essentially invited other entities into a very private setting, whose ethics, motivations, and standards we cannot fully verify, oversee, or assure the client of 100%.


Mental health counselor seated across from client uses his iPad to record her voice to simplify his note taking during session.

Can We Really Promise Safety With AI Notes In Therapy?


Software developers will say, “The audio is deleted after the transcript is created.” Sounds great. Then, in the same breath, they’ll tout how their AI becomes smarter over time by learning from previous audio transcripts. 


You don’t need a computer science degree from MIT to know: If a system can refer back to something, it hasn’t been truly deleted.


Are we truly trying to comfort our clients by claiming their voice recordings aren't stored anywhere, even though we can't confirm this ourselves? Are we assuming our clients don't mind that the transcript's content is being saved? Make that makes sense.


And this raises a legal concern too: when mental health records are subpoenaed by a judge, courts often request everything. This includes notes, emails, texts, and yes, recordings or transcripts. If an entire session is on file somewhere, it could end up in a courtroom. That is a far cry from the sacred space our clients signed up for.


A widely used and costly electronic records platform for therapists recently acknowledged that its audio transcripts have a 7-day period during which they are susceptible to being subpoenaed by a court. Despite this, literally thousands of therapists are enrolling in this specific AI recording feature on the platform. That's a significant number of therapists crossing their fingers, hoping to avoid potential legal issues.


Mental health counselor walking out of a courthouse after being called to testify in a client's case.

Consent Isn’t Always as Free as It Seems

(The REAL Dynamic Beneath)


Let's be honest: requesting a client's consent to record their session places them in a challenging situation. Declining their therapist might feel like they're hindering the therapist's process or being uncooperative. Even with having the "choice" to opt out, the situation is inherently pressured. A client might feel that the "right" response to their therapist is "yes, you may record my session to ease the burden of your note-taking."


For some clients, this request can do more than just feel awkward. It may even actively retraumatize them. Many individuals in therapy carry experiences of betrayal trauma, especially from being recorded without their knowledge or consent and sometimes in situations involving abuse, manipulation, or exploitation. For these clients, even the suggestion of recording their voice can trigger deep emotional responses. Suddenly, they’re not just in a therapy session. They’re back in a moment where their voice, privacy, or dignity was taken from them.


Here’s the part that’s especially hard to swallow: we may end up spending their session time processing how we just triggered them, while also charging them for it. All for the convenience of our note-taking shortcut that the client did not ask for nor likely sees a benefit in it for themselves.

We must ask ourselves: is that really ethical care?


When I recently took a casual poll of my friends on this, they each adamantly insisted that they would prefer no note or a low quality one over having any part of their session recorded, even temporarily.


Male psychotherapist seated across from male client listening and taking clinical notes during the session. Are AI notes really ethical in mental health?

Imagine Being in the Client's Seat


I’d like to ask therapists considering this technology to flip the script for a moment. Imagine you’re in therapy, finally opening up about a deeply painful or embarrassing experience, and your therapist slides over a digital consent form and says, “We’ll delete the recording right after, I promise!”

Would that feel safe? Would it help you open up more?


If you’re a therapist, you probably have areas of your own life you wouldn’t want recorded, even temporarily. That internal reaction? That’s empathy. That’s the same emotional radar we rely on to be effective clinicians.


Female therapist seated across from her client on the couch who leans forward as she takes notes on paper with pen. Is AI the next advancement in mental health?

Where Do We Go From Here?


To clarify: I'm not against all technology. I don't oppose using AI as an auxiliary tool to help organize generic treatment plans (that have been created by a human therapist) or to better manage non-identifiable data. These are genuinely beneficial, low-risk tools when used wisely and in a manner that safeguards clients. But recording the most vulnerable moments of our clients’ sessions even for a “few minutes” crosses a line in my opinion.


Recording a client's voice is a fundamental shift in the therapeutic relationship, one that risks doing real harm to the trust we work so hard to build. And for what? To save the therapist 5 minutes on their session note?


As therapists, we are also bound by our code of ethics to first do no harm. If using a recording device or AI tool (even with the best intentions) risks compromising our client’s sense of psychological safety, then it’s not a tool we should be using.



A Final Thought

Therapists, let’s remember what we’re here for: to create a safe, supportive, nonjudgmental space where people feel free to be exactly who they are. No software update should ever come at the cost of that sacred trust.


For clients seeking therapy for challenges that they vent to their neighbors about, perhaps recording audio would not be an issue for them. But for many clients, therapy represents the only space they can be fully vulnerable. Let’s not ruin that for them simply to shorten our note taking process.


Tech can be an amazing aid but it must stay in its lane. And we must continue to lead, not follow, when it comes to protecting the psychological safety of those we serve. Let’s advocate for our clients, not just with our words, but with our choices. Our clients are more than worth it!


© 2025 Mara B. Edmunds. All rights reserved.


***Disclaimer: This article is meant for educational purposes only and not meant to take the place of professional consultation.



Hope Harbor Counseling & Family Therapy, PLLC  

4917 Golden Triangle Blvd.

Suite 411

Fort Worth, TX 76244

(817) 201-2444 Call or Text

FREE Resources at www.HopefulHarbor.com 

_______________________________________________________________________________________________

About the Author

Mara B. Edmunds, LMFT, is a licensed psychotherapist in Texas with extensive experience in identity, trauma, and relational health. She is dedicated to guiding individuals and couples toward intentional living and aligned relationships with a warm, grounded, and curious approach.

Mara employs a holistic approach that considers not just the symptoms but also the underlying issues that contribute to her clients' struggles. She has dedicated her professional life to helping individuals navigate the complexities of their mental health and emotional well-being.


Comments


HOURS (CST)

MONDAY-THURSDAY

8:00AM-7:00PM

FRIDAY

 8:00 AM-4:00PM

_____

Call/Text (817) 201-2444

Email: Info@HopefulHarbor.com

Sea Turtle

Ask Yourself: 

WHAT WILL MY FUTURE SELF WISH I HAD DONE TODAY? 

Hope Harbor Counseling & Family Therapy, PLLC

4917 Golden Triangle Blvd. Suite #441

 Keller, TX 76244

(North Fort Worth/Southlake/Grapevine, TX Area)

Registered Telehealth Provider Statewide-Texas

bottom of page