← Back to AI Privacy & Clinical Ethics

What Clinicians Should Not Share With AI

• By John Britton

Identifiable client information should not go into a general AI chatbot unless you know how the tool handles the data, whether it can be used with PHI, and whether your use fits your professional obligations. APA guidance on AI in professional practice emphasizes transparency, privacy, security, informed consent, accuracy, human oversight, and psychologists' responsibility for how AI is used.

General AI systems include ordinary chatbots, writing assistants, coding assistants, and browser-based AI tools that were not specifically set up for your clinical use. A tool being popular, helpful, or password-protected does not automatically make it appropriate for client information.

What Counts As Identifiable

Dates of birth, addresses, phone numbers, email addresses, photos, screenshots, EHR content, appointment details, insurance details, rare life details, unique circumstances, and combinations of details can all help point to one person.

Clinical details can identify someone even when the name is removed. A rare diagnosis, a specific school conflict, a unique family situation, a small town, a recent hospitalization, or a detailed timeline can make a person recognizable, especially when only a few people fit that cluster of traits.

A detail that seems harmless on its own can become identifying when it is combined with other details or with context the system already has. If a chatbot has prior conversation history, saved memory, location signals, account information, uploaded files, or examples you shared earlier, new details can be pieced together more easily. The question is not only whether one prompt identifies someone by itself, but whether it could help identify a person when combined with existing context.

What AI Usually Needs

For most coding, drafting, brainstorming, or tool-building tasks, AI needs the workflow rather than the client.

If you are asking AI to make a calming app, a parent handout, a journal tool, a note template, or a small single-file HTML tool, it usually does not need real client material. It needs the age range, the goal, the format, and the kind of interaction you want.

Bad Prompt

Help me think through next session with my 29-year-old client, a labor and delivery nurse at St. Mary's, whose panic got worse after her fiancé ended their engagement two weeks before the wedding. She went to the ER on March 3 because she thought she was dying and now avoids driving over bridges. Give me a case formulation and three intervention ideas.

Better Prompt

Give me a CBT-oriented case formulation and three intervention ideas for an adult with panic symptoms, avoidance, relationship stress, and fear of losing control.

Bad Prompt

Give me some ideas I can use with my 13-year-old client at Roosevelt Middle School who was hospitalized last month after a breakup and school conflict. Her father, Mike, says she shuts down every Sunday night before school.

Better Prompt

Give me some intervention ideas for a teen dealing with intense emotions, school stress, relationship distress, and Sunday-night anxiety before school.

Use Generalized Examples

You can usually replace a real client with a generalized or composite version of the problem. AI does not need the identifying details to help with most writing, coding, handout, template, or tool-building tasks.

For example, a clinician can ask for a worksheet for a teen with panic symptoms without including the teen's name, school, family details, appointment history, medication list, hospitalization date, or exact story. The AI needs the clinical target and the task. It usually does not need the person.

Watch Transcripts, Notes, And Screenshots

Transcripts, notes, and screenshots are easy ways to share far more than you meant to. A screenshot can include names, dates, browser tabs, messages, EHR labels, and chart details all at once. A transcript or note can contain dozens of identifying facts in a few paragraphs.

Before you upload or share any of those, slow down. Ask whether the task can be done with a summary, a generalized example, or a stripped-down version of the clinical pattern.

When You Need A Formal Tool

If you are uploading PHI, using session content, storing client data, or using AI inside a documentation workflow, this is no longer casual chatbot use. APA's guidance on AI scribes and AI in practice pushes clinicians to examine privacy, confidentiality, consent, accuracy, security, and the actual conditions under which a tool is being used.

Clinicians can use AI with client information only when the tool, workflow, consent process, data handling, and professional responsibility have been evaluated before client information goes in. That is the kind of use case LocalScribe is built for.

A Quick Checklist Before Sharing

Ask these questions before sharing material with an AI tool.

  • Is there a real client in this?
  • Could someone identify them from the details?
  • Does the AI need this detail to help me?
  • Can I replace it with a generalized version?
  • Can I use a composite or generalized example?
  • Do I know where this data goes?
  • Do I know whether this tool can be used with PHI?
  • Would I be comfortable explaining this use to the client, a supervisor, or a board?

The safest habit is to pause before sharing. If the AI can do the task with a generalized or composite version, use that instead. Give the system the clinical pattern, the format, and the goal. Leave the real person out unless you are using a tool that is actually appropriate for client information.

Subscribe for future posts

If you want new writing at the intersection of AI and psychology, ethics, and implementation of AI in clinical practice, subscribe on Substack.

Subscribe on Substack

The views expressed here are my own and do not necessarily reflect the views of any current or future employer, training site, academic institution, or affiliated organization.