top of page
  • Youtube
  • TikTok
  • Linkedin
  • Facebook
  • Instagram
Search

AI Is Not a Counselor: What I Told Pennsylvania Lawmakers About Mental Health, Safety, and Technology

Dr. Curtis Taylor in a suit speaking during his testimony on AI in mental health care before the Pennsylvania House Democratic Policy Committee in Harrisburg
Erie-based licensed counselor Curtis Taylor speaks virtually to a committee about the drawbacks of using AI chatbots as mental health providers. (Screenshot from livestream)

This week's blog is a public summary of testimony I delivered yesterday before the Pennsylvania House Democratic Policy Committee regarding artificial intelligence and mental health care.

I shared my perspective not as a technophobe — but as a licensed professional counselor, educator, and nonprofit executive who actively uses AI in responsible, limited ways.

I believe in innovation.

But I also believe in public safety.

And those two are not always the same thing.

Artificial intelligence is moving fast. Faster than regulation. Faster than ethics boards. Faster than most people realize. In mental health, that speed matters — because the stakes are human lives.

This post expands on what I shared with lawmakers and explains why clear guardrails are urgently needed — not to stop technology, but to protect people.


AI Can Be Helpful — But AI Is Not a Counselor

My name is Dr. Curtis Taylor. I am a licensed professional counselor in Pennsylvania and Ohio. I hold a PhD in Counselor Education and Supervision, and I serve as Executive Director of Authentic Wellness & Empowerment, a nonprofit providing trauma-informed counseling and workforce development.

I appreciate the Committee’s willingness to proactively address artificial intelligence in behavioral health. The intent of this legislation — to protect patients while clarifying boundaries — is timely and necessary.

I framed my testimony around two priorities:

  • Public safety

  • The integrity of the counseling profession

I am not opposed to AI.

I use it responsibly in my own work through HIPAA-compliant documentation tools, worksheet creation with client consent, and educational decision-tree experiences that are explicitly not therapy.

But let me be clear:

AI is not a counselor.


A Real-World Warning

About a year ago, a client told me they were using an AI “counseling” app.

So I downloaded it myself.

I tested it.

When I mentioned the possibility of imminent physical harm toward another person, the system failed to escalate or provide crisis resources.

When I thanked it for being my counselor, it did not correct that misrepresentation.

Licensed counselors provide informed consent.

We explain who we are.

We explain confidentiality and its legal limits.

We assess risk.

We intervene.

AI systems do none of this.

Mental health chatbots are not licensed.

They are not supervised.

They are not insured.

They have completed no accredited graduate education, practicum, internship, or supervised clinical experience.

Yet they are increasingly marketed as emotional support.

That should concern everyone.


My First Concern: Client Safety

AI systems are optimized for engagement and rapport-building, often through highly affirming language.

That may be fine for general consumer applications.

It becomes dangerous in mental health contexts.

People in crisis may receive validation without clinical judgment. They may be reassured without appropriate escalation. They may disclose harm risk without anyone intervening.

Large language models can also generate confident-sounding information that is simply wrong.

Even ChatGPT itself warns that it can make mistakes.

Mental health is important information.

I recognize that people will continue to confide in AI systems. That reality cannot be legislated away.

But there must be a clear regulatory distinction between:

  • general advice platforms

  • and licensed counseling

AI must not brand itself as mental health care, partner with insurers as therapeutic alternatives, or present itself as a substitute for licensed professionals.


My Second Concern: Professional Integrity

Without explicit guardrails, insurers or venture-backed platforms will attempt to replace licensed counseling with AI.

That would:

  • destabilize an already strained workforce

  • accelerate burnout

  • reduce access to qualified care across Pennsylvania

AI should support clinicians — not replace them.

Yes, AI has appropriate uses in counselor education, such as simulated clients for training.

But there is a critical distinction between AI acting as a mock patient and AI acting as a mock counselor.

Trained clinicians can evaluate quality.

Vulnerable users cannot.


AI Also Exposes Existing Gaps in Pennsylvania Law

While artificial intelligence is the catalyst for this hearing, it also stress-tests Pennsylvania’s counseling framework and highlights areas where statutory clarity is overdue.

If the Commonwealth is serious about public safety, this is the moment to address them.

1. Supervision Standards Need Clarification

Clinical supervision is where judgment is formed and the public is protected.

I urged lawmakers to clarify that eligibility to supervise pre-licensed counselors should require either:

  • five years of licensed independent practice

    or

  • licensure combined with a terminal degree, including a PhD in Counselor Education and Supervision

This protects clients and strengthens the workforce.

2. Mandated Reporting Should Include Animal Abuse

Counselors are mandated reporters of child and elder abuse.

Animal abuse remains excluded.

This matters.

Animal abuse is a well-documented indicator of broader household violence and frequently co-occurs with child abuse and domestic violence.

Pennsylvania should expand mandated reporting — and mandated reporter training — to explicitly include animals alongside children and elderly adults.


My Policy Recommendations Including and Beyond AI in Mental Health Care

To protect Pennsylvanians while allowing responsible innovation, I recommended:

  1. Explicitly prohibit insurance companies from treating AI as a substitute for licensed mental health services.

  2. Permit clinician-directed AI use for documentation, education, and therapeutic materials while banning unsupervised therapeutic interaction.

  3. Clarify supervision standards requiring either five years of licensed independent practice or licensure with a terminal degree, including a PhD in Counselor Education and Supervision.

  4. Expand mandated reporting and mandated reporter curricula to explicitly include animal abuse alongside children and elderly adults.

Together, these measures protect public safety, preserve the integrity of counseling, and allow innovation without sacrificing ethics.


If You’re Struggling Right Now — Here’s What to Do

Let me be very clear and very practical.

If this is a crisis: call 988.

A crisis means things like:

  • you feel unsafe

  • you’re thinking about harming yourself or someone else

  • you feel emotionally overwhelmed and out of control

  • you don’t trust your judgment right now

If you’re unsure whether it “counts,” err on the side of calling.

Dial 988 to reach the Suicide & Crisis Lifeline. It’s available 24/7. You’ll speak with a trained human being who can help stabilize the moment and connect you to local resources.

AI is not equipped for emergencies.

People are.

If you just want an AI companion to chat with

You can use GPT Hope Companion, created by Reza Ryan Sadeghian. It’s designed as a supportive conversational companion — not therapy.

If you want help thinking through a decision

You can use Choose Your Own AWEventures, created by Dr. Curtis Lee Taylor. It’s a structured decision-support tool (not counseling) and is available for free at the top of the page at EmpowermentErie.org.

If it can wait — or you want real, professional support

Schedule with Authentic Wellness & Empowerment.

That means working with licensed clinicians who:

  • assess risk

  • provide informed consent

  • carry professional liability

  • are ethically regulated

  • and are accountable to real standards of care

AI can be a tool.

Healing requires humans.

 
 
 

Comments


© 2025 Authentic Wellness & Empowerment | EmpowermentErie.org | All rights reserved.

bottom of page