What is it about?

Ezra Lockhart critiques the article by Zhang and Wang (2024) on the potential for AI to replace human psychotherapists. While acknowledging the promise of AI in addressing mental health care gaps, Lockhart argues that the article presents an overly simplistic view of the issue. He highlights how Zhang and Wang focus heavily on AI as a solution to therapist shortages and access problems but fail to consider the deeper, systemic issues, such as insurance policies, regulatory barriers, and cultural obstacles in the mental health care system. Lockhart also challenges the article's optimistic portrayal of AI’s ability to personalize care, pointing out that AI’s potential to perpetuate biases is not sufficiently addressed, which could exacerbate existing inequalities in mental health treatment. Furthermore, Lockhart critiques the article for overlooking the ethical and relational dimensions of psychotherapy. He emphasizes that AI cannot replicate the trust, empathy, and human connection that are essential to effective therapy. The review argues that AI, while useful as a tool, cannot replace the essential human qualities that are integral to the therapeutic process. In conclusion, Lockhart asserts that the article fails to fully engage with the ethical, relational, and systemic complexities of integrating AI into mental health care, and while AI can support therapy, it cannot replace the human therapist.

Featured Image

Why is it important?

Lockhart's work is important because it provides a critical perspective on the growing role of AI in mental health care. As the use of AI in various sectors, including therapy, increases, it's essential to evaluate its limitations and potential risks. Lockhart's critique highlights the gaps in Zhang and Wang's article, especially around the oversimplification of AI's capabilities and the failure to address the ethical, relational, and systemic challenges AI might introduce to mental health care. His work is crucial in ensuring that technological advancements in therapy are not pursued at the cost of human connection, ethical standards, and fairness, particularly for vulnerable populations. By drawing attention to the deeper issues surrounding AI integration, Lockhart contributes to a more nuanced, responsible conversation about the future of mental health care.

Perspectives

Lockhart’s extensive background in both teletherapy and IT/network systems provides him with a unique and valuable perspective on the integration of AI into mental health care. With over a decade of professional experience in IT, he possesses a deep understanding of technology, data, and digital infrastructure. This allows him to critically evaluate the role of AI in psychotherapy not only from a therapeutic standpoint but also from a systems-oriented viewpoint, recognizing both the potential and the limitations of integrating AI into clinical practice. Given his expertise in IT, Lockhart is acutely aware of the technical challenges and risks involved in using AI for mental health care, such as algorithmic biases, data security, and the potential for AI systems to reinforce existing disparities. He can scrutinize claims about AI’s ability to personalize care and improve outcomes with a technical lens, considering the broader implications of algorithmic flaws and biases. His deep knowledge of how AI algorithms are designed and implemented informs his critique, enabling him to question overly optimistic assumptions about AI’s potential while emphasizing the need for caution and ethical consideration. At the same time, Lockhart’s work as a teletherapist grounds his critique in the realities of clinical practice. He understands firsthand the core human aspects of therapy—such as empathy, trust, and relational depth—that AI cannot replicate. His professional experience in teletherapy informs his belief that while technology can support and enhance therapy, it should never replace the vital human connection that forms the foundation of effective mental health care. This dual expertise—combining technical knowledge with therapeutic insight—makes Lockhart’s critique particularly valuable, ensuring that the conversation around AI in therapy is both informed and balanced, addressing the technical, ethical, and relational dimensions of this emerging field.

Assoc. Prof. Ezra N. S. Lockhart
National University

Read the Original

This page is a summary of: Review of "Can AI replace psychotherapists? Exploring the future of mental health care", May 2025, ScienceOpen,
DOI: 10.14293/s2199-1006.1.sor-med.a11560757.v1.rjcjha.
You can read the full text:

Read

Contributors

The following have contributed to this page