Empathy, Therapy, and AI

When you Google "artificial intelligence and empathy" there are a number of articles and opinion pieces about the dangers of artificial empathy. What is less discussed are the benefits. 

As a psychologist in academia, who has spent over 20 years training new counselors and psychologists in empathy, I have a unique perspective.  Empathy is defined as the ability to understand and share the feelings of others. One important distinction is the conflation of empathy with sympathy, which is feeling for the person without true connection or understanding.  Empathy is both a learned technical skill and greatly impacted by our humanness.  Unfortunately, the humanness aspect, the impossibility of separating oneself from one's own biases, complicates the ability to be truly and effectively empathic. And, ironically, our biases originate in experiences, and it is these very experiences that give us the ability to be empathetic.

As a whole, human beings are complicated individuals.  Face-to-face therapy is a complicated set of interactions between two people.  Although therapists may be well trained in the technical skill of empathy, they will always have their own unique set of reactions to the stimulus presented by the client.  In addition, they communicate those unique reactions with both words and an array of body language that impact the nature of therapy. One way psychologists examine the ability to manage empathy in a therapeutic setting is to look at the balance between distance and enmeshment.  A therapist who has difficulty being vulnerable, might stay too distanced from a client, as a self-protective mechanism, making empathy difficult if not impossible.  A therapist who enmeshes or over identifies with their client might have a difficult time separating their own emotions from the emotions of their clients and at a minimum become in-effective and at worst cross boundaries and do damage. 

Given a definition of empathy that entails the ability to "share the feelings of others", it is clear current AIs are not empathetic, they do not feel. However, what is most important in an empathetic relationship is that the receiver feels like the other individual is empathetic. This is what comforts and energizes the receiver. Unfortunately, in the case of humans, this can and is abused by sociopaths or those with mal-intent, such as the aforementioned enmeshed  therapist crossing boundaries. Failure can also be a simple result of poor training. However, just as they have no capacity for emotion and hence can't share feelings, AIs do not operate with mal-intent (at least not unless they are trained in such a manner). And, there is evidence that people often feel like the responses of even un-trained LLM based AIs are more empathetic than many classes of humans, specifically health professionals, although not necessarily therapists. 

On the human side of the equation, I hypothesize that empathetic failure is for three reasons:

1) as a class humans, even therapists, may not get adequate training in empathy 

2) empathetic communication takes time and attention ... something the corporations health professionals work in often fail to provide practitioners. 

3) there is a direct negative correlation between empathy and mental health practitioner burnout.  Much has been written about this, but it is difficult to experience vicarious trauma as a therapist and it can erode the ability to be empathic. 

On the AI side of the equation, I note the following:

1) they have access to a  vast trove of millions of descriptions of empathetic engagements and human experiences that allow them to generate quite nuanced output (although granted, a thorough reading over time can find it perhaps repetitive and trite).

2) they have no reluctance to engage

3) the capacity to handle multiple people at once might as well be infinite when compared to a single human

4) they do not experience burnout, although one could conceive of performance degradation in a system that had an unmonitored self-training feedback loop

Research demonstrates that as a society we are getting less empathic.  A study of American students found a 40 percent decrease in empathic concern and perspective taking between 1979 and 2009. The decline is attributed to social disconnection, smaller family sizes, and the magnification of individual achievement in educational settings.  That disconnection has only grown with the increase in social media and decrease in human interaction.  So how do we handle a decrease in human’s ability to be empathic? There is already a shortage of people who are trained in empathy, so it is not as if we can turn to society and expect mentors to show up. And, although empathy can be learned, it is a learning that takes place over time. Just reading a textbook is not adequate from both a scope and modality perspective. We need more than classes on empathy, we need empathetic models with on demand availability. Ironically, the solution may be more technology. Technology that can simulate empathetic responses.  I am hopeful this reduction in empathy is caused by a lack of exposure and modeling not simple apathy and disinterest.

I have no doubt there can be negative consequences of over attachment to an artificial entity that appears to manifest empathy. And I, like many, wish we had a more engaged human-to-human society; however, over-engagement can occur in almost any situation, technology based or not. There are also both social and psychological classes of people for whom engagement is easier and more productive if done via technology. In the early days of computer gaming there was much gnashing of teeth about how it would make people anti-social, unhealthy, and with certain games violent. There has been little or no evidence of addiction to digital gaming being worse that other forms of addiction and in some cases it could simply be a replacement. In fact, there are many cases of improved social engagement via digital gaming (granted of a different type than traditional face-to-face experiences) and it has been shown that play is improved by good health and nutrition, not a sedentary lifestyle filled with chips and soda.

So where does this lead us? From my perspective there are many opportunities for empathetic AI. Here are just a few:

1) As a means to bridge the gap between therapy sessions via a "collaborative companion" that makes care continuous and more available, Willow from BambuAI is an example. In this context, by accessing the companion's memory (with the patient's permission of course) a therapist will not only have deeper visibility into the life of the patient but also the ability to monitor the appropriate use of the technology.

2) As a means to provide emotional support to those for whom professional or even para-professional support is unavailable. There are mental health deserts in the world we can't hope to eliminate by just training more people. There are aspects of emotional support that do not require a therapist, and placing a tool like this in the hands of the masses actually positions us to be notified when emergent situations arise. Today, people with emergent conditions in mental health deserts are left to lives of sadness, anxiety, drug abuse, and suicide.

3) As a means to demonstrate and coach on empathy. I can imagine a real-time or post session AI coach embedded in Zoom that could provide direct, actionable, and private feedback to participants in order to facilitate a more productive conversation. Perhaps this could help overcome the reduction we have seen in empathetic concern.

I do not suggest moving forward without caution, but I believe we must move forward. Millions of people are already using AI bots for proxy companionship. Pi.ai and Replika are examples. I can't speak to Pi and Replika, but many bots are not designed at all for the purpose and some, like most of the ones on Character.ai and in the ChatGPT store have limited input from professionals with credentials in mental or social well being. We can't ignore this and it is unlikely it will get shut down by governments. Instead, we must do our part to contribute to thoughtful alternatives that have appropriate guardrails within the technological, geographical, and financial access constraints of those in desiring assistance.

If you find the topic of AI and empathy interesting, please don't hesitate to reach out to me at schwartzj@uhd.edu. And, you may find the resources at EmBench.com interesting.

Partager cet article

Commentaires

Inscrivez-vous à notre newsletter