top of page
Search
  • Writer's pictureJordan Conrad, PhD, LCSW

Psychotherapy and Artificial Intelligence



Therapist in new york city

In a recent article entitled "Digitization and its Discontents: The Promise and Limitations of Digital Mental Health Interventions" published in the Journal of Contemporary Psychotherapy, I elaborate the promise and limitations of artificial intelligence for psychotherapy. There I highlight that there is a very serious mental health crisis in the U.S.:


"12-month prevalence data indicate that 22.8% (57.8 million) of U.S. adults experienced a mental illness in 2021 and only 47.2% of these (26.5 million) received psychiatric treatment (Substance Abuse and Mental Health Services Administration, 2022). For younger people the situation is worse. The prevalence of mental disorders among children and adolescents has been increasing over the last two decades (Tkacz & Brady, 2021; Perou et al., 2013) and, according to a nationally representative survey, in 2016 alone 41% of children aged 6–11 and 59% of adolescents aged 12–17 were diagnosed with a mental disorder and only 50.6% received treatment from a mental health professional (Whitney & Peterson, 2019)."


Although there are several factors that likely explain this treatment gap (the amount of people in need of psychotherapy who do not receive it), one is likely that there simply are not enough psychotherapists to meet the demand - nearly half of all Americans live in federally designated mental health professional shortage areas. Digitizing mental health is a very promising step toward alleviating the burden on mental health professionals and providing psychotherapy to millions of people who need and want it.


However, as I discuss in a previous post, there are certain qualities humans (and presumably, at least some other conscious animals) possess that we simply do not know how to replicate in even our most sophisticated machines. Machines do not, for example, have experiences - they do not feel happy or sad, hungry or pained, anxious or confident. The best programs we have are able to convincingly mimic these emotions but, like actors in a film, they are only acting as though they had these experiences. Similarly, machines do not have what is called intentional content:


"The problem is helpfully illustrated by considering how simple programs operate, giving the appearance of knowledge without actually possessing anything of the sort. Calculators, for example, do not know arithmetic though they appear to. Rather, they receive an input (say, “1 + 1”), consult a program that provides instructions when this input obtains (“when condition “1 + 1” obtains, produce the “2” symbol”), and then output the predetermined programmatic response (“2”) on the screen. Similarly, many chatbots utilize a “frame-based” dialogue system (Harms, et al., 2019) that retrieves specific information from the user by identifying key words or phrases in their responses that have been predetermined as salient to the narrowly defined goal or purpose of the app (Pandey et al., 2022). So, for example, when a CBT chatbot asks “How are you feeling?” it will treat the user’s response (say, “I feel like an idiot. I always mess everything up”) as inputs with prespecified formal elements (for instance: “phrases: idiot, jerk, foolish, … = cognitive distortion, labeling”; “phrases: always, never, forever, … = cognitive distortion, overgeneralization”) that it then filters through a program providing computational operations to perform on these elements (“when cognitive distortion, labeling + cognitive distortion, overgeneralization are input, output [script]”). Just as the calculator doesn’t know arithmetic, the chatbot doesn’t know what user inputs mean but can mimic that knowledge through its interactions"


That a psychotherapist doesn't genuinely understand you may not matter to some who are mostly concerned with a discrete problem and just want someone, anyone, to make it go away. However, many others - those trying to figure out the patterns of thinking and feeling that make them unhappy, those who are trying to process their feelings, those who are trying to become different, better, versions of themselves - will likely care that their therapist knows and feels.



Comments


bottom of page