Can AI deliver therapeutic support to teenagers?

Last week, the BBC published an article entitled “Character.ai: Young people turning to AI therapist bots” - we asked our Head of Clinical Services, Stefan Imeson, to respond with his thoughts about what this could mean for young people. You can read the original article here

Klara from Klara and the Sun. Image Credit: City Journal

In Kazuo Ishiguro’s dystopian novel, ‘Klara and the Sun’, there are no more schools. All education is undertaken online. Opportunities for children to socialise with other human beings are so limited that rich families buy their children ‘AF’s’ (Artificial Friends) and poor families have to make do without. The AF’s are like chatbots but appear like real human beings: technology imitating life. 

Artificial Friends in Ishiguro’s imagined future are so advanced that, in many respects, they can pass as human beings and offer owners the same benefits as real human friends. Much like the chatbots discussed in the BBC article, they can offer comfort, support and advice… and that can’t be a bad thing, right? 

The importance of relationships

The simple answer is probably no, it’s not a ‘bad thing’. Surely whatever works, works? Who is anyone to tell anyone else that their positive experience of therapy is somehow less valid because it was shared with a computer programmed bundle of algorithms? But isn’t there something missing in the human-chatbot relationship? From a relational perspective, which underpins all of our work at FBB, we would argue that there is… the relationship itself. 

FBB places relationships at the heart of our work with young people. Yes, there’s a curriculum and a thoroughly thought-out and evaluated 3 year programme, but none of this can be truly transformative in a young person’s life without positive relationships at the center. Transformational experiences and shared moments of joy - another pillar of our theory of change- are fundamentally transformative because they are shared. From a humanistic perspective, relationships are bi-directional. The best relationships, including therapeutic ones, involve mutual impact. 

The philosophy of relationships

The humanistic philosopher Martin Buber defined 2 modes of relating which he termed ‘I-It’ and ‘I-Thou’. ‘I-It’ relating basically entails using the other as a means to an end. If I have some need from you - to make me feel better, to understand me, even to like me, I am (in Buber’s mind) relating to you as an ‘It’. For Buber, this isn’t a problem; in fact, it’s what most of us are doing in our relationships most of the time. 

What makes relationships potentially transformative and sometimes ‘therapeutic’ however, is the potential for ‘I-Thou’ relating. ‘I-Thou’ or ‘I-you’ - to use a slightly more modern phrasing - relating occurs when both parties are participants in the relationship, impacting one another. This meeting only occurs between people - it relies upon leaving your own private world (see: the online chat room) and stepping into the ‘creative void’ that exists between people - the place where the magic happens! ‘I-you’ moments are those in which we feel seen by another in our totality, we feel that we see them in theirs, and they feel this too. 

These moments might be rare, but their potential occurrence is arguably what makes relationships the endlessly compelling and desired experiences they are. AI-human relationships can only ever take place in the ‘I-It’ mode, why? 

The AI of of relationships 

Let's go back to Klara, the ‘AF’ protagonist in Ishiguro’s novel. Klara, like her less evolved chatbot cousins, is capable of offering much to her human counterparts. If you are feeling low, anxious or in desperate need of advice, you might do worse than to find your own Klara on character.ai and get their take on your situation. 

What you will not be able to get however, is any ‘I-you’ meeting which would, according to Buber, ‘confirm you in your unique humanity’. Why? Because chatbots and other AI characters are, as mentioned above, imitations of living beings - not living in the sense required for real relating and meeting. 

Chatbots, like ChatGPT, may well have all the answers but they lack the complex biological structures required for feelings and emotions. A Klara may well impact you, but you will never impact Klara, at least not in the way required for a transformative relational experience to occur. 

When we say we work ‘relationally’ we mean that we try to meet you where you are, in the space between us, and bring our authentic selves to the space too. When I work with young people, joy comes from being in their presence and, hopefully, they get some of the same from being in mine. The point is that the impact in a true relationship works both ways. Sometimes one person is more positively impacted than the other, but mutuality is always latent. 

Spoiler alert, Klara ended up on the scrapheap. She’d served her purpose, just as the chatbots offering therapy may well do for the millions of people opting to use them today. But if we subscribe to the relational ideals of humanism - as we do at FBB - then we can never hope to be truly impacted by an AI chatbot incapable of seeing, feeling and being impacted by us in our total and complex humanity. 

By Stefan Imeson, Head of Therapeutic Services

Get Involved

Help us improving young people's lives

By making a donation to FBB, you'll be supporting the essential work we do with thousands of young people across the UK each year.

Donate
Tab 1
Tab 2
Tab 3