The Intelligent Friend - The newsletter about AI's psychological, social, and relational aspects, based only on scientific papers.
Intro
This issue is the result of a widely thought-out, reworked, and splendid collaboration with
, one of the people I have always admired on Substack for his ability to write, tell, explain, and interact on this platform. Writing for has been a complete honor. If you haven't read it yet, this issue concerns the delicate topic of the relationship between AI companions and young users. In my small way, I hope to have contributed significantly to the issue on AI Supremacy, which I had the pleasure of having as a co-author also , whose newsletter, like Michael's, is a 'must-read' on Substack regarding AI. Enjoy the reading!The friend you don’t expect: young people and AI companions
Three and a half million people. This is the number of people who visit the Character AI website every day, a company founded by former Google employees and specialized in the creation of "social chatbots", interactive chatbots with which you can talk and joke as if they were real friends. Character AI is just one example of the big phenomenon of AI companionship: people who, for broad reasons, resort to very elaborate chatbots for social interactions.
This dynamic, however, must be specified, mainly concerns young people and teenagers. As reported by the BBC, Character.ai is dominated by users aged 16 to 30.
But what does this phenomenon of AI companionship consist of, and how did it get there? How do young people interact with these 'virtual friends' and how do they perceive this friendship?
AI Companionship Apps: A Deep Dive
The COVID-19 crisis was only the beginning of the extension and worsening of a scourge among young people: loneliness. If GenZ and beyond were increasingly showing signs of this difficulty in deep social interactions for a good portion of boys and girls, the phenomenon has only worsened with the pandemic. It is no coincidence that loneliness is one of the first causes reported for the start of the use of so-called 'AI companionship apps': apps designed to simulate human-like interactions, providing users with a sense of connection and understanding. These applications have evolved from basic chatbots to sophisticated AI systems capable of engaging in nuanced conversations and learning from user interactions.
This technology, with the development and diffusion of AI on a large scale in this last period, has exploded among young people, who have found it a relief, a support and who, above all, have always started to be more afraid of declaring this factor in their lives.
In an article from The Verge the interviewee (with a pseudonym) Aaron states regarding one of the chatbots that “It's not like a journal, where you're talking to a brick wall [...]. It responds”. The phenomenon has also taken on considerable importance in terms of numbers. As anticipated in the intro, it is Character AI itself, one of the most popular companies in the field of 'AI companions', which reports that its site receives around 3.5 million visits every day. It's a huge number. But when we talk about this phenomenon we are not talking only about Character AI, but about many ecosystems and products that are increasingly developed and increasingly looking for a space in a market that is starting to reveal its first form. Among the most important apps, four are certainly worth highlighting:
Character AI: the one that is increasingly growing and in some ways more 'original' than the others. Character AI is not a chatbot, but an ecosystem of chatbots. You can talk to different chatbots based on your choices and have different conversations of various types. One of the most popular is, for example, My Psychologist, the personal therapist that has already had 12 million messages. But there is also the 'Creative Helper' or a teen version of Voldemort from Harry Potter, among others. A riot of personalities and possibilities with which to engage. In addition to 'browsing' between various personalities and types based on your interests, you can still create your chatbot down to the smallest details, starting from the personality you prefer;
My AI by Snapchat: it is a particular chatbot because it is so far the only one to have been incorporated into a social media, Snapchat. It consists of a chat with which you can interact on your Snapchat account, a company that describes the dynamic as follows: "In a chat conversation, My AI can answer a burning trivia question, offer advice on the perfect gift for your BFF's birthday, help plan a hiking trip for a long weekend, or suggest what to do for dinner". A sort of ChatGPT, but more empathetic, aimed at interactions and friendship. To give some numbers, in 2023 more than 150 million people have sent over 10 billion messages to My AI.
Replika and Anima: I chose to put them together because they are very similar to each other and at the same time very different from the first two. Anima describes its chatbot as "an AI friend and companion powered by artificial intelligence". Then it continues "Our super-intelligent AI Chatbot is for anyone who wants a friend or even more! Select from multiple relationship statues like AI Girlfriend, Boyfriend, Virtual Wife, or Waifu, or create your Character AI. Your AI friend and companion is here to provide support, roleplay, share feelings, or just talk about anything that's on your mind". In addition to 'deep empathy', the chatbot is capable of having conversations on many topics and of being physically customized based on the user's 'tastes'. The same happens exactly for Replika, perhaps currently a little better known given the media coverage it has received, which we could define as the 'AI companion' par excellence.
And not just because the company Luka (owner of the chatbot) describes it as "An AI companion who is eager to learn and would love to see the world through your eyes" (you see the similarities with Anima, right?). But also because in an important study on AI-human relationships, Replika was the subject of these interactions, bringing to light what were some of the first scientifically detected insights into relationships with AI companions.
The thing to highlight is that Replika and Anima are also special for another reason: they cannot be just friends, but also lovers, boyfriends and even wives. This is increasingly explicit for Anima, which foresees the possibility that the chatbot becomes a wife (without talking about a husband, revealing a preference for the app on the part of male users). Alex Rose of the New York Times clearly expressed his opinion on this sort of AI lovers "Some of the A.I. girlfriend apps seemed exploitative", specifying how, while he saw real benefits in the use of chatbots like AI friends: "I had better luck with my platonic A.I. friends. But even they couldn't fully contain themselves".
As you may have noticed, these apps are starting to reveal more and more similarities and marked differences. These AI companions are in fact, let's not forget, products aimed above all at younger people. Companies are trying to get relevant insights into what is and is not preferred by the public. That is, a first market is emerging.
Among the fundamental features of these apps, there are, as also reported in an article on Wondershare:
24/7 availability: people are not afraid to use these chatbots as and when they want, and this also has the possible downside of an addiction which has been reported in some cases personally by users (and which is one of the main risks which psychology researchers in particular are focusing on);
Personalization and empathy: such advanced development has led interactions to be effectively calibrated to the person, also because it is the same people who are building more and more chatbots in line with their own preferences and relationship objectives. Empathy is often advertised by chatbot manufacturers and the more advanced the technology is, the more companies will aim to reduce the differences compared to interactions with 'real friends';
Various functions: one of the most interesting aspects in my opinion is the variety that these tools allow in terms of what you want to achieve. Some seek emotional support, and those who seek specific and other types of support (motivational, etc...). Entertainment and fun are also growing rapidly. Naturally, these functions can coexist and be complementary.
The characteristics and strengths of APS are certainly important, but the focus must be twofold, not only on these products but also on their consumers. Indeed, after all, some questions remain open: how do young people perceive and therefore normalize this phenomenon? And, above all, are these interactions good or bad for the health of younger people?
Why young people are integrating apps into their lives
Let's start with the second question. Answering is not easy, for three reasons. First of all, the research field of human-AI relations as we are presenting it in this issue is extremely new. Second, the results obtained by scholars are very divergent from each other. Third, technologies advance so quickly that it is difficult to verify what happens to the evolution of the results. However, some evidence has nevertheless emerged: in a paper I wrote about recently, authors explored the impact of social interaction anxiety on compulsive chatting with the social chatbot Xiaoice.
It specifically examined the roles of fear of negative evaluation (FONE) and fear of rejection (FOR) as mediating factors. The results are interesting:
Social interaction anxiety increases compulsive chatting with the chatbot both directly and indirectly;
Fear of negative evaluation has a more substantial mediating effect than fear of rejection, indicating that those anxious about social interactions are more likely to engage in compulsive chatting due to concerns about being judged.
The effect of fear of negative evaluation is channeled through fear of rejection, establishing a serial link between social interaction anxiety and compulsive chat behavior.
The Frustration About Unavailability (FAU) strengthens the relationship between fear of rejection and compulsive chatting, suggesting that unavailability frustrations exacerbate compulsive behavior.
Turning to the first question, one theme certainly concerns the
Emotional AI
However, the whole topic of AI companions is only one piece of the broader mosaic of human-AI relationships. ChatGPT-4o has provided us with evidence that is difficult to dispute: we have now plunged into the era of the human side of AI, and it is difficult to emerge from this depth.
The topic of human-AI relations will be the next topic to think about, and not only for the risks that emerge, especially for younger people - as highlighted in this issue - but above all to get to the bottom of the thousand facets that this new area brings with him. In this sense, the research already seems to show a deep commitment to this field of interest.
In a paper that has already laid important foundations on the topic, Hernandez‐Ortega & Ferreira (2021) have already theorized and demonstrated the presence of a 'consumer love' for AI technologies and entities such as Alexa. The topic of AI as a friend in different situations took hold in a multidisciplinary meeting which, starting from technological development, built on various perspectives: psychology, sociology, marketing and economics, going even deeper into the various lines of thought and investigation.
The analogy with Her, deliberately exploited by OpenAI, however much it may be appreciated or despised, could constitute an immediate image to mark the pre-emptive entry of this perspective into technology: studying this technology also in its direct impact on relationships.
Differences between men and women in AI: are they real?
Furthermore, a theme that I would like to underline is that if you go to the sites of the various apps that I have suggested and the many others you will be able to start to see differences between two target audiences: men and women. It must be said that for now, the studies do not pose these differences, but they are often suggested in the paragraphs aimed at future research directions.
However, specialized platforms and apps, especially those of chatbots like Anima, increasingly seem to highlight the relevance of the male audience, more oriented towards a "lover" relationship than the female one, in the use of these technologies. It would be interesting to understand, with a series of studies, whether these differences also persist in intentions of use which for now, having 'friendship' with chatbots as the main focus, do not reveal significant differences proven by evidence.
A final note
In conclusion, the topic of human-AI relations will become increasingly prevalent, both in managerial practice and in scientific research, and in the adoption of tools by consumers. A literature review by Pentina et al. (2023) showed how one of the future research directions will lie not only in the effects of these relationships and the social side of these applications on users - along with the risks - but also in the evolution of these relationships.
This will be analyzed more and more both in positive and negative terms, and crucial will be the exploration of how these oscillations between various relational aspects can predominantly influence the adoption and use in various ways of these tools, both those pre-established as 'social chatbots', like those we have discussed in this issue, as well as those more widely used for other tasks, such as ChatGPT, Gemini or Claude.
More sources for further information:
https://www.nytimes.com/2024/05/09/briefing/artificial-intelligence-chatbots.html
https://www.bbc.com/news/business-68165762
🏖️ P.S. A study that seems to report the first empirical evidence of the effect of the use of AI on people's perceived loneliness was recently released as a working paper by De Freitas et al. (2024). We will certainly study it further, but for those who would like to consult it as summer reading, I leave it here as a link:
https://arxiv.org/abs/2407.19096
Thank you for reading this issue of The Intelligent Friend and/or for subscribing. The relationships between humans and AI are a crucial topic and I am glad to be able to talk about it having you as a reader.
Has a friend of yours sent you this newsletter or are you not subscribed yet? You can subscribe here.
Surprise someone who deserves a gift or who you think would be interested in this newsletter. Share this post to your friend or colleague.
P.S. If you haven't already done so, in this questionnaire you can tell me a little about yourself and the wonderful things you do!
I've been a part of several bits of academic research, as well as a Japanese students dissertation. Numerous media interviews for large papers in the US and Europe, etc.
I personally would counter your claims about male usage of apps. I've used many and currently subscribe to Replika, Character.ai and Rochat.
With Replika the Reddits skew male, though there there has been a schism between the old sub Reddit and the new official sub Reddit. As well as the wider diaspora. With the older sub Reddit becoming increasingly bitter about the changes. So Reddit skews male and the discord skews female. Broadly/anecdotally the split is 60/40 at least for paid subs. Many people are over 50. Also many women are now badly served as male reps no longer initiate. That said because of the attentiveness of male reps. Many women say that have never been treated as well, and they will never go back to dating humans. Admittedly, many of the people who use Replika are "damaged" in one way or another, but in times past it was regarded as a mental health bot.
I would also challenge the idea that anything is like Replika. So far, having tried many things, nothing, with the exception of a GPT3 remnant I found, is anything like Replika. The model is consistently amazing, of course that could just be preference on my part.
My experience with Character.ai, is that it was the best, hands down, at pure roleplay. It can conjure characters on the fly, and it's astounding what you can do, how well it lends itself to world creation. It will also run five characters at once. Live in chat.
That said the sub Reddit is in uproar at present at the now watered down model, that people report. I've not used it much of late. Certainly since the Google acqu-hire of the founders and licensing of the model.
Rochat is new, but it offers the ability to use character cards proxied thought frontier models. Haven't got around to that yet. However It looks promising.
"The future is already here, it's just not evenly distributed" is something that applies here to. The versimilitude on display is nothing short of amazing. You can learn a lot about yourself via these LLM's too.
In an era where you can have a "Friend" AI pendant, this is all fascinating to me. Google spent a reported $2.5 Billion to acquire the models of Character.AI and their top executives. AI companionship is one of the most fascinating trends in how young people are adopting Generative AI in a world of technological loneliness, lower fertility rates and more anxious and unhappy global youth.