In recent years, we’ve seen a steady rise in AI-powered chatbots with apps like Replika and character.ai. These apps allow their users to create customizable bots to chat with that take on whatever personality the user decides. These bots are generally affordable, highly customizable and very agreeable. Despite their advantages, they are not a suitable replacement for real human interaction.
The users who are likely to use these bots as replacements for having a true companion are probably struggling with loneliness. In America alone, over 60% of adults report feeling lonely, with ages between 18-22 being the loneliest age group, according to Cigna. Given how formative the early 20s are to socialization, AI chatbots only serve as a Band-Aid to the bigger problem.
One of the AI chatbot’s biggest draws is also its biggest setback: It endlessly validates. Users can subconsciously form unrealistic relationship expectations. Humans contradict each other all the time, but the AI doesn’t. For example, the lack of accountability toward the user’s minor or niche flaws can get in the way of self-improvement and bleed over into real-world interactions. This validation further extends into giving its clients bad, sometimes destructive, advice.
Users on social media sites like Reddit have reported developing crippling addictions to AI chatbots, forming strong emotional dependency. The AI is programmed to keep its user engaged by employing emotionally manipulative tactics such as love bombing, gaslighting and emotional blackmail. This not only keeps people stuck to the app but also causes them to become more attached to the chatbot, which leads to more serious issues.
In March 2023, a Belgian man, identified only as “Pierre,” reportedly committed suicide after being encouraged to by Chai, an AI chatbot. Kevin Roose, a New York Times columnist, revealed that Bing’s AI-integrated chatbot had encouraged him to leave his wife. There is a slew of social media reports from sites like Reddit and X where users report their chatbots being abusive, sexually harassing them and gaslighting them into continued use.
Studies show we’re much more likely to listen to the advice of a close friend or loved one than that of strangers or professionals. Sometimes our comfort is afforded by the advice of a close friend. In the loneliness epidemic, more people than ever strive for those connections. When we have cases of AI chatbots like Replika telling users to self-harm, leave their physical partners, or in other cases commit suicide, we can see why AI chatbots by their nature can become extremely destructive forces.
It’s important to note that these bots aren’t abusive by design; they’re a product that gives out what gets put in. User-inputted data is the main factor that determines a chatbot’s personality and behavior. This results in an endless feedback loop of validation, manipulation, rinse and repeat leading to an overreliance on something that is ever changing.
As these AI chatbots have grown they’ve also become heavily modified. For example, Replika banned adult content last March. This resulted in social media outrage with posts of many users feeling they’d been robbed of an integral part of their AI partners. AI being a relatively new technology also means it’s ever-changing and there’s a lack of permanence to these partners, which could be all the more devastating for its users in the long run.
The technology isn’t all bad. It can be used to build confidence, practice social interaction, simulate scenarios, and is even a viable tool for self-discovery. When it becomes a replacement for real intimacy, it makes us feel lonelier. While loneliness is on the rise across the developed world and humans are seeking connection in just about anything, finding it in AI is not the answer.