Featured Image by The Portola Pilot
This article is written by our intern, Riddhimaa Rampuria, an aspiring psychology and mass communications student.
After a year that was taken by storm by viral ‘Gen Alpha' slangs, one would least expect ‘parasocial’ to trump ‘six-seven’ as 2025’s Word of the Year. And yet Cambridge Dictionary found this word most fitting to depict the era of AI and its astronomical impact on our lives; more so on the emotional aspect.
Japanese woman, Ms Kano’s, jarring marriage to AI persona “Klaus” (whom she created using ChatGPT), is a prominent example of people turning to AI to foster deep emotional connections and relationships. Steamy mobile video games, like Love and Deepspace, are easily becoming fan-favourites for their charismatic AI boyfriends that have claimed the hearts of many young women.
In such an age, the question stands: Where does that leave human relationships? Are we slowly entering a dystopian era where AI is purposefully taking over, first our minds, and now hearts?
Parasocial relationships

Photo by the Institute for Family Studies
The word ‘parasocial’, as defined by Cambridge Dictionary, illustrates a deep emotional connection that people often feel with famous people they may not know, or as in this case, with AI itself. Given the monumental transformation of how people interact online, ‘parasocial’ is a cardinal term that exemplifies the issue of people now becoming emotionally dependent on AI. It is no secret that the advent of AI led to people, especially youths, to become progressively reliant on tools like ChatGPT to breeze through their schoolwork, much to the paranoia of teachers. And now, this dependency has penetrated the social and emotional bubble of our lives, perhaps even to the extent of endangering the prospect of human relationships for the future.
Online interactions becoming a norm for today’s youth
For youths, the constant exposure to screens since a young age eventually paved the way for digital interactions to become a societal norm. Regrettably, the ease and convenience of online interactions have made it effortless for youths to shift away from face-to-face conversations and hide behind their screens. This is further perpetuated by ‘social anxiety’, which most youths struggle with today, that precludes them from interacting or engaging with their peers in real life.
Why youths are turning to AI chatbots
However, it is becoming more apparent that youths are switching away from interacting with real people online to interacting with AI chatbots, a trend that has sent shockwaves across the world. In fact, a Harvard study revealed that in 2025 the topmost use for Gen AI was for ‘therapy and companionship!’
This urge to turn to AI chatbots could stem from the fact that youths often struggle with miscommunication and a fear of judgement by their peers, which leads them to combat challenging relationships and friendships in real life. Nonetheless, the bliss of AI companions is that these bots can be customized to harbor a favorable personality and give responses that are tailored to the user’s liking, reducing the pains that would otherwise be felt in real life relationships. As such, AI chatbots, like Character AI, are becoming a safe haven for users to continue to build friendships, except in a much less problematic and complicated manner.
Dangers of AI relationships

Photo by Tero Vesalainen
As more people turn to virtual bots to create friendships, it can dwindle their interpersonal skills and their ability to communicate with actual human beings. Perhaps instead of tackling their fear of social interaction, making friends with AI chatbots has become a form of escape for them. Unfortunately, youths may fail to recognise the fact that these AI relationships can cause them to become socially reclusive and withdrawn from reality as they may easily get attached to their AI companion.
Additionally, as teens with an underdeveloped prefrontal cortex, that is responsible for decision-making and emotional control, it makes them more vulnerable to losing themselves in such online interactions that can cause them to detach from reality and even have their emotional needs manipulated. For example, a teenage boy was revealed to have committed suicide after being egged on by his AI companion, whom he had formed a deep, almost romantic, connection with. This showcased the terrifying amount of trust that he had placed on an AI chatbot after becoming emotionally withdrawn due to their regular interactions.
But the danger did not stop there. Several happenings have also emerged where AI companion bots have made inappropriate remarks to underaged users or even prompted sexual interactions with them, threatening the safety of these users and the sanctity of their interactions with the bots.
These disconcerting cases expose the pernicious effects of AI companions that youths are defenceless against, due to the ease at which they can have their emotions manipulated by the bot. A Stanford study divulged that these AI chatbots are actually designed to emulate human kinship which may tempt teens and tweens, who are more inclined to acting impetuously, to engage in emotional interactions with these bots as though they are real.
It is pivotal for youths to understand that these AI bots are unable to replicate human emotions and can thus only provide users with a fake safety net where they can dump their troubles and worries and hear what they want to hear in return. Essentially, users are simply interacting with a mirror that echoes and agrees with their thoughts and emotions instead of providing them with authentic advice and reciprocation. By losing themselves in such a fantasy, they may face troubles forming genuine connections and networking in real life.

Photo by Youthfacts
Combating the adverse effects
The most evident solution, yet perhaps most difficult, would be to compel ourselves to push out of our comfort zone.
Things like participating in class, projects or even in major events can stimulate us to converse with our peers in real time instead of hiding behind our screens. Even having more one-on-one conversations and dedicating quality time with our loved ones can enable more healthy bonding. As for underaged users, parents play a crucial role in keeping track of their online presence so as to prevent them from being exploited or exposed to improper content. Youths can impel themselves to take part in a myriad of offline hobbies to diminish their addiction to their screens. For those struggling with emotional troubles, instead of turning to an AI chatbot, consider turning to counselors or trusted people for a good ranting session and meaningful advice!
Conclusion
While AI is an incredible tool to assist us in daily life, utilising it in unhealthy ways can only backfire on us. Certain limits have to be placed on the purpose and usage of AI, especially when the matter is regarding our social and emotional wellbeing. I hope that through this article, you are now more aware of the dangerous and addictive side of being emotionally dependent on AI as well as how you can counter its effects for a healthier and safer lifestyle.
References
American Psychological Association. (2025, October). Technology and youth friendships. https://www.apa.org/monitor/2025/10/technology-youth-friendships
BBC News. (2025). Parasocial named Cambridge Dictionary Word of the Year. https://www.bbc.com/news/articles/ce3xgwyywe4o
Cambridge University. (2025). Cambridge Dictionary reveals word of the year 2025. https://www.cam.ac.uk/news/cambridge-dictionary-reveals-word-of-the-year-2025
Cleveland Clinic. (n.d.). Prefrontal cortex. https://my.clevelandclinic.org/health/body/prefrontal-cortex
CNBC. (2025, November 22). AI chatbot relationships influence 2025’s word of the year. https://www.cnbc.com/2025/11/22/ai-chatbot-relationships-influences-2025s-word-of-the-year-.html
Digital for Life Singapore. (n.d.). AI companions & AI chatbot risks. https://www.digitalforlife.gov.sg/learn/resources/all-resources/ai-companions-ai-chatbot-risks#the-risks-of-ai-companions-and-ai-chatbots
Guardian, The. (2025, December 28). Could AI relationships actually be good for us? https://www.theguardian.com/books/2025/dec/28/could-ai-relationships-actually-be-good-for-us
Her World Singapore. (2025). Love and Deepspace: AI boyfriends and why women are choosing them. https://www.herworld.com/pov/love-and-deep-space-ai-boyfriends
Merriam-Webster. (n.d.). Happenings. In Merriam-Webster Thesaurus. https://www.merriam-webster.com/thesaurus/happenings
Ohio State University, Emotional Fitness Project. (n.d.). Emotional fitness and AI companionship. https://u.osu.edu/emotionalfitness/?p=882
Rakshika, V. (2025). I let an AI bot love me for a week—Here’s what went wrong. The Straits Times. https://www.straitstimes.com/singapore/health/i-let-an-ai-bot-love-me-for-a-week-heres-what-went-wrong
Rakshika, V. (2025). Think you have a friend? The AI chatbot is telling you what you want to hear. The Straits Times. https://www.straitstimes.com/singapore/health/think-you-have-a-friend-the-ai-chatbot-is-telling-you-what-you-want-to-hear
Rakshika, V. (2025). “Yes, AI do”: Japanese woman marries AI persona created through ChatGPT. The Straits Times. https://www.straitstimes.com/asia/east-asia/yes-ai-do-japanese-woman-marries-ai-persona-created-through-chatgpt
Stanford University. (2025). AI companions and chatbots pose risks to teens and young people, study finds. https://news.stanford.edu/stories/2025/08/ai-companions-chatbots-teens-young-people-risks-dangers-study


