AI @ My LNAHighlightsMy LNA EnglishOpinion @ My LNATechnology @ My LNA

Imaginary Friend 2.0 or Digital Overlord? The Unsettling Future of AI Companionship

263
Can interaction with a generative AI chatbot be considered a true exchange of conversations (Transly Translation Agency - Unsplash)

By: Assoc. Prof. Ir. Ts. Dr. Nurul Asyikin Mohamed Radzi

Growing up, many of us had imaginary friends—those little companions that existed only in our minds. They kept us company when we were alone, making us feel less lonely. Imaginary friends helped us explore our thoughts and emotions, and even made us more creative. But sometimes, being Asian like myself, I remember how parents could get a bit worried. There is this superstitious belief in our culture that if a child talks to an invisible friend, it might be something supernatural or out of this world. It is just how many of us were raised, with a mix of imagination and cultural beliefs shaping our childhood.

Now that I am a parent myself, the challenge just got harder. One day, my daughter came home and told me that her friend has officially declared she has an Artificial Intelligent (AI) boyfriend. With AI-powered conversational agents becoming part of everyday life, like in games such as AI Dungeon, Hidden Door, and GPT Adventure, AI companionship has taken on a whole new level. It made me wonder: should we be worried about this AI companionship? While it feels like a far cry from the imaginary friends we once had, it is clear that AI is becoming a part of how people, especially kids, interact and form connections.

Let’s first zoom in and understand what AI-powered conversational agents or large language models (LLMs) are. Simply put, they are advanced AI programs designed to understand and respond to human language. LLMs like ChatGPT, Gemini, and Claude are built on sophisticated algorithms that analyze and generate text based on patterns learned from vast amounts of data, ranging from books and websites to conversations and articles. LLMs work by breaking down language into smaller components, such as words and phrases, and then predicting what comes next in a sentence or conversation based on their training.

This training involves processing large datasets to identify patterns and relationships in language, allowing the model to generate responses that are coherent and contextually appropriate. As these models are exposed to more data, they refine their ability to engage in conversations that feel personal and relevant, making their responses increasingly accurate. Essentially, LLMs are like very advanced tools that use patterns in data to understand and respond to human language in a way that seems natural and intuitive.

It is clear how AI companionship can resonate with many of us today. There are several notable advantages to having AI as a companion. First, they are available 24/7, providing support and interaction whenever you need it, whether you are seeking advice or simply want to chat, which is especially valuable during moments of solitude or stress. Besides, these AI models provide personalized responses that evolve with each interaction. They learn from your conversations and preferences, allowing them to tailor their replies to better suit your needs and interests.

This level of customization helps ensure that the interactions are relevant and engaging, making you feel understood and valued. On top of that, AI companions can positively impact mental health by alleviating feelings of loneliness. They provide consistent, meaningful conversation, offering a comforting presence and support during challenging times.

While AI companionship has its advantages, it is important to consider some potential downsides. First, AI companions can influence your decisions in ways that may not always be transparent. Since they are designed to provide responses based on patterns in data, they might inadvertently steer you toward certain viewpoints or solutions, potentially affecting your decision-making process. Second, the pervasive presence of AI-driven interactions can subtly shape perceptions and reinforce particular narratives based on the data they were trained on, which may influence how you see the world. Lastly, there is a risk of developing a dependency on AI interactions. The engaging and often comforting nature of these conversations might lead to excessive reliance for emotional support or entertainment, potentially impacting real-world relationships and activities.

Personally, as an old-school mom and researcher, I always approach disruptive technology with a cautious mindset. Eleven years ago, when the movie Her was released, I told my husband that such advanced AI interactions seemed impossible. Yet, this film unexpectedly prepared me for the reality we face today. As we embrace the potential of AI companionship, it is crucial to remember that moderation is key. Being mindful of the potential downsides such as the influence on our decisions, the shaping of our perceptions, and the risk of dependency ensure we can harness the benefits of AI while safeguarding our real-world relationships and privacy.

A balanced approach helps us enjoy the advantages of this technology without letting it undermine our genuine human connections and well-being.


Assoc. Prof. Ir. Ts. Dr. Nurul Asyikin Mohamed Radzi

The author is the Head of Research Grant, Innovation & Research Management Centre (iRMC) Universiti Tenaga Nasional (UNITEN). She may be reached at asyikin@uniten.edu.my

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *