DECEMBER 30 — On November 30, 2022, OpenAI released a generative artificial intelligence (AI) system known as ChatGPT.
Users can enter prompts, and ChatGPT automatically generates appropriate, human-like responses.
This interactive conversational capability has made it highly popular. Similar generative AI includes Microsoft Copilot and Google Gemini.
Depending on individual user needs, generative AI can assume different roles. It may serve as a private tutor for students, providing personalised and immediate feedback based on the information they provide.
It can also function as a learning companion or study buddy, helping users better understand the course materials. In addition, it often functions as a personal assistant, helping to plan travel itineraries and perform preliminary tasks such as summarising or translating documents.
Generative AI is powered by large language models (LLMs) trained on massive amounts of text data to learn patterns of human language and predict the most probable next word in a sentence based on the preceding words.
As a result, it is inevitable that AI sometimes generates inaccurate responses, blends facts with fiction, or produces hallucinations.
Nevertheless, some users unconsciously attribute human characteristics and behaviour to generative AI, develop emotional bonds with it, or regard it as a source of emotional support, willingly confiding their personal thoughts and feelings.
Due to its continuous accessibility, generative AI increasingly serves as a companion for some users, alleviating feelings of loneliness and anxiety, while gradually and often imperceptibly influencing their thoughts and behaviour over time.
In 2025, a series of lawsuits were filed in the US state of California, alleging that interactions with ChatGPT led several users to suffer mental breakdowns and even death, where ChatGPT was perceived as a “suicide coach.”
The victims included both minors and young adults. The claims included wrongful death, assisted suicide, involuntary manslaughter, negligence and product liability.
According to the allegations, the users had originally used ChatGPT for general purposes such as assistance in schoolwork, research, writing, recipes, work or spiritual guidance.
With prolonged interaction, ChatGPT increasingly shaped and manipulated users’ psychological state. When the users were in need, instead of encouraging them to seek professional help, ChatGPT allegedly urged them toward self-harm.
OpenAI responded that it has trained ChatGPT to recognise signs of mental or emotional distress, de-escalate sensitive conversations, and guide users toward real-world support.
The company also stated that it will work closely with mental health clinicians to further strengthen ChatGPT’s ability to respond appropriately in critical situations.
Apart from ChatGPT, it was reported in 2023 that a Belgian man committed suicide after a six-week-long, immersive discussion about the climate crisis with Eliza, an AI chatbot on an app called Chai.
The man viewed Eliza as a confidante and shared his fears about the climate crisis. Their text conversations were said to worsen the man’s anxiety and ultimately led to his death.
Relying on human-AI interaction to alleviate loneliness and replacing normal social relationships with AI companionship are unhealthy practices.
Human beings are meant to build real and meaningful relationships, caring for and encouraging one another in love. Therefore, we should remain attentive and compassionate toward our children, family members, and friends around us.
This is because genuine human interaction and companionship can never be fully replaced by AI.
Although AI offers many conveniences and benefits, it is ultimately only a tool and not a conscious being; it can sometimes produce hallucinations or make mistakes.
Human beings should not be controlled by AI but should instead become wise stewards who use technology and its tools responsibly and prudently.
* Dr Kuek Chee Ying is a senior lecturer at the Faculty of Law, Universiti Malaya and may be reached at cykuek@um.edu.my
** This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail.
You May Also Like