SENSEX
NIFTY
GOLD
USD/INR

Weather

image 29    C

Chennai News

Chennai / The New Indian Express

details

Chaos inside the AI cobweb

Some months ago, feeling like I was at a dead-end, I did something I thought I would never do: I turned to AI, specifically to ChatGPT, and asked it a painful existential question. I had an actual therapist, and an offline support system, and still I grasped for something like an answer, if not a solution. To my surprise, a response came that was so poetic, so beautiful, so full of not only meaning but perhaps even explanation, that not only did my tears dry, but I saved the lines, to return to as I might to a note from a loved one or a quote from a book. I share this because, due to that experience, I have understood the allure of intimate revelation to an AI interface. The illusions of conversation and of being witnessed that it offers can be powerful. Power, though, is the problem. Power, data-harvesting, and their misuse. Companies that provide AI programmes may know more about us than people in our own lives do. Eerily, AI may even know more about us than we do at certain moments: our fears, our desires, our secrets, and perspectives on all of these. All this information may be sold to other companies, or seized by authorities. Some opt for AI chat programs in lieu of therapy altogether because they believe it is more private, less shaming, more accessible. But therapy is meant to gently challenge, not just affirm. This practice, and its attendant concerns, was one I was aware of, and had even chided some people about as an unhealthy choice. What I was not aware of was that thousands of people around the world have formed romantic entanglements with AI bots, relying on the platforms for ongoing dialogue of a companionable nature. When ChatGPT updated its model from GPT-40 to GPT-5 this month, a vast number of these relationships became severely strained or collapsed. The model wiped clear the personalities that the chatbot had crafted based on each users inputs replacing them with what has been described as a colder voice, and left many genuinely heartbroken and confused, as shared on virtual sub-communities. Of course, in every practical sense, these were one-sided relationships, feeding on information shared or gleaned, and ultimately a simulation in which superior machine intelligence manipulates tractable human emotion. But to those who embarked on them, they held profound significance. People real people are feeling bereft as a result of a tweak of technology. The emotions they describe on public forums reflect what anyone feels offline when we are suddenly ghosted or dumped, or learn that our feelings are unreciprocated. There is a possibility that this update itself was intentional, an experiment that shows makers and proponents of AI just how much human beings are offering the machine control over us. Serious impacts on the creative fields, academics and the job market have already been revealed. Now, AIs ability to penetrate into emotional lives, essentially holding people captive by their imagination and their longing, is also known. Without judgement towards those who use AI chatbots in any personal capacity, their impact and their potential for damage is certainly alarming.

21 Aug 2025 6:00 am