India is rapidly digitising. There are good things and bad, speed-bumps on the way and caveats to be mindful of. The weekly column Terminal focuses on all that is connected and is not – on digital issues, policy, ideas and themes dominating the conversation in India and the world.
The proliferation of AI tools has already started affecting how the Indian population perceives reality. For example, a photo of detained Indian wrestlers Vinesh and Sangeeta Phogat was edited to make it look like they were smiling in police custody. While this is not the first time people are being manipulated with edited images, the perceived difference between what is a real image and what is edited or simulated is being blurred with new AI tools.
For decades, concepts such as ‘post-humans’ and virtual reality prompted science-fiction communities to imagine simulated worlds and philosophers to propose that we live inside a simulation.
These debates have always flown down into pop culture with movies like Star Trek and The Matrix, where questions of reality have been percolated to the masses. While they may seem silly, with a cult following among a section of ill-informed communities who worship these ideas without rational debate, there is a need for such debate to understand technologically-advanced cybernetic societies.
The debates around reality are no longer hypothetical scenarios. Now, many of us are unable to perceive what is real and what is AI-simulated. French sociologist and philosopher Jean Baudrillard’s work Simulacra and Simulation offer a postmodernist perspective on this debate.
Baudrillard’s work on our electronic media culture dating back to the 1980s arms us with sociological theory to understand these effects of electronic media. He points out the difference between pretension and simulation. In the case of pretension, the difference is that reality remains unchanged. But in the case of simulation, it has much more to do with what is true, false, real and imaginary.
What happened with the Indian wrestlers was a real event of being arrested during an act of protest. But in the minds of BJP followers and activists, there is an imaginary version of reality where wrestlers are happy about being detained, or achieving an imaginary goal of defaming the government. Baudrillard theorises that “the real can be produced from miniaturised cells, matrices, and memory banks, models of control – and it can be reproduced an indefinite number of times from these.”
Also Read: Backstory: Had the Media Done its Job, Wrestlers Wouldn’t Have Had to Go Through This Ordeal
In this case, the AI models are producing an imaginary reality that is true for the millions of consumers of the BJP IT cell’s simulated image.
The BJP IT cell has perfected the art of creating realities that are imaginary, where the distortion of reality is being distributed to millions of people using electronic media networks. Its ability to distort and promote new realities was scaled with the monopoly networks of Facebook and WhatsApp.
Now, AI gives fuel to this ability and makes it even more challenging for us to differentiate between what is real and imaginary. The proliferation and scale of these imaginary narratives are going to only increase from now as AI tools become better and more accessible.
Beyond the idea of being true or fake, people believe in the simulations performed by the BJP IT cell, leading them to further distort reality.
With the recent Odisha train crash, many have started promoting alternate versions of the accident for which Muslims, Pakistan and the opposition are responsible. These are not all versions from the IT cell, but from normal people who are already being prompted to act in this way.
Also Read: Social Media Users Add Communal Spin to Odisha Train Accident
The use of AI tools to distort reality is only going to increase over the next few months, as many states in India go through assembly elections. The increase in the proliferation of such manipulated content will be hard for fact-checkers and the normal population to detect, and will affect the way we perceive reality. Several web-based media outlets promoted by political parties are now using large language model-based AI tools to automate some version of their news reporting too.
The result of these tools being used with no regulation in sight is that society’s ability to see reality will be hindered. To quote the Hyderabad MP Asaduddin Owaisi, “minds” are being “hacked”, and these ‘simulated’ minds are dangerous both to themselves and to society.
In The Matrix, the people living inside the simulation are not entirely innocent either, as they have the ability to suddenly transform into agents who will not allow others to see reality.
Srinivas Kodali is a researcher on digitisation and a hacktivist.