A UK House of Commons report on disinformation and fake news has mentioned India and how the activities of two firms – SCL and Cambridge Analytica scandal – impacted Indian elections.
The sixth chapter of the report deals with the interference of SCL and Cambridge Analytica on elections around the world, with five sections, dealing in detail with the impact of this scandal on the elections in five countries: St Kitts and Nevis, Trinidad and Tobago, Argentina, Malta and Nigeria. While there is no section dedicated to India, it is mentioned in the same light.
Mark Turnbull, former managing director of SCL Elections, and Alexander Nix, who was the CEO of Cambridge Analytica, had themselves spoken about their work in India in undercover conversations taped by Channel 4 in the UK. Material from these journalistic investigations were taken note of by the House of Commons report.
“The following election and referenda campaigns were mentioned by Mr. Turnbull and Mr. Nix, over the course of the Channel 4 meetings: Kenya, Kenyatta campaign 2013; Kenya, Kenyatta campaign 2017; Ghana 2013; Mexico; Brazil; Australia; Thailand; Malaysia; Indonesia; India; Nigeria; Pakistan; Philippines; Germany; England; Slovakia; Czech Republic; and Kosovo.”
The report quotes Paul Oliver Dehaye, who described SCL employees working as double-agents to both aid Indian political campaigns and sabotage them. Dehaye described the work of Dan Muresan, who was an employee at SCL, “He was working for Congress, according to reports from India, but he was really paid for by an Indian billionaire who wanted Congress to lose. He was pretending to work for one party, but was really paid underhand by someone else.”
The report says, “The work of SCL and its associates in foreign countries involved unethical and dangerous work.”
Cambridge Analytica in India
Earlier this year, media in the UK and the US reported on a major scandal which showed how user data from thousands of people on Facebook were harvested and channelised to manipulate the 2016 US Presidential election.
In March this year, news surfaced that Indians and Indian politics were also affected by the Cambridge Analytica scandal.
Christopher Wylie, the main whistleblower in the scandal, testified to a UK parliamentary committee saying, “I believe their [Cambridge Analytica] client was Congress.” This kicked off a big controversy in India between the Congress and the BJP.
A Facebook spokesperson confirmed to The Wire’s Anuj Srivas that over 550,000 Indians could have been potentially affected by the controversy, with their personal information being harvested. This is because 335 people in India installed an app called ‘thisisyourdigitallife’ which collected not just the data of those who installed it but also of their social network. Through these 335 people, 562,120 others were affected. This was just 0.6% of those affected globally.
The Wire also reported that Cambridge Analytica was contracted for the Bihar Assembly Elections in 2010, by Chief Minister Nitish Kumar’s campaign. “The core challenge was to identify the floating/swing voters for each of the parties and to measure their levels electoral apathy, a result of the poor and unchanging condition of the state after 15 years of incumbent rule,” said the firm’s section on India. In that election, the JD(U) went from 88 seats in the Assembly, to 115. Their early work in India was handled by Amrish Tyagi, son of senior JD(U) leader, K.C. Tyagi.
UK committee recommendations on disinformation and fake news
The 90-page report has been released by the House of Commons Digital, Culture, Media and Sport Committee. It is titled “Disinformation and ‘fake news’’’.
It describes the role and legal responsibilities of tech companies and issues around targeting users. It also discusses political campaigning in detail, including the impact of this scandal on the EU referendum and Russian influence in political campaigns. It also discusses digital literacy, how people use social media and the need for making digital literacy a part of school curriculum and with young people.
It also makes a number of recommendations and invites feedback not just from governments and stakeholders, but from all readers of this report.
The report says governments should reject the term ‘fake news’ and instead use terms like ‘misinformation and disinformation’. They caution against “blunt, reactive and outmoded legislative instruments” which cannot adapt to fast-moving technology.
They say that companies should be audited and scrutinised not just for their finances, but also for their non-financial aspects such as their security mechanisms and algorithms, to ensure they are “operating responsibly”.
“If companies like Facebook and Twitter fail to act against fake accounts, and properly account for the estimated total of fake accounts on their sites at any one time, this could not only damage the user experience, but potentially defraud advertisers who could be buying target audiences on the basis that the user profiles are connected to real people. We ask the Competition and Markets Authority to consider conducting an audit of the operation of the advertising market on social media.”
The report particularly makes mention of Facebook’s impact in the ethnic cleansing of Rohingya Muslims, in its section of recommendations:
“The United Nations has named Facebook as being responsible for inciting hatred against the Rohingya Muslim minority in Burma, through its ‘Free Basics’ service. It provides people free mobile phone access without data charges, but is also responsible for the spread disinformation and propaganda. The CTO of Facebook, Mike Schroepfer described the situation in Burma as awful’, yet Facebook cannot show us that it has done anything to stop the spread of disinformation against the Rohingya minority.”
On the role of tech-companies and data-firms in political campaigns and influencing outcomes, the report recommends not just regulation but also bans:
“There should be a public register for political advertising, requiring all political advertising work to be listed for public display so that, even if work is not requiring regulation, it is accountable, clear and transparent for all to see. There should be a ban on micro-targeted political advertising to lookalikes online and a minimum limit for the number of voters sent individual political messages should be agreed, at a national level.”