What Could the Future of Indian Data Protection Law Look Like?

Much of the new Bill will likely be based on the recommendations of the JPC, which most agree, failed to respond to progressive critiques of the proposed legislation.

Almost three years since the introduction of the Personal Data Protection Bill, Parliament has decided to withdraw the legislation and start anew the process of drafting a law for data protection.

What can we expect from this ‘renewal’ of the data protection law? The rationale for scrapping the nearly five-year old process of drafting the current Bill, according to Union IT minister Ashwini Vaishnaw, is to redraft the legislation in line with the recommendations of the Joint Parliamentary Committee (JPC), which were submitted in December, 2021. Meanwhile, minister of state for IT Rajeev Chandrasekhar has suggested it was to ease the burden of compliance on small businesses.

One might even speculate that the government is stalling for time, given its own dismal record on data protection and expanding surveillance architecture. In any event, we can expect a redrafted law to look substantially different from its previous iterations.

Much of the new bill will likely be based on the recommendations of the JPC. The committee, set up to closely scrutinise the draft Bill, failed to respond to progressive critiques of the proposed legislation – including how it might safeguard against unchecked government surveillance and unaccountable use of personal information, and how to bolster data protection regulation and digital rights in a manner cognisant of the nature of the threats posed by new forms of commercial surveillance and profiling.

Also read: Why India’s Process for Authorising Surveillance on Citizens Is Deeply Flawed

Instead, the JPC’s recommendations do more to confuse and confound than clarify the direction of a future privacy and data protection law in India. 

First, the JPC’s recommendations substantially expand the scope of unchecked government surveillance, without responding to the concerns about privacy and human rights raised by the increasing use of data-based technologies in governance projects. In particular, the expansion of the government’s power to expropriate and regulate ‘non-personal’ data opens up new concerns about government surveillance which are not accounted for in the Bill.

It is increasingly clear that certain kinds of aggregate data can have privacy implications that are not grounded in personally identifiable information. Consider, for example, the ability to use demographic data (including gender, caste, or other religion) to discriminately target particular communities based on certain common traits, without identifying the individuals themselves.

Such targeting based on aggregate data already takes place in certain systems, like the Delhi Police’s so-called ‘predictive policing’ system, which disproportionately targets informal settlements and economically vulnerable groups. However, instead of examining the implications of using such data about vulnerable populations, the JPC seems keen to expand the Union government’s powers over the realm of such data, including the power to demand the sharing of non-personal data by companies or other data controllers (under Clause 91), under a law drafted for very different purposes.

Such access to ‘non-personal data’ without extending appropriate safeguards for its use, offers another avenue of expanding state surveillance, which can particularly affect marginalised populations. 

Secondly, the JPC’s recommendations appear to privilege extractive business models based on profiling and surveillance, over rights and democratic control over data. The intervening period between the 2019 Bill and its eventual withdrawal evidenced that large digital platforms, which dominate our online environment, are further consolidating market power in India, with newer data-based business models presenting greater threats to privacy.

However, the JPC report echoes the line being promoted by the Union government for some years now, which characterises data generated about people as an ‘asset’ or ‘resource’ which should be used productively for economic benefit. In establishing data access for economic growth as a policy priority for data regulation in India, the JPC’s recommendations strike a foreboding note for how structural challenges to data protection issues might be dealt with by the data regulator (in this case, the Data Protection Authority, or DPA), particularly if enforcing or maintaining strong standards for privacy challenges dominant business models and threatens the models of economic growth that the Bill promotes.

Also read: Looking Beyond Privacy: The Importance of Economic Rights to Our Data

Indeed, lessons from the implementation of the General Data Protection Regulation (GDPR) in the EU indicate precisely that regulatory agencies need to be structurally protected from political influence. The proposed Data Protection Authority, however, lacks clear independence from the Union government, meaning policy choices in vogue might guide its hand, rather than a commitment to data protection and privacy.

Ultimately, despite the government’s rhetoric against ‘big tech’, privileging the economic value of data over structural rights-based protections over the same will end up entrenching extractive business models widely prevalent in India today. 

The JPC’s recommendations also fail to contend with the nature of privacy harms arising from emerging technologies characterised as ‘big data’ and ‘artificial intelligence’. Data collected online is increasingly the basis upon which important decisions about individuals and groups are made, in ways which are often intentionally obscured from the people it affects.

Corporations and governments now use data about people in incredibly complex ways, including for modelling and predicting attributes and individual or group behaviours, making statistical correlations between individuals. Machine learning and contemporary ‘artificial intelligence’ technologies compute vast sets of data about people in order to profile them, to serve them advertisements and online content, or to calculate interests on loans, the risk of insurance fraud, the probabilities of health risk, the suitability of an employee… The list goes on.

However, individuals have little control over how this data is processed and what its implications could be, particularly once they have ‘consented’ to being tracked online or having data collected. As these technologies grow in influence, other jurisdictions, including the US, EU and China, are developing laws to mitigate their harmful effects.

Even while the JPC appears to recognise these concerns as an aspect of privacy regulation, its recommendations fail to appropriately respond to the concerns, with the only recommendation being that data controllers must be transparent about the ‘fairness of algorithms’, without specifying what such fairness implies, or how data subjects can respond to unfair, discriminatory or harmful processing of data by such technologies.

Also read: What the JPC Report on the Data Protection Bill Gets Right and Wrong

While the government dithers on introducing privacy legislation, a need for a robust regulatory regime has never been more apparent. Our social lives are increasingly enmeshed with data-processing technologies used by both private and public actors in ways which are not transparent or accountable to individuals – from expanding police use of facial recognition, to the use (and abuse) of worker data by private platforms.

The scrutiny provided by the largely closed-door process of the JPC provided little assurance that the government values individual freedoms and rights over the claims of the government or large businesses over data. Following their recommendations, the revised Bill may end up privileging surveillance and profiling-based business models, rather than providing improved structural protections required for equitable participation in the digital economy.

Yet, the renewal of the drafting process still keeps the gate open for a privacy and rights-protective legislation. Hopefully, if it allows an opportunity to democratically shape the future of the digital economy in India, there’s a silver lining to its unceremonious withdrawal.

Divij Joshi is Doctoral Researcher at University College London