top of page

India’s Data Protection Law: Undermining Labour Rights in the Digital Economy

-Shobhit S.*

 

(A version of this article was originally published as a two-part blog post on the CCG Blog. Parts I and Part II of the post are accessible here and here respectively.)


Abstract


Digital data is becoming increasingly central to the organisation of work, profoundly impacting workers’ autonomy, working conditions, and capacities for negotiation. In this context, this article discusses the role of data protection law in offering informational autonomy for workers. It proceeds to analyse the treatment of personal data in the context of workplace relations under India’s Digital Personal Data Protection Act, 2023 (“DPDP Act”). It highlights how the Act facilitates opaque employee surveillance, depriving employees of basic informational rights typically guaranteed to all individuals. It also discusses the Act’s failure to guarantee certain other data rights that hold significance in the occupational context – rights in connection with automated decision-making and data portability. Given the rapid platformisation of work, and the emergence of state-level regulations on platform-based work, the article pays special attention to the protection of platform workers’ interests over data. Finally, it offers suggestions for building a fairer data governance ecosystem, which accounts for the individual and collective implications of datafication in the workplace.


 

I. Datafication of work


For centuries, the collection and processing of information of workers has been vital for employers and their agents to exercise control over the labour process. In the industrial economy, the mass production of goods relied upon “scientific management”, entailing meticulous measurement of work and supervision of workers in assembly lines. Although economic activity has shifted away from factories, the logic of supervision, measurement, and control has persisted. It has only taken novel technological forms with the evolution of work arrangements and relations.


In contemporary platform capitalism, work is increasingly characterised by extremely short-term engagements (or “gigs”), often mediated via digital platforms that match its demand and supply. Today, a growing range of location-based services are organised via labour platforms such as Uber, organising taxi services or Swiggy, organising food delivery services. In parallel, there has been a sharp rise in web-based services, like transcription and data entry, where work is both organised and carried out online, through platforms such as Crowdworker. Heralded by a combination of social, technological and political-economic factors (see here: Page 27), the platform economy has mushroomed in the last decade. Its growth has been especially remarkable in developing nations, historically associated with large informal workforces. Even as platforms continue to withhold figures, platform workers were roughly estimated to comprise 1.3% of India’s workforce in 2022, and projected to rise to more than 4% by 2030. Further, recent surveys indicate that an overwhelming majority of them work “full-time” on such platforms, relying on them for subsistence.


The digitalisation and platformisation of work relies on and involves the relentless generation of digital data, including workers’ personal data, through which platforms exercise control over work and workers. Fed into algorithmic systems, data is used to automate key managerial decisions conventionally made by humans – to allocate (and withhold) opportunities for work, to assess workers’ behaviour and performance, and to determine their compensation. Despite profoundly shaping the process of work,  data-driven decision-making predominantly occurs in a “black-box”. Workers are neither offered reasons for specific decisions (say, why a certain gig was offered to a worker), nor the processes that undergird those decisions (say, how gigs are generally allocated to workers). While opaque algorithmic management of this nature is most noticeable in platform work, similar strategies are also pervading other workplaces, prompting apprehensions of “Uberisation” across industries. Besides its use as a tool for control, workers’ data is often appropriated for speculative commercial purposes, including training AI-systems that can render their labour obsolete. Self-driving cars and delivery drones are some concrete examples of such systems under development.


Against this background, data protection (“DP”) regulation can be an important avenue to address challenges surrounding the digital transformation of work. DP laws can guarantee workers a degree of control over their personal data, provide them transparency over its use, and limit its processing to reasonable purposes. On the contrary, India’s Digital Personal Data Protection Act, 2023 (the “DPDP Act”) undermines workers’ interests in the digital economy and creates conditions for further precarisation of digitally mediated work. Part A of Section II discusses how the DPDPA Act facilitates unrestrained access to employees’ personal data and prima facie deprives them of basic data rights otherwise guaranteed to all individuals. Thereafter, Parts B and C discuss the implications of the DPDP Act’s failure to offer certain other protections that are critical in the context of digitally mediated work.


II. Workplace data under the DPDP Act


A. Treatment of data in the context of employment


It is worth noting that platform workers in India are not presently recognised as ‘employees’ under labour law, even though their unions continue to strive for such recognition (see here and here). This Part A, analysing the treatment of data in the context of employment under Section 7(i) of the DPDP Act, therefore does not prima facie apply to platform workers. Yet, it is important to consider how this provision may relate to platform work, alongside how it relates to employment. This is because while platform workers struggle towards securing employee status, the DPDP Act threatens to undermine their informational autonomy even if such status is secured. 


(a) Allowing non-consensual access on broad grounds 


Consent is at the core of DP laws, which typically require entities to obtain an individual’s consent before processing their personal data. Nevertheless, in certain contexts, individuals are not positioned to provide free and informed consent. This holds true for employees, given the asymmetry of bargaining power between an employer and an individual employee. Thus, policymakers agree that employers should not generally be allowed to rely on employers’ consent as the basis to process their data. Instead, DP laws usually allow non-consensual processing of such data on the grounds of “legitimate use” or contractual necessity” (see here: Section 2.3). Simultaneously, they place guardrails on such processing – for instance, by requiring that it be necessary to achieve specific purposes, like disbursing employees’ wages or verifying their attendance.


The DPDP Act, while allowing non-consensual processing of personal data “for the purposes of employment” [Section 7(i)], does not include any such guardrails. The sweeping generality of this language allows employers to process any personal data relating to an employee, so long as such processing can be explained, however loosely, with reference to employment. In addition, by allowing data processing towards “safeguarding the employer from loss or liability” [Section 7(i)], the Act patently negates employees’ privacy in favour of employers’ commercial interests. This marks a continuation of the incremental expansion of the grounds to process employees’ personal data under successive drafts of the law (see here, here, and here). Moreover, the Act neither sets any limitation to the sources from which personal data can be collected nor requires that such processing meet the judicially-endorsed standards of necessity and proportionality.

 

(b) Depriving employees of basic data rights


Paradoxically, the DPDP Act also makes certain data rights, which are guaranteed unconditionally to individuals under other DP laws, conditional on the collection of such data being consensual. As discussed above, Section 7(i) allows the non-consensual collection of  data in the context of employment towards vague ends. Thus, where their data is processed under Section 7(i), employees would additionally be deprived of the rights of notification [Section 5], access [Section 11], correction, and erasure [Section 12)] over such data.


First, employees would not be notified at the time of collection or use of such data. Moreover, even where they actively seek access to such information at a subsequent stage, they would not be entitled to receive it. They would neither have the right to access its contents, nor the sources from where it was procured. The resulting information asymmetry would enable the collection of employees’ data from any source and using any means, without any accountability whatsoever toward them. This would expose employees to pervasive and surreptitious surveillance, potentially extending beyond their performance of work to their conduct and their communication channels. Such exposure would also produce chilling effects on their rights of speech and association. The growing deployment of “bossware” since the pandemic – including screen-recorders, key-loggers and tools for facial and emotional recognition – intensifies these concerns.


Second, employees would have no right to demand the erasure of such data, even after the underlying purpose of collecting it stands served. In fact, they would not be able to demand erasure even after their employment comes to an end. Employers would thus be empowered to aggregate and retain their employees’ personal data indefinitely, potentially feeding it to train bias-prone algorithms, increasingly used for hiring and for optimising other managerial processes. 


Third, unlike other data principals, employees would not have the right to demand correction or updation of such data, even where they know or suspect factual inaccuracies in it. This would leave them without any recourse in situations where, for instance, their wages or social security benefits are withheld due to inaccurate records held by employers.


The expansive grounds for non-consensual processing of employees’ data under Section 7(i), and the consequent deprivation of elementary rights over such data, deprives employees of basic informational autonomy at the workplace.


If platform workers were also to be recognised as employees, the implications of this provision would be particularly severe for them. As discussed earlier, platform work is fundamentally characterised by datafication, and digital surveillance is the primary mode through which platform workers are controlled and disciplined. In this context, the application of Section 7(i) would further diminish platform workers’ informational control, and leave their data even more vulnerable to speculative exploitation, and their labour to data-driven devaluation.


B. Failure to protect against automated decision-making


Considering threats posed by automated decision-making on privacy, fairness and autonomy, there is growing consensus that decisions that significantly affect individuals should involve human oversight. Accordingly, many DP frameworks provide individuals the right to challenge and obtain a human review of automated decisions (see here). Further, they prohibit the use of special categories of data, relating to an individual’s race, ethnicity, biometric information, and sexual orientation, to arrive at such decisions (see here: Section 5.6). In the EU, workers have utilised similar rights under the General Data Protection Regulation (“GDPR”) to secure momentous gains against opaque and discriminatory algorithmic management (see here: Chapter 3; here: page 30). In recent years, taxi drivers successfully sued Uber in Amsterdam for “robo-firing” drivers without adequate explanation. Similarly, action by delivery riders against discriminatory algorithms resulted in the Italian DP authority imposing a fine on Foodinho, a food-delivery platform. Building on the GDPR, the EU has introduced specialised rights against algorithmic management under its recent directive on platform work (“EU Directive”). The DPDP Act, however, does not contemplate any such avenue for workers. This is despite persistent demands by them for protection against the unfair use of algorithmic systems to allocate gigs and block accounts .


DP frameworks typically also require entities processing personal data in bulk, including via automated systems, to evaluate the associated risks through DP impact assessments (“DPIAs”) (see here: Section 4.7). In the context of work, DPIAs can prompt broader socio-political dialogue on how data-driven management systems impact the labour process. More concretely, they can inform collective action by workers and pave the way for their participation in consultations surrounding the use of automated systems. Notably, the DPDP Act does envisage DPIAs by ‘significant data fiduciaries’ (Section 10(2)(c)). However, delegated legislation is awaited to clarify the scope of such assessments, whether they will be publicly reported, and whether labour platforms would be classified as significant data fiduciaries.


C. Failure to secure data portability


Certain DP frameworks guarantee individuals the right to obtain personal data in the control of a certain digital intermediary and transmit it to another. This ‘right to data portability’ can act as a mechanism to counter platforms’ structural tendency to consolidate market power by locking in users. For instance, the GDPR (see here: Article 20) and Brazil’s General Data Protection Law (see here: Article 18) allow individuals to port their data from one digital intermediary to another, subject to certain conditions.


In the context of work, this right promises greater occupational mobility for workers, including platform-workers (see here: Section 3.8(e)). Labour platforms usually offer workers the flexibility to log on and off, take indefinite breaks, and even stop using the platform altogether at any time. However, when a worker deactivates their account on the platform, they often lose access to all records of their work – including the number of gigs completed by them, their earnings, and their reputational ratings. Faced with such switching costs, workers are effectively compelled to continue working on the platform, even if more favourable working conditions are available elsewhere. If operationalised, the right to data portability would enable workers to switch from one platform to another along with their hard-earned work histories, which are instrumental to accessing work in the platform economy. By failing to offer any such right, the DPDP Act misses an opportunity to mitigate employees’ dependency on their employer and platform-workers’ dependency on the platform they operate on.


III. Suggested regulatory interventions


As India’s overarching DP law, the DPDP Act should provide a principled baseline for the collection and use of individual’s personal data across sectors. In its current form, the Act falls significantly short of this expectation in the context of work and employment. Its wide affordances for data collection for employment-related purposes, along with the deprivation of rights of notification, access, correction and erasure for employees over such data, create conditions for untrammelled worker surveillance. Thus, Section 7(i) of the Act must be amended or clarified to limit the non-consensual processing of data in the context of employment, for narrowly defined ends. Given potential challenges in encoding such ends, employment-related processing must at least satisfy the contextual requirements of necessity and proportionality. Further, regardless of whether the processing is consent-based, employees must be granted rights of notification, access, correction, and erasure over their personal data. Lastly, whether through the Act or through other frameworks, lawmakers must consider guaranteeing the right to data portability for workers across traditional and non-traditional work settings.

 

In the face of widespread digitalisation of work arrangements, a DP law offering individual workers control over their personal data would indeed be useful. At the same time, given the asymmetry of power in workplace relationships, consent-driven, individual-centric DP laws cannot adequately account for the collective and societal implications of algorithmic management. Moreover, beyond privacy and data protection, workers’ interests lie in meaningfully negotiating control over workplace data for greater agency at work, and fairer distribution of digital intelligence. Such demands have fed into discussions on data governance mechanisms that enable workers to assert their interests collectively, through instrumentalities like data collectives and platform cooperatives. Driver’s Seat, a California-based cooperative comprising drivers and delivery workers, offers a compelling example of worker-led collective data governance in action. Through a mobile application, the cooperative allows workers to access and analyse data across platforms to understand and negotiate their working conditions. It also allows workers to make mobility data available to local authorities for better transportation planning, and to distribute the revenue generated from this. Comparable experiments are also taking shape in India. For instance, the Self-Employed Women’s Association Federation (SEWA) is attempting to leverage data-driven intelligence through cooperatives to assist women farmers in Gujarat in accessing financial credit and securing better returns on produce. As these experiments illustrate, collective data governance structures need enabling regulatory frameworks that recognise workers’ collective interests over data generated at the workplace.                       

                                                                 

Pertinently, the DPDP Act is envisioned by the Indian government as a component of a larger set of regulations for the digital economy, termed the Digital India Act (“DIA”) package. In the pre-draft consultations on the DIA, the government has indicated its intention to introduce a national policy on data governance . It is imperative that a regulatory framework for data governance, which recognises various groups’ collective interests over data, is enshrined in law. This would lay the groundwork for sectoral frameworks that endorse workers’ specific interests over data generated at the workplace. Further, the DIA package must include a framework governing the development and use of automated decision-making systems, accounting for their implications at the individual and the societal levels. This must specifically address the use of automated systems for algorithmic management of labour. It must offer workers a right to challenge algorithmic decisions as well as a right to participate in system-level decision-making on the deployment of algorithmic systems at the workplace. These steps can be critical starting points towards the protection of workers’ interests in the digital economy.

 

 * Shobhit is a researcher studying the law and political economy of information & communication technologies, focussing on digital data and platforms. He currently works as a Programme Officer at the Centre for Communication Governance, NLU Delhi. The opinions expressed in the article are personal to the author.

Comentários


Recent

bottom of page