Beyond Consent: Enhancing India's Digital Personal Data Protection Framework
- Deependra Kumar Kushwaha
- 2 days ago
- 14 min read
-Deependra Kumar Kushwaha*
[This post is part of the Data Protection Special Blog series: "Beyond Encryption: Tech & Data Protection". This series will feature blogs, such as the present one, which explore and analyse the reshaping of data security and privacy in an era of evolving technology, legal frameworks and regulations.]
Abstract
The Indian DPDP framework, which seeks to create a strong data protection regime, is plagued by several challenges because of its over-reliance on consent, lax compliances for SDFs and lack of provisions for regulatory intervention. Furthermore, the absence of pro-active regulation and enforcement also weakens the DPDP framework. The article presents a critical examination of such loopholes and suggests a model that integrates stronger regulatory oversight, accountability measures, and public awareness components. It also suggests some measures to overcome these challenges and advocates for raising transparency in data processing, and promoting digital literacy to create a better and more participative data protection regime in India.
Introduction
From the very start of the discussion around privacy, ‘consent’ has been the buzzword, as the very foundation upon which our personal sanctuary is built, is consent. The Indian Digital Personal Data Protection (hereinafter referred to as ‘DPDP’) Act, 2023 recognizes two bases for processing personal data namely consent and certain other legitimate purposes. Section 6(1) of the act requires consent to be free, specific, informed, unconditional, and unambiguous. Notably section 6 falls under Chapter II, which outlines the obligations of the data fiduciaries (DF). This placement suggests that it is an obligation on the DF to ensure that the consent obtained from the data principal (DP) meets the required criteria set by the act.
The draft rules also prescribe some measures to ensure that the notice for consent are easily understandable for a DP. But regardless of all the measures a DF takes for obtaining consent that fulfils the criteria of section 6(1), the question remains whether the data principal understands the importance of such consent and the implications of giving consent. Especially in a country like India where digital literacy is very low and the privacy culture does not exist, it will be tremendously challenging and nearly impossible for the data fiduciary to obtain such consent.
A PwC India survey reveals that only 16% of consumers are aware of the DPDP Act across diverse geographies, age groups, occupational backgrounds, and urban-rural divides. While privacy consciousness as a whole is low in India, the small percentage of people who are aware of its importance still engage in activities that compromise their data privacy. This phenomenon, known as the ‘privacy paradox,’ describes how individuals claim to value privacy yet readily relinquish their data for convenience or fail to take protective measures. Thus, the implication is that the Indian population requires a heightened awareness of data privacy and its importance, similar to the educational initiatives undertaken against cyber-financial frauds.
It is apparent that India's consent-based law fails to sufficiently address the peculiar challenges encountered by it, particularly the prevalent low levels of digital literacy and the existing digital divide. Although the law aspires to establish a comprehensive framework for data protection, its over reliance on informed consent renders it ineffective for a substantial segment of the populace that remains uninformed regarding data privacy. Considering that cultivating a privacy-conscious culture will take years, the immediate priority must be the strengthening of the current legal framework to bridge this gap. Rather than exclusively relying on individuals to protect their own privacy, the legal framework must integrate more robust accountability measures and oversight mechanisms to ensure that data fiduciaries maintain elevated standards of data protection, even in interactions with individuals who may not fully comprehend the ramifications of their consent.
This article presents critical analysis of the shortcomings and gaps in the DPDP framework, with particular emphasis on its excessive dependence on consent, and highlights the need for a more balanced approach to data protection. It argues that the current framework could be enhanced by incorporating some essential aspects from both accountability-driven and risk-oriented models to overcome its existing shortcomings. The discussion delves into the critiques the inadequacy of current consent mechanisms, importance of proactive regulatory supervision, improved regulations on data processing though AI, and more stringent compliance obligations to safeguard individuals who may lack the capacity to navigate intricate data privacy choices. Lastly, it highlights the need for timely grievance redressal, envisages a voluntary certification. By advocating for these initiatives, the article aims to contribute to the establishment of a more efficacious and inclusive data protection framework that upholds the privacy rights of all individuals, regardless of their levels of digital literacy.
Mandatory Display of Consent Notice
The current practice of obtaining a DP’s consent involves a clickwrap agreement, however as already discussed, majority of the Indians do not read the privacy policy, it becomes problematic in today’s world when understanding the consequences of data processing is important to avoid the risks. For a long time the enforceability of clickwrap agreements remain debated, scholars labelled them as adhesion contracts that users often accept without genuine consent. Some studies suggest that majority of the participants involved accepted a privacy policy via a clickwrap agreement without reading any part of it. Owing to the low levels of digital literacy, this phenomenon is more common among Indians, where majority of them agrees to processing by clicking ‘I agree to the privacy policy and terms of service’ without even looking on them.
The draft rules can incorporate a provision for privacy notice to be mandatorily shown to DP, unlike the click-wrap agreement. However, to prevent consent fatigue and ensuring understanding of the privacy policy easier for the DP, the fiduciaries must be mandated to show a concise version of the privacy notice and terms of service, outlining the most important clauses, to the DPs with the links of privacy notice and terms of service. Examples of such clauses include ‘collected data’, safeguards in place, and most importantly the implications of such processing on the data principal. This will ensure that the DPs at least get a gist of the privacy policy, leading to comparatively more informed consent than in the case of blindly consenting to processing without reading consent notices.
Since the legislation is primarily based on a consent-based model and thus, consent is the primary legal basis for processing personal data, it becomes even more important that the consent at least be informed.
AI and Personal Data Processing
AI presents a new challenge for processing personal data legitimately. It challenges data protection principles. For instance, AI conflicts with the principle of data minimization, since it needs a large amount of data to be trained on. The next example may be of the principle of purpose limitation – AI may use data in ways not originally agreed upon by consumers. AI models often operate as "black boxes," meaning users do not know how decisions affecting them are made. Thus it becomes essential when a DF relies on automated decision-making for processing personal data, it shall notify the DF of the use of AI for processing, their rights concerning such automated processing, and the procedure for challenging an automated decision. Such additions in the consent notice will be a step towards informed consent of the DP.
Article 13(2)(f) of the GDPR mandates the controllers to inform the data subjects about the existence of automated decision-making, including profiling, and meaningful information about the logic involved, as well as the potential consequences of such processing for the data subject. We can find various other laws mandating disclosure of the processing of personal data through AI. There is no such requirement of disclosure, for processing personal data through AI in the Indian framework.
The only provision in the law that governs processing through automated means or algorithmic software is Rule 12(3) which requires a Significant DF to observe due diligence to verify that algorithmic software used for various operations, including processing personal data, does not pose risks to the rights of DPs. The law is then silent on how the board or any other authority will verify whether any such processing is harmful to the rights of DP. There is no explicit requirement for algorithmic impact assessment, AI bias audits, fairness checks, or independent assessments. It also doesn’t provide any right to DP to know about AI-driven decisions, the logic involved, and potential consequences.
When we look at other national laws like California’s CPRA provides the right to opt out of automated decision-making and the right to access logic behind AI-driven processing as well as a description of the likely outcome of the process for the consumer. Article 22 of the GDPR provides the right not to be subject to decisions based on solely automated decisions and the right to obtain human intervention in such decisions. Additionally, the EU AI Act takes a risk-based approach and bans high-risk AI in critical areas, it also imposes stricter rules for AI decision-making affecting fundamental rights.
Similarly in Brazil, as per Article 20 of the LGPD, the data subject has the right to request a review of decisions taken solely based on the automated processing of personal data affecting his/her interests, the right to clear and adequate information regarding the criteria and procedures used for the automated decision. In addition to this Brazil’s national authority may perform audits to verify discriminatory aspects in automated processing of personal data.
While the DPDP Act doesn’t provide the right to opt out of automated decision-making, the least the government can do is to provide the right to obtain human intervention and the right to know the logic involved and the potential consequences of automated decisions. This information shall be made available at the time of data collection or upon request.
The government should also consider mandating a risk assessment or an algorithmic impact assessment before deploying AI or algorithmic systems that process personal data in a manner likely to significantly impact DPs. Such assessment shall evaluate the accuracy and fairness of the AI system, the risk of bias or unintended consequences, and the potential for harmful automated decisions. Based on the results of such assessment the board may either allow such processing through AI or mandate human oversight on critical AI decisions.
Obligations must be levied on the DFs to ensure that any algorithmic processing of personal data does not result in unfair bias or discrimination against any DP based on caste, gender, religion, or other protected characteristics and provisions must be made to make the individuals involved accountable for any such violation.
Leaving the Significant Impact of ‘Non-Significant’ Data Fiduciaries’ from Oversight
Notably the law nowhere talks about even due diligence of the algorithmic software when it is used by a non-significant DF. The non-significant DF may be using sensitive data which will further complicate the issue of processing data through AI. We can see one more area where the DPDP framework only requires a Significant DF to ensure compliance thereby leaving the non-significant DFs from compliances.
Section 10 of the Act mandates Significant DFs to conduct a Data Protection Impact Assessment (DPIA). However, it leaves non-significant DFs that processes sensitive data. The concern is not merely the type of entity or the volume of data an entity is processing personal data but the sensitivity of the data processed itself. For instance, a non-significant DF application in the business of delivering medicines requires the DPs to upload the prescription, the entity will have a lot of sensitive health data of the DPs.
Now since the distinction between personal data and sensitive personal data, as suggested by the draft bills, has vanished it becomes even more important that the legislation at least prescribes a proactive approach towards observing oversight for high-risk processing.
This can be done by expanding the scope to cover non-significant DFs that engage in high-risk data processing activities or the use of new technologies with privacy implications. Providing detailed guidelines for conducting DPIAs. Despite arguments that such broadening of section 10 would unduly burden smaller entities and negatively impact the ease of doing business, the government must prioritize the risks associated with sensitive data processing. While the government may hesitate to expand Section 10 to include non-significant data fiduciaries processing sensitive data because of the above argument, it should, at minimum, lower the volume threshold for SDF designation for those processing sensitive data.
Data Protection Board a passive stakeholder
We can observe that the Indian Data Protection Board (DPB) has been envisaged as a passive stakeholder in the regulatory landscape. The current structure and the functions as outlined in the act suggest that the DPB is merely an adjudicatory body. It doesn’t provide a pro-active enforcement and stronger regulatory oversight mechanism, thereby limiting the scope of India’s DPB. The board lacks some essential functions which are usually assigned to a national data protection authority.
If we take a look at national data protection authorities of other nations, we can see that they are tasked with many functions to serve, which are not merely adjudicatory. For example, the Japanese Data Protection Authority also issues guidelines for general rules for handling personal information and cross-border data transfer, it also has a right to perform audits. The South Korean body apart from the adjudicatory functions also provides policy suggestions and cooperation with international organizations and foreign Data Protection Authorities.
Generally, the National Data Protection authorities are tasked with proactive enforcement, international cooperation, advising the government, technical standard-setting, and issuing binding guidelines, standards, and sector-specific rules for data protection compliance and public awareness functions. However, the DPB's role is currently focused on enforcement, compliance, and consumer protection rather than advisory functions, international cooperation, and most importantly proactive regulatory oversight.
Need for a Stronger Oversight Mechanism
At this point, it is important to note the difference between enforcement powers and proactive enforcement. The Indian DPB is tasked with enforcing and overseeing businesses using its adjudicatory powers provided in Section 27 of the Act to carry out enforcement task.
The board can look into matters only when a complaint is made by the DP against a fiduciary or consent manager or by references made by the central and state government or in compliance with directions of the court. It doesn’t have the power to suo-moto enquire into breaches or non-compliances when it receives any information from public reports (media reports, data breaches, whistleblowers), suspicion of violations through regular compliance checks, nor does it have any power to conduct random audits. In the UK, the Information Commissioner's Office (ICO) has the power to conduct "dawn raids" under section 146 of the Data Protection Act 2018. Similarly, in the EU, the national data protection authorities have powers to initiate investigations on their own.
With regard to oversight over businesses, the only provisions that can be found in the law are concerning Significant DF which require them to submit to it the report of DPIA and Data audit once in 12 months. The requirements of DPIA as given in the rules are very relaxed on top of requiring only the SDF to conduct one. The GDPR mandates DPIA to every entity and requires it to be done before the processing activity begins. It also mandates prior consultation when the risk involved in processing is high which then needs the board's permission to proceed with such processing.
What actions will the board take regarding the reports? Will it assess the risks and communicate with the fiduciary to implement stronger safeguards for risk prevention? Does the board have the authority to prohibit the fiduciary from proceeding with a project that involves high-risk processing? Furthermore, how will the board evaluate the authenticity of the reports and ensure they accurately reflect the ongoing processes?
However, the Indian law only requires the SDF to conduct a DPIA and Data Audit once in 12 months and submit its report to the Board. The law is silent on the powers of the board in relation with those reports and leaves the following questions unanswered. Will it conduct a comprehensive risk assessment and subsequently communicate with the fiduciary to implement more robust safeguards aimed at mitigating potential risks? Does the board possess the authority to prohibit the fiduciary from advancing with a specific project that entails high-risk processing? In what manner will the board evaluate the veracity of the reports and ensure that they accurately represent the ongoing processes?
The above unanswered questions weakens the board’s authority to take appropriate actions and gives the impression of compliance being mandated but on paper.
Ideally, it should be integrated into the planning phase of any new project or processing operation for any significant data fiduciary and the reports must be required to be submitted to the board. And for high-risk processing such as profiling, prior consultation, and permission of the board to proceed with such processing must be mandated. The Board must be given the power to order such measures, including restricting fiduciaries to proceed with such processing, based on its assessment of the reports. The board’s assessment must consider the risk, safeguards, and if a less-intrusive way can be adopted to achieve the same objective, and based on its assessment it decides to allow such processing. Indian Computer Emergency Response Team (CERT-IN) or any other body must be given powers to conduct an unannounced inspection to assess the data protection practices of any organization, especially in cases where there is a reasonable suspicion of violation or public report of a data breach. To prevent misuse, clear safeguards must be put in place, like pre-defined triggers for inspections, independent oversight mechanisms and due process protections. This would thus balance proactive enforcement and accountability ensuring these powers serve the purpose they are meant to without overreach.
Provisions must be made either to expand the scope of DPBI or establish a body or notify existing bodies like CERT-IN to perform the above-mentioned function, actively monitor compliance of the businesses for the DBP, and assist in the investigation or any other matters connected with functions of the DBP. We should also look towards a system for whistleblower protection to encourage reporting of non-compliance by employees or third parties and mandating external and certified audits of compliance with privacy standards as it will help the board to ensure the authenticity of the report, especially for high-risk processing.
Strengthening the Data Principle through a Robust Framework for Enforcing her Rights
Section 13(1) provides the right of grievance redressal. However, the draft rules do not provide any specific timeframe to respond to the grievance of the data principle and leaves it on the DF or Consent Manager to prescribe their own time frame. This may result in DF prescribing elongated timelines for grievance redressal, which is even more concerning in cases where sensitive data is involved.
To ensure timely redressal of the DP grievances, the government must prescribe a limit on the maximum number of days a DF can take to respond and resolve the grievances as was the case in the SPDI rules which prescribed one month for the grievance officer to redress the grievance. This limit may vary for specific classes of DFs with a shorter period for fiduciaries dealing with sensitive data.
Promoting Data Stewardship
The government can establish a voluntary certification scheme for DFs that meets high accountability standards. Certified organizations could be listed publicly to encourage compliance, and it will also improve the brand reputation and customer trust.
CERT-IN already has a certification scheme where after a successful Vulnerability Assessment & Penetration Test (VAPT) by a CERT-IN empanelled cyber security organization awards a certificate demonstrating adherence to established cybersecurity standards and commitment to cybersecurity best practices. It can be promoted through partnering with industry associations to develop and certify organizations that meet high accountability standards. DSCI has been working on a similar project called ‘Data Protection Seal’ which aims to inform users about organizations that adhere to basic data privacy standards and securely utilize their data. The seal represents a commitment from DSCI indicating that the application, product, or platform has undergone scrutiny by the organization in accordance with the anticipated standards of data security and data privacy. A similar concept can be seen in Article 42 of the GDPR which encourages member nations to set up Data Protection Certification granting bodies.
Conclusion
While the DPDP framework is a remarkable step towards data protection in India, it is plagued with various issues particularly the problem of uninformed consent, inadequate oversight, and weak enforcement mechanisms. This article emphasize the need for more transparent consent mechanism, transparency and accountability on AI-driven data processing, and increased regulatory oversight, especially for non-significant data fiduciaries engaging in high-risk processing. It also emphasizes on empowering the DPB with active enforcement powers, ensuring timely grievance redressal, and voluntary certifying responsible data fiduciaries.
With the release of the Draft DPDP Rules, 2025, India is witnessing a vital phase that will lead to the development of its legal framework which will in turn define the state of privacy in India. However, without a fundamental shift in privacy culture, expecting DPs to engage with complex privacy policies remains unrealistic. The above-suggested measures aim to strengthen the Indian Data protection framework, especially for the population uncatered by the DPDPA. However, there is no alternative to aware and informed DPs which is the most important stakeholder in the privacy landscape. Indian culture's lack of acceptance of privacy and lack of education and awareness about privacy rights complicates the situation.
A change in attitude towards privacy culture is essential for effective data protection. Beyond legislative measures, the government must be proactive in educating people about digital literacy through country-wide campaigns, ensuring Data Principals are aware of their rights under the DPDP Act and how to implement them. Without common awareness, even the strongest legal frameworks will be insufficient to safeguard personal data.
*The Author is a Fourth year B.A. LLB. (Hons.) student at the Dharmashastra National Law University, Jabalpur.
Comments