On 15th March, 2021, the legislature of California banned the use of ‘dark patterns’ through an amendment in an effort to strengthen the enforcement of the California Consumer Privacy Act (“CCPA”). Through this ban, California is attempting to “ensure that consumers will not be confused or misled when seeking to exercise their data privacy rights”, as stated by California Attorney General. In essence, dark patterns are user interfaces present in websites or applications that are designed to mislead users into giving away their rights and personal data, or trick them into making certain choices that they may otherwise not make. With legislators across the USA and the EU attempting to devise solutions to effectively regulate dark patterns, it is crucial for regulators in India to take note of the same. This is especially considering the ongoing deliberations of the Joint Parliamentary Committee (“JPC”) on the Personal Data Protection Bill, 2019 (“PDP Bill”), and the additional extension it has received to present its report in the monsoon session of Parliament in 2021.
This post is divided into two parts. Part I examines the need for regulation of dark patterns, the existing regulation of dark patterns in other jurisdictions, and the challenges faced in this. Part II extrapolates the lessons learned from such regulation of dark patterns in other jurisdictions for regulation in India. It also argues that consumer protection laws must be used in tandem with data protection laws, and can be an effective avenue for regulation of dark patterns where data protection laws fall short.
Part 2: Lessons and Opportunities for Regulation in India
In Part 1, I examined the need for regulation of dark patterns and approaches to regulation of dark patterns in the USA and the EU, with particular focus on the CCPA, the action by the FTC, and the GDPR. In this part, I evaluate the positive and negative aspects of global regulation and the lessons for Indian regulators. In this regard, I will evaluate the possibilities for regulation through the PDP Bill based on the proposed draft, making a case for explicit prohibition of dark patterns, strengthening the provisions on privacy by design, and strong regulatory oversight. I will also make a case for adopting a rights-based approach to data protection rather than a consent-based framework, to more effectively regulate dark patterns which rely on exploiting the loopholes in a framework based purely on informed consent. Lastly, I will argue that consumer protection laws provide an avenue for regulatory pluralism and must be used in tandem with data protection laws.
Similar to the approaches in the USA and the EU, the current and proposed privacy laws in India are heavily based on user consent and transparency as parameters for protection, with the proposed Personal Data Protection Bill, 2019 heavily emphasising on the several ingredients that constitute ‘informed’ consent, as provided under Section 7 of the Bill. The current law in this regard- the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 (“SPDI Rules”) also requires informed consent of the user before their personal or sensitive personal data is collected.
Currently, reports suggest that the JPC proposes to overhaul the ambit of the PDP Bill, to focus more on data localisation and digitisation rather than personal data. It is left to see the final draft of the PDP Bill that will be produced post these revisions. For the purposes of this post, the 2019 draft will be analysed.
Possibilities for Regulation through the PDP Bill
Drawing from the GDPR, the PDP Bill, 2019 requires data fiduciaries (entities which control the purpose of the use of personal data) to publish detailed privacy notices, which ensure that user consent is not only informed but free, specific, clear, and capable of being withdrawn. Theoretically, the requirements for consent under the proposed Section 11 of the PDP Bill would render dark patterns such as confusing privacy policies, lack of an opt-out, or pre-checked acceptances of terms and conditions illegal, automatically.
However, as observed from the studies conducted in the EU, strong regulatory action is required to effectively enforce the prohibition of dark patterns. In addition, adopting a framework which is based heavily on user consent and transparency may not be too robust. This is because dark patterns make use of the loopholes surrounding the laws on consent and do not integrate transparency into design, considering it sufficient to simply inform the user no matter how convoluted the design may be. In addition, a study in the USA found that dark patterns depend on manipulating cognitive biases, essentially manufacturing an illusion of free consent that is then justified within a consent based framework in privacy law.
Therefore, the explicit prohibition of dark patterns is a necessary impetus to ensure compliance. This is the approach adopted in the CCPA and CPRA, as well as by the FTC. Given that the trend of privacy legislation in India has been to strike a balance between industry interests and user rights, the approach of the new regulation under the CCPA to limit the scope of regulation to certain types of dark patterns depending on its effect on subversion of user rights may be an effective middle ground to tackle the lack of compliance by several entities and websites. In this regard, the proposed Section 22 in the draft PDP Bill has the potential to include a robust framework for prohibition of dark patterns. Section 22 proposes that all data fiduciaries must prepare a ‘privacy by design’ policy, which would need to be certified by the proposed Data Protection Authority (“DPA”). This policy would theoretically put user privacy at the forefront, with data fiduciaries obligated to specify the technology used in processing personal data, the measures adopted to avoid harm to the user, the protection of privacy throughout processing from point of collection to the point of deletion, among other requirements.
However, the framework for privacy by design in the PDP Bill comes with significant limitations. Section 22 only requires that the data fiduciary make positive declarations regarding measures implemented to protect user privacy. As a consequence, these disclosures would not indicate the flaws in the design of the data fiduciary which could adversely impact the user. Additionally, the DPA is not obligated to then look beyond the declarations submitted by the data fiduciary in the privacy by design policy. Were the scope of Section 22 amended to include obligations to this effect for both the data fiduciary and the DPA, it will become mandatory for entities to overhaul their approach to the UI experience, and design processes that are inherently oriented toward the protection of privacy. If complemented by a strong message from a regulatory authority, such as the statements made by the FTC in relation to ABC Mouse, the PDP Bill, and Section 22 in particular, could become an effective tool through which to monitor and enforce the prohibition of dark patterns. It is also suggested that regulators consider encouraging entities to incorporate a standardised privacy options icon as done in the CCPA.
To tackle the loopholes in a consent based model of data protection law, scholarship in India has also suggested adopting a rights-based approach to data protection, as opposed to a purely consent-based approach. The consent-based approach places the onus on the individual to be aware of the terms under which they are sharing their data. This is impracticable given the sheer number of documents they must go through, as well as the rise of algorithmic inferences and automated data collection. Providing individuals with inalienable rights over their data and placing accountability on the data controller, rather than the user, is paramount in ensuring that there is structural change in the user experience, which cannot be subverted by simply claiming that the user has been notified. I would agree with the approach suggested in the scholarship to make the right of the user a right in rem as opposed to a right only against entities with whom a contractual relationship exists. I would additionally propose that such an approach lends itself to the regulation of dark patterns, given that they fundamentally rely on using consent as a defence to their operation, regardless of how coerced or un-informed this consent may be. Were the right of the user to be in rem, such a defence would not be available.
It remains to be seen how regulations such as the CCPA will be enforced in practicality. Nevertheless, an explicit acknowledgment of dark patterns within the regulatory framework is a crucial starting point for Indian regulators to ensure that informed consent is not subverted through clever interface design, as observed in the EU. It is important for the JPC to consider dark patterns in the latest iteration of the PDP Bill. To do so, it will be crucial to retain the framework on privacy by design and address its limitations, as discussed above.
Consumer Protection Law: An Avenue for Regulatory Pluralism
When examining the experience with the CCPA thus far, it is interesting to note that in October 2020, a consortium of seven leading advertising and marketing trade associations issued objections to one of the new regulations. Particularly, the objections were primed at a proposal which prevents users from having to read or listen to list of reasons not to opt out, before they are able to exercise that choice. The associations argued that “as a result of the proposed modifications, consumers’ receipt of factual, critical information about the nature of the ad-supported Internet would be unduly hindered, thereby undermining a consumer’s ability to make an informed decision.” While this objection was not successful in amending the regulation, it indicates that where privacy laws could fall short due to differing understandings of free consent and transparency. However, consumer protection laws may be a way forward. Scholarship has suggested this path of “regulatory pluralism”, that harnesses underutilised consumer protection laws to address the loophole in privacy laws which do not adequately address manufactured consent. This manufactured consent has been termed in the scholarship as “the privacy paradox” , when consumers who do care about their privacy are easily manipulated by nudges from against their own interests. The same study has found that the paradox is more apparent when consumer protection laws are weaker.
In March 2021, the Hyderabad District Consumer Redressal Commission (“Commission”) fined the online aggregator BookMyShow and PVR Cinemas for levying unauthorised service fees, which the aggregator had termed ‘internet handling fees’. These service charges were not authorised by the appropriate regulatory authority, as required under Section 6A of the Information Technology Act, 2000. While this matter was oriented towards consumer protection, the Commission inadvertently regulated a dark pattern. ‘Hidden costs’ are a recognised form of dark patterns, where a user at the final stage of checkout discovers that they are confronted with unexpected charges.
The Consumer Protection Act, 2019 and Consumer Protection (E-Commerce) Rules, 2020 have sought to regulate emerging technologies by bringing e-commerce entities within their ambit. They widen the definition of a ‘consumer’ to include those who purchase through online transactions, and hold online platforms accountable for misleading advertisements and unfair trade practices. The inclusive definition of unfair trade practices provides potential to address the particular challenge posed by dark patterns which are used to advertise and sell products and services on online platforms, or are connected with the “ad supported internet.” For instance, ‘confirmation shaming’ is found very often on most online marketplaces, by automatically adding items to a consumer’s cart or forcing a consumer into buying an additional product to avail a discount coupon. The Parliamentary Committee Report on the Consumer Protection (E-Commerce) Rules, 2020 has also recognised that consumer data privacy is often not respected on e-commerce platforms, recommending a more robust system that categorises the data according to sensitivity. It also recognises the presence of practices such as click farming, which deceptively inflates web traffic to a product to influence consumer decisions, which is a dark pattern in itself. While once again not using the term dark pattern explicitly, the Report demonstrates the potential for consumer protection law in India to regulate dark patterns.
While privacy laws are crucial to ensure informed consent in the whole user experience and that substantive rights of users are protected through opt-outs, consumer protection laws may be a parallel method to target particular dark patterns that are often found on e-commerce platforms.
This post has been an attempt to evaluate the possibilities for regulation of dark patterns in India, drawing from the experiences in the United States and the EU. Dark patterns have exploited the loopholes in privacy laws and transparency principles. As of now, the direction of personal data protection law in India hangs in the balance. However, the global experience shows that an explicit definition and targeted regulatory action is required to bring about structural changes in the user experience, and ensure privacy by design and not just by ensuring free consent. As the PDP Bill is being debated, it is crucial for regulators to take cognisance of the issue and ensure that provisions that could provide greater potential for India to specifically regulate dark patterns are not removed in the next iteration. In addition, this post has argued that a rights-based approach as contemplated in the scholarship lends itself to the regulation of dark patterns and ensuring privacy by design. This post has also discussed how ‘regulatory pluralism’ through the use of consumer protection laws is a promising way forward to target dark patterns that find their way into every day online transactions and effectively regulate them in conjunction with privacy laws.