With the dawn of an era which intends to preserve the rights manifested with respect to data privacy and associated rights bestowed upon an individual, leaving behind the rights accruing in favor of the “children of men” will be unbecoming. Given the fast-paced digital ecosystem, which is now moving towards focusing on targeted and child centric business activities, social platforms, it is time to make a move towards a conforming, and morally encompassing privacy framework. With very little control that a child exercises on their interests, and the excesses they wish to indulge in, it is for their guardians, “digital custodians” to ensure that nothing goes awry.
The proliferation of smart toys, wearables, mobile devices, has led to unchecked generation, collection, and processing of personal and sensitive personal data. A child’s uninhibited desire to partake in online activities, and willingness to try “new and trending devices”, only adds to the woes of their guardians. As these devices blend into the background of their users’ daily lives, the information on the children who carry these devices or access the digital platforms, gets seamlessly integrated into the systems of the systematic collectors.
The onset of augmented reality, virtual reality, artificial intelligence enabled tools, which are intrusively reliant on the incessant churning, processing, and analysis of user data, necessitates overhaul of the current data privacy framework for safekeeping of children. The blatant and excessive display on the profane digital space, makes the children vulnerable to unfiltered and possibly irresponsible content. Conventionally, the approach taken by parents in controlling the sphere of their children’s activities is very parochial and does not afford the children the opportunity to explore, beyond what is deigned “right” for them. This has been thwarted by the proliferation of automated, digital systems, which for the first time empowered the children to decide for themselves. While as a society we are progressing towards allowing people (children included) to decide for themselves, it is not unfounded that children belong to the demographic bracket which is vulnerable, incapable and for the lack of a better word, uninitiated. It is with this premise, that we proceed to examine the consequences of having children be exposed to the pervasive, ubiquitous digital access, in the backdrop of the applicable data privacy and protection framework.
Recently, the technology giant, Facebook’s photo and video sharing social networking service, Instagram, attempted to make their way into a child centric platform, which would have allowed children to build their own profiles on their platform, without having to deal with adults per se. The company reportedly admitted that their intent was to allow the children in the range of 10-12 years to become part of this closed and protected ecosystem, while being online. The company contended that kids of this age group are already participating in this ecosystem, in the guise of an older individual; so, they might as well allow them access with stricter terms, under parental guidance.
Without getting into the merits of this, it does seem like a safer opportunity to have kids in a closed door environment, without having “peeping toms” in the vicinity. However, much like the kids having false accounts in the name of older people, there is nothing stopping an offending adult to do the same, while bending the rules to their own advantage. Coupled with issues like early onset of social aggression, low self-esteem, lack of physical activities and interactions in the real world, there could be a tendency in the affected kids to gravitate towards greater psychological problems in the long run.
There have been reports of highly disturbing incidents involving AR/VR experiences, wherein the users have been subjected to unwanted sexual advances on the metaverse. The experience left older users traumatized and scarred, due to the VR experience, which gives a heightened sense of stimulus to users in the real world, for any action made in the virtual space. Exposure of children to such debase and coarse virtual experiences may create an indelible impression in their minds and create lasting psychological trauma in the long run. This behavior outstrips the existing severity of online bullying, cyber harassment issues; and this translating into the domains which children frequent to, has far reaching consequences than what a child can comprehend of.
Entities like Oculus, disclaim liability from any inconvenience being caused to users from a variety of sources when using Oculus Products and seeks an acknowledgement to the effect that content may be inaccurate, offensive, indecent, or otherwise objectionable. While the company allows only children above the age of 13 years to participate in their virtual worlds over their systems, the possibility of younger kids becoming part of this vile experience is real. Worse is the fact that even children above the age group of 13 years, may also not be prepared well to encounter the misgivings that this ecosystem has to offer, without any appropriate remediation mechanisms being put in place by these digital platforms.
In the realm of data privacy, consent plays a major role; this is inherently absent in the case of a minor who is availing services of any service provider, for a child’s consent will not qualify to be free and/ or valid consent.
In view of the aforementioned, and the differential levels of psychological capacity demonstrated by children based on external influences, regulators across jurisdictions have placed different age limits to qualify consent being sought from any age group. Accordingly, whereas the draft Data Protection Bill, 2021 (DP Bill) in India has identified any person under the age of 18 as a child, the European union General Data Protection Regulation (GDPR) and the Children’s Online Privacy Protection Act [USA] (COPPA) have benchmarked this threshold at 16 years and 13 years respectively.
Further, entities must provide direct notice to parents at an identifiable location on the website and obtain verifiable consent from parents before collection of children data and provide parents with the right to access their children’s information for review. Outside of these specific protocols, COPPA requires the entities to abide by the general principles of privacy in relation to data integrity, storage minimization and data retention.
Internet of Toys
Unlike smart toys, which enable interactions with the users, connected toys facilitate connectivity to web-based servers and the device available at the hands of the child users, and allow information to be sought from the end users and be pushed into the servers to build a profile of the kid using the device. Where there is mere use of persistent identifiers, the devices would ideally be left out of the scope of applicability of some applicable laws, not necessarily qualifying as personal information. A slight overstep from this, and moving into storing or processing a fragment / snippet of a conversation, may qualify this as personal information, and expecting the same level of compliance that is required for all other processing activities.
Verifiable parental/ guardian consent must remain the primary ground for data processing, collection, to ensure oversight over the children’s activities, and keep their legal guardians informed of their wards’ activities in the online sphere. To combat issues related to the vulnerabilities that children may be subjected to, digital platforms have been adding riders to block entry by underaged / unqualified persons. Age gating has been viewed unfavorably by netizens, but must be regarded as a necessary tool, to desist the unsavory elements from making their way into the accesses or privileges that children become privy to.
The earlier iteration of the data privacy legislation in India created a new segment of data fiduciary by the name of “guardian data fiduciary” to take up the responsibility of affording minors a semblance of control in re the information pertaining to them. This category of custodians have not made their way in the present iteration, possibly owing to the fact that such a relationship between a data fiduciary and a data principal (minors) will not only be onerous, but will also be fraught with unnecessary challenges, inaccuracies, and uncoordinated implementation.
In case of adults, there has been introduction of a class of data fiduciaries who function as consent managers; such data fiduciaries are entrusted with the task of managing consent of the data subjects and in practice, some consent managers have also evolved into owning up the task or performing the task of sifting through best industry practices of service providers with whom a user might want to engage with. Similarly, parents/ legal guardians must take up the roles of consent managers for their wards and take up the role of managing consent across platforms for their kids, keeping the children’s best interests at the core of their role as a guardian.
In order to ensure that there is no inherent bias seeping into the information which is being fed as base datasets for artificial intelligence/ machine learning algorithms, the European regulators have taken a strict view towards the regulation of artificial intelligence. The fact that there is a requirement to assess the impact that will be unleashed on the children who participate in this digital ecosystem, there is a requirement to have an equivalent set-up in the context of the current set-up providing for AR-VR solutions which are all-pervasive.
In the context of the foregoing, in an economy like India where there is a disparity amongst socio-economic classes, it is imperative that considerations be made with respect to the awareness that is caused to the people who participate in this digital ecosystem. Entities which facilitate instances like gaming, gambling, social interactions taking place between people from distinct classes (where the level of awareness and understanding stands as an inherent disparity), it is imperative that there are requirements in place prior to onboarding of such facilities or allowances being made to such participants.
As the data subjects belong to an age group which cannot be treated to be responsible for their own actions and may also suffer from the disability to understand the consequences of their participation in this ecosystem, it is pertinent that an additional layer of comfort is provisioned for such categories of data subjects. While elements of data privacy focuses on preservation of rights of a data subject, the qualifications and observations that are made by a data subject who may be of a questionable age would be susceptible to scrutiny in terms of admissibility and enforceability.
It is with the intent to serve a larger number that the privacy norms must account for a larger populace, including that which is not capable of making prudent choices, in the simplest of circumstances. There was a time where the choices varied between having plots of land to be cultivated for a singular crop (Farmville), to a point where we have moved to having children building territories to defend their own interests in an AR/VR world (Age of Empires).
To this end, it is imperative to have the differences be met out for what is to be consumed by a child, as opposed to what is for general public consumption.
Sapna Chaurasia is a Partner at TMT Law Practice having 16 years of experience of handling both litigation and transactions. Her expertise are media and entertainment, dispute resolution, corporate advisory, employee issues and ethics and integrity compliances.
Siddhant Gupta is an Associate with TMT Law Practice. He is a graduate of the 2015-2020 batch from Symbiosis Law School, Pune and his core areas of interest lies in the areas of Intellectual Property Laws, Media and Entertainment Laws. Siddhant has previous internship experience in intellectual property and litigation fields and interned with TMT Law Practice in 2020.
Andrew Grove, co-founder, and former CEO of Intel Corporation, in an interview in 2000, once said:
“Privacy is one of the biggest problems in this new electronic age. At the heart of the internet culture, is a force that wants to find out everything about you. And once it has found out everything about you and two hundred million others, that’s a very valuable asset, and people will be tempted to trade and do commerce with that asset. This wasn’t’ the information that people were thinking of when they called this the information age.”
This allows the people living in this age, to either choose to be left alone and preserve their own privacy, or be connected across a series of networks, connected globally, and interconnected with so many other networks. Following the trends of inter alia data breaches, free flow of information across group companies, review of decisions enabling commercial data transfers, this past year, witnessed significant developments in data privacy and protection laws, across the globe. A lot of focus was put on the rights of the consumers, in this ever “data” consuming world, where each technology platform offering any service or product to be delivered “online”, churns out large volumes of personal and sensitive personal data. Parenthetically, the web-world which has been witnessing a steady growth in children-specific service delivery platforms, also led to reconsideration of the rights and controls that are being afforded to their guardians, and the roles and obligations of their “data custodians”.
This past year was driven by concerns around data privacy, and people becoming extremely aware of the information that they generate, and the rights that they possess. With work from home becoming the norm in these past couple years, 20% of organizations experienced breach due to remote worker; consequently, companies became proactive in revamping their strategies around IT, and cybersecurity best practices became the focus.
The early half of this past year also was embroiled with apprehensions around contact tracing and vaccine passports, bringing health privacy to the fore. It became important to strike the balance between containing the spread of the virus and preserving the privacy of the individuals concerned. The UK government had to withdraw the original version of its contact tracing app and moved towards a decentralized model.
Apple propelled home the renewed demand of data privacy and introduced the privacy labels to the AppStore which meant that the user would get a glimpse of the privacy practices of an app before they download it. Close home, we now have a draft Data Protection Bill, for the good or the bad, yet to see.
Let’s look at the year, for what it was.
The European Data Protection Board adopted Guidelines on the interplay between Art. 3 and Chapter V GDPR – November 2021.
Data Protection and European Union’s anti-laundering regulations – July 2021
The European Commission adopted a package of legislative proposals to strengthen the anti-money laundering regime in Europe by preventing the use of financial systems for money laundering, and terrorism financing activities. The legislations create a framework for the coordination of national and financial authorities and creates the EU anti-money laundering authority. The regulations stress on the data protection compliances of private entities, EU bodies, to conduct necessary risk assessment audits, and take reasonable steps to prevent said financial crimes while pursuing outsourcing relationships. Link: https://ec.europa.eu/info/publications/210720-anti-money-laundering-countering-financing-terrorism_en
Amsterdam District Court recognizes a GDPR right to an explanation for algorithmic decision-making – March 2021.
The Court required Ola to explain the logic behind a fully automated decision in the sense of Article 22 of the GDPR. The Court held that Ola must communicate the main assessment criteria and their role in the automated decision to [the drivers], so that they can understand the criteria on the basis of which the decisions were taken and they are able to check the correctness and lawfulness of the data processing.
The European Commission (EC) proposed the framework for the creation of a European Digital Identity Wallet, to permit European citizens to store payment details, passwords, official documents at one secure location. The Wallet will enable Europeans to access government services online without the use of private identification methods, thereby restricting the mirroring of similar data units across services.
The European Union approved the revised ePrivacy Regulation which aims to overlook all forms of electronic communications services within the union. The Regulation imposes compliance obligation on communication content and metadata, restrictions on monitoring/processing user data without prior consent, simplifying cookie rules to allow user friendly browser settings, and ban unsolicited electronic communications by email, SMS or automated calling machines.
Germany’s Telecommunications and Telemedia Data Protection Act (TTDSG) – December 2021
TTDSG seeks to collate data protection laws from separate legislations in relation to telecommunication and telemedia within a single legislation. TTDSG aims to protect the confidentiality and privacy of its users while accessing internet – ready infrastructures such as websites, messaging services, or smart home devices. Key highlights include – processing of personal and non- personal data, broadening its application to IoT (Internet of Things) related devices, consolidation of data to one location, introduction of rights of heirs of telecommunication users, and regulation of cookies and cookie banners.
US ban on Chinese telecom subsidiaries over national security concerns – October 2021
Federal Communications Commission (FCC) revoked authorization of Chinese telecommunication companies in the country. It argued that such companies could be subject to exploitation, influence and control from the Chinese government, which could open them to further compliances neither having sufficient legal procedures nor judicial oversight. Subject to such control, FCC argued that it could cause substantial security and law enforcement risks while processing information generated in the country.
Colarado becomes the third state in the USA to implement its comprehensive data privacy legislation. The CPA applies to companies that conduct business in Colorado or sell product or services intentionally targeted to residents of Colorado and meet either of the following thresholds: (i) controls or processes personal data of 100,000 or more consumers during a calendar year; or (ii) derive revenue or receive discounts from the sale of personal data and control or process data of at least 25,000 consumers.
China’s Regulation for Industrial and Telecom Data Security – September 2021
The Ministry of Industry and Information Technology (MIIT) recently introduced the Measures for the Administration of Data Security in the Field of Industrial and Information Technology sectors (Trial) (Draft) (Measures). The Measures were drafted in furthering China’s Data Security Law. These Measure can be enforced against industries and telecommunication sectors. Briefly, these Measures aim to categorize data based on risk levels into three parts – ordinary, important and core data, while mandating localization of such core data along with other filing compliances. The Measures also provide guidelines on management of data, data inspection and other such legal responsibilities.
Virginia’s Consumer Data Protection Act – August 2021
CDPA expands consumer rights to access, correct, delete, and obtain a copy of personal data provided to or collected by a company, and to opt out of processing of the personal data for purposes of targeted advertising, sale, or profiling of the personal data. It applies to all persons that conduct business in the Commonwealth and either (i) control or process personal data of at least 100,000 consumers or (ii) derive over 50 percent of gross revenue from the sale of personal data and control or process personal data of at least 25,000 consumers.
On 4 June 2021, the Commission issued modernized standard contractual clauses under the GDPR for data transfers from controllers or processors in the EU/EEA (or otherwise subject to the GDPR) to controllers or processors established outside the EU/EEA (and not subject to the GDPR):
These modernized SCCs replace the three sets of SCCs that were adopted under the previous Data Protection Directive 95/46. Since 27 September 2021, it is no longer possible to conclude contracts incorporating these earlier sets of SCCs.
Until 27 December 2022, controllers and processors can continue to rely on those earlier SCCs for contracts that were concluded before 27 September 2021, provided that the processing operations that are the subject matter of the contract remain unchanged.
 Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Judgment in Case C-311/18 (Schrems II); https://curia.europa.eu/jcms/upload/docs/application/pdf/2020-07/cp200091en.pdf, last access on January 24, 2022, at 2147 hrs. In the Schrems II judgment, the Court of Justice of the European Union declared the European Commission’s Privacy Shield Decision invalid on account of invasive US surveillance programs, thereby making transfers of personal data on the basis of the Privacy Shield decision illegal. Furthermore, the Court stipulated stricter requirements for the transfer of personal data based on standard contract clauses (SCCs).
This post seeks to analyzes the changes brought about in the Policy, with a focus on the ramifications of use of shared hosting services with WhatsApp’s group/ parent entity, Facebook. In the latter segment the post will analyze the conformity of WhatsApp’s Policy with the existing legal regime and the legal implications of the Policy.
What does the Policy Entail?
The WhatsApp Policy focuses on increasing data transparency by keeping users informed of: (a) the information collected by WhatsApp, (b) the processing and handling of user’s data, (c) the manner of use of user’s data by businesses who opt for Facebook hosting services to manage user information and (d) payments on WhatsApp.
Ever since the introduction of the Policy, WhatsApp has issued multiple notifications and FAQs to assuage user concerns. A unique feature to WhatsApp, also the reason it has garnered manifold users across the globe, is the End-to-End Encryption (ETE) of user conversations. Simply put, the ETE feature masks the messages and communications of the sender by using unique keys/ codes so that the same is only decipherable by the intended receiver who has the special decrypting key. An ETE feature does not allow any third party, not even WhatsApp or its related group entities, to intercept or decipher the content shared between the sender and the receiver. While fake news and rumours doing the rounds early this year spoke to the contrary, the Policy does not alter the ETE signal protocol observed by WhatsApp when the messages are exchanged between users. The only exception to the mandatory ETE feature is an instance where WhatsApp under direction of a competent authority is required to intercept, monitor or decrypt the communications exchanged between users.
Collection and Processing of Data:
As per the Policy, while WhatsApp is not privy to user conversations, it collects manifold information and personal data including identifiers for its day-today operational activity. The new Policy provides more clarity on data points including the usage and log information collected by WhatsApp unlike the previous policy. Similarly, it also collects device related information such as hardware model connection information including phone numbers, time zones, IP addresses and identifiers, including the identifiers unique to Facebook associated with the same device or account. However, the Policy is concerning as it lacks clarity to the extent to which identified user information collected by WhatsApp is consequently shared with Facebook group companies, or other third-party service provider.
Sharing of Data with Third Party Service Providers:
The Policy clarifies that it shares user log in information, device locations, unique identifiers relevant to Facebook, etc with Facebook group entities unlike the ambiguities and non-disclosures in the older policy of WhatsApp. It is clear that WhatsApp engages with third-party service providers and other Facebook companies to operate, provide, improve, understand, customize, support, and market its services. Interestingly, this feature is not a new addition to the Policy and has been in existence even in the older version of the policy though, not in a granular form as the present one.
Communication with Business Accounts:
The newest feature of the Policy is the change in user privacy settings in a user-business communication. The Policy allows businesses to communicate and interact with each other and users, to browse through products, services as well as place orders. While WhatsApp has assured the users that ETE protocol is retained in case of user-business communications, a deeper analysis of the Policy speaks to the contrary. As long as communication is exchanged within WhatsApp, the ETE protection is guaranteed. However, once the business entity opts for Facebook or other third party hosting services, this principle gets compromised. WhatsApp’s clever disclaimers caution the users about the fact that information shared with such business contacts may be used for the business entities’ own marketing purposes, which may even include advertisement on Facebook or may even be accessed by several employees in the business organisation. Interestingly, to safeguard and avoid any liability, WhatsApp also ensures that every user conversing with such business accounts is well informed by way of labels appearing at the top of such conversations indicating whether such business entities have opted for hosting services from Facebook.
Legality of the Policy:
The Information Technology Act, 2000 (Act) read with the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 (Rules) only provide limited protection to sensitive personal data without an effective enforcement mechanism. Therefore, the nuances in WhatsApp’s Policy, particularly, the sharing of information with third party service providers will go unnoticed. No doubt, in such a loosely worded legislative scheme, the Policy will effectively sail through.
However, from a constitutional perspective where horizontal rights enforcement against a private entity is recognized and informational privacy is considered as a fundamental right, it is important to analyzes if the Policy effectively meets the adequate data protection standards. The Supreme Court has recognized that as an important aspect of informational privacy, one must have complete control over the dissemination of information that is personal. WhatsApp’s mandatory consent requirement to share information with third parties including Facebook Companies is in the teeth of this principle. This specific aspect formed the subject matter of challenge in a writ petition before the Delhi High Court and is pending adjudication before the Supreme Court.
In view of the user’s outrage, government demands and the pending court proceedings, WhatsApp has extended the deadline for accepting the Policy to May 15, 2021. It is only a matter of time when users will get to know the fate of their information on WhatsApp.
Atmaja Tripathy, Senior Associate, TMT Law Practice
Atmaja is pursuing litigation at different courts and tribunals in Delhi, including the Supreme Court of India, High Court of Delhi, Telecom Disputes and Settlement Appellate Tribunal, the Competition Commission of India and the National Company Law Appellate Tribunal. Atmaja is enrolled with the Delhi Bar Council. At law school, she has won multiple scholarships for academic excellence, including the Nanhi Palkhiwala Scholarship for Constitutional Law, Ram Jethmalani Scholarship and the Director’s Gold Medal for Outstanding Excellence in the graduating batch. Her interests in technology, media and telecommunication laws, competition law and constitutional law, have led her to pursue prestigious moots and essay competitions. Atmaja has also published articles on contemporary legal issues in reputed international and national journals like European Competition Law Review, Kluwer Business Law Journal, BRICS Law Journal, All India Reporter and Company Law Journal.”
India currently has a ban imposed on 800 odd pornographic websites. In a recent development, the police of an Indian state (StatePolice) devised a tactic to keep women safe, i.e. by keeping tabs on people’s porn search history. Without diving into the legality of porn ban, this article aims to discuss the legality of the act of monitoring search history in general and also the need for a law governing data protection in India.
Violation of Fundamental Rights Pursuant to Actions of State Police :
In 2017, the Supreme Court in its decision of K.S. Puttaswamy v Union of India held that the right to privacy under Article 21 of the Constitution constituted a fundamental right as a part of the right to “life” and “personal liberty”. The judgment also recognized the aspect of “informational privacy” and held that information about a person and the right to access that information also needs to be given the protection of privacy.
The Information Technology Rules (“IT Rules”), defines “personal information” to include any information that relates to an identified or identifiable natural person. Accordingly, it can be said that a person’s search history amounts to “personal information” as the same is unique to a person similar to his/her fingerprints. A browser’s search history is capable of conveying information about a person, to the level of possibility of extracting his/her psychometric or demographic preferences which technically constitutes personal information. Therefore, monitoring search history can be said to be privacy sensitive and a person’s browsing history should ideally be given the protection of privacy. While the right to privacy is not absolute, any restriction to the said right must adhere to the three-prong test of legality, legitimacy and necessity.
In this backdrop, let us evaluate if the measure adopted by the State Police satisfies the aforesaid three prongs.
Legality: An act or a measure can be said to be legal if it has been provided for in law. Section 69 of the Information Technology Act (“IT Act”) empowers the State or Central Government or any of its officers to adopt measures for monitoring if considered necessary or expedient in view of the legitimate aims listed in the section. Since ‘police’ has been listed under List II of Schedule 7 of the Indian Constitution, it can be said that police is a State subject with the power to take actions that it deems necessary under Section 69 of the Act. Therefore, the measure taken by the State Police to rack the browser history of persons who search for pornographic content can be said to have basis under the IT Act. However, the question that arises is searching a person’s entire browser history basis one criteria shall still be questionable.
Legitimacy: A restrictive measure may be taken if it is in pursuance of a legitimate aim. Keeping in view Section 69 of the Act, a bare reading of the section suggests that measures for interception or monitoring of information may be taken only in pursuance of the legitimate aims such as maintaining public order, investigating an offence, amongst others.
Browsing pornographic content is not a criminal offence per se, except viewing of child pornographic content which is illegal in India. Accordingly, the actions of the State Police in the current situation to monitor browsing history which action of an individual is not an offence per se cannot be said to be a legitimate act on part of the police. While crimes against women are punishable offences, browsing of pornographic content is no offence, thus the police surveillance on data with a view to prevent such offences from happening becomes dubious.
A public disorder is caused when there is disruption of peace and tranquility with the aim of undermining the security of the State and overthrowing it. Borrowing this understanding of public order into the law governing privacy it is difficult to understand how this measure by the State Police would help uphold public order. Thus, it can be said that private browsing of porn does not amount to any breach of public order and the measures taken by the State Police were not backed by any legitimate aims.
Necessity: Finally, a restriction may be imposed only if it is necessary in a democratic society. A pressing social need must exist and the measure in question must be proportionate to the legitimate aim sought. Multiple researches conducted suggest that the private consumption of pornography does not affect the overall treatment of women in the society. There is no evidence to support the notion that porn consumption encourages crimes against women. While it is true that other fundamental rights may be preferred over an individual’s right to privacy, this can only be done if private activities like the one in question cause significant harm to others.
In view of the non-fulfillment of the three prongs the State Police’ action appears to curtail an individual’s right from freely exploring and indulging in their own personal tastes and convictions. As JS Mill expounds, a person’s own good, either physical or moral is not a ground for state interference into a person’s liberty. Further, the action creates a chilling effect on freedom of speech and expression, impinging the invaluable fundamental right which includes within its ambit- the right to receive information and access content on the internet.
India presently lacks laws governing the protection of personal data, resulting in measures like the one taken by the State Police. In the absence of such laws, the actions of State Police escape legal scrutiny. Such instances further highlight the need for a Personal Data Protection law in India.
 Shreya Sundararaman, Penultimate Year Law Student. Shreya interned at TMT Law Practice in March 2021.
 News 18, ‘UP Police Will Monitor Your Porn Searches in Internet History. Will it Reduce Crimes Against Women?’ (16th february 2021) URL <https://www.news18.com/news/buzz/porn-ban-tracking-up-police-womens-safety-3440225.html>.
K.S. Puttaswamy v Union Of India (2017) 10 SCC 1 para 77 (“Puttaswamy”).
 Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules 2011, 11th April 2011, Rule 2(i).
 Lukasz Olejnik, Claude Castelluccia and Artur Janc, “Why Johnny Can’t Browse in Peace: On the Uniqueness of Web Browsing History Patterns. 5th Workshop on Hot Topics in Privacy Enhancing Technologies” (HotPETs 2012), Jul 2012, Vigo, Spain; Sarah Bird, Ilana Segall and Martin Lopatka, ‘Replication: Why We Still Can’t Browse in Peace: On the Uniqueness and Identifiability of Web Browsing Histories’ URL < https://www.usenix.org/system/files/soups2020-bird.pdf>.
K.S. Puttaswamyv Union Of India (2019 ) 1 SCC 1 para 835-36 (“Puttaswamy 2”).
Romesh Thappar v State of Madras AIR 1950 SC 124 para 10.
Suresh Bada Math, Biju Viswanath, Ami Sebastian Maroky, Naveen C. Kumar, Anish V. Cherian, and Maria Christine Nirmala, “Sexual Crime in India: Is it Influenced by Pornography?” (2014) Indian J of Psychol Med. 147-152; Jessica Brown, ‘Pornography is now only an internet search away, and is becoming ever more immersive. How is it changing people’s behaviour, relationships and desires?’ (BBC, 26th September 2017) URL <https://www.bbc.com/future/article/20170926-is-porn-harmful-the-evidence-the-myths-and-the-unknowns>.
 West, Caroline, “Pornography and Censorship”, The Stanford Encyclopedia of Philosophy (Fall 2018 Edition), Edward N. Zalta (ed.), URL <https://plato.stanford.edu/archives/fall2018/entries/pornography-censorship/>.
 Gautam Bhatia, Offend, shock or disturb: Free speech under the Indian Constitution (OUP) 49-50.