Children in the Online World: Are They Safe

Children in the Online World: Are They Safe

With the dawn of an era which intends to preserve the rights manifested with respect to data privacy and associated rights bestowed upon an individual, leaving behind the rights accruing in favor of the “children of men” will be unbecoming. Given the fast-paced digital ecosystem, which is now moving towards focusing on targeted and child centric business activities, social platforms, it is time to make a move towards a conforming, and morally encompassing privacy framework. With very little control that a child exercises on their interests, and the excesses they wish to indulge in, it is for their guardians, “digital custodians” to ensure that nothing goes awry.

The proliferation of smart toys, wearables, mobile devices, has led to unchecked generation, collection, and processing of personal and sensitive personal data. A child’s uninhibited desire to partake in online activities, and willingness to try “new and trending devices”, only adds to the woes of their guardians. As these devices blend into the background of their users’ daily lives, the information on the children who carry these devices or access the digital platforms, gets seamlessly integrated into the systems of the systematic collectors.

The onset of augmented reality, virtual reality, artificial intelligence enabled tools, which are intrusively reliant on the incessant churning, processing, and analysis of user data, necessitates overhaul of the current data privacy framework for safekeeping of children. The blatant and excessive display on the profane digital space, makes the children vulnerable to unfiltered and possibly irresponsible content. Conventionally, the approach taken by parents in controlling the sphere of their children’s activities is very parochial and does not afford the children the opportunity to explore, beyond what is deigned “right” for them. This has been thwarted by the proliferation of automated, digital systems, which for the first time empowered the children to decide for themselves. While as a society we are progressing towards allowing people (children included) to decide for themselves, it is not unfounded that children belong to the demographic bracket which is vulnerable, incapable and for the lack of a better word, uninitiated. It is with this premise, that we proceed to examine the consequences of having children be exposed to the pervasive, ubiquitous digital access, in the backdrop of the applicable data privacy and protection framework.

Online Excesses

Recently, the technology giant, Facebook’s photo and video sharing social networking service, Instagram, attempted to make their way into a child centric platform, which would have allowed children to build their own profiles on their platform, without having to deal with adults per se. The company reportedly admitted that their intent was to allow the children in the range of 10-12 years to become part of this closed and protected ecosystem, while being online[1]. The company contended that kids of this age group are already participating in this ecosystem, in the guise of an older individual; so, they might as well allow them access with stricter terms, under parental guidance.

Without getting into the merits of this, it does seem like a safer opportunity to have kids in a closed door environment, without having “peeping toms” in the vicinity. However, much like the kids having false accounts in the name of older people, there is nothing stopping an offending adult to do the same, while bending the rules to their own advantage. Coupled with issues like early onset of social aggression, low self-esteem, lack of physical activities and interactions in the real world, there could be a tendency in the affected kids to gravitate towards greater psychological problems in the long run.

There have been reports of highly disturbing incidents involving AR/VR experiences, wherein the users have been subjected to unwanted sexual advances on the metaverse. The experience left older users traumatized and scarred, due to the VR experience, which gives a heightened sense[2] of stimulus to users in the real world, for any action made in the virtual space. Exposure of children to such debase and coarse virtual experiences may create an indelible impression in their minds and create lasting psychological trauma in the long run. This behavior outstrips the existing severity of online bullying, cyber harassment issues; and this translating into the domains which children frequent to, has far reaching consequences than what a child can comprehend of.

Entities like Oculus, disclaim liability from any inconvenience being caused to users from a variety of sources when using Oculus Products and seeks an acknowledgement to the effect that content may be inaccurate, offensive, indecent, or otherwise objectionable[3]. While the company allows only children above the age of 13 years to participate in their virtual worlds over their systems, the possibility of younger kids becoming part of this vile experience is real. Worse is the fact that even children above the age group of 13 years, may also not be prepared well to encounter the misgivings that this ecosystem has to offer, without any appropriate remediation mechanisms being put in place by these digital platforms.

Regulatory Landscape

In the realm of data privacy, consent plays a major role; this is inherently absent in the case of a minor who is availing services of any service provider, for a child’s consent will not qualify to be free and/ or valid consent.

In view of the aforementioned, and the differential levels of psychological capacity demonstrated by children based on external influences, regulators across jurisdictions have placed different age limits to qualify consent being sought from any age group. Accordingly, whereas the draft Data Protection Bill, 2021 (DP Bill) in India has identified any person under the age of 18 as a child, the European union General Data Protection Regulation (GDPR) and the Children’s Online Privacy Protection Act [USA] (COPPA) have benchmarked this threshold at 16 years and 13 years respectively.

COPPA stands as the highest standard that is in place to monitor, regulate and afford the children their rights to privacy as a core and inalienable construct in the online world. Enacted in 2013, COPPA was the culmination of a series of actions, investigations conducted by the Federal Trade Commission (FTC) to ascertain the means and practices of data collection, processing activities of e-commerce websites in the USA. COPPA places the responsibility on parents to authorize, verify any data collection, processing activities conducted by websites, to ensure a degree of control over children’s data being shared/ transmitted/ collected by online portals. Accordingly, entities regulated under COPPA are mandated to post a clear, readable privacy policy to describe processing activities for children.

Further, entities must provide direct notice to parents at an identifiable location on the website and obtain verifiable consent from parents before collection of children data and provide parents with the right to access their children’s information for review. Outside of these specific protocols, COPPA requires the entities to abide by the general principles of privacy in relation to data integrity, storage minimization and data retention.

Internet of Toys

Unlike smart toys, which enable interactions with the users, connected toys facilitate connectivity to web-based servers and the device available at the hands of the child users, and allow information to be sought from the end users and be pushed into the servers to build a profile of the kid using the device. Where there is mere use of persistent identifiers, the devices would ideally be left out of the scope of applicability of some applicable laws, not necessarily qualifying as personal information. A slight overstep from this, and moving into storing or processing a fragment / snippet of a conversation, may qualify this as personal information, and expecting the same level of compliance that is required for all other processing activities.

Verifiable parental/ guardian consent must remain the primary ground for data processing, collection, to ensure oversight over the children’s activities, and keep their legal guardians informed of their wards’ activities in the online sphere. To combat issues related to the vulnerabilities that children may be subjected to, digital platforms have been adding riders to block entry by underaged / unqualified persons. Age gating has been viewed unfavorably by netizens, but must be regarded as a necessary tool, to desist the unsavory elements from making their way into the accesses or privileges that children become privy to.

The earlier iteration of the data privacy legislation in India created a new segment of data fiduciary by the name of “guardian data fiduciary” to take up the responsibility of affording minors a semblance of control in re the information pertaining to them. This category of custodians have not made their way in the present iteration, possibly owing to the fact that such a relationship between a data fiduciary and a data principal (minors) will not only be onerous, but will also be fraught with unnecessary challenges, inaccuracies, and uncoordinated implementation.

In case of adults, there has been introduction of a class of data fiduciaries who function as consent managers; such data fiduciaries are entrusted with the task of managing consent of the data subjects and in practice, some consent managers have also evolved into owning up the task or performing the task of sifting through best industry practices of service providers with whom a user might want to engage with. Similarly, parents/ legal guardians must take up the roles of consent managers for their wards and take up the role of managing consent across platforms for their kids, keeping the children’s best interests at the core of their role as a guardian.

In order to ensure that there is no inherent bias seeping into the information which is being fed as base datasets for artificial intelligence/ machine learning algorithms, the European regulators have taken a strict view towards the regulation of artificial intelligence[4]. The fact that there is a requirement to assess the impact that will be unleashed on the children who participate in this digital ecosystem, there is a requirement to have an equivalent set-up in the context of the current set-up providing for AR-VR solutions which are all-pervasive.


In the context of the foregoing, in an economy like India where there is a disparity amongst socio-economic classes, it is imperative that considerations be made with respect to the awareness that is caused to the people who participate in this digital ecosystem. Entities which facilitate instances like gaming, gambling, social interactions taking place between people from distinct classes (where the level of awareness and understanding stands as an inherent disparity), it is imperative that there are requirements in place prior to onboarding of such facilities or allowances being made to such participants.

As the data subjects belong to an age group which cannot be treated to be responsible for their own actions and may also suffer from the disability to understand the consequences of their participation in this ecosystem, it is pertinent that an additional layer of comfort is provisioned for such categories of data subjects. While elements of data privacy focuses on preservation of rights of a data subject, the qualifications and observations that are made by a data subject who may be of a questionable age would be susceptible to scrutiny in terms of admissibility and enforceability.

It is with the intent to serve a larger number that the privacy norms must account for a larger populace, including that which is not capable of making prudent choices, in the simplest of circumstances. There was a time where the choices varied between having plots of land to be cultivated for a singular crop (Farmville), to a point where we have moved to having children building territories to defend their own interests in an AR/VR world (Age of Empires).

To this end, it is imperative to have the differences be met out for what is to be consumed by a child, as opposed to what is for general public consumption.

[1]; last accessed on February 23, 2022, at 1040 hrs.

[2]; last accessed on February 23, 2022 at 1130 hrs.

[3]; last accessed on February 22, 2022 at 1037 hrs.

[4] Draft Artificial Intelligence Act, 2021

Sapna Chaurasia is a Partner at TMT Law Practice having 16 years of experience of handling both litigation and transactions.  Her expertise are media and entertainment, dispute resolution, corporate advisory, employee issues and ethics and integrity compliances.
Siddhant Gupta is an Associate with TMT Law Practice. He is a graduate of the 2015-2020 batch from Symbiosis Law School, Pune and his core areas of interest lies in the areas of Intellectual Property Laws, Media and Entertainment Laws. Siddhant has previous internship experience in intellectual property and litigation fields and interned with TMT Law Practice in 2020.
Data Privacy, 2021: The Year That It Was

Data Privacy, 2021: The Year That It Was

Andrew Grove, co-founder, and former CEO of Intel Corporation, in an interview in 2000, once said:

Privacy is one of the biggest problems in this new electronic age. At the heart of the internet culture, is a force that wants to find out everything about you. And once it has found out everything about you and two hundred million others, that’s a very valuable asset, and people will be tempted to trade and do commerce with that asset. This wasn’t’ the information that people were thinking of when they called this the information age.[1]

This allows the people living in this age, to either choose to be left alone and preserve their own privacy, or be connected across a series of networks, connected globally, and interconnected with so many other networks. Following the trends of inter alia data breaches[2], free flow of information across group companies[3], review of decisions enabling commercial data transfers[4], this past year, witnessed significant developments in data privacy and protection laws, across the globe. A lot of focus was put on the rights of the consumers, in this ever “data” consuming world, where each technology platform offering any service or product to be delivered “online”, churns out large volumes of personal and sensitive personal data. Parenthetically, the web-world which has been witnessing a steady growth in children-specific service delivery platforms, also led to reconsideration of the rights and controls that are being afforded to their guardians, and the roles and obligations of their “data custodians”.

This past year was driven by concerns around data privacy, and people becoming extremely aware of the information that they generate, and the rights that they possess. With work from home becoming the norm in these past couple years, 20% of organizations experienced breach due to remote worker[5]; consequently, companies became proactive in revamping their strategies around IT, and cybersecurity best practices became the focus.

The early half of this past year also was embroiled with apprehensions around contact tracing and vaccine passports[6], bringing health privacy to the fore. It became important to strike the balance between containing the spread of the virus and preserving the privacy of the individuals concerned. The UK government had to withdraw the original version of its contact tracing app and moved towards a decentralized model[7].

Apple propelled home the renewed demand of data privacy and introduced the privacy labels to the AppStore which meant that the user would get a glimpse of the privacy practices of an app before they download it[8]. Close home, we now have a draft Data Protection Bill, for the good or the bad, yet to see.

Let’s look at the year, for what it was.

The European Data Protection Board adopted Guidelines on the interplay between Art. 3 and Chapter V GDPR – November 2021.

The Guidelines clarify the interplay between the territorial scope of the GDPR (Art. 3) and the provisions on international transfers in Chapter V and aim to assist controllers and processors in the EU in identifying whether a processing operation constitutes an international transfer, and to provide a common understanding of the concept of international transfers. Link:

Data Protection and European Union’s anti-laundering regulations – July 2021

The European Commission adopted a package of legislative proposals to strengthen the anti-money laundering regime in Europe by preventing the use of financial systems for money laundering, and terrorism financing activities. The legislations create a framework for the coordination of national and financial authorities and creates the EU anti-money laundering authority. The regulations stress on the data protection compliances of private entities, EU bodies, to conduct necessary risk assessment audits, and take reasonable steps to prevent said financial crimes while pursuing outsourcing relationships.    Link:

Amsterdam District Court recognizes a GDPR right to an explanation for algorithmic decision-making – March 2021.

The Court required Ola to explain the logic behind a fully automated decision in the sense of Article 22 of the GDPR. The Court held that Ola must communicate the main assessment criteria and their role in the automated decision to [the drivers], so that they can understand the criteria on the basis of which the decisions were taken and they are able to check the correctness and lawfulness of the data processing. 


European Digital Identity Wallets – June 2021

The European Commission (EC) proposed the framework for the creation of a European Digital Identity Wallet, to permit European citizens to store payment details, passwords, official documents at one secure location. The Wallet will enable Europeans to access government services online without the use of private identification methods, thereby restricting the mirroring of similar data units across services.


The Joint Parliamentary Committee on the Personal Data Protection Bill, 2019 (JPC), tabled its report in both houses of Parliament – December 2021.

The JPC report which contains a list of policy recommendations, on analysis of various provisions of the PDP Bill, 2019, also contained a draft bill titled the Data Protection Bill, 2021. The reports focus was to address the public policy concerns that have arisen of late, and also took into consideration the judgment of the Hon’ble Supreme Court in the matter of Justice KC Puttaswamy (Retd) v. Union of India. Link:,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf

DPA Guidance on Contact Tracing Applications

In view of the prevalence of contact tracing applications (public and private) during the pandemic, several data protection authorities (DPAs) published guidance documents towards the safe maintenance of customer records and personal data. The guidance documents rely upon the privacy principles of data minimization, transparency, safe and limited storage and deletion upon exhaustion of collection purpose to safeguard user privacy. Link:

European Union’s ePrivacy Regulation

The European Union approved the revised ePrivacy Regulation which aims to overlook all forms of electronic communications services within the union. The Regulation imposes compliance obligation on communication content and metadata, restrictions on monitoring/processing user data without prior consent, simplifying cookie rules to allow user friendly browser settings, and ban unsolicited electronic communications by email, SMS or automated calling machines.


Germany’s Telecommunications and Telemedia Data Protection Act (TTDSG) – December 2021

TTDSG seeks to collate data protection laws from separate legislations in relation to telecommunication and telemedia within a single legislation. TTDSG aims to protect the confidentiality and privacy of its users while accessing internet – ready infrastructures such as websites, messaging services, or smart home devices. Key highlights include – processing of personal and non- personal data, broadening its application to IoT (Internet of Things) related devices, consolidation of data to one location, introduction of rights of heirs of telecommunication users, and regulation of cookies and cookie banners.


US ban on Chinese telecom subsidiaries over national security concerns – October 2021

Federal Communications Commission (FCC) revoked authorization of Chinese telecommunication companies in the country. It argued that such companies could be subject to exploitation, influence and control from the Chinese government, which could open them to further compliances neither having sufficient legal procedures nor judicial oversight. Subject to such control, FCC argued that it could cause substantial security and law enforcement risks while processing information generated in the country.


Colorado Privacy Act – July 2021

Colarado becomes the third state in the USA to implement its comprehensive data privacy legislation. The CPA applies to companies that conduct business in Colorado or sell product or services intentionally targeted to residents of Colorado and meet either of the following thresholds: (i) controls or processes personal data of 100,000 or more consumers during a calendar year; or (ii) derive revenue or receive discounts from the sale of personal data and control or process data of at least 25,000 consumers. 


China’s Regulation for Industrial and Telecom Data Security – September 2021

The Ministry of Industry and Information Technology (MIIT) recently introduced the Measures for the Administration of Data Security in the Field of Industrial and Information Technology sectors (Trial) (Draft) (Measures). The Measures were drafted in furthering China’s Data Security Law. These Measure can be enforced against industries and telecommunication sectors. Briefly, these Measures aim to categorize data based on risk levels into three parts – ordinary, important and core data, while mandating localization of such core data along with other filing compliances. The Measures also provide guidelines on management of data, data inspection and other such legal responsibilities.


Virginia’s Consumer Data Protection Act – August 2021

CDPA expands consumer rights to access, correct, delete, and obtain a copy of personal data provided to or collected by a company, and to opt out of processing of the personal data for purposes of targeted advertising, sale, or profiling of the personal data. It applies to all persons that conduct business in the Commonwealth and either (i) control or process personal data of at least 100,000 consumers or (ii) derive over 50 percent of gross revenue from the sale of personal data and control or process personal data of at least 25,000 consumers.


Standard Contractual Clauses – June 2021

On 4 June 2021, the Commission issued modernized standard contractual clauses under the GDPR for data transfers from controllers or processors in the EU/EEA (or otherwise subject to the GDPR) to controllers or processors established outside the EU/EEA (and not subject to the GDPR):

These modernized SCCs replace the three sets of SCCs that were adopted under the previous Data Protection Directive 95/46. Since 27 September 2021, it is no longer possible to conclude contracts incorporating these earlier sets of SCCs.

Until 27 December 2022, controllers and processors can continue to rely on those earlier SCCs for contracts that were concluded before 27 September 2021, provided that the processing operations that are the subject matter of the contract remain unchanged.


[1]; originally published in the May 2000 issue; last accessed on January 15, 2022, at 0915 hrs.

[2] last accessed on January 15, 2022, at 1055 hrs; last accessed on January 15, 2022, at 1055 hrs. The major breaches were across several consumer facing sectors, where the efficient delivery of the service is dependent on the accuracy of the information which is provided by the data subject.

[3] last accessed on January 24, 2022, at 2137 hrs. The technology giant, WhatsApp, made data sharing mandatory for the business accounts, with Facebook. This has drawn ire of several regulators (privacy, anti-trust) across the globe. The awareness and concerns raised by the users, forced the platform, to give more time for making its own policy mandatorily applicable across the existing users. Also see:, last accessed on January 24, 2022, at 2140 hrs.

[4] Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems, Judgment in Case C-311/18 (Schrems II);, last access on January 24, 2022, at 2147 hrs. In the Schrems II judgment, the Court of Justice of the European Union declared the European Commission’s Privacy Shield Decision invalid on account of invasive US surveillance programs, thereby making transfers of personal data on the basis of the Privacy Shield decision illegal. Furthermore, the Court stipulated stricter requirements for the transfer of personal data based on standard contract clauses (SCCs).

[5], last accessed on January 24, 2022, at 0940 hrs.

[6], last accessed on January 26, 2022, at 1140 hrs.

[7], last accessed on January 26, 2022, at 1138 hrs.

[8], last accessed on January 26, 2022, at 1129 hrs.




The year 2021 began with a lot of skepticism. WhatsApp’s in-app notification informing a change in the privacy policy[1] (Policy) triggered a series of discussion with the government, amongst social thinkers, as a result of the unyielding confusion in the minds of the general public. The update left only a narrow margin to comply with the Policy either before February 08, 2021 or exit the platform, most people complied with it. While many aware users also opted for alternative internet-based messaging platforms like Telegram and Signal, the altered Policy has ever since triggered prolonged discussions As a result of the criticisms and disconcerted opinions amongst users, the deadline seeking mandatory compliance with the modified Policy has been extended to May 15, 2021. However, the open-ended issue which requires consideration hovers around the adequacy of the Policy in protecting user’s privacy.

This post seeks to analyzes the changes brought about in the Policy, with a focus on the ramifications of use of shared hosting services with WhatsApp’s group/ parent entity, Facebook. In the latter segment the post will analyze the conformity of WhatsApp’s Policy with the existing legal regime and the legal implications of the Policy.

What does the Policy Entail?

The WhatsApp Policy focuses on increasing data transparency by keeping users informed of: (a) the information collected by WhatsApp, (b) the processing and handling of user’s data, (c) the manner of use of user’s data by businesses who opt for Facebook hosting services to manage user information and (d) payments on WhatsApp.

  1. End-to-End Encryption:

Ever since the introduction of the Policy, WhatsApp has issued multiple notifications and FAQs[2] to assuage user concerns. A unique feature to WhatsApp, also the reason it has garnered manifold users across the globe, is the End-to-End Encryption (ETE) of user conversations. Simply put, the ETE feature masks the messages and communications of the sender by using unique keys/ codes so that the same is only decipherable by the intended receiver who has the special decrypting key.[3] An ETE feature does not allow any third party, not even WhatsApp or its related group entities, to intercept or decipher the content shared between the sender and the receiver. While fake news and rumours doing the rounds early this year spoke to the contrary,[4] the Policy does not alter the ETE signal protocol observed by WhatsApp when the messages are exchanged between users. The only exception to the mandatory ETE feature is an instance where WhatsApp under direction of a competent authority is required to intercept, monitor or decrypt the communications exchanged between users.[5]

  1. Collection and Processing of Data:

As per the Policy, while WhatsApp is not privy to user conversations, it collects manifold information and personal data including identifiers for its day-today operational activity. The new Policy provides more clarity on data points including the usage and log information collected by WhatsApp unlike the previous policy. Similarly, it also collects device related information such as hardware model connection information including phone numbers, time zones, IP addresses and identifiers, including the identifiers unique to Facebook associated with the same device or account. However, the Policy is concerning as it lacks clarity to the extent to which identified user information collected by WhatsApp is consequently shared with Facebook group companies, or other third-party service provider.

  1. Sharing of Data with Third Party Service Providers:

The Policy clarifies that it shares user log in information, device locations, unique identifiers relevant to Facebook, etc with Facebook group entities unlike the ambiguities and non-disclosures in the older policy of WhatsApp.[6] It is clear that WhatsApp engages with third-party service providers and other Facebook companies to operate, provide, improve, understand, customize, support, and market its services. Interestingly, this feature is not a new addition to the Policy and has been in existence even in the older version of the policy though, not in a granular form as the present one.

In certain other instances where users opt for only WhatsApp’s service and opt out of the services of other Facebook Group Entities, mandating the collection and sharing of data of the user with Facebook does not confirm with the data protection standards including the principle of providing a concise purpose limitation for processing and use of the data. Lastly, in case of conflict or compromise of user data, whether the terms & services and privacy policy of WhatsApp will prevail or will it be that of the other Facebook related group entity is unclear from the present Policy.

The different standards adopted by WhatsApp is evident from a perusal of WhatsApp’s privacy policy in the European Union (EU) region[7] and the other parts of the world. Interestingly, in order to be General Data Protection Regulation (GDPR) compliant, in the EU region WhatsApp does not voluntarily share user information with third party service providers for its infrastructural, service or program development unlike its policy in the remaining parts of the world. It is only when the user opts for the third-party services such as a cloud service integrated with the system for back up etc., that the user’s information is shared with third parties. In such cases, naturally the users will be bound by the terms and privacy policies of those services having opted for such services out of their own volition. This stark distinction between the policy of EU region and the other parts of the world clearly demonstrates that WhatsApp is more than willing to comply with stricter norms, however, the absence of norms is at times facilitating its commercial intent, while making it conducive for the platform to process large volumes of data.

  1. Communication with Business Accounts:

The newest feature of the Policy is the change in user privacy settings in a user-business communication. The Policy allows businesses to communicate and interact with each other and users, to browse through products, services as well as place orders. While WhatsApp has assured the users that ETE protocol is retained in case of user-business communications, a deeper analysis of the Policy speaks to the contrary. As long as communication is exchanged within WhatsApp, the ETE protection is guaranteed. However, once the business entity opts for Facebook or other third party hosting services, this principle gets compromised.[8] WhatsApp’s clever disclaimers caution the users about the fact that information shared with such business contacts may be used for the business entities’ own marketing purposes, which may even include advertisement on Facebook or may even be accessed by several employees in the business organisation. Interestingly, to safeguard and avoid any liability, WhatsApp also ensures that every user conversing with such business accounts is well informed by way of labels appearing at the top of such conversations indicating whether such business entities have opted for hosting services from Facebook.

The aforesaid aspect of the Policy also brings to us important questions- who will be liable for breach of privacy in such an arrangement? Can WhatsApp compel its users to tacitly consent to the processing of data by third party entities? Further, can a user be expected to be aware of the privacy policy of not just WhatsApp but also of the hosting services which may be of a third-party including Facebook companies? WhatsApp’s Policy does not provide clear answers to these questions. Therefore, from a user’s perspective who is concerned about the data protection, the only options are: (a) Refrain from communicating with business accounts on WhatsApp or (b) Switch to alternate internet messaging platforms.

Legality of the Policy:

The Information Technology Act, 2000 (Act) read with the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 (Rules) only provide limited protection to sensitive personal data without an effective enforcement mechanism. Therefore, the nuances in WhatsApp’s Policy, particularly, the sharing of information with third party service providers will go unnoticed. No doubt, in such a loosely worded legislative scheme, the Policy will effectively sail through.

However, from a constitutional perspective where horizontal rights enforcement against a private entity is recognized and informational privacy is considered as a fundamental right[9], it is important to analyzes if the Policy effectively meets the adequate data protection standards. The Supreme Court has recognized that as an important aspect of informational privacy, one must have complete control over the dissemination of information that is personal.[10] WhatsApp’s mandatory consent requirement to share information with third parties including Facebook Companies is in the teeth of this principle. This specific aspect formed the subject matter of challenge in a writ petition before the Delhi High Court[11] and is pending adjudication before the Supreme Court.[12]

Recently, a public interest litigation was filed before the Delhi High Court challenging the revised Policy.[13] Absence of a user’s choice to actively consent to the processing and sharing of his data is challenged as being violative of right to privacy. In fact, the petition also highlights the disparity between the Policy and WhatsApp’s privacy policy in the EU Region. The matter is pending adjudication before the High Court.


WhatsApp’s privacy policy, particularly the mandatory scheme of requiring user consent to the sharing of data with Facebook companies has created a stir in India. The Ministry of Electronics and Information Technology has sought clarifications from WhatsApp in respect of the issues pertaining to its privacy, data transfer and sharing regime and general business practices. [14] As per the Indian government, the present Policy will have a disproportionate impact on the Indian citizens. However, till the time India lacks a regulatory regime for personal data protection, larger entities which indulges in generating of data such as WhatsApp/ Facebook will slither away from any strict compliance conditions.

In view of the user’s outrage, government demands and the pending court proceedings, WhatsApp has extended the deadline for accepting the Policy to May 15, 2021. It is only a matter of time when users will get to know the fate of their information on WhatsApp.

[1] WhatsApp Privacy Policy, as of January 04, 2021

[2] (last accessed on 26.02.2021, at 19:00pm); Answering Your Questions about WhatsApp’s Privacy Policy, (last accessed on 26.02.2021, at 19:00pm); About New Business Features and WhatsApp’s Privacy Policy Update, (last accessed on 26.02.2021, at 19:10pm); We’re updating our Terms of Service and Privacy Policy,  (last accessed on 26.02.2021, at 19:00pm)

[3] Technical White Paper on WhatsApp Encryption Overview, (last accessed on 26.02.2021, at 19:20pm)

[4] Fact Check: No, WhatsApp is not recording your calls but privacy concerns can’t be ruled out yet, India Today, January 23, 2021 (last accessed on 26.02.2021, at 19:30pm)

[5] The Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules 2009, Rule 3

[6] WhatsApp Privacy Policy, as of December 19, 2019 (last accessed on 26.02.2021, at 16:45pm)

[7] WhatsApp’s Privacy Policy in the EU Region, as of April 2018, (last accessed on 26.02.2021, at 16:20pm)

[8] Answering your Questions about WhatsApp’s Privacy Policy,  (last accessed on 26.02.2021, at 19:15pm)

[9] Justice K.S. Puttaswamy (Retired) v. Union of India, WP (C) 494 of 2012

[10] Ibid, ¶81

[11] Karmanya Singh Sareen & Anr. v. Union of India & Ors., WP (C) 7663 of 2016, Delhi High Court

[12] Karmanya Singh Sareen & Anr. v. Union of India & Ors. SLP No. 804 of 2017

[13] Chaitanya Rohilla v. Union of India and Ors., WP (C) 677/ 2021, Delhi High Court

[14] Questions That India Asked WhatsApp on Privacy and Data Security, NDTV, January 19, 2021, (last accessed on 26.02.2021 at 19:40 pm)


Atmaja Tripathy, Senior Associate, TMT Law Practice

Atmaja is pursuing litigation at different courts and tribunals in Delhi, including the Supreme Court of India, High Court of Delhi, Telecom Disputes and Settlement Appellate Tribunal, the Competition Commission of India and the National Company Law Appellate Tribunal. Atmaja is enrolled with the Delhi Bar Council. At law school, she has won multiple scholarships for academic excellence, including the Nanhi Palkhiwala Scholarship for Constitutional Law, Ram Jethmalani Scholarship and the Director’s Gold Medal for Outstanding Excellence in the graduating batch. Her interests in technology, media and telecommunication laws, competition law and constitutional law, have led her to pursue prestigious moots and essay competitions. Atmaja has also published articles on contemporary legal issues in reputed international and national journals like European Competition Law Review, Kluwer Business Law Journal, BRICS Law Journal, All India Reporter and Company Law Journal.”

Monitoring Search History: Implications on Privacy

Monitoring Search History: Implications on Privacy


India currently has a ban imposed on 800 odd pornographic websites. In a recent development, the police of an Indian state (State Police) devised a tactic to keep women safe, i.e. by keeping tabs on people’s porn search history.[2] Without diving into the legality of porn ban, this article aims to discuss the legality of the act of monitoring search history in general and also the need for a law governing data protection in India.

Violation of Fundamental Rights Pursuant to Actions of State Police :

In 2017, the Supreme Court in its decision of K.S. Puttaswamy v Union of India held that the right to privacy under Article 21 of the Constitution constituted a fundamental right as a part of the right to “life” and “personal liberty”.[3] The judgment also recognized the aspect of “informational privacy” and held that information about a person and the right to access that information also needs to be given the protection of privacy.[4]

The Information Technology Rules (“IT Rules”), defines “personal information” to include any information that relates to an identified or identifiable natural person.[5]  Accordingly, it can be said that a person’s search history amounts to “personal information” as the same is unique to a person similar to his/her fingerprints.  A browser’s search history is capable of conveying information about a person, to the level of possibility of extracting his/her psychometric or demographic preferences which technically constitutes personal information.[6] Therefore, monitoring search history can be said to be privacy sensitive and a person’s browsing history should ideally be given the protection of privacy. While the right to privacy is not absolute, any restriction to the said right must adhere to the three-prong test of legality, legitimacy and necessity.[7]

In this backdrop, let us evaluate if the measure adopted by the State Police satisfies the aforesaid three prongs.

Legality: An act or a measure can be said to be legal if it has been provided for in law.[8] Section 69 of the Information Technology Act (“IT Act”) empowers the State or Central Government or any of its officers to adopt measures for monitoring if considered necessary or expedient in view of the legitimate aims listed in the section.[9] Since ‘police’ has been listed under List II of Schedule 7 of the Indian Constitution, it can be said that police is a State subject with the power to take actions that it deems necessary under Section 69 of the Act.[10] Therefore, the measure taken by the State Police to rack the browser history of persons who search for pornographic content can be said to have basis under the IT Act. However, the question that arises is searching a person’s entire browser history basis one criteria shall still be questionable.

Legitimacy: A restrictive measure may be taken if it is in pursuance of a legitimate aim.[11] Keeping in view Section 69 of the Act, a bare reading of the section suggests that measures for interception or monitoring of information may be taken only in pursuance of the legitimate aims such as maintaining public order,  investigating an offence, amongst others.[12]

  • Browsing pornographic content is not a criminal offence per se, except viewing of child pornographic content which is illegal in India. Accordingly, the actions of the State Police in the current situation to monitor browsing history which action of an individual is not an offence per se cannot be said to be a legitimate act on part of the police. While crimes against women are punishable offences, browsing of pornographic content is no offence, thus the police surveillance on data with a view to prevent such offences from happening becomes dubious.
  • A public disorder is caused when there is disruption of peace and tranquility with the aim of undermining the security of the State and overthrowing it.[13] Borrowing this understanding of public order into the law governing privacy it is difficult to understand how this measure by the State Police would help uphold public order. Thus, it can be said that private browsing of porn does not amount to any breach of public order and the measures taken by the State Police were not backed by any legitimate aims.

Necessity: Finally, a restriction may be imposed only if it is necessary in a democratic society. A pressing social need must exist and the measure in question must be proportionate to the legitimate aim sought. Multiple researches conducted suggest that the private consumption of pornography does not affect the overall treatment of women in the society.[14] There is no evidence to support the notion that porn consumption encourages crimes against women. While it is true that other fundamental rights may be preferred over an individual’s right to privacy, this can only be done if private activities like the one in question cause significant harm to others.


In view of the non-fulfillment of the three prongs the State Police’ action appears to curtail an individual’s right from freely exploring and indulging in their own personal tastes and convictions.[15] As JS Mill expounds, a person’s own good, either physical or moral is not a ground for state interference into a person’s liberty.[16]    Further, the action creates a chilling effect on freedom of speech and expression, impinging the invaluable fundamental right which includes within its ambit- the right to receive information and access content on the internet.

India presently lacks laws governing the protection of personal data, resulting in measures like the one taken by the State Police. In the absence of such laws, the actions of State Police escape legal scrutiny. Such instances further highlight the need for a Personal Data Protection law in India.

[1] Shreya Sundararaman, Penultimate Year Law Student. Shreya interned at TMT Law Practice in March 2021.

[2] News 18, ‘UP Police Will Monitor Your Porn Searches in Internet History. Will it Reduce Crimes Against Women?’ (16th february 2021) URL <>.

[3] K.S. Puttaswamy v Union Of India (2017) 10 SCC 1 para 77 (“Puttaswamy”).

[4] Puttaswamy (n 2), para 54.

[5] Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules 2011, 11th April 2011, Rule 2(i).

[6] Lukasz Olejnik, Claude Castelluccia and Artur Janc, “Why Johnny Can’t Browse in Peace: On the Uniqueness of Web Browsing History Patterns. 5th Workshop on Hot Topics in Privacy Enhancing Technologies” (HotPETs 2012), Jul 2012, Vigo, Spain; Sarah Bird, Ilana Segall and  Martin Lopatka, ‘Replication: Why We Still Can’t Browse in Peace: On the Uniqueness and Identifiability of Web Browsing Histories’ URL <>.

[7] K.S. Puttaswamy v Union Of India (2019 ) 1 SCC 1 para 835-36 (“Puttaswamy 2”).

[8] id.

[9] The Information Technology Act 2000, 09 June 2000, Section 69 (“IT Act”).

[10] Constitution of India, Seventh Schedule, List II, Entries 1 and 2.

[11] id.

[12] IT Act (n 8), Section 69.

[13] Romesh Thappar v State of Madras  AIR 1950 SC 124 para 10.

[14] Suresh Bada Math, Biju Viswanath, Ami Sebastian Maroky, Naveen C. Kumar, Anish V. Cherian, and Maria Christine Nirmala, “Sexual Crime in India: Is it Influenced by Pornography?” (2014)  Indian J of Psychol Med. 147-152; Jessica Brown, ‘Pornography is now only an internet search away, and is becoming ever more immersive. How is it changing people’s behaviour, relationships and desires?’ (BBC, 26th September 2017) URL <>.

[15] West, Caroline, “Pornography and Censorship”, The Stanford Encyclopedia of Philosophy (Fall 2018 Edition), Edward N. Zalta (ed.), URL <>.

[16] Gautam Bhatia, Offend, shock or disturb: Free speech under the Indian Constitution (OUP) 49-50.

Author: Ms. Shreya Sundararaman

    Internship Opportunities

    Upload CV

      Work With Us