By Anuradha Gandhi and Isha Sharma
In recent years, the Chinese-owned short video platform, TikTok has witnessed a meteoric rise in popularity, particularly among the demographic of teenagers. This surge in growth has made TikTok a global sensation, shaping the social media landscape and redefining how the younger generation interacts with content and each other in the digital arena.
However, the rise of TikTok has not been without its share of controversies and concern. Issues related to data privacy and content moderation have raised questions about the platform’s safety and security, which further led to increased scrutiny by regulators and policymakers in various countries.
TikTok is becoming a more frequent target of parents, policymakers and regulators who are wary of the company’s data-collection practices and the platform’s effect on the mental health of young people. In a 2022 survey, 67 percent of American teens said they use TikTok, with 16 percent saying they use it “almost constantly,” according to the Pew Research Center.
Fine on TikTok
Recently, it has been brought to light that the European Union regulator imposed a substantial fine of 345 million euros on TikTok for violating the European strict data privacy rules, particularly pertaining to processing of children’s personal data.
The 345 million euros fine, which is equivalent to 369 million dollars, represents the culmination of a two-year inquiry conducted by the Ireland’s Data Protection Commission (DPC).
The Irish regulatory authority, which holds a pivotal role in enforcing the EU’s GDPR regulations, granted TikTok a three month period to align its data processing practices with the prescribed rules.
In the month of September 2021 the DPC initiated an assessment of TikTok’s adherence to GDPR standards concerning platform configurations and the processing of personal data for users below the age of 18 years. The DPC also scrutinized the age verification measures adopted by TikTok, concluding that there were no infringements in this regard. However, the investigation did reveal that the platform had not conducted thorough assessment of the potential risks associated with younger individuals registering on their services.
The European regulators in its ruling dated September 15, 2023 highlighted that an investigation found that children who signed up for TikTok had their accounts configured as “public” by default, allowing anyone to view and/or comment on their content.
The DPC also raised concerns regarding TikTok’s “family pairing” mode, which is intended to connect parents accounts with those of their teenage offspring and discovered through its investigation that the company did not adequately verify the parental or guardian status of users utilizing this feature.
TikTok respectfully disagrees
However, TikTok respectfully disagrees with the decision of imposing the fine, saying it has made the changes well several months before the investigation even began, toughening parental controls to family pairing in November 2020 and making all accounts for teen under 16 private by default in January 2021.
It was also contended by TikTok that it closely monitors the age of its users and take action, as and when required. Furthermore, TikTok claims to delete approximately 17 million accounts worldwide in the first three month of this year due to suspicions that they belonged to people under 13 years old.
TikTok has announced its intentions to enhance its privacy documentation, with a specific focus on providing greater clarity on the distinction between public and private accounts. As part of these efforts, the platform will automatically preselect a private account option for users aged 16 or 17 when they register for the app, starting later this month.
The DPC granted TikTok a 3 months grace period to rectify all the processing practices that were found to be non-compliant with the regulations.
The regulator is still carrying out a second investigation into whether TikTok complied with the EU’s General Data Protection Regulation when it transferred users’ personal information to China, where its owner, ByteDance, is based. 
TikTok has faced accusations it poses a security risk over fears that users’ sensitive information could end up in China. It has embarked on a project to localize European user data to address those concerns: opening a data centre in Dublin this month, which will be the first of three in the continent.
This is not the first time TikTok has been punished for its handling of children’s data. In April, British regulators fined the company 12.7 million pounds, worth about $15.8 million today, for not preventing children under the age of 13 from signing up for the service. In 2019, Musical.ly, the service that would later become TikTok, agreed to pay $5.7 million to settle charges by the Federal Trade Commission for violating U.S. data protection rules for children.
In addition to TikTok, several other major tech industry players, including Instagram, WhatsApp and their parent company, Meta have faced substantial fines imposed by the Irish regulator over the course of the past year. These fines have underscored the heightened scrutiny and enforcement measures being applied to big techs regarding data protection, privacy and compliance with relevant regulations.
The Irish regulator’s actions reflect the growing emphasis on holding tech giants accountable for their data handling practices and ensuring that they adhere to the stringent standards set forth by data protection laws such as the GPDR and now the newly enforced Digital Personal Data Protection Act, 2023 (also known as the DPDP Act) in India as well.
As the digital landscape continues to evolve, it is imperative to acknowledge that children now make up a substantial portion of active internet users. In today’s fast-paced online environment, their digital activities are changing and expanding at an unprecedented rate. This dynamic shift has given rise to significant concerns surrounding the paramount importance of safeguarding both the safety and privacy of these young users. In furtherance of the same, the children’s data should be subjected to a greater protection.
As per the DPDP Act, child means an individual who has not attained the age of 18 years. The DPDP Act places additional obligation on the Data Fiduciaries to obtain “verifiable consent” from the parent or legal guardian, as the case may be, before processing the personal data of a child as enumerated under Section 9 of the DPDP Act. Failure to adhere to such data protection regulation can result in substantial penalties, which may extend to INR 200 crore.
Further, the implementation of the DPDP law implies that the social media platforms will now have to obtain parental consent for users between the age groups of 13 years and 18 years, thereby, confirming the identity of both the child and their parents. Likewise, if a parent agrees to share the data, they will have to enter a one-time password (OTP) to provide consent. This consent will then be recorded in the parent’s consent ledger. The mechanism will be such that the parents would be required to declare all their children, hence, consequently allowing mapping of children with their parents. For more information, please refer to our article titled: “Parental Consent to be stored in Digilockers”.
In today’s interconnected world, where data breaches and privacy violations are widely publicized, the consequences of non-compliance extend beyond monetary penalties, they can tarnish an entity’s image and erode the trust that stakeholders, clients and the public place in them. Therefore, it is imperative for all entities handling personal data to prioritize compliance with data protection regulations to safeguard their financial stability and maintain their hard-earned reputation.