Following on the heels of major developments coming out of the Senate last week to advance privacy protections for children online, the Department of Justice (“DOJ”) officially filed a lawsuit on Friday against TikTok, Inc., its parent company, ByteDance, and certain affiliates (collectively, “TikTok”), over alleged violations of the Children’s Online Privacy Protection Act (“COPPA”) and its implementing rule (the “COPPA Rule”) as well as an existing FTC 2019 consent order (the “2019 Order”) alleging violations of the same.[1]

After an investigation by the Federal Trade Commission (“FTC”) into TikTok’s compliance with the 2109 Order allegedly revealed a flagrant, continued disregard for children’s privacy protections, the FTC took the rare step of releasing a public statement referring the complaint to the DOJ which subsequently filed suit in the Central District of California last week.  “TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” said FTC Chair Lina M. Khan.  “The FTC will continue to use the full scope of its authorities to protect children online—especially as firms deploy increasingly sophisticated digital tools to surveil kids and profit from their data.”

According to the complaint, TikTok is alleged to have violated not only COPPA and the COPPA Rule but also the 2019 Order by:

  1. Knowingly allowing millions of children under thirteen to create and use TikTok accounts that are not reserved for children, enabling full access to the TikTok platform to view, make and share content without verifiable parental consent;
  2. Collecting extensive data, including personal information, from children without justification and sharing it with third parties without verifiable parental consent;
  3. Failing to comply with parents’ requests to delete their children’s accounts or personal information; and
  4. Failing to delete the accounts and information of users TikTok knows are children in direct violation of the 2019 Order. 

In highlighting a number of actions undertaken by TikTok, which allegedly led to “unlawful, massive-scale invasions of children’s privacy”, the DOJ’s complaint contains several allegations that TikTok knowingly disregarded its obligations under applicable law and under the 2019 Order requiring TikTok to prevent child users from accessing its platform without verifiable parental consent and to take measures to protect, safeguard and ensure the privacy of the information of its child users once obtained. Among others, the DOJ alleged the following illegal practices:

  • Insufficient Age Identification Practices.  Despite implementing age gates since March 2019 on its platform in efforts to direct users under thirteen to TikTok Kids Mode (a version of the app designed for younger users which allows users to view videos but not create or upload videos, post information publicly or message other users) the complaint alleges that TikTok continued to knowingly create accounts for child users that were not on Kids Mode without requesting parental consent by allowing child users to evade the age gate.  Specifically, upon entering their birthdates and being directed to Kids Mode, under-age users could simply restart the account creation process in order to provide a new birthdate to gain access to the general TikTok platform without restriction (even though TikTok knew it was the same person); alternatively, users could also avoid the age gate entirely by logging in via third-party online services in which case TikTok did not verify the user’s age at all. 
  • Unlawful and Overinclusive Data Collection from Child Users. Even where child users were directed to Kids Mode, the complaint alleges that personal information was collected from children, such as username, password and birthday as well as other persistent identifiers such as IP addresses or unique device IDs, without providing notice to parents and receiving consent as required under COPPA.  TikTok also collected voluminous account activity data which was then combined with persistent identifiers to amass profiles on child users and widely shared with third parties without justification.  For example, until at least mid-2020, TikTok is alleged to have shared information collected via Kids Modes accounts with Facebook and AppsFlyer, a third party marketing analytics firm, to increase user engagement; the collection and sharing of persistent identifiers without parental consent was unlawful under the COPPA Rule because use of such data was not limited to the purpose of providing “support” for TikTok’s “internal operations”.
  • Failures to Honor Deletion Requests.  Though the COPPA Rule and the 2019 Order required TikTok to delete personal information collected from children at their parent’s request, TikTok failed to inform parents of this right and separately to act upon such requests.  To request deletion under TikTok’s policies, TikTok allegedly employed an unreasonable and burdensome process, often times requiring parents to undertake a series of convoluted administrative actions to delete their child’s account before taking action, including scrolling through multiple webpages to find and click on a series of links and menu options that gave no clear indication that they apply to such a request.  Even where parents successfully navigated this process, their requests were infrequently honored due to rigid policies maintained by TikTok related to account deletion.[2]   The complaint also suggests that even where such accounts were deleted, TikTok maintained certain personal information related to such users, such as application activity log data, for up to eighteen months without justification.
  • Failures to Delete Accounts Independently Identified by TikTok as Children’s Accounts. In clear violation of the 2019 Order, TikTok is also alleged to have employed deficient technologies, processes and procedures to identify children’s accounts for deletion, and even appears to have ignored accounts flagged by its own human content moderators as belonging to a child and ripe for deletion.  Instead, despite strict mandates to delete such accounts, TikTok’s internal policies permitted account deletion only if rigid criteria were satisfied—such as explicit admissions by the user of their age—and provided human reviewers with insufficient resources or time to conduct even the limited review permitted under such policies.[3]

In addition to a permanent injunction to cease the infringing acts and prevent further violations of COPPA, the complaint requests that the court impose civil penalties against TikTok under the FTC Act, which allows civil penalties of up to $51,744 per violation, per day.  Given the uptick in recent enforcement related to children’s privacy issues and potential for material fines, entities should carefully consider the scope of COPPA’s coverage to their existing products and services, as well as their existing policies, practices and product functionality, to ensure compliance and avoid regulatory scrutiny.


[1] Specifically, the 2019 Order (i) imposed a $5.7 million civil penalty, (ii) required TikTok to destroy personal information of users under the age of thirteen and, by May 2019, remove accounts of users whose age could not be identified, (iii) enjoined TikTok from violating the COPPA Rule and (iv) required TikTok to retain certain records related to compliance with the COPPA Rule and the 2019 Order.

[2] According to the complaint, in a sample of approximately 1,700 children’s TikTok accounts about which TikTok received complaints and deletion requests between March 21, 2019, and December 14, 2020, approximately 500 (30%) remained active as of November 1, 2021, and several hundred were still active in March 2023.

[3] For example, despite having tens of millions of monthly active users at times since the entry of the 2019 Order, TikTok’s consent moderation team included fewer than two dozen fulltime human moderators responsible for identifying and removing material that violated all of its content related policies, including identifying and deleting accounts of unauthorized users under thirteen.  Further, during at least some periods since 2019, TikTok human moderators spent an average of only five to seven seconds reviewing each flagged account to determine if it belonged to a child.