On December 19, 2022, the United States Federal Trade Commission (“FTC”) announced two separate record-breaking settlements with Epic Games, Inc. (“Epic”), the video game publisher behind the popular online multiplayer game “Fortnite,” totaling over $520 million for alleged violations of the Children’s Online Privacy Protection Act (“COPPA”) and use of “dark patterns” to deceive players into making unwanted, in-game purchases. 

The settlements come against a backdrop of increasing scrutiny by privacy regulators regarding the collection and processing of children’s data online, as parents and privacy advocates around the world call for greater protections for children who engage in online activities.  Despite growing concerns and an uptick in enforcement, lawmakers in the United States continue to fail to advance federal children’s privacy bills with both the Kids Online Safety Act[1] and the Children and Teen’s Online Privacy Protection Act[2] excluded from Congress’ end-of-year spending bill released on Tuesday, December 20, 2022. Federal inaction has spurred states to take matters into their own hands, with California again leading the charge with enactment of its Age Appropriate Design Code (as previously discussed here), which will become effective on July 1, 2024 and apply broadly to all businesses covered by the California Privacy Rights Act that develop and provide online services, products or features that are “likely to be accessed by children” under the age of 18.[3]  

Below, we summarize each of the recent FTC consent orders and settlements, including some of the evidence relied upon by the FTC to determine COPPA’s applicability and how Epic’s actions, or lack thereof, violated its obligations under COPPA and Section 5 of the FTC Act which prohibits unfair or deceptive trade practices, as well as the monetary penalties and required remediation Epic must undertake and comply with going forward.  We then provide a few key takeaways and best practices for businesses that operate websites or other online services likely to be accessed and used by children and teens to consider to avoid regulatory scrutiny and penalty.

FTC Alleges Fortnite is Subject to COPPA

In its children’s privacy-related complaint against Epic,[4] the FTC alleged that Epic violated COPPA by collecting children’s personal information—including their full names, email addresses and usernames—in connection with their use of Fortnite without notifying parents or obtaining parents’ verifiable consent. 

As a threshold matter, whether or not COPPA applies depends upon whether (i) a website or online service is directed to children under the age of 13 or (ii) an operator of a website or online service has “actual knowledge” that a particular visitor is a child.  Where COPPA applies, the operator must comply with certain specific requirements, including obtaining verifiable parental consent prior to collecting personal information from children; providing parents a reasonable means to review personal information collected from children online; and permitting parents to request deletion of personal information collected from children online.

While Epic maintained that Fortnite was not required to be COPPA-compliant as it was neither “directed at children” (stating as such in its privacy policy) nor did Epic have any “actual knowledge” that children under the age of 13 were accessing or playing the game, the FTC disagreed, relying on both internal communications between Epic employees and public studies that (as argued by the FTC) established that Fortnite is an online service directed to children under the age of 13, and thus must comply with COPPA. Specifically, the FTC cited (i) the game’s cartoon graphics, (ii) emphasis on building forts and non-violent laser-tag-like mechanics, (iii) inclusion of music events from artists popular with children, (iv) internal Epic correspondence and (v) evidence from licensing deals that Epic has entered into for child-themed toys and apparel.  The FTC also argued that Epic had actual knowledge of the use of Fortnite by children, including based on internal communications between Epic employees.

When Fortnite was first launched in 2017, players were merely required to provide their name and email to create an Epic account and access the game, with minimal privacy settings. From then until September 2019, the FTC claimed Epic took no steps to (a) provide notice to parents describing Epic’s practices regarding the collection of children’s personal information; (b) explain what information Epic collected from children; or (c) seek verifiable parental consent from parents before collecting their children’s personal information.  After the implementation of an age gate by Epic in September 2019, requiring new users under the age of 13 to self-identify as such, Epic undertook efforts to collect their parents’ email addresses, to provide parents with notice of Epic’s data collection and processing practices and request parents’ consent.  Simultaneously, Epic also attempted to use information in support tickets to identify additional users from its existing player base who seemed to be under the age of 13, and required them to provide their birthdate at their next login (sending parents a notice and requiring consent if such user self-identified as under the age of 13).  However, in spite of these efforts, the FTC noted that, apart from the limited actions taken using support tickets, Epic did not retroactively apply these age gate controls to the hundreds of millions of Fortnite players with existing accounts nor did it address the “extraordinary hoops” it erected to verify parental status (often requiring parents to provide all the IP addresses used by their child to play Fortnite, the date the child’s Epic account was created, the date of their child’s last Fortnite login, their child’s original Epic account display name and other peculiar data points).

In addition to COPPA violations, the FTC further alleged that Epic violated Section 5 of the FTC Act, which prohibits use of unfair or deceptive trade practices, by enabling real-time voice and text chat communications for all players (including children) by default, which, together with publicly broadcasting children’s display names, the FTC alleged resulted in harm to children.  Epic did subsequently offer an option to disable the chat function but buried this functionality deep within the “settings” page and configured the default setting to “on” for all players.  

As part of the settlement and in addition to the $275 million penalty to be paid to the U.S. Treasury, Epic further agreed to:

  1. Ensure that parents receive direct notice of Epic’s privacy practices relating to children’s personal information;
  2. Post a prominent and clearly labeled link to an online notice of such privacy practices;
  3. Obtain verifiable parental consent before any collection of children’s personal information.
  4. Delete a child’s personal information at the request of a parent;
  5. Confirm the age of all current Fortnite users through an age gate and delete all information of children under age 13 (unless Epic can demonstrate evidence of having provided direct notice and obtained verifiable parental consent);
  6. Disable voice and text communications by default, unless parents of users under 13 give their affirmative consent through a privacy setting; and
  7. Establish and maintain a comprehensive privacy program to protect users’ personal information, which must be assessed biennially by a third party.

Alleged Use of “Dark Patterns” to Mislead Consumers into Unwanted Purchases[5]

In a separate settlement also released on December 19,[6] the FTC alleged that Epic violated Section 5 of the FTC Act where its use of “dark patterns” (or deceptive designs that trick users into unintentionally consenting to terms or charges or that make it difficult for consumers to cancel memberships or other recurring payments once set up) caused players, including children, to rack up hundreds of dollars in unintended in-game purchases with the press of a button. 

According to the FTC, Epic violated the FTC Act’s prohibition on unfair or deceptive acts or practices by employing a “myriad of design tricks” that resulted in consumers being charged for in-game enhancements, such as costumes, dance moves and item-filled piñatas, without first obtaining their or their parent’s express informed consent.  Specifically, the complaint alleged that when players first made a purchase at the time of Fortnite’s launch, Epic saved players’ payment information by default, seemingly without notice, and used it to bill for future purchases without players even realizing that their payment card information had been saved.  Thus, when players sought to make future purchases, Fortnite’s features did not require users to take any further action before incurring charges, such as asking them to re-provide their credit card CVV code or even to confirm their purchase, thereby allowing purchases to occur automatically with the click of a button without the player or cardholder’s consent.  Furthermore, not only did Epic inconspicuously retain payment information, but through design that placed the preview and purchase buttons very close together and the fact that the same console controller button was used to make a purchase as was used for other activities outside the game’s store, players could incur a number of charges simply while attempting to navigate between outfit styles or dance moves, and even while trying to “wake” the game up from a sleep state, leading to hundreds of millions of dollars in unauthorized charges for consumers. 

Additionally, the FTC alleged that Epic used dark patterns to make it difficult to cancel or request refunds for unauthorized charges, and further subsequently banned consumers from accessing previously paid-for content when they disputed unauthorized charges with their credit card providers.  Originally, players were unable to cancel or undo charges for any in-game enhancements; however, in 2019, after receipt of a number of complaint from its players and internal correspondence from employees suggesting that Epic make changes to these practices, Epic allowed users to cancel certain charges using an undo button, but only for a limited time and only if they remained on the purchase screen.  After numerous players cancelled purchases using this function, Epic allegedly changed the size and location of the undo button to make it much less prominent and to require a user to hold down a button to cancel the purchase (even though making the purchase did not require anything more than a single button press).  Not only did Epic employ tactics to deter players from “undoing” these unwanted charges, but according to the complaint, beginning around February 2018 Epic began to deactivate or block the accounts of  players who disputed these unauthorized charges regardless of the reason for the dispute or whether the charge was upheld, causing them to lose access to previously purchased in-game content which for some players totaled hundreds or even thousands of dollars.  Even when Epic agreed to eventually unlock certain accounts, it warned consumers that continuous payment-related disputes could lead to lifetime bans.

As part of the settlement for these charges and in addition to the $245 million required to be refunded to impacted consumers, Epic agreed to:

  1. Obtain express informed consent prior to charging any Fortnite account holder, and
  2. Not deny any user access their accounts (or any previously paid-for goods) for disputing unauthorized charges.

Key Takeaways

  • Processing of children and teen’s data continues to be a major focus of regulators and lawmakers, both in the US and abroad[7].   With increasing legislation aimed at protecting individuals under the age of 18 from misuse and exploitation of their personal information and recent enforcement actions that signal regulators are committed to their objectives to defend children and teen’s privacy rights, businesses that process kids’ data must stay apprised of their obligations in this space.
  • As enforcement of COPPA appears to be on the rise, covered businesses must assess whether any of their websites or online services are “directed at children” by considering a number of factors including: the website or online services’ subject matter, music, audio and visual content; whether the website or service uses animated characters or child-oriented activities and incentives; the presence of younger or child-aged models, child celebrities or celebrities who appeal to children; whether advertising promoting or appearing on the website or online service is directed to children; or any other evidence regarding the intended audience of the site or service. Should COPPA apply, businesses should ensure proper privacy notices are provided to, and verifiable consent sought from, parents of children under the age of 13 by implementing age verification systems and ensuring mechanisms are in place that allow parents to control the collection and use of their children’s personal information.
  • Operators of websites or other online services that may be considered to be directed at, or actually accessed by, children must consider the ways in which these services are designed and developed, including by (i) conducting tailored risk assessments relating to the processing of children’s data and (ii) adopting a “privacy by design” approach to ensure an intuitive and frictionless user experience that allows consumers to control the collection and use of their personal information, and avoids the need to retroactively address privacy concerns.

[1] A copy of the Kids Online Safety Act bill proposal is available here.

[2] A copy of the Children and Teen’s Online Privacy Protection Act, the federal bill seeking to amend COPPA to include children and minors through the age of 16, is available here.

[3] Following California’s lead, both New York and New Jersey have since introduced their own bills geared toward increasing privacy protections for children’s data: a copy of New York’s proposed “Child Data Privacy and Protection Act” can be found here and New Jersey’s proposal to create the New Jersey “Children’s Data Protection Commission” can be found here.

[4] A copy of the COPPA-related complaint can be found here.  The settlement can be found here.

[5] While this portion of the FTC settlement does not concern privacy law per se, note that newly amended or enacted privacy laws in California, Colorado and Connecticut each prohibit the use of “dark patterns” as a means by which to seek and receive consumer consent for the collection or processing of consumer personal information, thereby making the use of “dark patterns” without consent a privacy violation.

[6] A copy of the dark patterns-related settlement can be found here. The complaint can be found here.

[7] Note, in particular, that the UK Age Appropriate Design Code was issued in September 2021 but companies had a 12-month grace period to comply with the same.