Determined to maintain its position as a pioneer for consumer privacy rights, California is again among the first to take action to tackle issues of children’s safety and privacy online with the enactment of the California Age-Appropriate Design Code (the “Code”), which was signed into law by Governor Gavin Newsom on September 15, 2022.  Once effective on July 1, 2024, the Code would, among other things, prescribe rules that require  businesses to design their online products and services with children’s privacy in mind and identify and mitigate any risks of material detriment to children that arise from businesses’ online data practices.

The Code’s enactment comes after years of inaction at the federal level on children’s online safety and privacy despite concerns from parents and privacy advocates alike and bipartisan support for a new federal children’s privacy bill (or amendments to address loopholes under the existing Children’s Online Privacy Protection Act (“COPPA”)).  Certain comprehensive state privacy laws, such as those recently enacted in Connecticut, Colorado and Virginia, already impose heightened obligations where processing children’s data (defined as sensitive personal information under such laws) but these laws do not specifically address businesses that provide online services, products, or features likely to be accessed by children.  However, with increased focus on the protection of children’s personal information both stateside and around the world[1], we can expect additional states to impose targeted requirements to protect children who engage in online activities.

Below, we examine the Code’s applicability and key obligations and restrictions, as well as the potential for enforcement and related liabilities. We are happy to provide clients upon request a “cheat sheet” outlining the indicators to consider when determining whether the Code applies to an online product, service, or feature, as well as those actions businesses are required to take, or are restricted from taking, with respect to the processing of children’s personal information in such online product, service or feature.

The Code’s Applicability

The Code applies to all businesses that are subject to the California Privacy Rights Act (the “CPRA”)[2] and that develop and provide online services, products, or features that are “likely to be accessed by children.”[3]

Children include anyone under 18.  Importantly, the Code defines “children” as consumers under the age of eighteen, expanding coverage to teenagers in a departure from COPPA, which only applies to children under the age of thirteen. As discussed below, this age expansion, together with the Code’s broad scope, will complicate businesses’ compliance obligations.

Broad “likely to be accessed” test.  The Code sets out an exhaustive list of “indicators” to determine if it is reasonable to expect that an online service, product, or feature is likely to be accessed by children.  That would be the case when the online service, product, or feature in question (i) is “directed to children” as defined under COPPA (which similarly requires consideration of a number of factors including: the website or online services’ subject matter, music, audio and visual content; whether the website or service uses animated characters or child-oriented activities and incentives; the presence of younger or child-aged models, child celebrities or celebrities who appeal to children; whether advertising promoting or appearing on the website or online service is directed to children; or any other evidence regarding audience composition or the intended audience of the site or service)[4], (ii) is determined to be routinely accessed by a significant number of children, based on competent and reliable evidence regarding audience composition, (iii) contains advertisements marketed to children, (iv) is an online service, product, or feature that is substantially similar to those covered under indicator (ii), (v) has design elements that are known to be of interest to children, including games, cartoons, music and celebrities who appeal to children, and (vi) has an audience that, based on internal company research, is comprised of a significant amount of children.

Though it appears that the California legislature intended for the Code’s “likely to be accessed” test to reach beyond COPPA’s existing scope, as evidenced by the inclusion by reference of COPPA’s “directed to children” test as one of six indicators under the Code, the Code’s indicators (ii)-(vi) also appear to substantially overlap with the factors to be considered under COPPA’s “directed to children” and “actual knowledge” standards.  Specifically, both the “directed to children” test and indicators (ii), (iii), (v) and (vi) take into account objective factors such as advertising marketed to children, evidence of audience composition, and the presence of child-oriented content such as children’s music, animated characters or cartoons, or celebrities who appeal to children.  Furthermore, the Code’s references in indicators (ii) and (vi) to evidence that the online service, product, or feature is “routinely accessed by a significant number of children,” or that a company’s internal research determined that children comprise “a significant amount of the audience” appear to functionally mirror COPPA’s overall application to operators of general audience websites and online services who have actual knowledge that they are collecting children’s information.

A key difference between COPPA and the Code is the role of the website or service’s intent in determining applicability.  Notably, COPPA and its implementing regulations consider “evidence regarding the intended audience” of a website or online service, and even contain an exception, which is not available under the Code, meant to specifically exempt websites and online services that actively seek to exclude children from their primary audience through the use of age verification and parental consent mechanisms prior to the collection of personal information.  A business that takes measures to fall within COPPA’s exemption would still need to take additional measures under the Code due to the Code’s wide-ranging application.  In contrast to COPPA, aspects of the Code appear specifically designed to sweep in online services, products and features aimed at general audiences, regardless of intent, so long as a significant number of children access or use them, or even without such access or use if the services, products, or features, are “substantially similar” to those “routinely accessed by a significant number of children.”  The Code’s application to children under eighteen (as opposed to COPPA’s application to children under thirteen) compounds this expansive scope, as it may be particularly challenging to distinguish between products and services targeting or used by teenage “children” versus those used by young adults.

The Code’s standard has been met with initial pushback from many industry groups and businesses, and some argue that not only is the “likely to be accessed by children” threshold unduly expansive (e.g. many websites have at least some design elements that are known to be of interest to children), but it serves to potentially extend the Code’s applicability beyond that of COPPA and similar laws to cover any online service, product, or feature, affecting even businesses that do not intentionally or knowingly target, or even reach, children.

Obligations and Restrictions Imposed by the Code

The Code requires businesses that provide an online service, product, or feature “likely to be accessed by children” to consider the best interests of children, taking into account the unique needs of different age ranges, and to prioritize those interests over commercial interests. Specifically, the Code prescribes a detailed list of obligations and simultaneously imposes a variety restrictions on businesses, many of which are largely adopted from and expand upon the standards set forth under the UK’s Age-Appropriate Design Code.  Specifically, the Code requires that businesses take the following actions:

  1. Conduct, Document and Review Data Protection Impact Assessments. Building on similar requirements imposed initially under the European GDPR and subsequently under the CPRA, the Code requires businesses to conduct, document and biennially review data protection impact assessments (“DPIAs”) for any online service, product, or feature that is “likely to be accessed by children” and that has been offered to the public before the Code’s effective date, as well as before launching any new online service, product, or feature “likely to be accessed by children.”  DPIAs must address a prescribed list of questions, including whether the design of the online product, service, or feature could harm children, such as by exposing them to harmful, or potentially harmful, content, contact or conduct, and whether algorithms or targeted advertising systems are employed in ways that could harm children. Upon identifying any risk of material detriment to children, businesses are required to create a timed plan to mitigate or eliminate the risk before the online service, product, or feature may be accessed by children. DPIAs must be made available to the Attorney General within three business days upon request, but are not discoverable by the public (and the disclosure to the Attorney General does not constitute a waiver of any privilege).
  2. Apply Protections to All Users Unless the Business Can Reasonably Estimate the Age of Child Users. Unless businesses can estimate the age of their child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business, they must apply those privacy and data protections afforded to children to all consumers.
  3. Implement Privacy by Design. Online services, products, or features that are likely to be accessed by children should offer strong privacy protections by design and by “default,”[5] such as by disabling features that profile children based on their previous behavior, browsing history, or assumptions of their similarity to other children, to offer detrimental material. Moreover, there is an obligation to configure all default privacy settings provided to children to settings that offer a high level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interests of children.
  4. Notice Appropriate for Kids, Transparency and Enforcement of Privacy Rights and Standards. Businesses must provide privacy information, terms of service, policies and/or community standards prominently, using language suited to the age of the child likely to access the online product, service, or feature, and must enforce all such policies and standards.  Prominent, accessible, and responsive tools must be provided to children and their parents to exercise their privacy rights and report concerns.
  5. Online Activity & Geolocation Tracking Disclosure (Even When Monitored by Parents). If an online service, product, or feature allows the child’s parent, guardian or any other consumer to monitor the child’s online activity or track the child’s location, the business must provide an obvious signal to the child when the child is being monitored or tracked.

In addition to the obligations listed above, the Code prohibits businesses from:

  1. Using personal information (i) in a way the business knows, or has reason to know, is materially detrimental to the physical health, mental health or well-being of a child, (ii) beyond the purpose for which it was collected unless the business can demonstrate a compelling reason that such use is in the best interests of children or (iii) collected to estimate age or age range for any other purpose, or retaining that personal information longer than necessary to estimate age.
  2. Profiling a child by default unless (i) the business can demonstrate it has appropriate safeguards in place to protect children and (ii) either (a) the profiling is necessary to provide the online service, product, or feature requested (but only with respect to the aspects of the online service, product, or feature with which the child is actively and knowingly engaged) or (b) the business can demonstrate a compelling reason that profiling is in the best interests of children.
  3. Collecting, selling, sharing, or retaining (i) any personal information beyond what is necessary to provide an online service, product, or feature with which a child is actively and knowingly engaged, or for certain purposes exempted under the CPRA such as where required to comply with federal, state or local law, or to cooperate and coordinate with law enforcement, unless the business can demonstrate a compelling reason that the collecting, selling, sharing, or retaining of the personal information is in the best interests of children likely to access the online service, product, or feature or (ii) any precise geolocation information of children by default unless such collection is strictly necessary to provide the service, product, or feature requested, and then only for the limited time that the collection of precise geolocation information is necessary to provide the service, product, or feature.
  4. Using “dark patterns,” as that term is defined under the CPRA, to lead or encourage children to provide personal information beyond what is reasonably expected to forego privacy protections, or to take any action that the business knows, or has reason to know, is materially detrimental to the child’s physical health, mental health or well-being.

Penalties, Enforcement and the California Children’s Data Protection Working Group

In a departure from prior versions of the Code which established joint enforcement authority with the California Privacy Protection Agency, the Code vests exclusive enforcement and rulemaking authority with the state Attorney General.  However, before any enforcement action can be brought for alleged violations of the Code, the Attorney General must provide written notice to the business, identifying the specific provisions that the Attorney General alleges have been or are being violated.  Upon notice, businesses have 90 days (up from 45 days in prior versions) to remediate the alleged violation and provide the Attorney General with a written statement that certifies that the business cured the noticed violation and that sufficient measures have been taken to prevent future violations to avoid civil liability. Where a business is unable to cure such noticed violation, penalties may include an injunction or civil penalty ranging from not more than $2,500 per affected child for each negligent violation and not more than $7,500 per affected child for each intentional violation.

Finally, the Code also establishes the California Children’s Data Protection Working Group (the “Working Group”), which is to consist of ten members appointed by various state regulators provided that such members have pertinent expertise in areas such as children’s mental and physical health and well-being, computer science and children’s rights and privacy. The Working Group is tasked with delivering, on or before January 1, 2024 and every two years thereafter[6], a report containing recommendations and “best practices” to implement the Code, including by identifying online services, products, and features likely to be accessed by children.

Conclusion

While the Code’s passing has been heralded by privacy advocates as a historic triumph in the efforts to protect children’s online privacy, critics have questioned the Code’s practicability, including whether its mandates are technologically and operationally feasible.  Another criticism relates to the risk that the Code’s requirements will result in practice in increasing the collection of personal information contrary to data minimization principles required under the CPRA (e.g. through the requirement for costly and questionably effective age verification mechanisms that may require all users to submit personal information such as birthdates in order to verify user age in order to access services, products, or features).

Questions remain as to whether the Code will be preempted by the current draft of the American Data Privacy and Protection Act or other federal children’s privacy bills like the Children and Teens’ Online Privacy Protection Act and the Kids Online Safety Act currently making their way through the U.S. Congress. Despite such uncertainty, companies offering online services, products, or features that are accessed by children should begin planning for compliance with the Code’s requirements.


[1] The UK recently began enforcement of its own Age-Appropriate Design Code in September of 2021. Unlike the Code, the UK Code details 15 standards businesses should adopt when processing children’s data in order to satisfy obligations set forth under the UK’s General Data Protection Regulation (the “GDPR”).

[2] Unless otherwise defined therein, the Code adopts definitions from §1798.140 of the CPRA, including the definition of a “business” and “personal information.”

[3] Carve-outs exist for online services, products, and features including broadband internet access services, telecommunications services and delivery or use of a physical product, such as connected devices.

[4] Note that COPPA also considers a website or online service to be “directed to children” where it has actual knowledge that it is collecting personal information directly from users of another website or online service that is directed to children.

[5] Default is defined as “a preselected option adopted by the business for the online service, product, or feature.”

[6] The provisions that create the Working Group remain in effect until January 1, 2030 and are thereafter repealed.