On September 11, Delaware’s governor signed into law the Delaware Personal Data Privacy Act (the “DPDPA” or “Act”),[1] establishing Delaware as the 12th state in the U.S. to enact its own comprehensive data protection law and contributing to the patchwork of U.S. data protection regimes that continue to proliferate in the absence of federal regulation. 

The Act, which will take effect on January 1, 2025, largely tracks previously enacted data protection laws such as those passed in Colorado, Connecticut, Oregon and Virginia; however, the DPDPA diverges from these laws in certain key respects.  For example, likely reflecting the relatively smaller population of Delaware, the DPDPA has a much lower processing threshold as compared to other states, applying to organizations that control or process the personal data of (i) 35,000 or more Delaware residents in a given year (excluding (a) individuals acting in an employment or commercial context and (b) personal data processed for the purpose of completing a payment transaction), or (ii) 10,000 or more state residents and derive more than 20% of gross revenue from the “sale” of personal data for monetary or other valuable consideration.  Also in a departure from the majority approach, though similar to the laws in Colorado and Oregon, the DPDPA does not categorically exempt nonprofits.[2]  That said, like other state regimes, the Act provides consumers with a similar suite of privacy rights, e.g., rights to access,[3] correct, delete, data portability and to opt-out, including through universal opt-out mechanisms, of personal data sales and certain other uses.  The Act also imposes transparency, data minimization, purpose limitation, data security, data protection impact assessment, and other commonly seen data protection obligations on covered entities. 

Most noteworthy, the DPDPA has also adopted, and in some ways expanded, two emerging trends in state data protection regulation, namely (1) the broadening of the definition of sensitive data and (2) enhancements with respect to the processing of children’s and teenagers’ data.  In this post, we discuss the DPDPA’s treatment of these topics and situate it in the rapidly changing context of other relevant state laws.  To the extent organizations process sensitive data or offer products or services that may attract children under the age of eighteen (18), it is increasingly important to be aware of the legislative and judicial developments in these areas.

Expanding the Definition of Sensitive Data

Compared to previously enacted state data protection laws, the DPDPA has a broader definition of sensitive data, which includes data that: (i) reveals racial or ethnic origin, religious beliefs, mental or physical health conditions or diagnoses (including pregnancy), sex life, sexual orientation, status as transgender or nonbinary, national origin, citizenship status or immigration status, (ii) genetic or biometric data, (iii) the personal data of a known child (i.e., an individual under the age of thirteen (13)) and (iv) precise geolocation data.  Notably, the DPDPA follows the lead of the Oregon law in expanding the definition of sensitive data to include transgender or nonbinary status and takes another step further by explicitly mentioning pregnancy status and providing a definition of “genetic data”.[4] 

Like the laws in Connecticut, Colorado, Virginia, Indiana, Montana, Oregon and Tennessee, the DPDPA prohibits data controllers from processing a consumer’s sensitive data without first obtaining the consumer’s consent or, in the case of a child under the age of thirteen (13), without first obtaining the verifiable consent of a parent or legal guardian in accordance with the federal Children’s Online Privacy Protection Act (“COPPA”).  This is in contrast to the newly amended California Consumer Privacy Act (“CCPA”) which does not require opt-in consent and instead requires covered entities to (i) disclose to consumers the categories of sensitive data to be collected, the purposes for the collection and use of such data and whether the data is sold or shared and (ii) with respect to sensitive data that is collected or processed with the purpose of inferring characteristics about consumers, provide consumers with a right to limit the entity’s use of the consumer’s sensitive data to that use which is necessary to perform the services or provide the goods reasonably expected by an average consumer.  Yet another approach has been taken by Iowa and Utah’s laws, which prohibit data controllers from processing sensitive data without (i) first presenting the consumer with clear notice and an opportunity to opt out of such processing or (ii) in the case of processing the personal data concerning a known child under the age of thirteen (13), processing the data in accordance with COPPA.

Enhanced Focus on Children and Teenagers’ Data

In line with growing trends to protect children and teens online, the DPDPA includes heightened protections for such consumers’ data, prohibiting a controller from (i) processing personal data for the purposes of targeted advertising or (ii) selling personal data without consent, in each case, where the controller has actual knowledge or willfully disregards that the relevant consumer is at least thirteen (13) years of age but younger than eighteen (18) years of age, expanding coverage to teenagers in a departure from COPPA, which only applies to children under the age of thirteen (13).  This follows the recently amended privacy law in Connecticut that similarly outlines new obligations for controllers that provide an online service, product or feature to minors who are at least thirteen (13) years of age but younger than eighteen (18).  Specifically, under Connecticut’s amended data privacy law, such controllers who have actual knowledge or willfully disregard that consumers are minors are prohibited, without the minor’s consent (or consent of the child’s parent, if under the age of thirteen (13)), from processing a minor’s personal data for purposes of targeted advertising, personal data sales or profiling in furtherance of any fully automated decisions that produce legal or similarly significant effects, among other restrictions.   A recently enacted bill in Florida, Senate Bill 262, effective July 1, 2024, takes a slightly different approach, targeting online platforms that operate “an online service, product, game, or feature likely to be predominantly accessed by children and accessible by Florida children”.[5]  Under Florida’s law, covered online platforms are prohibited from (a) processing the personal data of any child (defined as a person under the age of eighteen (18)) if the online platform has actual knowledge of, or willfully disregards, that the processing may result in substantial harm or privacy risk to children, (b) collecting, selling, sharing or retaining any personal data that is not necessary to provide such online service, product or feature without demonstrating a compelling reason why such collection does not pose a substantial harm or privacy risk to children and (c) using dark patterns to lead or encourage children to provide personal data beyond what would reasonably be expected to be provided for the online feature, service, game or product, among other restrictions.

Interestingly, this enhanced focus comes at a time when a California federal judge recently granted a preliminary injunction blocking enforcement of the California Age-Appropriate Design Code (the “CAADCA” or “Code”), discussed in greater detail here),[6] marking a win for NetChoice, LLC (“NetChoice”), a trade association of online businesses.  In December, NetChoice sued the California Attorney General over the enactment of the CAADCA, claiming that the CAADCA violates the First Amendment by compelling companies to serve as “roving censors of speech”, and that the law is “unconstitutionally vague” and preempted by COPPA, among other claims.[7]  The CAADCA, which was modeled on the United Kingdom’s Age-Appropriate Design Code and was set to go into effect July 1, 2024, imposes obligations on CCPA-covered entities that develop and provide online services, products, or features that are “likely to be accessed by children”, requiring online service providers to offer a high level of privacy by design and by default to children (defined as consumers under the age of eighteen (18), like the DPDPA and Connecticut law).  In the order issued on September 18 granting NetChoice a preliminary injunction, Judge Beth Freeman of the U.S. District Court for the Northern District of California (the “Court”), applying an intermediate scrutiny standard, agreed that NetChoice was likely to succeed on the merits of their First Amendment argument, stating that the provisions of the CAADCA “do not pass constitutional muster” where many of the Code’s provisions were insufficiently tailored to advance the government’s interest in protecting minors’ wellbeing online.  Specifically, the Court found that the Code’s provisions would unlawfully target protected  speech, including by forcing websites to impose barriers for children that would also likely impact adults given the difficulty of accurately estimating the age of a business’s users, as required by the CAADCA.  Furthermore, the Court noted that the CAADCA’s age estimation requirements appear counter to the government’s interest in increasing privacy protection for children, and may in fact exacerbate the problem, by inducing businesses to require its users to divulge additional personal information to estimate their age, including through facial scan technology.[8]

The enhanced state-level protections are in line with the continued focus on children and minors’ privacy in state legislatures, as we have previously highlighted and that we expect will continue, especially at a time when bills at the federal level have stalled, including the Kids Online Safety Act that would impose a duty of care for tech companies to act in minors’ best interests[9] and the Children and Teens’ Online Privacy Protection Act that would expand the age of protection under COPPA from thirteen (13) to sixteen (16) and ban targeted advertising to children and teens.[10]  Such lack of progress at the federal level has left the state legislatures to fill in the gaps, with varying levels of success, particularly with respect to the data of teenagers. 


The enactment of the DPDPA marks another milestone in the ever-changing patchwork of data protection laws, with its broad definition of sensitive data and focus on children and teenagers’ personal data, reflecting two recent trends across state legislatures.  It is imperative for businesses that process sensitive data or offer products or services that may attract children under the age of eighteen (18) be aware of the nuances of each new law as the interest in regulating these areas continues to increase.

[1] The full text of the Delaware Personal Data Privacy Act is available here.

[2] Note, however, that the DPDPA does not apply (i) to nonprofits that are “dedicated exclusively to preventing and addressing insurance crime” or (ii) to the personal data of victims or witnesses of child abuse, domestic violence, human trafficking, sexual assault, violent felony and stalking that is collected or processed by nonprofits that provide services to such victims or witnesses.

[3] Uniquely, the DPDPA permits consumers to obtain from the controller a list of the categories of third parties to whom their personal data was disclosed.

[4] The DPDPA defines “genetic data” as “any data, regardless of its format, that results from the analysis of a biological sample of an individual, or from another source enabling equivalent information to be obtained, and concerns genetic material”, including DNA, RNA, genes, chromosomes, alleles, genomes, alterations or modifications to DNA or RNA, single nucleotide polymorphisms, uninterpreted data that results from analysis of the biological sample or other source, and any information extrapolated, derived or inferred therefrom.

[5] The full text of Florida Senate Bill 262 is available here.

[6] The full text of the order granting the motion for preliminary injunction is available here.

[7] The full text of the complaint is available here.

[8] Interestingly, the Court took specific issue with the Code’s requirements to perform data protection impact assessment requirements and disclose them to the government, finding that such provisions provide “only ineffective or remote support for the government’s purpose” and do not “directly advance” the government’s substantial interest in promoting a proactive approach to the design of digital products, services, and features, therefore failing to survive intermediate scrutiny.  Such conclusions call into question the viability of data protection impact assessment requirements in other comprehensive state privacy laws passed in recent years, where similar arguments could be brought on First Amendment grounds.

[9] This has been controversial, with criticism from civil rights organizations such as the Electronic Frontier Foundation regarding the bill’s proposed safeguards and the chances of increased online surveillance and censorship.

[10] Both such bills passed out of committee but have stalled on the Senate floor, and neither bill has been introduced in the House.