The Biden administration recently issued Executive Order 14117 (the “Order”) on “Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern.”  Building upon earlier Executive Orders[1], the Order was motivated by growing fears that “countries of concern” may use artificial intelligence and other advanced technologies to analyze and manipulate bulk sensitive personal data for nefarious purposes.  In particular, the Order notes that unfettered access to American’s bulk sensitive personal data and United States governmental data by countries of concern, whether via data brokers, third-party vendor agreements or otherwise, may pose heightened national security risks. To address these possibilities, the Order directs the Attorney General to issue regulations prohibiting or restricting U.S. persons from entering into certain transactions that pose an unacceptable risk to the national security of the United States.  Last week, the Department of Justice (“DOJ”) issued an Advance Notice of Proposed Rulemaking, outlining its preliminary approach to the rulemaking and seeking comments on dozens of issues ranging from the definition of bulk U.S. sensitive personal data to mitigation of compliance costs. 

Continue Reading Biden Administration Executive Order Targets Bulk Data Transactions

On January 16, 2024, New Jersey officially became one of a growing number states with comprehensive privacy laws, as Governor Phil Murphy signed Senate Bill 332 (the “New Jersey Privacy Act”) into law.[1]  New Hampshire followed closely behind, with its own comprehensive privacy law, Senate Bill 255 (the “New Hampshire Privacy Act” and, together with the New Jersey Privacy Act, the “Acts”), signed into law by Governor Chris Sununu on March 6, 2024.[2] 

Continue Reading New Privacy Laws Enacted in New Jersey and New Hampshire

On January 29, 2024, the U.S. Department of Commerce (“Commerce”) published a notice of proposed rulemaking (the “Notice”) seeking comments on proposed rules promulgated by Commerce’s Bureau of Industry and Security (“BIS”) and newly-created Office of Information and Communications Technology and Services to implement Executive Order 14110, the Biden Administration’s October 2023 executive order on artificial intelligence (“AI”)  (“E.O. 14110”, see our prior alert here)[1].  The Notice also implements Executive Order 13984, a 2021 executive order relating to malicious cyber-enabled activities (“E.O. 13984”) (with respect to which Commerce had already issued an advanced notice of proposed rulemaking)[2]

Continue Reading Proposed Rulemaking by U.S. Department of Commerce Introduces New Obligations on U.S. IaaS Providers and Foreign Resellers to Curb Malicious Cyber-Enabled Activities

The rapid development of AI is introducing new opportunities and challenges to dispute resolution. AI is already impacting the document review and production process, legal research, and the drafting of court submissions. It is expected that the use of AI will expand into other areas, including predicting case outcomes and adjudicating disputes. However, the use of AI in litigation also bears risk, as highlighted by a recent First-tier Tribunal (Tax) decision, where an appellant had sought to rely on precedent authorities that, in fact, were fabricated by AI (a known risk with AI using large language models, referred to as hallucination).[1] While, in this particular case, no further consequences seemed to follow (in light of the fact that the appellant, a litigant in person, “had been unaware that the AI cases were not genuine and that she did not know how to check their validity[2]), the Tribunal did highlight that “providing authorities which are not genuine and asking a court or tribunal to rely on them is a serious and important issue”,[3] suggesting that litigants may incur certain risks by relying on authorities suggested by AI, unless these are independently verified. On 12 December 2023, a group of senior judges, including the Master of the Rolls and the Lady Chief Justice, issued guidance on AI for judicial office holders, which, amongst other things, discourages the use of AI for legal research and analysis and highlights the risk of AI being relied on by litigants to provide legal advice and/or to produce evidence.[4]

Continue Reading Nexus of AI, AI Regulation and Dispute Resolution

The following post was originally included as part of our recently published memorandum “Selected Issues for Boards of Directors in 2024”.

Continuing global trends to protect consumer privacy and rein in the exploitation of personal data by organizations, 2023 saw an explosion of comprehensive privacy laws, amendments to existing laws and a proliferation of targeted regulations around the world. 

Continue Reading Privacy and Data Protection Compliance Will Become More Fragmented in 2024

Quantum technology is seen as having the potential to revolutionize many aspects of technology, the economy and society, including the financial sector. At the same time, this technology represents a significant threat to cybersecurity, especially due to its potential to render most current encryption schemes obsolete.

Continue Reading Quantum Computing and the Financial Sector: World Economic Forum Lays Out Roadmap Towards Quantum Security

The following post was originally included as part of our recently published memorandum “Selected Issues for Boards of Directors in 2024”.

In July 2023, the U.S. Securities and Exchange Commission (SEC) adopted final rules to enhance and standardize disclosure requirements related to cybersecurity.  In order to comply with the new reporting requirements of the rules, companies will need to make ongoing materiality determinations with respect to cybersecurity incidents and series of related incidents.  The inherent nature of cybersecurity incidents, which are often initially characterized by a high degree of uncertainty around scope and impact, and an SEC that is laser-focused on cybersecurity from both a disclosure and enforcement perspective, combine to present registrants and their boards of directors with a novel set of challenges heading into 2024.

Continue Reading Crossing a New Threshold for Material Cybersecurity Incident Reporting

On 15 January 2024, the UK Information Commissioner’s Office (“ICO”)[1] launched a series of public consultations on the applicability of data protection laws to the development and use of generative artificial intelligence (“GenAI”). The ICO is seeking comments from “all stakeholders with an interest in GenAI”, including developers, users, legal advisors and consultants.[2]

This first public consultation (which closes on 1 March 2024) focuses on the lawful basis for training GenAI models on web-scraped data.[3]

Continue Reading The UK ICO launches consultation series on GenAI

Saudi Arabia has in the past few years taken strides to update its legislative frameworks to reflect technological advancements, and data protection laws are the latest iterations of such reform. Data protection issues were historically not codified as a standalone law in the country and instead dealt with under what is broadly known as the “sharia” judicial system, which includes the principle of individuals’ right to privacy and safety from encroachment into one’s personal affairs.[1] The spirit of this principle, along with modern interpretations of privacy as applied to personal data, carried over into the Kingdom’s Personal Data Protection Law (the “PDPL”), implemented by Royal Decree M/19 of 17 September 2021 and amended on 21 March 2023.[2] The amended PDPL was published in the official gazette on and formally effective as of September 14, 2023, and entities have an extended grace period of one year (i.e., until September of 2024) to comply.[3] In conjunction with the PDPL, two sets of related regulations were published on the same date – the PDPL Implementing Regulations (the “Implementing Regulations”) and the regulations on personal data transfer (the “Transfer Regulations” and together with the Implementing Regulations, the “Regulations”).[4]

Continue Reading Saudi Arabia’s Data Protection Law and Regulations Come Into Effect

Nearly five years after a landmark Supreme Court ruling, which reiterated that information privacy is a fundamental right enshrined in the Constitution, India finally enacted its Digital Personal Data Protection Act, 2023 (the “DPDPA” or “Act”), on August 11, 2023.

Continue Reading Comparing Global Privacy Regimes Under GDPR, DPDPA and US Data Protection Laws