Last week, the New York Department of Financial Services (“DFS”) issued guidance addressed to executives and information security personnel of entities regulated by DFS to assist them in understanding and assessing cybersecurity risks associated with the use of artificial intelligence (“AI”), and implementing appropriate controls to mitigate such risks (the “Guidance”).[1] In particular, and to address inquiries received by DFS regarding AI’s impact on cyber risk, the Guidance is intended is to explain how the framework set forth in DFS’ Cybersecurity Regulation (23 NYCRR Part 500) should be used to assess and address such risks.Continue Reading New York Department of Financial Services Issues Guidance on Cybersecurity Risks Arising from Artificial Intelligence
Cybersecurity
Cybersecurity Law Enters Into Force
On July 17, 2024, Law No. 90/2024 containing provisions for strengthening national cybersecurity and addressing cybercrime (the “Cybersecurity Law”) entered into force.Continue Reading Cybersecurity Law Enters Into Force
Biden Administration Executive Order Targets Bulk Data Transactions
The Biden administration recently issued Executive Order 14117 (the “Order”) on “Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern.” Building upon earlier Executive Orders[1], the Order was motivated by growing fears that “countries of concern” may use artificial intelligence and other advanced technologies to analyze and manipulate bulk sensitive personal data for nefarious purposes. In particular, the Order notes that unfettered access to American’s bulk sensitive personal data and United States governmental data by countries of concern, whether via data brokers, third-party vendor agreements or otherwise, may pose heightened national security risks. To address these possibilities, the Order directs the Attorney General to issue regulations prohibiting or restricting U.S. persons from entering into certain transactions that pose an unacceptable risk to the national security of the United States. Last week, the Department of Justice (“DOJ”) issued an Advance Notice of Proposed Rulemaking, outlining its preliminary approach to the rulemaking and seeking comments on dozens of issues ranging from the definition of bulk U.S. sensitive personal data to mitigation of compliance costs. Continue Reading Biden Administration Executive Order Targets Bulk Data Transactions
Proposed Rulemaking by U.S. Department of Commerce Introduces New Obligations on U.S. IaaS Providers and Foreign Resellers to Curb Malicious Cyber-Enabled Activities
On January 29, 2024, the U.S. Department of Commerce (“Commerce”) published a notice of proposed rulemaking (the “Notice”) seeking comments on proposed rules promulgated by Commerce’s Bureau of Industry and Security (“BIS”) and newly-created Office of Information and Communications Technology and Services to implement Executive Order 14110, the Biden Administration’s October 2023 executive order on artificial intelligence (“AI”) (“E.O. 14110”, see our prior alert here)[1]. The Notice also implements Executive Order 13984, a 2021 executive order relating to malicious cyber-enabled activities (“E.O. 13984”) (with respect to which Commerce had already issued an advanced notice of proposed rulemaking)[2]. Continue Reading Proposed Rulemaking by U.S. Department of Commerce Introduces New Obligations on U.S. IaaS Providers and Foreign Resellers to Curb Malicious Cyber-Enabled Activities
Quantum Computing and the Financial Sector: World Economic Forum Lays Out Roadmap Towards Quantum Security
Quantum technology is seen as having the potential to revolutionize many aspects of technology, the economy and society, including the financial sector. At the same time, this technology represents a significant threat to cybersecurity, especially due to its potential to render most current encryption schemes obsolete.Continue Reading Quantum Computing and the Financial Sector: World Economic Forum Lays Out Roadmap Towards Quantum Security
Crossing a New Threshold for Material Cybersecurity Incident Reporting
The following post was originally included as part of our recently published memorandum “Selected Issues for Boards of Directors in 2024”.
In July 2023, the U.S. Securities and Exchange Commission (SEC) adopted final rules to enhance and standardize disclosure requirements related to cybersecurity. In order to comply with the new reporting requirements of the rules, companies will need to make ongoing materiality determinations with respect to cybersecurity incidents and series of related incidents. The inherent nature of cybersecurity incidents, which are often initially characterized by a high degree of uncertainty around scope and impact, and an SEC that is laser-focused on cybersecurity from both a disclosure and enforcement perspective, combine to present registrants and their boards of directors with a novel set of challenges heading into 2024.Continue Reading Crossing a New Threshold for Material Cybersecurity Incident Reporting
The UK ICO launches consultation series on GenAI
On 15 January 2024, the UK Information Commissioner’s Office (“ICO”)[1] launched a series of public consultations on the applicability of data protection laws to the development and use of generative artificial intelligence (“GenAI”). The ICO is seeking comments from “all stakeholders with an interest in GenAI”, including developers, users, legal advisors and consultants.[2]
This first public consultation (which closes on 1 March 2024) focuses on the lawful basis for training GenAI models on web-scraped data.[3]Continue Reading The UK ICO launches consultation series on GenAI
Saudi Arabia’s Data Protection Law and Regulations Come Into Effect
Saudi Arabia has in the past few years taken strides to update its legislative frameworks to reflect technological advancements, and data protection laws are the latest iterations of such reform. Data protection issues were historically not codified as a standalone law in the country and instead dealt with under what is broadly known as the “sharia” judicial system, which includes the principle of individuals’ right to privacy and safety from encroachment into one’s personal affairs.[1] The spirit of this principle, along with modern interpretations of privacy as applied to personal data, carried over into the Kingdom’s Personal Data Protection Law (the “PDPL”), implemented by Royal Decree M/19 of 17 September 2021 and amended on 21 March 2023.[2] The amended PDPL was published in the official gazette on and formally effective as of September 14, 2023, and entities have an extended grace period of one year (i.e., until September of 2024) to comply.[3] In conjunction with the PDPL, two sets of related regulations were published on the same date – the PDPL Implementing Regulations (the “Implementing Regulations”) and the regulations on personal data transfer (the “Transfer Regulations” and together with the Implementing Regulations, the “Regulations”).[4]Continue Reading Saudi Arabia’s Data Protection Law and Regulations Come Into Effect
Comparing Global Privacy Regimes Under GDPR, DPDPA and US Data Protection Laws
Nearly five years after a landmark Supreme Court ruling, which reiterated that information privacy is a fundamental right enshrined in the Constitution, India finally enacted its Digital Personal Data Protection Act, 2023 (the “DPDPA” or “Act”), on August 11, 2023.Continue Reading Comparing Global Privacy Regimes Under GDPR, DPDPA and US Data Protection Laws
FTC Proposes COPPA Rule Revisions Detailing Enhanced Online Privacy Protections for Children
The Federal Trade Commission (“FTC”) on December 20, 2023[1] proposed a set of revisions to its rules implementing the Children’s Online Privacy Protection Act (“COPPA Rule”).[2] The COPPA Rule, which became effective in 2000, and was amended in 2013, serves as the FTC’s primary means to enforce the Children’s Online Privacy Protection Act of 1998 (“COPPA”), the principal regulation protecting children (and their personal information) online. At a high level, the COPPA Rule requires operators of websites online services (i) directed to children[3] or (ii) when not directed to children, that have actual knowledge that they are collecting personal information online from a child; to provide notice to parents and obtain verifiable parental consent before collecting, using or disclosing personal information from their children, as well as to provide parents with opportunities to review, delete and prevent further use or future collection of such information.Continue Reading FTC Proposes COPPA Rule Revisions Detailing Enhanced Online Privacy Protections for Children