California Governor Gavin Newsom signed a new executive order (EO) on September 6 regarding the use, procurement, and development of generative artificial intelligence (GenAI) by California state agencies and employees. The EO announces the initiation of various studies, reports, and guidelines in the coming year, which will address opportunities and potential risks related to the use of GenAI.
While the EO is currently only applicable to the state government’s use of GenAI, California companies should expect the guidelines and studies produced as a result of the EO to be used as a basis for future rules and regulations impacting the use of AI in the private sector.
KEY TAKEAWAYS
- Within 60 days of the publishing of the EO, the California Government Operations Agency, Department of Technology, Office of Data and Innovation, and Governor’s Office of Business and Economic Development will prepare a report examining (1) the most important and potentially beneficial uses of GenAI by the state and (2) the potential risks of such uses, with a focus on “high-risk use cases.”
- Within 60 days of the publishing of the EO, all state agencies and departments must submit an inventory of their current high-risk uses of GenAI. The EO does not define “high-risk” use cases but highlights those areas “where GenAI is used to make a consequential decision affecting access to essential goods and services.” The EO also references risks stemming from bad actors and insufficiently guarded governmental systems, unintended or emergent effects, and potential risks toward democratic and legal processes, public health and safety, and the economy.
- General guidelines for public sector employee training, procurement, and use of GenAI tools, modeled after the White House’s AI Bill of Rights and the National Institute of Standards and Technology’s AI Risk Management Framework, will be released by January 2024. Trainings for state government workers should be made available by July 2024.
- A joint risk analysis report, to be completed by March 2024, will study the impact of GenAI tools on California’s critical energy infrastructure, including where it could lead to mass casualty events and environmental emergencies. Responsible agencies will provide public recommendations for regular testing and monitoring based on this report.
- Guidelines to analyze the impact of state agencies’ use of GenAI tools on vulnerable communities, including whether they achieve equitable outcomes in their deployment and implementation, will be created by July 2024. These guidelines will inform whether state agencies should use a particular tool.
- These guidelines will be used to update the state’s contract terms for the project approval and procurement of new GenAI tools by January 2025.
- California will create infrastructure to allow state agencies to carry out pilot projects and allow for “sandbox” testing of new GenAI applications.
- California will engage with the legislature and stakeholders, including civil society, academia, industry experts, as well as historically vulnerable and marginalized communities in the development of these guidelines, reports, trainings, and pilot projects.
- California will pursue a partnership with UC Berkeley and Stanford to evaluate the impacts of GenAI on California and recommend efforts for the state to advance its leadership in the area, with the aim of hosting a joint summit in 2024 to discuss findings.
PRACTICAL CONSIDERATIONS
- Employers should be prepared for more auditing, monitoring, and reporting by the state regarding the use of GenAI tools. Employers should monitor how they select GenAI tools, employees’ use of GenAI tools in the workplace, and the outcomes of their use.
- Employers should review their use of GenAI tools and identify which uses, if any, may be considered “high risk” and likely subject to increased scrutiny.
- Employers should be aware of and review the new employee trainings that will be made available by the state and consider making changes to their own employee trainings.
- To the extent that GenAI generally becomes a greater part of employee duties and responsibilities, employers should review job descriptions, the line of oversight over GenAI tools, and technical trainings for affected employees.
- Employers should review the new procurement rules when they are released and be aware of any tools they currently use or contract with that may become non-compliant as a result.
- Employers should be aware of existing guidance and regulations regarding AI that already impact private employers, including but not limited to the following:
- Employers should be aware of ongoing and potential future litigation clarifying the scope of these regulations. For instance, the California Supreme Court recently held that third-party providers acting on behalf of employers may be held liable for employment discrimination. Thus, litigation may also be expected regarding whether such liability extends beyond personnel administration and potentially impacts liability, including for agents of the company with respect to the use of AI systems for employment decision-making.