When HR Meets AI: California Issues New Employment AI Regulations
In recent years, employers have increasingly utilized artificial intelligence (“AI”) tools to assist with making various employment decisions, ranging from screening applicants during hiring to determining employee pay and schedules. While AI tools can lead to efficiency and cost-saving for employers, they also risk resulting in unintentional bias or discrimination if not properly administered.
To protect against such risks, the California Civil Rights Council recently published new regulations, effective October 1, 2025, aimed at regulating employer use of “automated-decision systems.” Under the new regulations, an “automated-decision system” is defined as “a computational process that makes a decision or facilitates human decision-making regarding an employment benefit,” and which may be “derived from and/or use artificial intelligence, machine-learning, algorithms, statistics, and/or other data processing techniques.” Most notably, the regulations:
1) Prohibit employers from utilizing automated-decision systems that discriminate against an applicant or employee (or a class of applicants or employees) on any basis protected by California’s Fair Employment and Housing Act (“FEHA”), subject to available defenses.
2) Increase employers’ burden to retain employment records under the FEHA, including those related to the use of automated-decision systems, from two (2) to four (4) years.
3) Provide that “agents” are “employers” for the purposes of the FEHA, and expands the definition of “agent” to encompass third-parties exercising FEHA-regulated activities, such as “applicant recruitment, applicant screening, hiring, promotion, or [determining] pay, benefits, or leave, including when such activities and decisions are conducted in whole or in part through the use of an automated-decision system.”
Whether an employer has any available defense to a discrimination claim relating to its use of automated-decision systems will take into account any anti-bias testing and other proactive efforts (or lack thereof) to avoid unlawful discrimination, including “the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results.” Therefore, it would be prudent for employers to proactively evaluate their utilization of automated-decision systems and consider whether such systems may potentially result in bias or discrimination. Employers wishing to obtain further information may contact their trusted MSK Labor & Employment advisor.