The Civil Rights Council of the California Civil Rights Department has amended its regulations concerning the Fair Employment and Housing Act (“FEHA”) to address automated-decision systems in the workplace. These regulations aim to protect against employment discrimination given the growing concerns over employers’ increasing use of artificial intelligence (AI) and automated-decision systems to make or facilitate employment decisions resulting in “algorithmic discrimination.” These regulations will go into effect on October 1, 2025. All employers should note that the amended regulations extend the required retention period for certain employment-related records. In addition, all employers or other entities using automated-decision systems to facilitate human decision-making regarding employment benefits should carefully review the regulations and consult counsel if necessary. The full text of the amended regulations is available at the CRD’s website: https://calcivilrights.ca.gov/civilrightscouncil/ (the regulations are “Exhibit B” to the March 21, 2025 meeting).
Here are some key aspects of the final regulations that employers need to know:
What is an “Automated-Decision System”?
The new regulations define an automated-decision system (commonly known as “ADS”) as “a computational process that makes a decision or facilitates human decision making regarding an employment benefit,” including processes that “may be derived from and/or use artificial intelligence, machine-learning, algorithms, statistics, and/or other data processing techniques.” The new regulations also include definitions for “algorithm,” “artificial intelligence,” “machine learning” and “automated-decision system data.”
Covered ADSs perform tasks such as:
- Using computer-based tests or assessments, including questions, puzzles or games to predict an applicant’s or employee’s skills, dexterity, reaction time, and other abilities or measure a person’s aptitude or cultural fit or screen applicants or employees;
- Directing job advertisements or recruiting materials to targeted groups;
- Screening resumes for particular terms or patterns;
- Analyzing facial expressions, word choice and voice during online interviews; or
- Analyzing employee or applicant data acquired from third parties.
It is important to note that not all technologies fall under this definition. Basic software tools such as word processors, spreadsheets, navigation systems, web hosting, firewalls, anti-virus and data storage solutions or other similar technologies are excluded unless they specifically make decisions regarding employment benefits.
ADS-Related Discrimination
The new regulations make it clear that employers and covered entities are prohibited from using an automated-decision system or selection criteria that discriminates against applicants or employees based on characteristics that are protected under the FEHA. This includes any qualification standards, employment tests, or proxies that may result in discrimination. The amendments clarify that numerous forms of discrimination are unlawful where the discrimination resulted, in whole or in part, from a covered entity’s use of an automated-decision system or selection criteria; and these changes include adding references to automated-decision systems throughout the FEHA regulations.
Additionally, the regulations emphasize the importance of conducting anti-bias testing and other proactive measures to prevent unlawful discrimination from the use of an automated-decision system or selection criteria. Evidence of such efforts, or the lack thereof, is relevant in any legal claims or defenses, including the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results. This means that any due diligence conducted to test, audit, review, and address potential unlawful discrimination resulting from the use of AI tools, or the failure to conduct such reviews, can be considered in legal proceedings. Essentially, these regulations encourage employers to conduct such testing; but the regulations neither require such testing, nor do they establish that testing is necessarily a defense to a claim of discrimination.
Note that the regulations prohibit the use of selection criteria including proxies to discriminate. A “proxy” is defined in the regulations as a “characteristic or category closely correlated with a basis protected by the Act.” Thus, the regulation states that it is unlawful to use a characteristic closely correlated with a protected basis if it discriminates against employees on the basis of a protected characteristic.
Pre-Employment Practices
The new regulations include extensive amendments to existing regulations regarding pre-employment practices.
For example, the prior regulations noted that the use of online application technology that limits or screens out applicants based on their schedule may discriminate against applicants based on their religious creed, disability, or medical condition. The amended regulations state that technology that ranks or prioritizes applicants based on their schedule may similarly be impermissible. All such practices having an adverse impact are unlawful unless they are job-related and required by business necessity, and if there is a mechanism for applicants to request accommodations.
Furthermore, the amended regulations specify that automated-decision systems which measure an applicant’s skills, dexterity, reaction time, or other abilities may discriminate against individuals with certain disabilities or other protected characteristics. The regulations note that to avoid unlawful discrimination, an employer or covered entity may need to provide reasonable accommodation. Similarly, systems that analyze an applicant’s tone of voice, facial expressions, or other physical characteristics may result in unlawful discrimination based on race, national origin, gender, disability, or other protected characteristics. Employers may need to provide reasonable accommodations to avoid such discrimination.
The regulations clarify that impermissible pre-employment inquiries (including, for example, impermissible inquiries about criminal history) may not be made via automated-decision systems.
In addition, the regulations specify that automated-decision assessments, including tests, questions, or puzzle games that are likely to elicit information about a disability may constitute a “medical or psychological examination” and thus may be subject to regulations on such examinations.
Finally, the amended regulations tighten the restrictions on using qualification standards, employment tests, proxies or other selection criteria that screen out or tend to screen out applicants or employees on the basis of disability, uncorrected vision, or uncorrected hearing. The new regulations clarify that this includes but is not limited to the use of automated decision systems, and state that employers can only use such tests or other selection criteria if they are job-related for the position in question, consistent with business necessity, and (in a new requirement) there is no less discriminatory standard, test, or other selection criteria that serves the employer’s goals as effectively as the challenged standard, test, or other selection criteria.
Third-Party (Agent) Liability
The amended regulations include a new definition of the word “Agent”: any individual acting on behalf of an employer, either directly or indirectly, to perform functions traditionally executed by the employer or any other activities regulated by the FEHA. This includes tasks such as applicant recruitment, screening, hiring, promotion, and decisions regarding compensation, benefits, or leave, even when these activities are partially or wholly conducted through automated decision systems. The regulations deem an agent of an employer an “employer” under the FEHA regulations.
The broad definition of “agent” means that third parties, including vendors and developers of AI systems, could be considered agents and thus subject to the same regulations as employers. This could introduce new liability issues for both users and deployers of AI software, potentially requiring contract renegotiations to ensure compliance.
Additionally, the regulations clarify that prohibitions on aiding and abetting unlawful employment practices include engaging in a prohibited practice (such as assisting a person in unlawful employment discrimination) conducted in whole or in part through the use of automated-decision systems, potentially implicating third parties that design or implement such systems. Evidence (or a lack of evidence) of anti-bias testing and proactive efforts to avoid discrimination, including the quality, efficacy, recency, the scope of such efforts, the results of testing, and the response to the results, is relevant to claims of unlawful discrimination or available defenses. However, the regulations do not include language that would have created third-party liability for the design, development, advertising, promotion, or sale of such systems. This means that while third parties may be implicated in the use of these systems, they are not directly liable for their creation and marketing.
Recordkeeping Requirements
There are numerous sources of recordkeeping requirements applicable to California employers. The prior FEHA regulations required employers to retain personnel or other employment records dealing with employment practices for a period of two years. The amended regulations require preservation of personnel or other employment records created or received by any employer or other covered entity dealing with any employment practice and affecting any employment benefit of any applicant or employee for four years from the date of making the record or the date of the personnel action involved, whichever occurs later. The amended regulations specify that this includes all applications, personnel records, membership records, employment referral records, selection criteria, automated-decision system data, and other records created or received by the employer or other covered entity dealing with any employment practice and affecting any employment benefit of any applicant or employee.
“Automated-Decision Systems Data” is defined as any data used in or generated by automated-decision systems. This includes data about individual applicants or employees, data reflecting employment decisions or outcomes, and data used to develop or customize the system for a specific employer or entity.
Thus, all employers should note that this extended retention period applies to all such records, and not only records related to automated decision systems; and any employers using automated decision systems as defined in the regulations should ensure they are preserving automated-decision system data.
Next Steps
To ensure that any AI tools used in personnel decision-making processes are in compliance with the new regulations avoid potential bias, employers should consider:
- Conducting regular anti-bias testing in AI tools to identify and mitigate biases that could lead to discrimination.
- Evaluating all AI tools used for HR functions within your organization to ensure they comply with antidiscrimination laws and do not introduce bias.
- Developing an AI governance policy that includes guidelines for transparency, accountability, privacy, and fairness to mitigate risks and maximize the benefits of AI technology.
- Providing training for HR personnel and managers on the new regulations, the risks of algorithmic discrimination, and best practices for compliance.
- Establishing guidelines for managing relationships with vendors who develop, supply, or support AI technology to ensure these vendors adhere to the new regulations and do not introduce biases into your employment processes.
- Ensuring that automated-decision systems include mechanisms for applicants to request accommodations for disabilities, religious practices, or medical conditions.
- Retaining all required records for the mandated four-year period and ensuring that automated-decision system data is preserved in compliance with the regulations.
If you have questions about how these new regulations will affect your business or need advice about how to implement these new requirements, please contact us.
- Patricia L. Clark (pclark@wilsonturnerkosmo.com)
- Katie M. McCray (kmccray@wilsonturnerkosmo.com)
Wilson Turner Kosmo’s Special Alerts are intended to update our valued clients on significant employment law developments as they occur. This should not be considered legal advice.