What California’s AI Proposed Rules Mean for Employers

In this episode of The Workplace podcast, CalChamber Associate General Counsel Matthew Roberts and CalChamber Senior Policy Advocate Ashley Hoffman discuss regulations on the use of artificial intelligence (AI) tools in the workplace proposed by the California Civil Rights Council (CRC).

Regulation Concerns

Recently, the CRC proposed new regulations that could have significant implications for employers using AI in hiring and other employment decisions.

While discrimination is illegal regardless of whether decisions are made with pen and paper or through automated tools, Hoffman warns that the proposed rules go far beyond that.

One concern Hoffman identifies is the introduction of a new concept: the “agent.” This term could extend liability under California’s Fair Employment and Housing Act (FEHA) to AI developers, potentially holding them accountable if their tools are used in discriminatory ways.

Another key concern Hoffman raises is the broad scope of the regulations. The term “automated decision system” could encompass a wide range of tools, from hiring software to even seemingly simple systems like calculators or assessment tools. This expansive definition could lead to unanticipated liabilities for companies that create or use these tools, especially if their systems unintentionally result in biased outcomes.

Roberts points out that while the regulations aim to prevent discrimination, there is a risk that companies may avoid using AI tools altogether to sidestep potential legal exposure.

Hoffman agrees, saying that if an AI tool is used and unintentionally results in a disparate impact on a protected group, it could be deemed discriminatory—even if the tool was used in good faith.

Another concern Hoffman has with the proposed rules is with its record-keeping requirements.

“There’s still some concerns about record keeping and some of the feasibility for that, especially if you’re on the developer side, not the person actually using the tool. If you sell it to an entity that you no longer have contact with, what does that mean, as far as reliability or record keeping obligations and the like,” she asks?

Specific Concerns in Hiring, Screening

The proposed regulations contain language regarding AI tools that may lead to potentially screening out of certain applicants who have disabilities. Hoffman says that tools designed to assess skills, such as reaction time or dexterity, could unintentionally screen out candidates with physical or cognitive disabilities.

Many protections exist under California and federal law, so the CalChamber is working to make sure that the regulations are striking a good balance between tools that could be helpful, encouraging innovation, but also taking into consideration potential impacts that that could have on certain applicants, Hoffman explains.

The Legislative Landscape

Earlier this year, a bill was introduced in the California Assembly, AB 2930 (Bauer-Kahan; D-Orinda) that would have imposed significant requirements on employers and the developers of AI tools as it pertains to potential discrimination in employment. The bill failed, but Hoffman expects similar bills to be reintroduced next year.

“From an overarching perspective we want to make sure that anything the CRD [Civil Rights Department] is doing is also not getting ahead of these other agencies and entities because these issues are complex, this technology is ever changing, and we want to make sure we’re getting it right,” Hoffman says.

What Employers Can Do Now

While the CRC’s regulations are still in the proposal phase, Hoffman urges caution when using AI tools.

Under current law, employers are already subject to claims of discrimination if their use of AI leads to a disparate impact on protected classes. Employers should carefully monitor the outcomes of their AI-based hiring and employment decisions to ensure they are not inadvertently screening out qualified candidates from protected groups.

“The use of these tools may already be the cause of action for discrimination. We’ve seen the EEOC [Equal Employment Opportunity Commission] go after companies for that already. So let’s clarify that, but let’s also make sure we are not just dropping the hammer on any use of this to where we’re deterring innovation,” she says. “It’s a really tough balance, but I think, as an employer, it’s great to have these tools to make things more efficient, but you need to be very careful.”