California suggests aiming for AI-powered recruiting software • The Register

0

A proposed new amendment to California’s employment discrimination laws would make AI-based employment decision-making software a source of legal liability.

The proposal would make it illegal for companies and employment agencies to use automated decision systems to weed out applicants who are considered a protected class by the California Department of Fair Employment and Housing. Broad language, however, means the law could be easily applied to “applications or systems that can only be indirectly linked to employment decisions,” wrote attorneys Brent Hamilton and Jeffrey Bosley of Davis Wright Tremaine.

Automated decision systems and algorithms, both fundamental to the law, are broadly defined in the draft, Hamilton and Bosley said. The lack of specificity means that technologies designed to aid human decision-making in subtle, subtle ways could end up being lumped in with rental software, just like the third-party vendors who supply the code.

Strict record keeping requirements are included in the proposed legislation which doubles the retention period for records from two to four years and requires anyone using automated decision systems to retain all machine learning data generated in the framework of its operation and training.

The training datasets also leave vendors responsible: “Anyone who engages in the advertising, sale, provision, or use of any selection tool, including but not limited to a decision system automated decision system, an employer or other covered entity must keep records of the evaluation criteria used by the automated decision system,” the proposed text says. It specifically mentions that it must also keep records for each client to which he forms models.

A big target

Applicant Tracking Systems (ATS) and Recruitment Management Systems (RMS) are used almost universally, with a 2021 study revealing that more than 90% of companies use these software to rank and screen candidates.

This same study suggests that HR software of the type covered by the proposed California law is one of the reasons employers are also struggling to fill positions. The study concluded that data points often serve as a proxy for personal traits an employer may want to filter out, but personality and resume don’t always match perfectly, leading to the exclusion of viable candidates.

Unintentional filtering is not covered by the new California law, which focuses on how software can discriminate against certain types of people, whether unintentionally or not.

AI and automation tools have had issues with bias for years. California’s proposed new law offers no solution, and it could leave California businesses wondering how to respond, if at all.

Hamilton and Bosley suggest that California employers review their ATS and RMS software to make sure it’s compliant with the proposal, improve their understanding of how the algorithms they use work, be prepared to demonstrate that the results of their process are fair and discuss with suppliers to ensure they are doing what they need to do to comply.

The 45-day public comment period for the proposed changes is not yet open, which means there is no timeline for the changes to be reviewed, modified, and submitted for adoption. ®

Share.

Comments are closed.