New York City’s law requiring employers to audit and notify candidates about the use of automated employment decision tools will be enforced beginning July 5, the city’s Department of Consumer and Worker Protection announced in an update as part of the publication of a final rule implementing Local Law 144.
The law is part of a series of AI-related developments affecting HR departments. DCWP proposed its first version of the rule in September and revised the rule in December following a public hearing in November. A second public hearing in January yielded comments that DCWP said led to changes in the final rule.
Which tools are impacted?
NYC’s law defines “automated employment decision tool,” or AEDT, as “any computational process, derived from machine learning, statistical modeling, data analytics or artificial intelligence, that issues simplified output, including a score, classification or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.”
The rule provides further clarity around the phrase “substantially assist or replace discretionary decision making” to mean tools that:
- Rely solely on a simplified output, such as a score, tag, classification or ranking, with no other factors considered.
- Use a simplified output as one of a set of criteria where the simplified output is weighted more than any other criterion in the set.
- Use a simplified output to overrule conclusions derived from other factors including human decision making.
A separate segment of the rule adds two criteria that define machine learning, statistical modeling, data analytics and AI tools as “mathematical, computer-based techniques” that generate a prediction — i.e., an expected outcome for an observation — or that generate a classification — i.e., an assignment of an observation to a particular group.
For example, an AEDT could predict a job candidate’s fit or likelihood of success, or it could assign a classification to candidates or groups of candidates based on the candidates’ skill sets or aptitudes, per the rule.
A computer must assist the tool by, at least in part, identifying or assigning weight to the inputs and other parameters that improve the accuracy of a given prediction or classification.
What will bias audits look like?
Under the rule, NYC employers may not use an AEDT if the tool has not been audited for bias, and use of an AEDT may not continue if the tool has not been audited for more than a year.
DCWP provided an example of a bias audit in the rule. At minimum, an audit must calculate the selection rate and impact ratio for demographic categories including:
- Sex.
- Race and ethnicity.
- Intersectional categories of sex, race and ethnicity.
If the AEDT classifies job candidates for employment, or classifies employees as being considered for promotion into specific groups, an audit must perform these calculations for each demographic group.
The audit also must indicate the number of individuals assessed by the AEDT who are not included in these calculations because they fall into an unknown category.
Where an AEDT scores candidates and employees, the audit must calculate the median score for the full sample of applicants; the scoring rate for individuals in each category; the impact ratio for each category; and the number of individuals not included in these calculations because they fall into an unknown category.
The rule notes that an independent auditor may exclude from the required impact ratio calculations any categories that represent less than 2% of the data being used for the bias audit.
If such a category is excluded, the audit’s summary of results must include a justification for the exclusion, the number of applicants excluded and the scoring rate or selection rate for each excluded category.
Data for bias audits must be derived from the AEDT’s historical data, per the rule.
The data may be collected from one or more employers or employment agencies that use the same AEDT, but individual employers and agencies may rely on such an audit only if they provide historical data from their own use of the AEDT to the independent auditor conducting the audit, or if they have never used the AEDT.
If there is insufficient historical data to conduct an audit, an employer or agency may rely on an audit that uses test data. If this is the case, the summary of results must explain why historical data was not used as well as how the test data was generated and obtained.
How must candidates be notified?
Employers and agencies must publicly disclose the data of the most recent bias audit as well as the summary of results from such audits, but they also must notify candidates and employees about the use of an AEDT at least 10 business days prior to use of the tool.
The notice must include the job qualifications and characteristics that the AEDT will use to assess individuals, and must permit candidates to request an alternative selection process or accommodation if such an alternative is available.
The final rule specifies that an employer or agency may provide the notice by posting it in a clear and conspicuous manner on the employment section on the employer or agency’s website; placing the notice within the job posting; or sending the notice via U.S. mail or email.
The rule includes additional requirements with respect to the AEDT’s data retention policy.
Employers also should consider obtaining consents from applicants and employees regarding the use of AI because use of the tech may give rise to disparate impact claims under federal and state laws, Nicholas Pappas, partner at Dorsey & Whitney, said in an email to HR Dive.
“Advising applicants and employees how to contact the employer if reasonable accommodations are needed [is] incredibly important,” Pappas said. “And companies should provide adequate and reasonable accommodations to applicants and employees with disabilities.”