Keypoint: After a January hearing, New York City continues to consider comments to a new law regulating employers’ use of automated employment decision tools, with enforcement to begin “in the coming months.”
New York City moves closer to implementing Local Law 144, the first major U.S. law governing the use of AI employment technologies. On January 23, 2023, the New York City Department of Consumer and Worker Protection (DCWP), the agency charged with enforcing the law, held a second public hearing on the law’s proposed rules to address several ambiguities related to key definitions and the scope of the law. Within the past week, the DCWP published a transcript of the hearing and announced that it would finalize its rules and begin enforcement “in the coming months.”
As we wrote in December 2022, Local Law 144 applies to employers and employment agencies in New York City who use “automated employment decision tools” (AEDTs) to screen applicants for employment or employees for promotional opportunities within the city. The law makes it unlawful to use an AEDT to screen candidates or employees for an employment decision unless (1) the AEDT is subject to an annual “bias audit” by an “independent auditor” before use; and (2) the results of the most recent bias audit and the AEDT’s distribution date are published on the employer’s or employment agency’s website. The law was enacted in December 2021 and was initially scheduled to take effect on January 1, 2023.
The DCWP published its initial proposed rules in September 2022 and held a public hearing on the rules in November. After receiving public comments, the DCWP published revised proposed rules on December 15, 2022, which prompted the second public hearing on January 23. While the DCWP initially stated they would begin enforcing Local Law 144 on April 15, 2023, they have since announced that they are continuing review of a “substantial volume of thoughtful comments” and plan to finalize the rules and “begin enforcement in the coming months.”
Below is a summary of issues that received the most attention at the January 23 hearing. They provide insight on stakeholder concerns and questions the DCWP is grappling with before enforcement begins.
Narrowing the definition of AEDT
Local Law 144 defines an AEDT as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making….” The initial rules proposed in September took a more expansive interpretation of “substantially assist or replace discretionary decision making” to include instances where use of a simplified output could “modify” conclusions made by human decision-making. The revised December rules take a narrower approach, clarifying that a “simplified output” only triggers the definition of an AEDT if the output is (1) the sole factor in an employment decision, (2) one of multiple factors in an employment decision but weighed more heavily than the other factors, or (3) used to “overrule conclusions derived from other factors including human decision-making.”
At the January 23 hearing, most commentators opposed this change, arguing it would render many tools exempt from the regulation. Those commentators, representing advocates of stricter AI regulation, argued that an AEDT would rarely, if ever, weigh more heavily or overrule human decision-making which is inherent in any employment decision.
Local Law 144 requires bias audits to be conducted by an “independent auditor.” The DCWP’s initial rules defined “independent auditor” to include a “person or group that is not involved in using or developing” the AEDT under audit. The revised December rules, however, further qualify the definition to “a person or group that is capable of exercising objective and impartial judgment on all issues within the scope of a bias audit of an AEDT.” The updated definition of “independent auditor” also excludes anyone that: (1) is or was involved in using, developing, or distributing the AEDT; (2) had an employment relationship with an employer or employment agency using the AEDT or with a vendor that developed or distributed the AEDT at any point during the bias audit; and (3) had a direct financial interest or a material indirect financial interest in an employer or employment agency using the AEDT or in a vendor that developed or distributed the AEDT at any point during the bias audit.
Representatives of employer organizations and HR vendors opposed this change at the hearing. They argued this change effectively prohibits bias audits from being completed in-house which the representatives believed to be more efficient and effective than outsourcing audit responsibilities to individuals who are not familiar with the AEDT being used or employer practices.
Defining machine learning, statistical modelling, data analytics, or artificial intelligence
As mentioned above, an AEDT is defined by the law as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence….” The DCWP’s proposed rules from September and December both define “machine learning, statistical modelling, data analytics, or artificial intelligence” as “a group of mathematical, computerbased techniques:
- that generate a prediction, meaning an expected outcome for an observation, such as an assessment of a candidate’s fit or likelihood of success, or that generate a classification, meaning an assignment of an observation to a group, such as categorizations based on skill sets or aptitude; and
- for which a computer at least in part identifies the inputs, the relative importance placed on those inputs, and other parameters for the models in order to improve the accuracy of the prediction or classification; and
- for which the inputs and parameters are refined through cross-validation or by using training and testing data.”
Many commentators at the January 23 hearing opposed the inclusion of this definition, arguing it provided loopholes for employers to avoid the definition of an AEDT and the law’s requirements.
Other comments raised at the January 23 hearing:
- Calls to define the employee notice requirements with detail;
- The desire for bias audits to test for efficacy in addition to disparate treatment;
- Opposition to removing the requirement that quantitative results of a bias audit be made public; and
- Requests to shorten the 10-day notice requirements for candidates and employees.