LawFlash

NYC Postpones Enforcement of AI Bias Law Until April 2023 and Revises Proposed Rules

January 09, 2023

The New York City Department of Consumer and Worker Protection (DCWP) announced that it would postpone enforcement of New York City Local Law 144 (the AI Law) from January 1, 2023, to April 15. The DCWP also released further revisions to its proposed rules implementing the AI Law and invited additional public comment.

The New York City AI Law makes it an unlawful employment practice for employers to use automated employment decision tools (AEDTs) to screen a candidate or employee within New York City unless certain bias audit and notice requirements are met. The DCWP, the agency charged with enforcing the AI Law, released proposed rules clarifying those requirements on September 23, 2022—a mere 100 days before the AI Law went into effect.

Public comment on the rules noted the limited window the DCWP was providing for compliance and requested that the DCWP grant employers further time to conform with the law. Commentors also highlighted several areas where further guidance was needed from the DCWP in final rules. In response, the DCWP recently announced that it was releasing a revised version of its proposed rules and would hold a second public hearing on January 23, 2023. It further stated that, in the interim, it would postpone enforcement of the AI Law until April 15, 2023.

REVISED PROPOSED RULES

The revised proposed rules make four significant alterations to the DCWP’s September 2022 proposal.

First, the revised proposed rules slightly narrow the definition of an AEDT. The AI Law defines an AEDT as any process “derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making.” The previous rule suggested that a tool whose output was used to “modify” human decision making could meet that definition. The proposed rules clarify that simplified output from a tool only meets this standard if it is (1) the sole factor in an employment decision, (2) one of several factors but more heavily weighted than other factors, or (3) used to overrule conclusions derived from other factors.

Second, the revised proposed rules confirm that multiple employers using the same AEDT can rely on the same bias audit, so long as each employer using the AEDT provides historical data to the auditor or can explain why such data is not available. “Historical data” is defined as data collected during an employer’s use of an AEDT to assess candidates for employment or employees for promotion. This means that under the proposed rules, an auditor can aggregate results across multiple organizations when conducting a bias audit and each of those organizations can rely on that single audit to comply with the law. This critical modification will further anonymize employer data. If an employer does not have sufficient historical data to conduct a statistically significant bias audit, the rule allows test data to be used instead.

Third, the proposed rules clarify the definition of “independent auditor.” The DCWP’s initial proposal said any person or group that was not involved in using or developing the AEDT could qualify as an independent auditor, raising the possibility that employers or vendors could have internal teams conduct the audits. The proposed rules state that an independent auditor must be “capable of exercising objective and impartial judgment” and must not have been involved in the development of the AEDT, have an employment relationship with an employer or vendor that uses or develops an AEDT, or have a direct or material financial interest in an employer or vendor that uses or develops an AEDT.

Fourth, the proposed rules explain that the AI Law does not create an independent alternative selection process or reasonable accommodation obligation. The law says that employers who use AEDTs must provide notices to certain employees and applicants that “allow” them to “request an alternative selection process or accommodation.” The proposed rules make clear that this only requires employers to include instructions on how an individual might request an alternative selection process or accommodation, “if available.” Of course, employers must still comply with their reasonable accommodation obligations under the Americans with Disabilities Act, NY State Human Rights Law, and the NYC Human Rights Law.

Additionally, the revised proposed rules:

  • Provide that an AEDT cannot be used if more than one year has passed since the most recent bias audit.
  • Revise the calculation methodology used where an AEDT scores candidates.
  • Clarify that the required “impact ratio” must be calculated separately using scoring rates, (instead of average scores) to compare sex categories, race/ethnicity categories, and intersectional categories.
  • Distinguishes test data (i.e., data used to conduct a bias audit) from historical data.

IMPLICATIONS FOR EMPLOYERS

Employers should continue to monitor updates from the DCWP as it completes its rule-making process. If the DCWP intends to meet its April 15, 2023 enforcement deadline and provide an adequate compliance period for employers, it will need to release final rules fairly shortly after the January 23, 2023 hearing. There continue to be steps employers can take to prepare in the absence of final rules, however, including the following:

  • Reviewing existing use of AEDTs in hiring and promotion practices to determine whether they are covered by the AI Law.
  • Reviewing data retention policies applicable to data collected using AEDTs.
  • Training supervisors and managers as well as compliance, human resources, and legal professionals on the implications of the new law, including notice requirements and proper responses to employee communications and inquiries on hiring and promotion processes.
  • Communicating with any vendors who operate AEDTs for the business to confirm they are in compliance with the AI Law, including that they are prepared to conduct an independent bias audit as required by the AI Law.
  • Determining how the company will meet the independent audit and notice requirements of the AI Law and any final rules.

PUBLIC HEARING

The DCWP will hold an additional public hearing on the proposed rules on January 23, 2023, at 11:00 am. Prior to the hearing, any individual or group may submit comments on the proposed rules. Written comments can be submitted in advance online at https://rules.cityofnewyork.us/rule/automated-employment-decision-tools-updated/, which also includes a link to join the public hearing by video conference or phone, or by email to Rulecomments@dcwp.nyc.gov. Comments can also be raised at the public hearing, but those wishing to speak at the hearing will be allotted up to three minutes of speaking time and must sign up in advance by calling (212) 436-0396.

NAVIGATING THE NEXT.

Sharing insights and resources that help our clients prepare for and address evolving issues is a hallmark of Morgan Lewis. To that end, we maintain a resource center with access to tools and perspectives on timely topics driven by current events such as the global public health crisis, economic uncertainty, and geopolitical dynamics. Find resources on how to cope with the globe’s ever-changing business, social, and political landscape at Navigating the NEXT. to stay up to date on developments as they unfold. Subscribe now if you would like to receive a digest of new updates to these resources.

Contacts

If you have any questions or would like more information on the issues discussed in this LawFlash, please contact any of the following:

Authors
Sharon Perley Masling (Washington, DC)
Leora Grushka (New York)
Carolyn M. Corcoran (New York)
Silicon Valley
Washington, DC