LawFlash

New Jersey Codifies Existing Disparate-Impact Rules

December 22, 2025

New Jersey’s Division on Civil Rights has adopted regulations, effective on December 15, 2025, to codify the state’s longstanding disparate-impact framework under the New Jersey Law Against Discrimination. While the rule largely codifies existing law, it also signals where the division may intend to focus enforcement efforts, especially artificial intelligence–driven hiring tools, criminal history, and credit screens, as well as other automated screening practices.

THE REGULATIONS TRACK EXISTING LAW

NJAC 13:16 largely codifies the New Jersey Supreme Court’s longstanding three-step burden-shifting framework:

  1. The plaintiff must first establish that a facially neutral policy produces a disproportionate adverse effect on a protected group.
  2. The burden then shifts to the employer to demonstrate that the policy is necessary to achieve a substantial, legitimate, nondiscriminatory interest.
  3. If the employer meets that burden, the plaintiff may still prevail by identifying a less discriminatory alternative that serves the employer’s objective.

While Step Two uses “necessary to achieve a substantial, legitimate, nondiscriminatory interest” rather than the more familiar “business necessity” language typically seen in employment cases, New Jersey courts treat these formulations as functionally equivalent in disparate-impact cases.

One notable omission: the original draft regulations would have required plaintiffs to identify at Step Three alternatives that are “equally effective” as the challenged practice. That language was removed before adoption.

While that language would have been helpful to employers defending against claims of disparate impact, it remains to be seen whether its omission will have any practical consequence. In reality, at least some existing case law provides that any allegedly less discriminatory alternative to which a plaintiff points must be equally effective as the challenged employment practice. So the absence of the phrase “equally effective” arguably reflects only that the phrase would have been redundant of existing obligations.

New Jersey adopted these regulations as federal agencies are stepping away from disparate-impact enforcement. While the Equal Employment Opportunity Commission is no longer pursuing disparate-impact theories, Attorney General of New Jersey Matt Platkin views the new regulations as a countermeasure: “As the [federal government] continues its . . . attempts to dismantle disparate impact protections at the federal level, it’s more important than ever that states take action to protect the civil rights of their residents—and that’s exactly what we’re doing.”

AI AND AUTOMATED TOOLS LIKELY TO DRAW ENFORCEMENT SCRUTINY

The regulations’ most significant practical significance lies in the New Jersey Division on Civil Rights’ (DCR’s) express attention to artificial intelligence and automated decision-making tools. Building on AG guidance issued in January 2025, the rules explicitly identify automated decision-making tools as potential sources of disparate-impact liability.

The DCR provides an illustrative example: an AI system trained to evaluate job applicants by reference to a company’s existing workforce. Where that workforce is predominantly white and male, the algorithm may systematically favor candidates who resemble incumbent employees—producing discriminatory outcomes without discriminatory intent according to the DCR.

The regulations also flag criminal history screens, credit checks, physical requirements, and English-only policies as practices warranting heightened scrutiny.

While the DCR’s enumeration does not alter the governing legal standard, it identifies where the DCR may allocate enforcement resources going forward. To that end, employers should audit their screening tools and selection criteria and be prepared to explain, for example, why they use the screening methods they do, what alternatives they considered, and how they evaluated any AI or algorithmic tools for bias before using them.

Contacts

If you have any questions or would like more information on the issues discussed in this LawFlash, please contact any of the following:

Authors
August W. Heckman III (Princeton)
Richard G. Rosenblatt (Princeton / Philadelphia)
Christian A. Zazzali (Princeton)