What HR needs to know about NYC's Automated Employment Decision Tools regulations

HR leaders - are you ready? Starting from Wednesday, July 5th, enforcement begins for Local Law 144 and Department of Consumer and Work Protection (DCWP) Rules. These regulations specifically address the use of Automated Employment Decision Tools (AEDT) found in software used during the job application or promotion process for employers and candidates residing in New York City.

Understanding the new AEDT regulation

Now, let's break it down in simpler terms. The law and regulations cover AEDT, which is basically any process that uses machine learning, statistical modeling, data analytics, or artificial intelligence to provide simplified output like scores, classifications, or recommendations. These tools are meant to assist or even replace discretionary decision making by humans in the employment process.

To comply with these new requirements, employers need to take a few steps. First, you'll need to figure out if any of the software your company uses for hiring or promotions falls under the category of AEDT. In other words, if it's helping with decision-making by "scoring," classifying, or recommending candidates or employees in NYC. This law is broader than similar laws in Illinois and Maryland that focus on facial-recognition software, so it covers commonly used HR software.

If you do use software with AEDT, here's what you need to do: (1) make sure a bias audit has been conducted; (2) provide at least 10 days' notice to applicants or employees that AEDT will be used; (3) explain the qualifications the AEDT will consider during assessments; (4) disclose the data source and type of AEDT being used, along with your data retention policy if it hasn't been shared elsewhere; and (5) inform applicants or employees that they have the right to request an alternative means of assessment or a "reasonable accommodation" under other laws.

But wait, there's more! The initial bias audit is just the beginning. Employers are also responsible for conducting an AEDT audit annually. And here's the kicker: the results of the bias audit need to be published on your website before using the AEDT. These audits have to be conducted by "Independent Auditors" who are unbiased and not financially connected to the employer or the software vendor. Read more here.

Now, let's talk penalties. If you violate any of these requirements, you could face civil penalties. The first violation can result in a $375 fine, while subsequent violations could be at least $500, with a maximum of $1,500. There are separate penalties for violations of the notice and audit requirements.

Keep in mind that this new regulation is in line with the EEOC's guidance issued in May 2023, which also requires bias audits, notice, and opt-out provisions. It's all part of the effort to ensure fairness in employment decisions. And it's not just happening in NYC – other jurisdictions are considering similar bills and regulations related to AI and employment decisions.


Want to learn more about these regulations and others around the world impacting HR?

Take our Regulations and Standards masterclass and be a leader in the space. 

Join Regulation Masterclass


Understanding your options

I think a lot of companies not using One Model are going to have some big challenges. From our experience, most AI tools are like a black box. You cannot get a clear understanding of what is being used in the models that generate the insights you're using.

One Model is the complete opposite. Our models are only built from your data. They have outputs that tell you exactly what data is being fed into the models and, in turn, are customizable. 

eda-dataprocess-snippet-from-oneai

In addition, One Model customers benefit from being able to build models beyond talent acquisition including retention, diversity, and more.


Would you like to see our AI in action?

Meet with us today!


Understanding the bigger global picture

On a global scale, the EU's Artificial Intelligence Act is also making progress. The European Parliament recently adopted its official negotiating position on June 14, 2023. This Act covers the use of AI in various areas, including employment. If it goes into effect, AI used for employment purposes would likely fall under the "high-risk" category and face greater regulation.

So, as HR leaders, it's important to stay informed about these evolving regulations. Make sure to review your software tools, conduct the necessary audits, and provide the required notices to applicants and employees. And keep an eye out for any developments in your local jurisdiction or at the EU level. Good luck navigating these changes!

Written By

Dennis is a husband, dad of three, and tinkerer of many things electronic and mechanical. He currently leads One Model's marketing team and is enthusiastic about the potential for data analytics to bring fulfillment and success to organizations and their great people.

Ready to learn more?

Request a tailored demo to see how One Model could help you.