The New York City AI Hiring Law: What data scientists and compliance officers can do now
Data and Infrastructure Preparedness Steps
The New York City AI hiring law, a local law Int. No. 1894-A, made headlines late last year as a brave new step on the path of AI regulation in the U.S. The law, which takes effect on January 1, 2023, regulates the use of “automated employment decision tools” in workforce decisions within the city requiring independent bias audits, advanced notices, data retention and alternative processing options for applicants or employees that are subject to such tools. Six months later, the details on the audits, passing criteria, notices, and types of alternative accommodation are still lacking. While I am hopeful that the New York City Division of Human Rights and Committee on Technology will provide clarification on the employer obligations and offer public comments, let’s look into some practical steps that companies should take now to be prepared for 2023.
AI hiring tools taxonomy. Create a taxonomy of “automated employment decision tools”, internal or external, that are used in your organization. The law refers to “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace a discretionary decision making for making employment decisions that impact natural persons”. A report by Littler offers some helpful examples:
“This likely covers any computerized tool or algorithm-based software program used to identify, select, evaluate, or recruit candidates for any employment position. It may thus include any data-driven tools used to review résumés, conduct skills testing, rank applicants, “chat” with applicants, conduct behavioral analysis of prospective hires, assess employee performance and productivity, monitor field-based or remote employees, or determine compensation and promotions.”
Did your data science team write a program to extract and summarize results from the take-home tasks? Does your engineering department use a skill assessment tool for intern selection? Both fall under the algorithmic employment decision tool definition.
For the purpose of the user notice, the taxonomy should include job characteristics and qualifications that each tool assesses, the type and source of data that it uses and the associated data retention policies. It should also include an alternative accommodation procedure for each tool, as we discuss below.
For the purpose of the bias audit, each tool in the taxonomy should include a ‘distribution date’ - the tool version used in the bias audit. Extra tip: you might want to use this opportunity to create a comprehensive taxonomy of AI technologies and their applications across the organization.
Data collection. Design and implement data collection processes for the independent bias audit. The bias audit must assess disparate impact on the tool’s users based on their race, ethnicity or sex, and these variables, along with the tool’s output, should be available for the audit (make sure to use the definitions and categories that comply with the federal law). While many questions on the subject, the level of detail and the passing criteria of the audit remain open, it is obvious that the audit will not be technically possible without the required data.
Data retention. Design, operationalize and test data retention and disclosure protocols for the tools and models that are subject to the law.
Ten-day notice and hiring cycle. The law requires a ten day notice prior to the use of an automated employment decision tool which would probably result in longer hiring or promotion cycles. Communicating these requirements will set the right expectations for the hiring managers and the teams, and making sure the notice protocols are part for the UX flow is instrumental.
Alternative selection process design. According to the law, an applicant or an employee should be offered an option for an alternative selection procedure when presented with a notice on the automated tool use. This might mean a human review of an application or a subject matter expert doing an assessment grading. Designing an alternative accommodation process with clear steps, resource designation and time scope will help you stay compliant with the law and on target with the hiring goals. It is equally important to ensure that an ‘opt-out’ selection is propagated through the UX data flow and reflected in the retained data.
While there is a lot yet to be clarified about the New York City AI hiring law, this is an excellent opportunity for organizations to start asking questions about impact and transparency of the AI tools that they use.