Artificial intelligence in the workplace: A “New Civil Rights Frontier”

0
551
Image created through artificial intelligence at creator.nightcafe.studio
Image created through artificial intelligence at creator.nightcafe.studio

By Rebecca Capece and Julia Stewart

Special to North Carolina Construction News

When it comes to hiring qualified employees, a growing number of employers have started to rely on artificial intelligence (AI) to simplify the hiring process. At the same time, lawmakers across the country are scrutinizing the potential discriminatory impact of using AI in the workplace. As a result, there has been a significant increase in regulatory oversight and legislation both on a federal and state level.

The concerns stem from the growing popularity of employer use of sourcing and recruiting platforms powered by AI and machine learning, as well as the use of algorithms in screening and/or interview software to analyze and rank job applicants. In fact, the Chair of the Equal Employment Opportunity Commission (EEOC), Charlotte Burrows, estimated in May 2022 that more than 80% of employers are using AI in some form in their work and employment decision-making.

Legislative and Regulatory Oversight

As a result of its concerns over the growing use of AI in employment decision-making, the EEOC has signaled that it will keep focusing on the use of AI in the workplace, calling it a “new civil rights frontier.”   In the fall of 2021, the EEOC announced an initiative to ensure that the use of AI complies with federal civil rights laws. As part of this initiative, the EEOC stated that it planned to identify best practices and issue technical assistance to provide guidance on algorithmic fairness and the use of AI in employment decisions.

In May 2022, the EEOC issued guidance for employers on complying with the Americans with Disabilities Act while using AI.

On January 10, 2023, the EEOC released its 2023-2027 Draft Strategic Enforcement Plan (“SEP”) in the Federal Register, noting that one of its priorities would be eliminating barriers in recruitment and hiring, including by focusing on “the use of automatic systems, including artificial intelligence or machine learning, to target advertisements, recruit applicants, or make or assist in hiring decisions where such systems intentionally exclude or adversely impact protected groups,” as well as the use of “screening tools or requirements that disproportionately impact workers based on their protected status, including those facilitated by artificial intelligence or other automated systems, pre-employment tests, and background checks.”

And, on March 30, 2023, EEOC Chair Burrows announced at an American Bar Association conference that additional guidance regarding use of AI in the workplace is forthcoming from the EEOC.

In addition, some states, including New York, California, Maryland, and Washington, have either enacted or are considering enacting legislation to address the use of AI in the recruitment process. In particular, the New York legislation, set to become effective July 2023, prohibits employers from using AI employment selection tools unless an organization institutes specific bias auditing and makes the resulting data publicly available. Employers would also be required to disclose their use of AI to job candidates who live in New York City.

Common uses of AI in employment decision-making 

AI can assist employers in performing hiring tasks such as analyzing resumes, and it can even perform facial analysis in interviews to evaluate a candidate’s stability, optimism or attention span.  While this can help streamline processes for employers, it can also create issues by enabling (even unintentionally) systemic discrimination and duplicating human biases.

Although proponents of AI have said that AI will in fact eliminate human bias from the recruitment process, this is not always the case. For example, AI software may use algorithms to analyze a candidate’s facial movements, words, and speech patterns, and it could then evaluate these candidates by comparing their behaviors to other successful hires made by the company. This may in turn inadvertently eliminate candidates with disabilities from the hiring process.

Further, if an employer utilizes a third-party vendor to provide AI services during the hiring process, it may be difficult for the employer to establish a level of control over the process and ensure that the vendor’s programs, processes, or algorithms are not resulting in unintentional discrimination. This is especially the case if the vendor’s programs or algorithms are identified as trade secrets or are otherwise confidential, as they may then be protected from disclosure to employers.

What are the takeaways? 

Employers need to be aware of the implications of the use of AI in hiring and should not assume that because AI technology is handling tasks such as applicant screening, they do not have to worry about preventing discrimination in the hiring process.  Rather, employers need to be involved in understanding how these AI tools work and take steps to ensure that use of these tools does not disparately impact applicants in protected groups.

In addition, if employers utilize a third-party vendor to provide AI technology, they need to discuss these issues with the vendor and make sure there is transparency in the vendor’s processes regarding the elimination of bias when using their tools.  EEOC Chair Burrows has noted that employers need to exercise due diligence and ask vendors “what’s under the hood” of their algorithms before using them to vet candidates.

For example, she has indicated that employers need to question vendors about whether any algorithm or other AI screening tool allows for reasonable accommodations in the hiring process, which is a requirement for employees with disabilities under the Americans with Disabilities Act.  According to Burrows, “if the vendor hasn’t thought about that, isn’t ready to engage in that, that should be a warning signal.”

In summary, employers need to carefully weigh the use of AI in their screening and hiring processes.  Please contact the labor and employment attorneys at Maynard Nexsen if you have further questions about these issues.

Rebecca Capece and Julia Stewart are Associates at Maynard Nexsen.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.