Resume-screening programs could raise discrimination concerns
If you’re a desirable employer in a competitive job market, you probably get dozens if not hundreds of applications when you post an opening on a job board like monster.com or indeed.com.
Thankfully “artificial intelligence” (AI) computer programs exist to make sifting through these resumes a lot easier. Many of these programs use algorithms to identify the resumes that provide the best match, based on training, education or experience. Some can mine job candidates’ social media activity to learn about their political beliefs and social connections, and some can go a step further and generate information about candidates’ spending habits or voter registration. But that’s not all. We’re moving to a place where human interviewers can be replaced by a virtual “AI” interviewer (called a “chatbot”) to ask questions and evaluate candidates’ facial expressions, speech patterns and word choices.
If you think this sounds great, you also should be aware that these programs could potentially set you up for discrimination claims by rejected candidates. First of all, the programmers who design these applications and the project managers in your own organization who work with the programmers to customize them for your needs may have cultural biases that make their way into the software in a manner that disproportionately favors candidates from certain racial or ethnic groups. Additionally, the algorithms might tend to hurt minority groups even if this isn’t intentional, such as by eliminating applicants with GEDs instead of traditional high school diplomas.
This isn’t to say that you shouldn’t consider taking advantage of cutting-edge recruiting software. Just be aware of this issue, discuss it with your software vendor, and discuss it with an employment attorney.