As technology has advanced in recent years, so have hiring tools. Among these tools are “algorithms” — formulas developed by data analysts and computer programmers to help employers cut the hundreds or even thousands of online job applications down to a smaller number that meet certain stated job qualifications. This could include educational requirements or particular skills necessary for the position.
These algorithms also enable employers to subject applicants to personality tests and find online information about potential candidates, and even help the employer reach out to people who might be a good fit but haven’t actually applied.
However, these tools can also pose a danger. Employers may set the algorithms to look for candidates who look like their idea of a “top performer,” but this could lead to weeding out women, racial minorities, people with disabilities or other groups protected by antidiscrimination laws.
Use of such tools could potentially open an employer up to a lawsuit by members of these groups who claim they’ve been “disparately impacted.” Even if the employer can show that what they’re doing is “job related,” would-be employees might still have a case under federal law if they can prove another tool would have been as effective without discriminating.
One complicating issue is that this technology is very hard to understand and employers might not know their algorithms are having a discriminatory effect. Meanwhile, it can be difficult for potential employees to show they’ve been victimized in a discriminatory manner. However, these issues are out there, so if you’re thinking of deploying “big data” tools to aid your hiring efforts talk to an attorney to make sure you’re not walking into dangerous territory.