Software is your secret weapon against hiring bias

In a country where African-American candidates still have a harder time getting hired than their white peers, preventing bias is top of mind for most talent organizations. Human’s are built to be biased, so there are really only two solutions to reducing hiring bias that I can think of:

  1. Training humans to be less biased, or
  2. Removing humans from the process as much as possible.

I’m biased. I like the second option best. Below are the ways that software can combat common biases in your recruiting and interviewing process:

Stereotyping

Stereotyping occurs when you assume something about someone based on the fact that they belong to a particular group (for example, assuming all comedians are good at making up jokes on the spot). When you’re hiring, this stereotype bias more traditionally comes out as racism, sexism, or judgements about education institutions. Assuming that all Harvard grads are high-performers is a bias. Assuming all non-Harvard grads are worse performers than Harvard grads is also a bias. Software can solve this problem by qualifying candidates against the exact same objective criteria while minimizing or outright masking the things about the candidate that trigger stereotyping. (However, software can’t stop an employee from being racist. That’s a problem for other human beings to solve.)

First-impression bias

A good first impression is one of the most important things that a candidate can give… except when it’s not. First-impression bias can often overwhelm a discussion about candidate fit even if that candidate did everything else right. Screening software can make an objective score for an objective test the first impression given by every candidate, eliminating the focus on anything but performance.

Similar-to-me effect

This type of bias is something that private equity firms and sales floors across the nation have shown in gallons. Giving a preference to a person who is similar to you in ways that have little to do with actual job performance is a sure sign that you’re letting this bias occur. The problem comes from when you give too much weight to the factors that determine friendship over the factors that align with job success. Software is an easy fix for this because it can evaluate all candidates only on objective factors (unless you program it to make friends, that is).

Question inconsistency

Question inconsistency isn’t just bias but also poor hiring practice. When you ask different questions for different candidates, you hurt the candidate by giving candidates questions that may not be fair or relevant, and you hurt yourself by being unable to compare candidates (or your own performance) objectively. Administering a phone screen or interview with software that locks in questions is the best way to go.

Cultural noise

Ever feel like someone is telling you what you want to hear? If you don’t pick up on it, that might be cultural noise bias in action. This can sometimes backfire with software because candidates have been known to keyword-stuff their resumes to get through an ATS screening program (kind of like telling the ATS what it wants to hear). If you use software to test a candidates on their performance instead of asking them to answer questions, you avoid sweet-talk or embellishment.

Recency

Recency bias is the bias of favoring people that you saw most recently because you remember more details about them. Software is amazing at saving data, and it will allow you line up everyone you met from day 1 and compare them in the fullest of detail. Think of it like advanced note-taking (where no one loses their notes).

Contrast effect

Contrast effect occurs when you interview someone who is much better or much worse than the adjacent candidates. This emphasizes the difference and can unfairly improve or hurt your judgement about all three. Software can be set up to post up the company’s benchmarks for evaluation, giving you equal footing for all candidates. Perhaps a candidate is much better than the others, but a benchmark will remind you that the other candidates aren’t suddenly un-hireable.

Unfortunately, I left one off of this list because I think the circumstances of meeting your candidate lock in the bias:

Halo/horn effect

If you let one attribute about someone outweigh every other attribute while you are considering moving them to the next round or making them an offer, you may be falling prey to the halo or horn effect. (Halo meaning angel halo and horn meaning devil horns.) Even if you use software at every step of the way, there is one common situation that help this bias sneak through.

That situation is taking referrals. If you feel that the candidate’s reference is an upstanding person who looks out for your wellbeing, then you already feel differently about the candidate. You are not coming into the interview as a neutral party. Referrals themselves are not bad (and in fact are great for company growth) but how you feel about particular referrals based on factors unrelated to the candidate’s job performance is the problem. On the flip side, if someone you trust has a strong negative opinion about the candidate, that can often be enough to reject them without further evaluation. Again, not always a bad thing, but also not objective.