Not All Bias is Bad Bias

Technology has helped business leaders root out bad, unconscious bias in their decision-making. But let’s not dismiss all bias and human judgment as something desperate to get rid of.

There’s a growing idea that bias is public enemy No. 1 when it comes to evaluating and hiring talent.

I disagree. You might say I have a bias against the idea that bias is bad — at least in some respects.

In today’s technology-filled world — where more and more of our everyday decision-making is driven by algorithms as much as it is driven by raw human thought, if not more so — we’ve come to view human thinking as everything that is wrong with the status quo and technology as the primarily solution to fix it. I’ve written about this before. There’s no doubt that human decision-making and logic is often flawed, and there’s also little doubt that, in many ways, technology is the best way to clean up the mess.

We’ve seen this on display time and time again, as humans have become better at using streams of data to make more effective and objective decisions about everything including the best way to exercise, where to eat dinner and whom to hire for that the executive role you’re looking to fill.

Within this conversation is the idea that human thought is often filled with erroneous ways because each of us has a bias that we often don’t even realize exists — hence why it’s referred to as unconscious bias. This bias, formally defined as having a prejudice in favor of or against a thing, person or group compared with another, usually in an unfair way, affects many decisions we make.

Sometimes our bias is fully known and agenda-driven; that is, we know we have a preference and we’re fully acting on it for a specific reason. Unconscious bias, as I mentioned above, is the dirty culprit that we turn to technology to get rid of. It’s the type of bias that we aren’t self aware of that forces us to make decisions that might lead to an undesirable outcome.

As has been published widely, unconscious bias often creeps into evaluating and hiring talent. We tend to want to hire people that we like or have similar backgrounds or experiences. This is often an unconscious bias that is difficult to control, which is why many companies have turned to technology and other methods to root out unconscious bias in hiring — an effort that will seemingly result in a more fair, inclusive and diverse workforce, leading to better business results.

As a result, at consumer products company Unilever, for instance, entry-level job candidates don’t come in direct contact with human judgment on their candidacy until the final step in the interview process. The rest of the interview consists of candidates taking a series of computer-based assessments and tests, as well as a portion where applicants submit a recorded video for evaluators to review.

Of course, that’s after candidates’ initial applications — consisting of a digital submission of their LinkedIn profiles — are examined by algorithms to weed out more than half of the initial talent pool. The company says this helps broaden its candidate pool and eliminate human bias. It has also made hiring faster and more accurate.

Still, I wonder if efforts like this go too far to eliminate the human element. Yes, when you’re a company that has more than 200,000 applications for a given job, it’s not unreasonable to use technology to slim down the initial pool. But not having human judgment enter the equation until the final round of the interview? That seems to go too far.

The truth is, at least in my view, not all bias is bad bias. And when evaluating talent, I dare suggest we need human bias. We need it early and we need it often.

To be sure, this isn’t suggesting we need more of the sort of unconscious bias that gets in the way of our diverse and effective decision-making. We still need less of that, and technology and other methods are still needed to help weed that out.

What I am suggesting is that, in order to truly capture the total picture of something, we need to pair technological judgment with a lot of human judgment. We can use the objective measures spit out by technology and measure them against a collective of human bias to ultimately make the decision. Unilever does this by including human judgment in the final rounds of its hiring process, but many companies could benefit by including it earlier. Unilever is likely weeding out a lot of talent that is perfectly qualified but may have not passed the smell test of technology. This is an opportunity pool for smaller firms to take advantage of.

The bottom line is bias is OK in my book so long as we’re aware of it. Furthermore, bias is OK when hiring if it’s pooled together with a group of other people with their own unique bias. If I’m hiring a direct report that will report to me and me only, it makes sense to let my bias creep into the equation — so long as I can admit to it and weigh it in my decision-making to hire or not hire a given candidate. After all, I am the primary one who has to work with this person.

For more robust, companywide hiring efforts, individual biases should be explained and put on the table through the use of a hiring committee of multiple people with multiple perspectives and biases. Think of all the intangible assets a candidate might bring to the table that technology might overlook, not to mention the fact that many soft leadership skills are still hard to judge through an algorithm, despite what many leaders of assessment providers will tell you.

Technology has done wonders for business, and rooting out unconscious bias — especially in hiring — is one of those wonders. Nevertheless, let’s not allow all bias to turn into a dirty, four-letter word. Let’s embrace bias, admit to it, share it with others and use it along with technology to lead our businesses into the future.

Frank Kalman is Talent Economy’s managing editor. To comment, email editor@talenteconomy.io.