Robots are Color Blind: How Big Data is Removing Biases from the Hiring Process

The recent trend towards the use of online assessments – coupled with big data technologies – to select job applicants has picked up steam. A variety of different news outlets have written about how large employers are turning increasingly towards computer algorithms to determine who is and is not a good fit for the job[1]. Although the results consistently suggest that these “robot recruiters” are effective at picking employees that stay longer on the job and perform better, there is still some natural skepticism on the part of outsiders as to whether computer can replace human judgment when it comes to evaluating talent.

What’s important to recognize first and foremost is that the current system isn’t perfect and recruiters aren’t unbiased. In fact, there is a long line of research documenting empirically the existence of a “like me” bias that leads recruiters to hire applicants like themselves[2]. This may benefit job applicants who happened to have gone to the same school as the interviewer but unfortunately it tends to hurt anyone who didn’t. The inevitable outcome of this bias is that it’s often not the most talented or skillful individual who gets selected for a job, it’s often the applicant who seems like the best fit with the recruiter. Not only that but there’s the additional possibility that hiring like-minded individuals tends to reduce diversity at the workplace.

Contrast this with the algorithms that have been built to select the best applicants and validated with data from actual job outcomes. These algorithms are designed to make assessment decisions based on the factors that actually matter and have been correlated statistically with on-the-job outcomes. They’re also engineered to ensure that they have no “adverse impact” on protected groups (e.g., gender, race, age, etc.). In essence, they have been trained to select the most talented applicants and to ignore the fact that he or she went to Harvard and plays squash. The data supports this claim. In fact, a White Paper is being released shortly by researchers at the University of Toronto, Yale, and Northwestern that analyzes hundreds of thousands of hires and finds that the adoption of job testing is associated with a 20% reduction in quitting behavior.

If anything, it’s more likely that online assessments actually reduce bias in the hiring process. Consider, for example, the fact that recruiters typically spend approximately 7 seconds screening each resume. What do they look for? Among other things, they look for previous work history and job-relevant experience. Evolv has released studies demonstrating conclusively that job hoppers and long-term unemployed stay just as long and perform just as well as individuals with a more typical work history[3]. These are factors that shouldn’t play a role in the screening process yet 2 to 6 percent of all job applicants are dismissed immediately because of an unusual work history. Pre-hire screening reduces personal biases by allowing these people to be considered on the basis of their true knowledge, skills, and abilities.

Although there is some natural trepidation about the fact that computer algorithms are playing a bigger role in the hiring process, it’s important to realize that these algorithms aren’t mean to replace recruiters. They’re simply intended to arm recruiters with more information that they can use to make an informed decision. It’s an exciting era, not only because the technology has evolved to the point where it’s capable of issuing recommendations around something as complicated as hiring someone but also because this capability is going to give a fair shot to millions of job applicants who wouldn’t have been considered previously.


[1] Walker, Joseph. “Meet the New Boss: Big Data.” Wall Street Journal 20 September 2012: B1. Print.

Lohr, Steve. “Big Data, Trying to Build Better Workers.” New York Times 20 April 2013: BU4. Print.

Cukier, Kenneth. “Robot Recruiters: How Software Helps Firms Hire Workers More Efficiently.” The Economist 6 April 2013: 78. Print.

Peck, Don. “They’re Watching You at Work.” The Atlantic Monthly 20 November 2013: 74-84. Print.

[2] Rivera, LA. “Hiring as Cultural Matching: The Case of Elite Professional Service Firms.” American Sociological Review. December 2012. 77(6): 999-1022.

Cable, D.M. & Judge, T.A. “Interviewers’ Perceptions of Person-Organization Fit and Organizational Selection Decisions.” Journal of Applied Psychology. 1997. 82(4): 546-561.

[3] Housman M. “The Influence of Work History on Attrition: Does Previous Work History Predict Future Employee Outcomes?” San Francisco, CA: Evolv, Inc., 2012.

Housman M. “The Truth About the Long-Term Unemployed” San Francisco, CA: Evolv, Inc., 2014.

 


Comment on Robots are Color Blind: How Big Data is Removing Biases from the Hiring Process

Leave a Reply