Humans just aren't that good at hiring - and what to do about it (pt.2)
The message from research is clear: sustainable business performance depends on your ability to attract and engage a diverse workforce. In my last post, I described how modern HR (with its desire for cultural fit and non-desire for otherness) is slowing down diversity - and thereby the potential for growth and innovation in organizations.
In this post, I want to highlight two of the things we can focus on to hire better in the future: blind hiring and data-based decision making. Why?
If you're human, you're biased
According to a recent New York times article, companies rely too much on flawed human judgment when they recruit. The study of top banking, law and consulting firms found that similarities in things like leisure activities and personality were the most important factor in their evaluation of candidates. Hiring now resembles choosing a romantic partner more than an employee, says Lauren Rivera, an associate professor of management and sociology at Northwestern, and the author of the study. (1)
And this tendency can be particularly harmful when it comes to race, as a landmark 2003 study showed. Researchers from the University of Chicago and M.I.T. created fake resumes with the same qualifications, giving half of them black-sounding names (Lakisha Washington and Jamal Jones) and the other half white-sounding names (Emily Walsh and Greg Baker). Those with ‘‘white’’ names received 50% more callbacks for interviews than those with ‘‘black’’ names. (2)
Other recent studies have found that applicants who are Muslims, mothers or gay are also less likely to be called back, even if the employers swear they want a diverse workforce and believe they’re doing everything in their power to create one. (3)
The conventional wisdom in classical economics was that humans are “rational actors” who make decisions and behave in ways that maximize advantage and utility and minimize risk and costs. Many organizations still believe this. But throughout the past 20 years, research in situations like the above have proved, that humans make decisions and act in ways that are anything but rational. In fact, we act consistently, routinely and profoundly biased. Meaning, we have a tendency to make decisions and take action based on limited acquisition or processing of information, or based on self-interest, overconfidence, or attachment to past experience – without even knowing, that this is happening. (4) The following figure displays some of the many diversity traits, we are biased about in our decision making (5):
Bias has many faces and every one of us is biased. A large proportion of research on the topic comes from the US, but bias exists in Scandinavian organizations just as much as anywhere else in the world. A good place to start, when identifying your own biases, is with Harvard's Implicit Association Tests on topics including age, gender, sexuality etc. Try it out here, it's quite interesting: https://implicit.harvard.edu/implicit/demo/selectatest.html
In organizations, biased thinking is problematic because it causes people to make poor decisions – based on personal patterns, instead of corporate interests. Hiring people who resemble us, instead of people who are perfect for the job; developing products that meet our own criteria better than our customer’s criteria; and “doing what we’ve always done” instead of thinking and acting progressive and innovative are just a few severe consequences of biased thinking. (6)
So how can we overcome bias and recruit for true diversity? How do we expand our whole recruiting philosophy to include more professionals with a broad set of different experiences?
The first step might just be realizing that human judgement (and with it instinct, gut feeling and intuition) is not the best foundation for hiring decisions.
According to a recent study from the National Bureau of Economic Research, algorithms could in fact be capable of making hiring decisions with better results than humans - because they can reduce the impact of managerial mistakes or biases. (7)
And a recent study from Harvard Business School backs up this research. It found that when service-sector employers used a job test, they hired workers who tended to stay at the job longer — indicating that they were a better match. When employers overruled the test results to hire someone for more subjective reasons, the employees were significantly more likely to quit or be fired. (8)
So, knowing what we don't know and taking into consideration our own limitations is crucial, if we truly want to hire for diversity. Only then can we take action to help ourselves overcome these limitations - for example with the help of modern technology and strict focus.
How blind hiring & technology can help us to overcome bias
In the 1970s, symphony orchestras were still made up almost exclusively of white men. Around that time, many began to use a new method of hiring musicians: blind auditions. Musicians auditioned behind screens so the judges couldn’t see what they looked like, and walked on carpeted floors so the judges couldn’t determine if they were women or men — the women often wore heels. The Boston Symphony Orchestra pioneered the practice in 1952, and more orchestras began using it after a high-profile racial discrimination case was brought by two black musicians against the New York Philharmonic in 1969. Researchers from Harvard and Princeton took notice and studied the results; among other things, they found that blind auditions increased the likelihood that a woman would be hired by between 25% and 46%. That is huge.
Inspired by studies like the above, a couple of tech startups in Silicon Valley recently decided to do something about the diversity problem in their own industry.
One of these companies is called GapJumpers - working from the insight that, like musicians, coders created something that could easily be evaluated by their peers. But the founders realized from the study above that employers didn’t need to see prospective employees’ faces, or even learn their names. Eventually, they would have to meet, but by keeping the process blind for as long as possible, the founders figured they could help reduce bias. And, besides, they were already working with the best tool for masking identity in the history of humankind: software.
GapJumpers co-founders Kedar Iyer and Petar Vujosevic found a way to screen job applicants without showing employers any biographical information. Together with their clients, they create a list of skills required for the job, then design a relevant test that the applicant completes online. The first piece of information the hiring company sees is applicants’ scores, and, based on those, it selects candidates to interview. Only then does it see their names and resumes.
By now, GapJumpers has conducted more than 1,400 auditions for companies like Bloomberg and Dolby Laboratories. According to the company’s numbers, using conventional resume screening, about a fifth of applicants who were not white, male, able-bodied people from elite schools made it to a first-round interview. Using blind auditions, 60% did.
And GapJumpers is just one of a handful of Silicon Valley start-ups spreading technological fixes for biased hiring practices.
Gild, for example, has developed a software that finds candidates based on code they have published online and strips out biographical information before recommending them to employers.
Textio, a start-up with clients that include Starbucks and Microsoft, scans job listings and highlights language that data have shown to turn off certain candidates. For example, saying a job requires a ‘‘rock star’’ will draw more men than women; saying it requires a ‘‘passion for learning’’ attracts more women than men. Textio’s research has found that while most people dislike corporate jargon — ‘‘synergy,’’ ‘‘push the envelope’’ and so on — applicants who are not white dislike it even more and are less likely to respond to job listings that use that sort of language. (9)
Unitive is a hiring platform, enabling hiring managers to create job postings, review resumes, and manage interviews, while getting decision-making assistance during every step of the process. “We found a way to operationalize psychological findings so that hiring managers avoid bias as much as possible,” explains the founder, Laura Mather. She says, that the lack of diversity in tech was what lead her to start Unitive in the first place. The startup has recently completed a $7.5 million Series A round of funding.
Unitive’s platform was inspired in part by a study in which participants were asked to select a chief of police from a pool of two candidates: one woman and one man. The male and female candidates were randomly and alternately assigned to two different resumes, one of which showed experience, the other an impressive educational background. Overwhelmingly, participants chose the male candidate in both scenarios. When asked to explain their reasoning, they would point to education or experience—whatever the man happened to be stronger in.
In another iteration of the study, participants were asked beforehand whether experience or education were more important, and thereby pre- committed to judging candidates based on that criteria. In this version of the study, discrimination against women stopped.
Unitive uses this idea of “pre-commitment” to ensure that managers are objective in their hiring decisions. The app makes managers commit right away, while posting a job, to the most important criteria for the job. Later, when the manager is reviewing (anonymized) resumes and interviewing candidates, the app continuously reminds him or her of these criteria. (10)
What all of these organizations have in common, is that they shift the employers’ focus from resumes and gusto to skills and data. Thereby they achieve shorter recruiting processes, stronger focus on crucial qualities and an increased diversity in hires. The biggest plus for employers might be better matches between jobs and applicants, fewer people who quit or get fired, and thus longer cooperation periods - saving employers the investment of new hires.
So does this mean that hiring managers are useless in the future and will be replaced by robots soon? Probably not. As in other fields that are facing the influence of automation, we’re more likely to see a gradual shifting of priorities than an outright robot takeover. Data driven decision making will become exceedingly relevant.
Data beats opinion - from Silicon Valley to Billund
Google liked to hire people who possess something it calls "Googleyness", a measure of how well they will fit in. It’s not easily defined, but it includes things like enjoying fun and coping well with ambiguity. This emphasis on ‘‘cultural fit’’ isn’t unique to Google; it seems to have spread across industries as work hours have lengthened and as offices have further blurred the distinction between work and leisure.
And it should come as no surprise that hiring for cultural fit can be self-reinforcing. In 2014, Google for the first time released data on the makeup of its employees: 2% of them were black, and 3% were Latino. 70% were men. And, as at most tech companies, Asian-Americans made up a disproportionately large share of employees.
Trying to overcome its diversity issue, Google now turns to data driven decision making. Interviewers use standardized questions, mostly abandoning the brainteasers, according to Laszlo Bock, head of people operations. They stopped asking for SAT scores, a practice that research has shown underestimates the college performance of women and minority students. And they have tried to build diversity into the definition of ‘‘Googleyness’’, like whether someone has taken an interesting life path or solves problems in a different way. (11)
Silicon Valley is in many ways an odd place to be at the forefront of the solution to the problem of corporate homogeneity. Because its work force is strikingly uniform. The diversity numbers from Facebook, Twitter, Microsoft, Yahoo and other companies look no better than Google’s.
But it’s also one of the few industries in which even the biggest companies are young and agile enough to quickly change the way they do business. Therefore it's highly interesting to follow what's happening in terms of diversity in the Valley.
In Denmark, few organizations have deliberately introduced measures to overcome their diversity issues. One of them being LEGO.
After publicly being scoffed for his "men's club", CEO Jørgen Vig Knudstorp has made LEGO's recruitment policy more female-friendly in late 2015, as Berlingske Business reports. The chief executive himself realized, that many of LEGO's job offers had been too masculine in their expression, and that the company thus in part was responsible for its own diversity problem.
As a direct result, LEGO has worked on its job advertisements ever since, to make them more appealing to a broader group of people - by adjusting lingo, colours and design. And LEGO's own test results show, according to Knudstorp, that the new job listings provide a broader field of applicants than the old job listings. (12)
Organizational experiences, such as LEGO's and Google's, combined with the latest research and the business case behind diversity should serve as a great motivation for more organizations to commit to diverse hiring, in the near future. All we need, is the kind of progressive hiring managers and courageous executives that dare to make the necessary organizational changes. Fingers crossed.
If you want to read more on diversity, visit our website: diversitybyilab.com
4/6. Everyday Bias, Ross (2014)