“The funniest thing I heard recently was I had heard for a part that I was too sophisticated. And I was like, ‘Oh, that sounds nice.’ I like that feedback. I didn’t get the part, but I’m a very sophisticated person. And then I found out later that they actually said ‘old.’ I want to make a translation sheet for Hollywood that’s all the feedback your agents give you and what it really means.”
That’s actress Olivia Wilde discussing on “The Howard Stern Show” why she wasn’t hired for the 2013 hit film “The Wolf of Wall Street.” Instead, the part went to Margot Robie, seven years Wilde’s junior.
While coded language and judgments based on factors like age, sex and race may be the norm – and and even widely accepted – in Hollywood, it’s different for those of us who don’t work in show business. Hiring bias certainly isn’t widely accepted – but that doesn’t mean it isn’t widespread. Bias is almost hardwired into our human nature: Familiarity is comfortable, and what’s more familiar to us than ourselves?
Career expert Gail Tolstoi-Miller, author of the book Networking Karma, has seen some crazy examples of hiring bias in her line of work. One of her clients even refused to hire a candidate because she was wearing white pumps! At her networking events, Tolstoi-Miller doesn’t allow attendees to put job titles or company names on their nametags. “Everyone sees ‘CEO’ on a nametag and makes a beeline for them,” she says. “A janitor is just as important as a CEO.”
Anonymizing candidate selection is certainly an effective way to help remove hiring bias from the recruitment process. But it’s far from the only way. Consider implementing one or more of these methods to ensure that your organization’s hiring process is bias-free.
This software program helps companies like Google and Dolby select talent the way the judges on The Voice select singers – through blind auditions. Instead of looking at a resume and assessing an applicant based on name, schools attended, companies worked for and titles held, the only thing employers can measure is candidates’ performance on a skills-based test. Researchers from Harvard and Princeton found that blind auditions increased the likelihood that female musicians would be hired by an orchestra by 25 to 46 percent.
In technology, there’s a perception that women simply aren’t applying for jobs, hence numbers like Google’s, where women account for only 18 percent of the technical workforce. But GapJumpers has found that 54 percent of its blind auditioners are women. Moreover, almost 59 percent of top performers in blind auditions are women. And it’s not just women who benefit from blind auditions: The process also benefits people with all types of educational backgrounds, be it state universities, community colleges or coding bootcamps. No longer do Ivy League grads automatically have the upper hand.
Companies like Twitter and Starbucks use Textio to help them write better job descriptions. Co-founded by a linguistics Ph.D., the software provides a field for employers to enter their job descriptions, and offers feedback as you type. Along the way, it uncovers key phrases and spots biases. It highlights words and phrases and classifies them as “negative,” “positive,” “repetitive,” “masculine” and “feminine.” (Want to attract more female candidates? Avoid terms like “rock star,” “ninja” and “killer.”) It also offers insights about strengths and problems with job descriptions, like good use of active language or too many cliches and jargon. Job descriptions receive a score, along with recommendations for how to improve.
This is another tool for writing better job descriptions. Product manager Kat Matfield developed the Gender Decoder for Job Ads after reading a research article that appeared in the Journal of Personality and Psychology, “Evidence That Gendered Wording in Job Advertisements Exists and Sustains Gender Inequality.” The researchers found that women were far less likely to apply to job descriptions with masculine-coded language, whereas the likelihood that men would apply to job descriptions with feminine-coded job descriptions was negligibly affected. Matfield drew the original list of gender-coded words from this paper.
Masculine-coded words include:
Feminine-coded words include:
Developed at Harvard, the IAT helps to uncover thoughts that test-takers are unconsciously hiding from themselves, and measures attitudes and beliefs that people may be unwilling or unable to report. The IAT measures the strength of associations between a concept (say, gay people) and evaluations (good, bad) or stereotypes (stylish, clumsy). Employers can use this information to become more self-aware and proactively check their biases before they hire.
5. Search Party
The Search Party recruitment marketplace helps employers find candidates based on data science and machine learning. When you search, it serves up anonymous profiles that just show you enough data to make an educated hiring decision while removing bias inducing information like gender and ethnicity. By keeping the profiles anonymized until after you’ve decided who you’d like to interview, you help defeat your own biases and interview the best candidate for the job.
6. Collaborative Hiring
A collaborative hiring process helps people to check their biases and uncover blind spots. If you enlist employees with a diverse set of experiences to help interview and assess candidates, you’re much more likely to end up with a diverse staff. Collaborative hiring helps to safeguard companies from a number of cognitive biases – the inherent thinking errors that humans make in processing information. Some of these biases include:
- Confirmation bias: The tendency for people to seek out information that conforms to their preexisting views, and ignore information that goes against their views
- Ingroup bias: The tendency to favor members of your own group
- Projection bias: The thinking that others have the same priority, attitude or belief as you do
- Selective perception: The process of perceiving what we want to while intaking information, while ignoring stimuli that contradicts our beliefs or expectations
- Status quo bias: A preference for the current state of affairs
Hiring software like Recruiterbox helps colleagues collaborate by allowing them to collect and share interview feedback, assign and delegate tasks, and share candidates with each other. By keeping a record of users’ activity and sending automated reminders, the software helps team members stay on the same page, which means high-quality candidates are less likely to be overlooked.
Beyond being the right thing to do and putting you into compliance with the EEOC, removing hiring bias from the recruitment process is just good business sense. In a study of 366 public companies across the globe, consulting firm McKinsey found that gender-diverse companies are 15 percent more likely to outperform their non-gender diverse counterparts. Ethnically diverse companies are 35 percent more likely to outperform. And what organization isn’t looking for a way to increase earnings?