Recognising our biases and presumptions is only the start of achieving the fairer, more inclusive, recruitment processes our businesses need and demand. Chris Platts offers a practical guide on how to establish a level playing field when selecting candidates.
Over the past few years, employers have increased their focus on improving diversity and inclusion. However, change has been slow. A new report from Business in the Community reveals that black employees hold just 1.5% of top management roles in the UK private sector and this figure has only increased by 0.1% since 2014.
Both the Black Lives Matter campaign and the coronavirus pandemic have highlighted more effectively than anything else in recent times the significant inequalities that still exist. Discrimination, whether associated with race, gender, age, sexuality, disabilities, or religious beliefs sadly is a reality that many face on a daily basis.
Equality and diversity
Currently, many businesses are making positive noises about their commitment to greater diversity and inclusion. There is an ambition to do better and to make real changes happen. But what can be done to ensure the momentum is not wasted? A good place to start is to undertake a thorough review of the hiring process.
What is fair hiring?
A hiring process is considered fair when candidates are not intentionally singled out for discriminatory treatment and when the overall effect of the selection process does not disproportionately disadvantage members of any one demographic group.
Two theories are commonly used to describe workplace discrimination: “disparate treatment” and “disparate impact.”
Disparate treatment occurs when a candidate is affected by an intentional act of overt discrimination. It is surprisingly common; in one seminal study, candidates with “white-sounding” names received 50% more interview requests than equally qualified black applicants.
Disparate impact, on the other hand, occurs when a selection process disproportionately excludes candidates based on a protected characteristic. This is unintentional, but is perpetuated by traditional hiring methods.
Our intuitions are biased
Human instincts are powerful, and, in many instances, they serve us well. However, when it comes to recruitment, they can lead to decisions based on irrelevant characteristics. This can lead to job seekers being rejected because it’s assumed that they won’t fit in, not because they can’t do the job.
Human behaviour is influenced by 175 different cognitive biases. While we may be aware of some of these, it’s unlikely you’ll ever be conscious of them all. We all judge people, whether we realise it or not, from the first moment we see them. These judgments can be spot on or they can be wildly off the mark. In an interview situation, basing a decision too heavily on intuition can have serious consequences.
All biases can be a barrier to greater diversity and inclusion, but “similar to me bias” can be a particular issue. We are programmed to gravitate towards the familiar – it is perceived to be the “safe” option. We feel more comfortable with people who look like us, sound like us, have a similar background and like the same things. This means hiring managers can easily be drawn to candidates who are most similar to them, rather than the ones most suited to the job.
Listening too much to intuition can mean the most suitable candidates are overlooked and the wrong people get hired. In research conducted with hiring managers, 50% admitted that their bad hires were a result of relying on their gut instincts too much. This seems to be a particular issue with younger, potentially less experienced hiring managers, with 61% of those under the age of 35 admitting their poor hires were the result of paying too much notice to their gut feel.
All biases can be a barrier to greater diversity and inclusion, but ‘similar to me bias’ can be a particular issue”
Tips on how to hire more fairly
A fair hiring process must mitigate the impact of these biases and provide objective data that allows candidates to be judged primarily on their ability rather than any other irrelevant characteristics or preconceptions. This must be done in each stage of the process from application to post-interview to create conditions that are as fair as possible, offering an equal chance to all.
- Blind CVs: hide demographic information that can lead to bias, including names, pictures, hobbies and interests
- Introduce technology-enabled assessments that measure performance on specific “real-life” tasks that are predictive of success in the role
- Agree an ideal candidate profile based on job relevant attributes and skills
- Agree a scoring system against the ideal candidate profile – decide which attributes are most important and ensure everyone in the process is aligned and agreed on how to score each attribute
- Avoid forming a decision too quickly. Candidates are often nervous at the start of an interview so put them at their ease and make them feel as comfortable as possible
- Have multiple interviewers present to produce a range of opinions on each candidate that helps to avoid bias blind spots
- Ensure each interviewer scores candidates separately and has the confidence to put their own opinion forward
- Write down the scores for each candidate before group discussions
- Be systematic – go through each attribute one by one. Stick to the facts and avoid generic questions like “What did you think?”
- Challenge any bias – if anyone mentions something about the candidate they like or dislike, which isn’t part of the criteria, discuss how relevant this is.
- Create a log to track decisions – share with all hiring managers to reinforce a consistent approach across the organisation.
- Audit your recruitment process for disparate impact and revisit any technology solutions to develop improvements as necessary
If organisations concentrate their efforts on creating fairer hiring processes, greater diversity and inclusion will naturally follow. Can technology make recruitment fairer? Absolutely, provided that technology is designed to avoid the typical shortcomings produced by human bias.
The benefit of automated assessment and screening tools is that they can be stripped of such biases and developed to solely focus on job-relevant signals rather than the noise associated with demographic indicators. The brains of human resume reviewers cannot be similarly stripped of such biases.