Personnel Today
  • Home
    • All PT content
  • Email sign-up
  • Topics
    • HR Practice
    • Employee relations
    • Learning & training
    • Pay & benefits
    • Wellbeing
    • Recruitment & retention
    • HR strategy
    • HR Tech
    • The HR profession
    • Global
    • All HR topics
  • Legal
    • Case law
    • Commentary
    • Flexible working
    • Legal timetable
    • Maternity & paternity
    • Shared parental leave
    • Redundancy
    • TUPE
    • Disciplinary and grievances
    • Employer’s guides
  • AWARDS
    • Personnel Today Awards
    • The RAD Awards
  • Jobs
    • Find a job
    • Jobs by email
    • Careers advice
    • Post a job
  • Brightmine
    • Learn more
    • Products
    • Free trial
    • Request a quote
  • Webinars
  • Advertise
  • OHW+

Personnel Today

Register
Log in
Personnel Today
  • Home
    • All PT content
  • Email sign-up
  • Topics
    • HR Practice
    • Employee relations
    • Learning & training
    • Pay & benefits
    • Wellbeing
    • Recruitment & retention
    • HR strategy
    • HR Tech
    • The HR profession
    • Global
    • All HR topics
  • Legal
    • Case law
    • Commentary
    • Flexible working
    • Legal timetable
    • Maternity & paternity
    • Shared parental leave
    • Redundancy
    • TUPE
    • Disciplinary and grievances
    • Employer’s guides
  • AWARDS
    • Personnel Today Awards
    • The RAD Awards
  • Jobs
    • Find a job
    • Jobs by email
    • Careers advice
    • Post a job
  • Brightmine
    • Learn more
    • Products
    • Free trial
    • Request a quote
  • Webinars
  • Advertise
  • OHW+

Artificial intelligenceAge discriminationLatest NewsDiscriminationRace discrimination

AI recruitment systems to be investigated over discrimination worries

by Adam McCulloch 14 Jul 2022
by Adam McCulloch 14 Jul 2022 Photo: Shutterstock
Photo: Shutterstock

The UK privacy watchdog is set to probe whether employers using artificial intelligence in their recruitment systems could be discriminating against ethnic minorities and people with disabilities.

John Edwards, the information commissioner, has announced plans for an inquiry into the automated systems that screen job candidates, including looking at employers’ evaluation techniques and the AI software they use.

Over recent years, concerns have mounted that AI, in many cases, discriminates against minorities and others because of the speech or writing patterns they use. Many employers use algorithms to whittle down digital job applications enabling them to save time and money.

Regulation has been seen as slow to take up the challenge presented by the technology with the TUC and the All Parliamentary Group on the Future Work keen to see laws introduced to curb any misuse or unforeseen consequences of its use. Frances O’Grady, TUC general secretary, said: “Without fair rules, the use of AI at work could lead to widespread discrimination and unfair treatment — especially for those in insecure work and the gig economy.”

AI adoption in the UK

How HR can mitigate the risks of AI

AI adoption: Skills shortages means UK lagging behind Europe

Why algorithms at work aren’t all bad

How HR can get ahead of AI regulation

Edwards pledged that his plans over the next three years would consider “the impact the use of AI in recruitment could be having on neurodiverse people or ethnic minorities, who weren’t part of the testing for this software”.

Autism, ADHD and dyslexia are included under the umbrella term “neurodiverse”.

A survey of recruiting executives carried out by consulting firm Gartner last year found that almost all reported using AI for part of the recruiting and hiring process.

The use of AI in recruitment process is seen as a way of removing management biases and prevent discrimination, but could be having the opposite effect, because the algorithms themselves can amplify human biases.

Earlier this year Estée Lauder faced legal action after two employees were made redundant by algorithm. Last year, facial recognition software used by Uber, related to AI processes, was alleged to be in effect racist. And in 2018, Amazon ditched a trial of a recruitment algorithm that was discovered to be favouring men and rejecting applicants on the basis they went to female-only colleges.

A spokesperson for the Information Commissioner’s Office said: “We will be investigating concerns over the use of algorithms to sift recruitment applications, which could be negatively impacting employment opportunities of those from diverse backgrounds. We will also set out our expectations through refreshed guidance for AI developers on ensuring that algorithms treat people and their information fairly.”

These algorithms have essentially been left to their own devices, leading to thousands of people having negative impacts on their opportunities” – Natalie Cramp, data science expert

The ICO’s role is to ensure people’s personal data is kept safe by organisations and not misused. It has the power to fine them up to 4% of global turnover as well as to order undertakings from them.

Under the UK’s General Data Protection Regulation (which is enforced by the ICO), people have the right to non-discrimination under the processing of their data. The ICO has warned in the past that AI-driven systems could lead to outcomes that disadvantage particular groups if the data set the algorithm is trained and tested on is not complete. The UK Equality Act 2010, also offers people protection from discrimination, whether caused by a human or an automated decision-making system.

In the US, the Department of Justice and the Equal Employment Opportunity Commission warned in May that commonly used algorithmic tools including automatic video interviewing systems were likely to be discriminating against people with disabilities.

Legal comment

Senior counsel at Taylor Wessing, Joe Aiston, said that in addition to issues of unconscious bias “which inevitably regularly impact companies’ hiring processes where human decisions are being made”, care needs to be taken when using any form of artificial intelligence software when recruiting.

A particular issue for employers is that the software they may opt to use to streamline the selection process could be utilising discriminatory selection processes without their knowledge” – Joe Aiston, Taylor Wessing

Whilst some AI recruitment software is marketed as working to avoid biases and potential discrimination in the recruitment process, depending on the algorithms and decision making processes used, there is a risk that such software could result in discrimination issues of its own. For example, if recruitment software analyses writing or speech patterns to determine who weaker candidates might be, this could have a disproportionately negative impact on individuals who do not have English as a first language or who are neurodiverse. A decision made by AI to reject such a candidate for a role purely on this basis could result in a discrimination claim against the employer despite that decision not having been made by a human.

“A particular issue for employers is that the software they may opt to use to streamline the selection process could be utilising discriminatory selection processes without their knowledge. It is therefore important that the supplier of the software is made to clearly set out what selection criteria and algorithms it is intended to be used and how these will be applied in order that the company can assess any potential discrimination risk and so that this can be rectified.”

The law and the regulators were playing catch-up with this relatively new area of potential risk, Aiston added, but it was likely that further regulation would be introduced.

Natalie Cramp, CEO of data science consultancy Profusion, said the ICO’s investigation into whether AI systems showed racial bias was very welcome and overdue. This should only be a first step in tackling the dangerous of discriminatory algorithms, she added.

“There have been a number of recent incidents where organisations have employed algorithms for functions such as recruitment, and the result has been racial or sexiest discrimination. In many cases the problem was not uncovered for several months or even years. This is because bias has been either built into the algorithm itself or from the data that has been used. Critically, there has then been little human oversight to determine whether the outputs of the algorithm are not only correct but also fair.

“These algorithms have essentially been left to their own devices, leading to thousands of people having negative impacts on their opportunities,” said Cramp.

“Ultimately an algorithm is a subjective view in code, not objective. Organisations need more training and education to both verify the data they use and challenge the results of any algorithms. There should be industry wide best practice guidelines that ensure that human oversight remains a key component of AI. Organisations cannot rely on one team or individual to create and manage these algorithms.”

Sign up to our weekly round-up of HR news and guidance

Receive the Personnel Today Direct e-newsletter every Wednesday

OptOut
This field is for validation purposes and should be left unchanged.

An ICO investigation alone will not tackle these issues, she added. “Without this safety net people will quickly lose confidence in AI and with that will go the huge potential for it to revolutionise and better all our lives.”

Latest HR job opportunities on Personnel Today


Browse more human resources jobs

Adam McCulloch

Adam McCulloch first worked for Personnel Today magazine in the early 1990s as a sub editor. He rejoined Personnel Today as a writer in 2017, covering all aspects of HR but with a special interest in diversity, social mobility and industrial relations. He has ventured beyond the HR realm to work as a freelance writer and production editor in sectors including travel (The Guardian), aviation (Flight International), agriculture (Farmers' Weekly), music (Jazzwise), theatre (The Stage) and social work (Community Care). He is also the author of KentWalksNearLondon. Adam first became interested in industrial relations after witnessing an exchange between Arthur Scargill and National Coal Board chairman Ian McGregor in 1984, while working as a temp in facilities at the NCB, carrying extra chairs into a conference room!

previous post
Case involving Citi banker who was called ‘old’ to be reheard
next post
Amazon announces 4,000-role recruitment drive

You may also like

CIPD appoints expert in AI to boost support...

8 May 2025

How can businesses build protections for gig workers?

7 May 2025

Quarter of employees worried AI will threaten jobs...

28 Apr 2025

How can HR battle the rise in identity...

27 Apr 2025

Google concerned by slow AI take-up in UK

25 Apr 2025

HR teams build AI use but seek more...

14 Apr 2025

The future of work: is the UK workforce...

11 Mar 2025

New employment rights top priority for HR teams

10 Mar 2025

Singapore’s biggest bank slashes 10% of its workforce...

25 Feb 2025

What does the TV show Severance tell us...

14 Feb 2025

  • 2025 Employee Communications Report PROMOTED | HR and leadership...Read more
  • The Majority of Employees Have Their Eyes on Their Next Move PROMOTED | A staggering 65%...Read more
  • Prioritising performance management: Strategies for success (webinar) WEBINAR | In today’s fast-paced...Read more
  • Self-Leadership: The Key to Successful Organisations PROMOTED | Eletive is helping businesses...Read more
  • Retaining Female Talent: Four Ways to Reduce Workplace Drop Out PROMOTED | International Women’s Day...Read more

Personnel Today Jobs
 

Search Jobs

PERSONNEL TODAY

About us
Contact us
Browse all HR topics
Email newsletters
Content feeds
Cookies policy
Privacy policy
Terms and conditions

JOBS

Personnel Today Jobs
Post a job
Why advertise with us?

EVENTS & PRODUCTS

The Personnel Today Awards
The RAD Awards
Employee Benefits
Forum for Expatriate Management
OHW+
Whatmedia

ADVERTISING & PR

Advertising opportunities
Features list 2025

  • Facebook
  • Twitter
  • Instagram
  • Linkedin


© 2011 - 2025 DVV Media International Ltd

Personnel Today
  • Home
    • All PT content
  • Email sign-up
  • Topics
    • HR Practice
    • Employee relations
    • Learning & training
    • Pay & benefits
    • Wellbeing
    • Recruitment & retention
    • HR strategy
    • HR Tech
    • The HR profession
    • Global
    • All HR topics
  • Legal
    • Case law
    • Commentary
    • Flexible working
    • Legal timetable
    • Maternity & paternity
    • Shared parental leave
    • Redundancy
    • TUPE
    • Disciplinary and grievances
    • Employer’s guides
  • AWARDS
    • Personnel Today Awards
    • The RAD Awards
  • Jobs
    • Find a job
    • Jobs by email
    • Careers advice
    • Post a job
  • Brightmine
    • Learn more
    • Products
    • Free trial
    • Request a quote
  • Webinars
  • Advertise
  • OHW+