News from FINDHR: How to Counter Algorithmic Discrimination in Hiring – Toolkits for Policymakers, Software Developers and HR Professionals

Just Hiring!

FINDHR – a major European research project is releasing new tools, approaches, and concrete recommendations aimed at tackling discrimination in job recruiting caused by AI hiring tools.

AI screening job applications? That’s already a reality.

Hiring is expensive and time-consuming. Employers often deal with hundreds of applicants per job opening and are often under great time pressure. This pressure explains why more and more public and private employers are turning to Applicant Tracking Systems (ATS), algorithms that support them in pre-sorting or ranking candidates. However, these hiring tools can reinforce existing patterns of discrimination or create new ones – often without the knowledge of the people using these tools.

Proven Risks of Discrimination in AI-Assisted Hiring

AI-assisted hiring systems promise time savings for HR professionals. However, real-world experiences show that these systems can reinforce existing patterns of discrimination—or create new ones—often without the awareness of those using them.

As part of the Horizon Europe project «FINDHR – Fairness and Intersectional Non-Discrimination in Human Recommendation» we – a European, interdisciplinary network from academia, industry, and civil society – joined forces and spent the last three years working on solutions to counter algorithmic discrimination in hiring.

The FINDHR project focuses especially on intersectional discrimination, where combinations of personal characteristics (such as gender, age, religion, origin, or sexual orientation) generate new or multiplied forms of discrimination.

Our research demonstrates that discrimination in automated hiring is not a theoretical concern but a lived reality for many. Interviews conducted with affected individuals in seven European countries (Albania, Bulgaria, Germany, Greece, Italy, the Netherlands, and Serbia) revealed feelings of powerlessness and frustration, with applicants often receiving only automated rejections outside working hours, despite strong qualifications and repeated applications.

How to counter Algorithmic Discrimination in hiring?

Because the potential for discrimination in these systems is systematic and scalable, using AI in hiring demands a high level of responsibility of all stakeholders involved – from software developers, HR professionals, and policymakers alike. To effectively address discrimination by AI systems in job applications it must be tackled on all of these levels. That’s why we have created three tailored Toolkits – one for each group – offering key insights into our research, background information and actionable recommendations to tackle algorithmic discrimination.

Find all three toolkits here: https://findhr.eu/toolkits/

About FINDHR

The toolkits are part of the FINDHR project. FINDHR (Fairness and Intersectional NonDiscrimination in Human Recommendation) is an interdisciplinary research project, funded throughout 2022-2025 by the EU Framework Horizon Europe.

FINDHR is a consortium of 11 partners out of academia (6 universities and institutes), industry (2 companies) and civil society (3 NGOs) based in Belgium, Germany, Greece, Italy, the Netherlands, Spain, and Switzerland. Its research draws on an interdisciplinary background and expertise, covering legal, computational, industrial, societal and ethical perspectives. FINDHR offers an array of training, tools and information. This includes reports that describe the experiences of people facing higher risk of being discriminated through an AI assisted hiring process.

All FINDHR resources can be found at: www.findhr.eu.

Disclaimer:

The Toolkits are part of a project that has received funding from the European Union’s Horizon Europe research and innovation program under grant agreement No 101070212.

Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union. Neither the European Union nor the granting authority can be held responsible for them.

Subscribe to our feminist newsletter
We keep your data private and share your data only with third parties that make this service possible. Read our Privacy Policy.

Discover more from

Subscribe now to keep reading and get access to the full archive.

Continue reading