Welcome to the age of application algorithms

Derek Mobley: an experienced IT specialist, black American, over 40 - and frustrated. After more than 100 unsuccessful applications, he draws the line: not because he lacks motivation, but because he suspects that AI could be systematically weeding him out.

The software that was used in all cases: an AI-supported application tool from Workday, one of the largest providers of HR software in the world. A rejection minutes after submitting? Not an isolated case for Mobley, but part of a pattern. He is now suing - and the case has the potential to shake up the global HR industry.

When algorithms learn discrimination

What seems like an isolated case has long since become a system: more and more companies are using artificial intelligence to pre-select applications. Efficient, cost-saving - but also highly dangerous. Because when AI learns from old data, it also adopts old prejudices. Mobley's complaint, which many other applicants have since joined, is based on precisely this: systematic age discrimination.

The court allowed the lawsuit to proceed as a class action - a disaster for Workday, which processes millions of applications every year. If the suspicion is confirmed, thousands of cases in the USA could be affected - and the model could be shaken globally. After all, such tools have long been used in Europe too.

AI in the application process: curse or progress?

But AI in HR doesn't have to be fundamentally bad. Experts emphasize this: When used correctly, AI can help to discover talents that would otherwise be overlooked - especially in the case of non-linear CVs. Further training needs or hidden skills in the existing team can also be better analyzed using AI.

HR consultant Andreas Günzel takes a similar view: AI can make the recruiting process faster and more targeted - but it can never be the sole decision-making authority. The human component remains indispensable. But this is precisely where the problem lies: in practice, AI recommendations are often not questioned, but blindly accepted.

From Amazon to Workday: when AI revisits old mistakes

Amazon has already provided a warning example: The company's own AI assessed women as unsuitable - because the training material consisted almost exclusively of male applications. As a result, the software automatically rejected CVs with female first names. Amazon scrapped the project - but the damage was done.

The same risk now looms with Workday. If training data is unbalanced or biased, discrimination is passed on to the AI. And if it then screens out masses of applicants because they appear too old, too female, too "deviant" - then a tool becomes an automated filter for systematic discrimination.

AI can open doors - or systematically close them

Those who outsource decision-making processes to algorithms still bear responsibility. And if companies with millions of pieces of software cannot ensure that no applicants are rejected across the board, it is not trust that belongs to them - but a lawsuit.

Subscribe to the newsletter

and always up to date on data protection.