The case of Mobley v. Workday
Table of Contents
In February 2023, Derek Mobley applies for more than a hundred positions. Many of these run via the globally widespread HR platform Workday. The platform uses AI systems to pre-filter applications. Mobley is rejected every time. The answers often come after a few minutes. Once less than an hour after receiving the application. At this point it is 1:50 at night.
Mobley is African American, over forty years old and lives with anxiety disorders and depression. He’s filing a lawsuit against Workday. His accusation is: systematic discrimination based on age, origin and health restrictions.
Workday defends itself by arguing that it is only a software provider. The employers would make decisions. The court does not follow this argument and allows the lawsuit. Certification as a class action lawsuit will take place in May 2025. Workday itself claims to have automatically rejected around 1.1 billion applications in the relevant period. Millions of those affected could join the lawsuit.
What companies from Germany have to learn from this
The case concerns a recruiting tool. It’s about job rejections and applicants who feel discriminated against. But the real question is aimed at every company that uses AI in recruiting.
Suppose you use an AI system for pre-selection. They rely on the technology to evaluate objectively and fairly. A few months later, an internal analysis is carried out to see how the system works. The evaluation shows that the tool mainly favors men under the age of forty. Other applicants systematically fall through the cracks. An applicant sues for age discrimination. Your company argues that the decision was made by the AI.
Under German law, this argument will not hold water. Many companies use AI systems without checking how they work. There are no clear processes for monitoring. There is a lack of awareness, documentation and accountability. As soon as a lawsuit is filed, it will become clear that responsibility remains with the company. The advice to only use software will not protect you.
What to do now
German companies should immediately review the use of AI in the application process. This includes, among others, the following steps:
- Clear responsibilities within the company for AI systems
- regular reviews of the decision criteria
- Documentation of functionality and risks
- Raising awareness of HR departments
- Development of internal guidelines for fair AI use
The crucial question
Does your company use AI systems in recruiting? And has it been checked whether these systems work without discrimination?
I would be happy to support you in making your processes legally compliant and identifying risks at an early stage.
