Employee Survival Guide® - Algorithmic Bias in Hiring: The Case of Derek Mobley vs. Workday Inc

Episode Date: December 13, 2024

Comment on the Show by Sending Mark a Text Message.This episode is part of my initiative to provide access to important court decisions  impacting employees in an easy to understand conversational fo...rmat using AI.  The speakers in the episode are AI generated and frankly sound great to listen to.  Enjoy!Can technology uphold fairness, or is it silently perpetuating bias? Discover the complex world of AI in the hiring process as we unravel the case of Derek Mobley versus Workday Inc. Mobley, a black man over 40 with mental health conditions, challenges the algorithms that he claims have unjustly barred him from over 100 job opportunities. Despite the court's decision not to categorize Workday as an employment agency, the episode prompts a pivotal discussion about the responsibilities HR tech companies might bear when their software influences employment outcomes. We grapple with the concept of disparate impact discrimination and what it means when unintentional practices result in a skewed playing field for protected groups.From the courtrooms to the broader tech landscape, the implications of this case ripple across the HR industry and beyond. We weigh the necessity for transparency, accountability, and fairness in algorithmic decision-making while acknowledging the delicate balance with innovation. Listen as we delve into the potential for increased scrutiny and regulation of HR tech companies, and encourage job seekers to critically engage with the data that drives these systems. Join us in exploring how technology shapes our employment landscape and what needs to change to ensure it does so equitably. If you enjoyed this episode of the Employee Survival Guide please like us on Facebook, Twitter and LinkedIn. We would really appreciate if you could leave a review of this podcast on your favorite podcast player such as Apple Podcasts. Leaving a review will inform other listeners you found the content on this podcast is important in the area of employment law in the United States. For more information, please contact our employment attorneys at Carey & Associates, P.C. at 203-255-4150, www.capclaw.com.Disclaimer: For educational use only, not intended to be legal advice.

Transcript
Discussion (0)
Starting point is 00:00:00 Hey, it's Mark here and welcome to the next edition of the Employee Survival Guide where I tell you, as always, what your employer does definitely not want you to know about and a lot more. Welcome back. Today we're taking a deep dive into a case that's making waves in the world of tech and hiring. Derek Mobley vs Workday Inc. It's not just a legal battle, it really gets you thinking. How are algorithms used in hiring? Can companies like Workday be held responsible if there's bias?
Starting point is 00:00:36 Yeah, it's a really interesting case, isn't it? It shows just how much AI is affecting our lives now, like even finding a job. And it all revolves around Derek Mobley, a black man, over 40, who claims he was rejected from over 100 jobs, and all of them used Workday software for some part of hiring. What's really striking is he wasn't just applying to one company or even one industry. All sorts of jobs, different sectors, and every time he hit this Workday wall. Okay, so before we jump in too deep, can you give us some background on Workday? What exactly do they do? Why are they the focus here? So Workday is a big name in HR tech, cloud-based software, stuff like HR, payroll, and well this is important for the case, talent
Starting point is 00:01:16 management. They work with tons of companies, lots of fortune 500 firms even. So their software it could be affecting a huge number of people applying for jobs. Wow they're not just some small startup then. They're a major player in this hiring space. Exactly. And that's partly why this case is so big. It's not just about one guy looking for work. It's about the potential for bias,
Starting point is 00:01:37 algorithmic bias on a massive scale. OK, back to Mobley. He was applying for all these jobs each time running into Workday. What was that experience like for him? Well, he'd find postings on LinkedIn, pretty standard stuff, but clicking apply, he'd get redirected to a Workday platform on the company's website. So even though he's applying to different companies, it's always Workday behind the
Starting point is 00:01:59 scenes handling his application. That's right. Every time, new Workday account, upload his resume, Every time new workday account upload his resume. Sometimes even these workday assessments like personality tests. Workday is collecting a lot of data on that. And that's crucial right? Algorithms need data to learn and that data it's not just what's on your resume. Like what else?
Starting point is 00:02:16 Well think about it. When you create a workday account maybe your age, location, education history and those personality tests they might show your personality traits, are you emotionally stable? How about risk aversion? Hmm, yeah, I see your point. It's all data that an algorithm could use to make decisions about you?
Starting point is 00:02:34 Exactly, and this is where Mobley's concerns start. He argues Workday's tools are discriminatory. They use biased data, things like personality tests that might put certain people at a disadvantage. So it's not saying he's just unlucky. There's something wrong with how Workday's algorithms are making decisions. Yeah, inherently biased, that's what he's claiming,
Starting point is 00:02:52 against people like him. Black, over 40, and with mental health conditions like anxiety and depression. He was rejected from over 100 jobs. It's not just a few rejections here. And get this, some of those rejection emails, middle of the night, like 2 a.m. That is kind of creepy, you gotta admit. Definitely sounds like automation was involved. Makes you wonder how much human judgment was really there versus an automated decision made by Workday's software.
Starting point is 00:03:18 That's a big question, goes to the heart of this case. But before we go further, what does Mobley mean when he says Workday's tools are discriminatory? Does he mean they're designed to discriminate against certain groups? Not necessarily. He's arguing it's what's called disparate impact discrimination. Okay, disparate impact. Sounds like legal jargon. Can you explain that for us? It is legal stuff, but super important here. Disparate impact means even if a practice doesn't mean to discriminate, if it ends up disproportionately harming a protected group, well legally that can still be discrimination. Ah, so even if Workday wasn't trying to discriminate, if their algorithms have that effect, they could
Starting point is 00:03:58 still be in trouble. Exactly. That's Mowgli's point. Even if companies using Workday mean well, the software itself can lead to bad outcomes, discrimination. Interesting. So it's not just about intent, but the actual impact. Right on. And in this case, it makes us face the possibility of algorithmic bias in a system that's relying more and more on AI for big decisions.
Starting point is 00:04:20 Okay, we've got the background on Workday. Mowgli's experience of constantly being judged by their algorithms and this idea of disparate impact. What are Mobley's actual legal claims? What is he arguing in court? Actually a couple of different arguments, and they both hinge on whether Workday can be held liable for the discrimination, not just the individual employers. Okay, now I'm really interested.
Starting point is 00:04:43 What are those arguments? First one, Workday is an employment agency under laws like Title VII of the Civil Rights Act, the Age Discrimination and Employment Act, ADA, that kind of thing. So he's saying they're in the business of finding people jobs, like a regular employment agency. That was his initial argument, yeah. Because Workday is so deep in the hiring process,
Starting point is 00:05:03 the gatekeepers, they should have the same anti-discrimination rules as any other agency. Makes sense. I mean, they are screening candidates, right? They are, but the court actually dismissed that specific claim. They said Workday doesn't technically procure employees, legally speaking, not actively finding people to fill jobs. They just provide the software, the platform. So Workday is off the hook then?
Starting point is 00:05:26 Not entirely. Here's where it gets a bit tricky, legally. The court did say, while not an employment agency, Mobley's got a case, a plausible one, that Workday was acting as an agent of those employers. Hold on. Employment agency and agent? What's the difference?
Starting point is 00:05:41 They both seem to be involved in helping companies find employees. It is subtle but important difference. An employment agency, their main business is connecting job seekers and employers. Think headhunting firms, temp agencies, they actively go out and recruit and place people. So Workday isn't doing that, they're giving the software. Right, but being an agent of the employer, they take on some of the employers responsibilities. The courts view if Workday is doing core, that's exactly it. And that's why this case is so big for the whole HR tech world. If the court sides with Mobley on this agent idea, big precedent. Software companies
Starting point is 00:06:19 could be held accountable for algorithmic bias in their hiring tools. That is huge. But Workday is fighting back hard, I bet. Not just going to accept liability. Of course not. They've got their legal team working on their defense. Main argument. We're just the software provider, a neutral platform, basically saying our customers,
Starting point is 00:06:36 the employers, they set the hiring criteria, make the decisions. So don't blame us, blame the companies using our software. That's the gist. But it's not quite that simple. Why not? What's Mobley's counter-argument? Why not? It can't be that easy, can it? What's Mobley saying? Well, think of it this way. Imagine buying a car, but the brakes are faulty. You get in an accident, you wouldn't just blame yourself, would you? You'd hold the carmaker responsible too.
Starting point is 00:07:02 Yeah, for sure. Especially if they knew about the bad brakes and didn't do anything to fix them. Exactly. And that's part of what Mobley's arguing. He's saying Workday knows their algorithms can be biased. There are studies out there showing how AI can carry over those biases from society, discriminate based on race, gender, all sorts of things. So he's saying Workday is aware, or should be aware, that their software could lead to discrimination. They can't just play dumb.
Starting point is 00:07:28 That's right. And he says they haven't done enough to deal with those potential biases. So we've got this back and forth, right? Workday saying, we're just the software guys up to the employers to use it fairly. Mobly's side is, no, you built the tool, you knew it could be biased,
Starting point is 00:07:43 you're responsible for what it does, even if you built the tool, you knew it could be biased, you're responsible for what it does even if you didn't want to discriminate. You've got it. It's a really complex situation, not black and white at all. Legal stuff, ethical questions, the court's got to figure it all out. Okay, the case hinges on whether Workday's an agent of the employers and if they can be held responsible for any discrimination. But there's something else I'm wondering.
Starting point is 00:08:05 Mobley hasn't said which companies he thinks actually discriminate against him, right? That's true, and Workday's using that in their defense. Okay, you say you were discriminated against, but by who? Show us the proof that specific employers were biased against you because you're black, over 40, or have a disability. So Mobley's got a challenge.
Starting point is 00:08:23 He has to show the connection. Workday's algorithms plus what specific employers did led to those unfair rejections. You got it. It's not enough to say Workday's software might be biased generally. He needs to show how that bias played out for him across all those job applications and proving discrimination. Never easy, but proving it's from algorithmic bias, even tougher.
Starting point is 00:08:46 Totally agree. So what are the hurdles he's facing in proving his case? Well, for starters, he needs data showing exactly how Workday's algorithms were used in his specific applications. What were the screening criteria? What factors were weighted more heavily? How did he score on those Workday assessments?
Starting point is 00:09:02 That kind of thing. I bet getting that data from workday is an uphill battle. Probably, yeah. Companies guard their algorithm closely, trade secrets, that whole thing. Mowgli might have to fight tooth and nail for that information. Okay, let's say he gets the data. What does he do with it? To prove his case.
Starting point is 00:09:18 He has to show a pattern of rejections, and it can't be because of anything other than his race, age, or disability. For instance, if he consistently aced the skills assessments but kept getting rejected for jobs needing those skills, that could be evidence of bias. It all comes down to showing a clear connection. Workday's algorithms, what the employers did, and the discrimination he faced. Exactly. It's a high bar to clear.
Starting point is 00:09:43 But if he can do it, big implications, not just for him, but for the whole industry. Okay, let's play this out. He wins the case. What then? What happens to Workday? Well, there's the money, of course. If the court says they're liable for discrimination, they could have to pay mobile damages, could be a lot of money. But the bigger thing, the legal precedent. What do you mean by legal precedent? If Workday loses this case, could open the floodgates for lawsuits against other HR tech companies, send a message. You can't just say, we're a neutral platform. You've got a responsibility to make sure your algorithms are fair, don't lead to discrimination. So a win for Mobley could change the whole game for the industry.
Starting point is 00:10:21 Definitely possible. Companies like Workday might have to be way more transparent about how their algorithms work, more proactive about checking for bias, and taking responsibility for the decisions their software is involved in. That's a huge shift. Shows how important this case really is. It's not just one guy and his job search. It's about the role of algorithms in all our lives. Can technology make inequality worse? Or can it challenge it? Exactly. And as AI gets more and more powerful, that debate's only gonna get more intense.
Starting point is 00:10:49 So back to this specific case, what's next? Where do things stand now? The court's given Mobley a chance to revise his complaint, provide more specific evidence to back up his claims. It's a crucial moment for him to bolster his case and make those connections we've been talking about. He's gotta show that concrete link between Work workdays algorithms and the rejections he faced, right? He needs to prove workdays actions as that agent of the employers directly led to him being rejected from those jobs
Starting point is 00:11:16 He's got a lot to do but if he pulls it off the impact could be massive Absolutely a case worth keeping an eye on. This deep dive has been fascinating. I can't wait to see how it all plays out. Thanks for helping us understand all the intricacies. Happy to do it. Where law, tech, and ethics meet. Always a lot to think about. OK, we've covered a lot.
Starting point is 00:11:38 Derek Mobley's story, Workday's role, disparate impact, the legal arguments. We even touched on what this case could mean for the whole HR tech world and algorithms in general. But let's take a step back for a second. What does this case really mean? Good point. It raises big questions going beyond just the legal stuff. How does tech fit into society? Can it do good or can it do harm? Yeah, like what does fairness mean when algorithms are involved? And who's
Starting point is 00:12:03 responsible when these systems make decisions that have real consequences for people? Exactly the questions we need to be asking. We can't just embrace every new technology without thinking critically about how it might affect people. Right. It's not about saying no to technology. It's about using it responsibly, ethically, in a way that benefits everyone.
Starting point is 00:12:21 Couldn't agree more. And cases like this, as messy and complicated as they are, they help us have those conversations, figure out how to navigate this new world. Okay, we've looked at Mobley's claims, Workday's defense, what this case might mean for the whole industry. But as we wrap up, what are the key takeaways for our listeners?
Starting point is 00:12:40 Especially when it comes to understanding how algorithms might be affecting their own job searches. Be aware. That's the biggest thing. Know that algorithms are being used more and more in hiring, and that those algorithms can be biased even if they weren't meant to discriminate. So not paranoia, just being informed. Right. Know how these systems work, what data they're using, what blind spots they might have, and speak up. Demand transparency and accountability from the companies using this technology. That's a great point. As job seekers, we have a right to know how these decisions are
Starting point is 00:13:12 made. Absolutely. And the more we know about these systems, the better we can navigate them, make sure they're being used fairly and ethically. It's about being empowered, not just letting the algorithms decide for us. Exactly. Technology is a tool. Any tool can be used for good or bad. It's up to us to decide how it's used, to make sure it reflects our values, our goals. Well said. Really thought-provoking deep dive.
Starting point is 00:13:33 Thanks for sharing your insights. My pleasure. And to all of you listening, thanks for joining us on the deep dive. We'll be back soon with another deep dive into a topic that will get you thinking. So to prove his case, Mobley really needs to paint a clear picture for the court. What kind of evidence will they be looking for specifically?
Starting point is 00:13:50 They need to see a direct link, you know, from those workday algorithms to the rejections he got. Like, did the system red flag something about Mobley that caused his applications to be automatically tossed out? Did he get consistently lower scores on workday's assessments, scores that don't match his actual qualifications? That's what the court needs to figure out.
Starting point is 00:14:10 And on top of that, his legal team has to counter Workday's argument, the whole, we're just a neutral platform thing. That it's the employers who should be held responsible for using the software fairly. Exactly. They have to make a strong case that Workday was more than just a software provider. That they were acting on behalf of those employers like an agent and therefore share the blame
Starting point is 00:14:33 for any discrimination that happened. It all boils down to proving that connection. Workday's actions, the algorithms they created, and the negative impact it had on Mobley's job search. That's the heart of the matter. And if Mobly wins, could send shockwaves through the whole HR tech world. What kind of intact are we talking about? Paint us a picture. Imagine a future where companies like Workday, they're required to check their algorithms for bias regularly, to be open about the criteria they're using to
Starting point is 00:15:02 screen candidates and to be held accountable for any unequal impact their software might be having. That's the kind of change this could bring. So this case, it could really change how these companies do business, how they even design their products. It's definitely within the realm of possibility, and it could give job seekers more power, too. They could start demanding more transparency and fairness from the companies they apply to. It sounds like this case could be a real turning point in this whole debate about making algorithms accountable in the hiring process. But I'm sure there are some people who worry about
Starting point is 00:15:33 too much regulation in this area. What are some of those concerns? Well, some folks argue that too much regulation could stifle innovation, you know, hold back progress in the HR tech sector. The worry is that if companies are constantly looking over their shoulder, afraid of lawsuits about algorithmic bias, they might be less likely to create new and innovative tools. So it's a delicate balance, protecting job seekers from being treated unfairly, but also not squashing progress in the field. Exactly. That's what makes this case so complicated and so important. We have to face these tough questions head on
Starting point is 00:16:07 and find solutions that encourage innovation while ensuring fairness and justice. It's not an easy task. So as we wrap up this deep dive, what's the one key takeaway you'd want our listeners to remember about the Mobley versus Workday case? The big takeaway. Don't just sit back and watch.
Starting point is 00:16:24 We can't be passive in this new age of algorithms. We need to stay informed, stay engaged, and be willing to ask the hard questions. How is this technology being used? How is it affecting our lives? Those are the questions we need to be asking. Well said. It's about understanding the role technology
Starting point is 00:16:41 plays in our world. Thanks again for walking us through this complicated and fascinating case. It's been my pleasure. Always enjoy these conversations. And to all our listeners, thanks for joining us on the deep dive. We'll catch you next time with another deep dive into a topic that'll get those brain cells firing. Until then, keep those questions coming. If you like the Employee Survival Guide, I'd really encourage you to leave a review. We try really hard to produce information to you that's informative, that's timely, that you can actually use and solve problems on your own and at your employment. So if you'd like to leave a review anywhere you listen to our podcast, please do so.
Starting point is 00:17:18 And leave five stars because anything less than five is really not as good, right? I'll keep it up. I'll keep the standards up. I'll keep the standards up, I'll keep the information flowing at you. If you'd like to send me an email and ask me a question, I'll actually review it and post it on there. You can send it to mcaru at capclaw.com, that's capclaw.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.