Unfolding the Eightfold Lawsuit
- Chad Sowash
- 9 hours ago
- 22 min read
The black box is back… and apparently it’s here to “enrich hiring and society.” Because nothing says progress like a robot scraping your internet crumbs, guessing your personality, and deciding your career destiny like it’s fantasy football.
In this episode of HR’s Most Dangerous Podcast, Chad & Cheese sit down with attorney Rachel Dempsey to break down the Eightfold lawsuit where AI hiring, billion-point data claims, and “ethical AI” marketing collide with actual law and common sense.
We dig into enriched talent profiles, mystery match scores, and the idea that bots predicting your future is somehow good for humanity. Spoiler: when the algorithm plays therapist, recruiter, and fortune teller… things get weird fast.
Class action stakes. Employer risk. AI vendors sweating behind the pitch decks.
PODCAST TRANSCRIPTION
Joel Cheesman (00:31.054)
Two guys whose favorite class in high school was detention. Hey kids, it's the Chad and Cheese podcast. I'm your co-host, Joel Cheesman. Joined as always, riding shotgun is Chad Sowash as we welcome Rachel Dempsey, Associate Director at Towards Justice, a nonprofit legal organization that defends workers on and on their way to being knee deep in the eightfold lawsuit. Rachel, welcome to HR's Most Dangerous Podcast.
Chad Sowash (00:36.75)
True, true.
Chad Sowash (00:45.964)
Esquire.
Rachel Dempsey (01:00.674)
Thank you. It's great to be here.
Joel Cheesman (01:02.926)
Good to have you, good to have you.
Chad Sowash (01:03.074)
Rachel, how do you become an Esquire? I don't know any about this.
Joel Cheesman (01:07.042)
go to law school and hurt.
Rachel Dempsey (01:08.423)
Go to law school and pass the bar, I think. You might just have to go to law school. I'm not sure.
Chad Sowash (01:11.406)
Is that it?
Joel Cheesman (01:12.097)
And
In her case, Yale Law School, which means this is rock bottom for her career, which it can only go up from here. Thanks for taking a bet on us, Rachel. We appreciate it.
Rachel Dempsey (01:23.603)
This is more interesting than any law school class I ever went to.
Joel Cheesman (01:30.745)
Oh, oh, love that. Love that. So our listeners likely won't know you, although they will be probably hearing more of you as the eightfold case unfolds, but give us a little taste of who is Rachel.
Rachel Dempsey (01:43.539)
So I'm an attorney at Towards Justice. I'm the associate director here. We're a nonprofit law firm based in Denver, Colorado, but we do national work. We represent workers and consumers in litigation and in other advocacy that builds worker power and advances economic justice in Colorado and across the country. we do both. Our original focus, I think, was on workers, and we've expanded a little bit to more consumer.
competition law. And this case is really at the intersection of a lot of those legal spaces.
Chad Sowash (02:17.9)
Yeah, it definitely is. What about you though, Rachel? Taking a little time on the bunny hill there in Colorado. What's going on? What's going on with you? Tell us a little bit about you.
Rachel Dempsey (02:26.867)
I have a three-year-old who went skiing for the first time this week and she took her very first ski lesson. think she went down the bunny hill one time in three hours and her report was that the hot chocolate was the best part. It was a great success.
Chad Sowash (02:38.222)
sweet.
Joel Cheesman (02:44.748)
And no ACL sprains or no ACL is fine? I know a lot of that's going down with the Olympics this year. Okay, good.
Rachel Dempsey (02:51.089)
All good. As physically, yeah, as physically unbruised as a three-year-old can be.
Joel Cheesman (02:58.67)
physically impressed. The mentally, yeah.
Chad Sowash (02:59.874)
Yeah, mentally is something entirely different. Yes. That's what the hot chocolate's for. So if you can, can you give us a little bit about just kind of like the background of Torch Justice and also how did you stumble upon this company, this little company called Eightfold?
Rachel Dempsey (03:02.163)
We'll learn that in a few years.
Rachel Dempsey (03:25.789)
Sure. So like I said, towards justice, we take a broad view of worker power and economic justice and sort of started learning about these AI hiring companies. It's something I think that everybody who works with workers, who represents workers has heard about and sort of understands is a really big issue for the workers that we talk to. They're facing, I think, a lot of
a lot of difficulties in applying for jobs that they haven't necessarily seen before. So we started looking into these products about a year ago, and we've been working actually very closely with Towards Justice. One of our senior fellows is Seth Frotman, who until a year ago was the general counsel at the Consumer Financial Protection Bureau.
Chad Sowash (04:21.414)
hello.
Rachel Dempsey (04:22.451)
Yeah, and they had sort of done some looking into these products and actually issued a bulletin about two years ago, sort of suggesting that there could be FICRA liability, Fair Credit Reporting Act liability for some of these products that sort of track workers and create reports on them. So this case came out of that project. And I think eightfold.
jumped out at us because a lot of huge companies use it. And then we came across a lot of workers who had had experiences with applying to jobs there and with not being able to get jobs there and not understand, sorry, not being able to get jobs with companies that use eightfold and not understanding why.
Chad Sowash (05:07.768)
So for all of us dummies out there who don't know what FICRA or FCRA, which is what I've been calling it, we don't know what that is. Can you give us a little background on
Rachel Dempsey (05:14.365)
You
Rachel Dempsey (05:19.955)
sure, lawyers love a nickname for a law. The Fair Credit Reporting Act is a law passed in the 1970s that governs what information and how companies can use information in creating credit reports that control people's access to credit, insurance, and jobs. So FICRA defines a credit report as
information bearing on a consumer's credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living. The reason why I read through that, all of that sort of legal language is because I think that until I got involved in this, I thought that a credit report was just like the thing that TransUnion or Equifax or Experience made to get, determine your access to a credit card or to a loan or whatever.
Joel Cheesman (06:10.562)
Mm-hmm.
Rachel Dempsey (06:18.099)
And it's actually a lot broader than that. And the legislative history makes that clear. Legislative history is another sort of lawyer word, which means what the lawmakers who passed the law sort of were talking about and how they explained the need for the law when they passed it. So the Fair Credit Reporting Act was passed in 1970, of at the very beginning of...
credit reports as we know them. And at the time that the Fair Credit Reporting Act was passed, credit reports could include stuff like, you know, what your neighbor, what your neighbor said about you, whether what your marital status was, you know, whether you had a reputation as being like a drunk among the people that you sort of that you that you hung out with. So what what FICRA was intended to do was to
provide some transparency into this process, right? Like people were being denied credit because their neighbors, you know, said they threw loud parties or whatever.
Joel Cheesman (07:17.656)
Mm-hmm.
Joel Cheesman (07:24.398)
And not only that, just so I understand that information was there, but I wouldn't have access to it. So it was sort of the, say black box more than a few times in this interview. It's there, but I can't see it, which I think comes to the crux of some of this. So I think you've done a great job of setting the table. Talk about the players in this. Talk about the plaintiffs. You're working with another law firm. Like let's get the players on the field.
Rachel Dempsey (07:31.289)
Exactly. Exactly.
Rachel Dempsey (07:51.367)
Yeah, totally. So Towards Justice is one of the lawyers on the case, one of the law firms on the case. The other law firm on the case is a private plaintiff's firm called Outen and Golden. Again, they typically do employment cases, this is sort of their employment and consumer law, I think, has a lot of overlap, and particularly these days. So then the named plaintiffs in this case, Aaron, Aaron Kistler.
Chad Sowash (08:12.984)
Mm-hmm.
Rachel Dempsey (08:19.251)
and Sruti Bamek are both just workers who have applied to jobs, a lot of jobs using eightfold platforms and have really struggled to find a job. both, they both have a lot of experience. They're both sort of relatively older workers and they came to us having applied to just hundreds of jobs and been denied for those jobs over and over again and not understanding why.
Chad Sowash (08:49.122)
this sounds, I mean, much more layered than the Mobley versus Workday case, where it's literally that to me just seems like age discrimination. This this seems much deeper from the standpoint of, as Joel had said, collecting all that that data and it being in a black box and not being validated and or verified by the actual candidate and or user or consumer or whatever the hell you want to call them, right?
Rachel Dempsey (08:58.269)
Yeah.
Chad Sowash (09:16.75)
but there's literally no validation or verification. I'm sure you've seen the eightfold patent. There's a lot of inference that happens. So that's gotta be, I mean, the patent, you take a look at some of their marketing materials that talks about enrichment data and actually taking from LinkedIn and GitHub.
Rachel Dempsey (09:27.535)
Exactly. Yeah.
Rachel Dempsey (09:39.859)
Exactly.
Chad Sowash (09:43.68)
in all these different areas and they're enriching a person's profile. And it's like, how do you know that that's valid data? So if you could dig into that a little bit, because this is so much more layered than the mobile case.
Rachel Dempsey (09:55.239)
Yeah, and I think the Mobley case is just addressing a different issue. It's also a serious issue. I think there are reasons to think that this AI hiring software facilitates and sort of recreates human biases. And so there are potentially, I think, discrimination cases that could come from this. This is different. This case sort of applies to all users and deals with the right to both to privacy and to transparency, to understanding
sort of how your data is being used. And so just based again, based on eightfolds public information, eventually I think we will hopefully get insight into sort of, you know, private information, be able to test that in discovery. But right now we just have access to its public information, but there's a lot of public information they put out there about that, about their product. And based on that public information, I think there's two potential issues, both of which would
Chad Sowash (10:47.022)
Yes.
Rachel Dempsey (10:51.431)
violate the Fair Credit Reporting Act. One is that they do sort of in their patents and various marketing materials talk about these enriched talent profiles. And the enriched talent profiles sort of suggest that they find data from elsewhere on the internet about a specific candidate and add that data to a candidate's profile. I think they've denied that. Again, the discovery will sort of let us.
explore more into that and our allegations are based on the public statements that they've made. The other issue though, even setting aside whether for any individual candidate, they like go to your LinkedIn, sort of find, you know, blogs that you've written, whatever their marketing material suggests, even if they don't do that, for every applicant, they sort of take the information that the applicant has given them, they take the information that the employer has given them, and then they
Chad Sowash (11:19.97)
Mm-hmm.
Rachel Dempsey (11:46.163)
break down that information and sort of analyze it and use that, by that analysis, they make inferences about, you know, what jobs are like each other, what job titles are like each other, and what sort of skills an employee is likely to be able to develop, what their career trajectory is, what jobs they're likely to have in the future. And all of those inferences are provided to the employer and, you know, the employer sort of used to decide.
Chad Sowash (11:59.395)
Mm-hmm.
Rachel Dempsey (12:15.399)
whether or not someone is eligible for a job. And those inferences are unverifiable, right? Some of them are predictions of the future. And I think that that in itself is a credit report under the definition of the credit report. it feels, I think, very invasive to people to have this machine deciding what you want to do with your career. mean, maybe it's wrong.
Joel Cheesman (12:42.254)
In case it ever comes out Chad that only fans account is not me. It is not me. Okay Rachel Eightfold eightfold is a company with a two billion plus dollar valuation. They've raised about 400 million dollars I assume that they have something to say about this. What has been their response other than just it's not true
Chad Sowash (12:42.491)
Right? Yeah.
Chad Sowash (12:50.83)
Check the credit card on that one.
Rachel Dempsey (13:04.307)
So their response has been, we are an ethical AI company, and what we use is just candidate data and employer data. Now, I am not the judge of who's ethical or not. That's not my job. I'm a lawyer. My job is to decide whether or not it looks like they're complying with the law. So setting aside whether they're ethical, which is not something that I'm opining on, again, the allegations in the complaint come from their marketing material.
Joel Cheesman (13:18.307)
Mm-hmm.
Chad Sowash (13:19.403)
Exactly.
Rachel Dempsey (13:33.989)
it's they're all cited. You can follow the citations and see the patents, they're all in the complaint. And regardless of whether for sort of any individual candidate, they're only the individualized information is only from, you know, the resume and the employer. Again, these inferences, these sort of these calculations, this match score is being compared to I think their their big selling point is, you know, we compare people to 1 billion data points, right? We
our decisions are informed by 1 billion data points. I mean, that is a lot of external data that they're using to draw these inferences, to create these reports that say things like this person has these skills today, but they're likely to be able to develop these skills in the future. And based on the types of companies they've been at in the past, they're likely to go to this type of company in the future. So, I think that regardless of what data they're sort of ingesting about any individual,
Chad Sowash (14:05.986)
Yeah. Yeah.
Rachel Dempsey (14:32.509)
that goes through their system and then comes out is, I think, a much more expansive report than that. This is not just them sort of reciting what's on someone's resume.
Chad Sowash (14:41.902)
So for me, obviously we've Joel and I have been in this space for a long time. We hear a lot of bullshit marketing terms and one of the biggest bullshit marketing terms we hear right now is ethical AI. There's literally no hard edges around what it actually means, but yet everybody's using it. And I want to dig into this patent real quick because I have it up on my computer. So as you were talking about talent insights.
Quality of education. I don't know what that means. Does that mean that I get a better quality of education by the different school that I go to? mean, what does that mean? Career growth progression, skills depth, industry expertise, and then you get into the personality insights. This is kind of scary, okay? Talking about whether I'm a team player, I'm an introvert, extrovert, chess.
Whether I played chess or, you know, a high endurance athlete. mean, it's just like the, and then there's also the predictions piece that that's associated to this. This is all set in the patent that's out there. That is the framework of which eightfold was built on. And yet they're, they're denying all of this.
Rachel Dempsey (15:55.923)
I think that's exactly right. mean, there's a lot of information that they have said and we're just repeating that information. I mean, again, everything in the complaint is cited. We're just repeating it. And I do, I think it feels very invasive to people and it's, I think, pretty clearly covered by these existing laws. mean, again, these laws are from the 1970s, but even at the time that they were passed, their drafters were concerned with sort
Chad Sowash (16:03.874)
Gotcha. Fair.
Rachel Dempsey (16:24.979)
computerized data collection very specifically. The sort of line that I love is one of the drafters said, the individual is in great danger of having his life and character reduced to impersonal blips and key punch holes in a stolid and unthinking machine. Now, obviously we've moved past key punch holes, but I think the overall sort of concern is very much coming to bear in these products.
Joel Cheesman (16:49.806)
Rachel, is a class action lawsuit for those, again, that don't know. Sort of describe what that means, how that could unfold, and then maybe what are some historical examples of how these cases resolve.
Chad Sowash (16:50.926)
agreed.
Rachel Dempsey (17:05.565)
Sure. So a class action is a structure for a case where a single issue or a single sort of legal concern affects a large group of people all at the same time. So the idea is that every single person who's submitted an application, who's been affected by Eightfold, could file their own lawsuit. That would probably be like millions of lawsuits, right? That just doesn't make sense from an efficiency perspective. It would overwhelm the courts. None of it would work. You can't afford to hire your own lawyer for that.
So a class action sort of aggregates all of those claims together using sort of the examples of a couple of named plaintiffs, a couple of people that have been affected in the same way as everybody else by the lawsuit. So the way that those cases proceed is generally you sort of go through what's called a class certification procedure where the court decides if the lawyers are correct, that it's more efficient for the case to move forward all at once. That's often one of the sort
big early fights is whether you get class certification or not. Once a class has been certified, sort of everybody's claims are in it together, people in the class get notice of their rights, have the opportunity to stay in the class or to leave the class. And so all of that moves forward together. Generally, people don't have to affirmatively opt into the class. If a class is certified, they're a member of the class. So people who are affected right now don't have to do anything in order to sort of stay part of
part of the lawsuit. And then eventually either it all goes to trial together or there's a settlement. And then obviously the alternative is that class certification is denied, the cases go forward on their individual claims. We're optimistic, I think, that this is an appropriate case for class treatment.
Chad Sowash (18:55.65)
So it seems like, at least I think, I believe I read that there are over a billion profiles within the actual eightfold system. And this is obviously, we're just gonna break it down into the US. So let's say that there's a hundred million or let's say 50, 50 million people who've been impacted on this. How are they notified? Are they notified? How are they notified?
How do you get into the discovery process at this point?
Rachel Dempsey (19:28.156)
Yeah, so.
Joel Cheesman (19:28.398)
The good news is our listenership is about 50 million. So that'll cover most of the people that know about this.
Chad Sowash (19:31.288)
hahahaha
Rachel Dempsey (19:35.645)
So generally, when you get a class certified notice goes out to everybody, I think that we've all gotten emails in our inbox that say, this is a class notice, whatever. Often they go to spam. Yeah, postcards, whatever. I actually check my spam pretty regularly as a lawyer who does class actions to be like, am I missing a class action here? Is there something I need to opt into? Or submit a claim form to get money, right? I'm not going to leave free money on the table.
Joel Cheesman (19:36.098)
You're welcome.
Joel Cheesman (19:48.034)
Postcards.
Joel Cheesman (20:04.259)
All those sales leads in the spam inbox for lawyers. had no idea. Never thought of that.
Rachel Dempsey (20:05.427)
But anyway, so if a class gets certified, think generally class notice goes out. And then if there's a settlement, if there's like money, then class notice definitely goes out. Again, it's usually some combination of email and mail. And then in a case that's this big, there may also just sort of be like press coverage where you learn about it. They're sort of like posting ways for a class case to...
Joel Cheesman (20:29.454)
Mm-hmm.
Rachel Dempsey (20:35.271)
to get notified. Generally though, mean, one of the advantages of a class action is that you don't individually usually have to do anything to participate, potentially unless there's money on the table.
Chad Sowash (20:48.46)
So
Joel Cheesman (20:48.59)
Is there a website? That's usually something, isn't it? Like, go to this website? Yeah, there will be, okay. I think Chad owns Eightfold Sucks if you need that one in the future. Just a heads up.
Rachel Dempsey (20:51.559)
There will be. Yeah, there will be. There's not right now. Yeah, but there's generally a class website.
Chad Sowash (20:59.48)
Hahaha!
Rachel Dempsey (21:01.203)
The defendant usually has to agree and I do not think they're going to agree to that one.
Chad Sowash (21:03.843)
So.
Joel Cheesman (21:06.35)
They probably won't agree on that one. Sorry. Sorry, Chad. I know. I know.
Chad Sowash (21:06.862)
that doesn't that that that's no fun. That's no fun at all. So going back to the the class action side of the house. So back in 2017, I talent then was in a class action, but they went ahead and settled. So therefore they didn't have to go to discovery. And to be quite frank, our industry and the world needs discovery because we have no clue at this point because of all the black boxes that are out there, not just a fault.
Rachel Dempsey (21:09.272)
you
Chad Sowash (21:33.103)
We've got so many black boxes that are out there. There has to be a precedent set. And there's not really, or at least I don't know of one, especially in the hiring space, where there's been a precedent set. How do we, how do you, I because this is no pressure, this is on you, Rachel. How do we make sure this gets to discovery so we can actually see the data and see what's happening so that companies that are currently doing it can stop, number one, and number two,
ones that want to get into this space start to understand, shit, you know, maybe we need to do some redesign.
Rachel Dempsey (22:05.159)
Yes, excuse me. So as lawyers and as lawyers representing a class, it's our ethical obligation to sort of, you know, achieve whatever result we think is in the best interest of the class. Sometimes that does mean early settlement. If, you know, if for example, I'm not saying that this is what happened in that case, I'm not saying that that's this is what happened in any specific case. But if sort of the defendant comes and says, you know, we, we messed up, we
we're going to change our practices, let's sort of get this resolved before there's a lot of expensive litigation, then maybe you'll reach an early settlement. mean, the other reason why you might reach an early settlement is that you realize your case isn't as strong as you thought. So there are potentially pre, there are situations in which you reach a pre-discovery settlement and that is the best sort of, the best outcome for your class. As lawyers, I think that we are, you know, towards justice, we do litigation, we are
in the public interest, are sort of trying to impact, you know, affect policy and sort of bring attention to issues that we think need more light. And I think this is one of them, but we ultimately have the responsibility sort of to the class to get the best possible result for them. And litigation is expensive, right? So, and discovery is expensive. So there are, think, incentives on all sides to settle early before all of the money that could potentially go to the class ends up
going towards actually the legal battle. So I can't know how this case is going to shake out. again, we have obligations that aren't just sort of purely get to discovery. That said, do think discovery is a really important way to shine light on a lot of these issues and is one way that litigation can help advance the public interest.
Joel Cheesman (23:38.136)
Mm-hmm.
Chad Sowash (23:57.259)
yeah.
Joel Cheesman (24:01.262)
Mm hmm. Rachel, this was filed in California. Explain to me again as a layman, it'll be tried in California if it goes to court and then can other states use it as precedent? How does the federal, because it's a federal law, like explain that to me. Help me make sense of that.
Rachel Dempsey (24:21.107)
Yeah, so the case was filed in California and it's under both federal and California law. I think that the class is defined as sort of all everyone who was affected by by eightfold nationwide. They're sort of specific, the California claims only apply to people in California, because that's sort of how how state law works. FICRA applies to people nationwide.
I honestly, I can't remember off the top of my head how we sort of defined the class. I think it is a nationwide class, so it would affect people nationwide. But it is also common, I think, and we didn't, there are other state laws that provide similar protections to workers in other states. And, you know, I think that there's a potential for follow-on cases in other states as well.
Joel Cheesman (25:15.892)
And historically, I mean, this is a company that serves global clients. Could this be a springboard for, the EU or other international governments to look at this and say, like, we're going to do the same here because that's an issue here? I mean, that's a real existential threat to any AI company, I would think.
Chad Sowash (25:36.971)
yeah.
Rachel Dempsey (25:38.035)
Yeah, so I'm not a US lawyer, right? am a US Esquire. So I don't, yeah, yeah, I mean, absolutely. And I do think that the EU does have some stronger privacy protections than the US in a lot of ways. It's my understanding they are looking into these AI companies and these AI hiring platforms. And so I do think that is an issue that all of these companies need to be aware of.
Joel Cheesman (25:43.212)
We won't hold your feet to it. I asked historically like this happens, assume.
Chad Sowash (25:47.235)
He
Joel Cheesman (25:52.526)
They do.
Chad Sowash (26:02.742)
And I'm sure they are paying attention to the news and seeing the headlines all over the place.
Joel Cheesman (26:08.974)
Are you saying they might have added motivation to stick it to our AI companies, Chad? Rachel, we have a lot of employers listen to this show and many of them are using, if it's not Eightfold, it's some other AI black box. can magically give you the perfect candidate for every job. What kind of risk, you mentioned Microsoft, I think, as an Eightfold user and part of this case, correct me if I'm wrong, but like,
Chad Sowash (26:13.038)
Could be, could be.
Joel Cheesman (26:35.948)
What employers are freaking out about this stuff. What would you tell them in light of this case?
Rachel Dempsey (26:41.629)
So this case, the case that we have brought is just against eightfold. It's not against any of the employers. That said, I think that there is potentially liability on the part of the employers. One of the things about the Fair Credit Reporting Act is that it imposes potential liability both on the people that use the credit reports and the people that create the credit reports. So I do think it's important to understand what these rights are.
Chad Sowash (26:54.968)
Mm-hmm.
Rachel Dempsey (27:11.481)
know, their disclosure and correction rights. So I think employers should be aware of what those rights are and just make sure that they're complying with the law and providing users, providing applicants with all of the rights that they're entitled to.
Chad Sowash (27:25.614)
Well, it's interesting that you say that and it's a great question, Joel, because on the mobile versus workday side of the house, they had asked for client lists of workday clients that were actually using aspects, quote unquote, AI aspects of the product. mean, Joel and I have talked about this for years. Many laws actually hold the employer responsible solely.
we're starting to move to a shared responsibility where these black boxes are being created. Although that doesn't leave the employer off the hook because they should be doing the due diligence and the constant auditing and looking at analytics and also their hiring cohorts to really understand what's happening in the black box. I find this really, really interesting.
But there are so many other things that are happening out there and I'm sure you already have a full plate as it is. But you know, there are companies that are asking, are actually forcing in some cases, employers to provide hiring signals back to train LLMs, right? So large language models. And for me, that's way too far.
Rachel Dempsey (28:41.747)
Yeah.
Chad Sowash (28:48.534)
Right to be able to actually take these hiring signals I mean first and foremost the employer shouldn't be giving to them but as a candidate and that attached to me and All of these different systems if these hiring systems are training training large language models I have no clue that this is happening. Right? This is again the whole fair credit report thing I have no clue that this is happening what I mean
Rachel Dempsey (29:12.243)
Absolutely.
Chad Sowash (29:17.002)
It sounds like there could be perspective impact in many other areas of large language models training and such.
Rachel Dempsey (29:25.587)
I think that's absolutely right. think that using this data to train large language models, and again, that's one of eightfolds big selling points, right? Is that they train their large language models on one billion skills, one billion people, et cetera. I think that there is potentially significant liability there and that employers as well as the reporting companies themselves really need to be aware of that and think through that very seriously.
Joel Cheesman (29:53.804)
Rachel, we have a lot of people who use LinkedIn. I want your opinion on LinkedIn as a model where someone voluntarily gives their profile, creates an account. then ideal, guess, LinkedIn could do some sort of a ranking of people based in their system. Do you view a LinkedIn as susceptible to cases like this? Or are they sort of immune because you're
Rachel Dempsey (30:11.784)
Yeah.
Joel Cheesman (30:20.3)
voluntarily going onto their system, you're agreeing to their terms of service, which I assume they say we can do whatever the hell we want in terms of algorithms and whatnot. do you think LinkedIn is immune to these kinds of cases? Could LinkedIn be legally bound to showing people like a credit report, what they are showing employers or what employers can access in their algorithm? Like where does LinkedIn or maybe even an Indeed, which has a really big database,
Rachel Dempsey (30:27.218)
Yeah.
Joel Cheesman (30:49.986)
where do they play into these cases, if at all?
Rachel Dempsey (30:53.395)
So I don't wanna, this is a very lawyerly answer. I don't wanna sort of make any comments on LinkedIn specific potential liability. But generally speaking, I mean, we all agree to so many terms of service. We are all bound to sort of infinite word count of these sort of fine print terms for every single thing we do online. That said, one of the things that's really powerful about the Fair Credit Reporting Act is that it requires not only sort of consent, like, you know,
Joel Cheesman (30:57.469)
Hahaha
Chad Sowash (31:08.366)
It's
Rachel Dempsey (31:22.195)
sort of click through consent that companies get from you all the time and you have no idea what's in there, it requires standalone and clear and conspicuous disclosures. So the sort of click wrap type of disclosure, type of agreement that we're all agreeing to all the time via those little pop-up terms of service boxes, that is not good enough. FICRA requires the sort of FICRA disclosures to be made separately from any other disclosures. They can't sort of be buried in a long contract.
they need to be clear and conspicuous. They need to be in a format that people will read and understand as disclosures about this specific thing. So I think that's one of the things that's powerful and helpful about FICRA is that you can't just bury the disclosures in the fine print contracts or behind the little hyperlink that says terms and conditions or all of the various ways that companies get you to agree to these contract terms.
Joel Cheesman (32:08.91)
Yeah.
Joel Cheesman (32:19.938)
Those bastards.
Chad Sowash (32:20.566)
Rachel, you've given us so much to think about, so much to think about. Rachel, this is Rachel Dempsey, Esquire, from the Torch Justice, the associate director. Rachel, if there are, I don't know, prospective class action suit clients who want to reach out to you, want to find out more on what they should do because they feel like they've been impacted, how would they actually reach out to you and
and start that process.
Rachel Dempsey (32:51.207)
Yeah, so you can find my contact information on the Towards Justice website. That's Towards with an S, which is something that I have to explain at least once a week. But yeah, my contact information is available there. Our intake form is available there. And we would love to hear from people.
Joel Cheesman (33:11.32)
Chad, I have a sneaking suspicion that we will hear from Rachel again in the future. That's another one in the can everybody. We out.
Chad Sowash (33:13.602)
Hmm?
Chad Sowash (33:19.449)
We out.





