Does LinkedIn Hate Women? w/ Martyn Redstone
- Chad Sowash
- 9 hours ago
- 20 min read

LinkedIn says its algorithm is neutral. Martyn Redstone says… not so fast. In this episode, the boys dig into a viral experiment that exposed how “proxy bias” quietly buries women’s voices while amplifying the same old ones. No overt discrimination, nope, just algorithms doing damage between the lines.
If you’re in HR, tech, or living on LinkedIn, this one might change how you see your feed forever.
PODCAST TRANSCRIPTION
Joel Cheesman (00:30.847)
It's the podcast your mom warned you about also known as the chat and cheese podcast. What's up everybody. I'm your cohost Joel Cheeseman joined as always. Chad Sowash his writing shotgun as we welcome Martyn Redstone to the show. He's a, he's an expert in AI governance within HR and entrepreneur and writer for hair, which we'll get to in a second, but he's also the author of a post on LinkedIn entitled is LinkedIn's algorithm biased against women, illegal and
Chad (00:39.118)
What?
Joel Cheesman (00:58.313)
technical analysis. Martyn, welcome to HR's Most Dangerous Podcast.
Martyn Redstone (01:03.426)
Thank you very much, Chaps. Great to be here.
Joel Cheesman (01:06.175)
Alright, let's get to the hair real quick. You're follicly challenged, so I guess hair doesn't mean the golden locks of some of us. So, hair, explain.
Chad (01:09.156)
or lack of.
Martyn Redstone (01:11.534)
and I'm not here.
Martyn Redstone (01:19.626)
It's definitely ironic, that's for sure. ultimately it's a plan words. It's AI in HR, literally spells hair. But the funny story is when I launched that a couple of years ago, I did have a message on something. It started off as a LinkedIn newsletter and I had a message from somebody to say, I don't know how to tell you this, but you've called your latest newsletter hair and you don't have any.
Joel Cheesman (01:42.783)
Mm-hmm.
Martyn Redstone (01:44.41)
And I was just like, wow, thanks. I really didn't think that one through properly, did I? But no, it's a play on words.
Joel Cheesman (01:51.167)
AI and HR, that's clever. That's clever. Very, very British. Also ironic that we're three white guys talking about if LinkedIn hates women. But let's get to learning a little bit about you before we get into the meat of the article. Tell us about Martyn.
Martyn Redstone (02:06.719)
Yeah, the quick story is I've been in recruitment for 20 years, literally January we'll see 20 years. Last 10 years has been helping organizations with innovation in TA technology and processes. Seven years ago, started up my own AI automation agency. And basically for the last 12, 18 months, I've been really focused on AI governance, making sure that people are doing things.
sensibly, responsibly, ethically, ultimately making sure they just don't do stupid shit with AI.
Chad (02:41.616)
You're a whistleblower. I love that. We need those. We need a whistleblower.
Martyn Redstone (02:44.493)
I'm not a wizard. No.
Joel Cheesman (02:44.505)
Yeah, you're not busy at all, right? You got nothing going on if you're watching that space.
Chad (02:50.286)
So let's talk about how you got involved because Jane Evans and Cindy Gallop pulled together. They saw a problem with the LinkedIn algorithm. So they wanted to test it. So talk a little bit about that test and then how you actually got involved. And by the way, Cindy Gallop, I don't know if you know this or not, friend of the show.
Martyn Redstone (02:56.715)
Yes.
Martyn Redstone (03:13.195)
Yeah, yeah, I saw her earlier on this week and she mentioned that she'd been on the show so that's great to hear as well.
Joel Cheesman (03:13.279)
Loveless some sending.
Chad (03:21.955)
Lover.
Joel Cheesman (03:23.264)
I still have the scars from where she smacked me around on the episode that we had her on, but that's another story.
Chad (03:26.146)
Hahaha
Martyn Redstone (03:27.905)
Now that's a video I really want to see. Send me the video privately afterwards, that sounds great. So yes, earlier this year, Cindy Gallop, Jane Evans, between them they've got almost 200,000 followers. They saw their reach just fall off a cliff. And it wasn't just low engagement, substantive posts about female founders, women, feminism.
Chad (03:29.834)
It's not like you didn't deserve it. Come on.
Joel Cheesman (03:31.903)
True, true. Sorry. Go ahead, Martyn.
Martyn Redstone (03:56.047)
they felt that they were being specifically buried. And it felt to them like a bit of a shadow ban. what they did was, along with two of their male peers, they posted exactly the same content at exactly the same time. So the two women, two men. And the men got between a 50 and 140 % reach across their follower numbers. And their follower numbers were four digit numbers compared to very low.
Chad (04:22.968)
Quite low. Yes. And that was like a combined. Yeah. I don't even think it was 10,000 combined, was it? Yeah.
Martyn Redstone (04:25.838)
Yes, combined. It wasn't. No, it wasn't 10,000 combined compared to Jane and Cindy. Jane got 8 % range across her follower numbers. So that was the big experiment that kicked it all off really. What the hell is going on here? So then we saw women like Megan Cornish, they changed their profile settings to a male profile. They put mustaches on their photos.
They started to run it in a more male way. It's great. There's loads of pictures out there of them. The result was immediate spikes in visibility. So Dorothy Dalton got involved in this as well. We all know Dorothy through the HR and recruitment world. And she said to me, because she knows I'm weirdly into all of this AI biased and governance stuff, have you looked into this at all?
And if the answer is no, I just want to look into it. So I did. I analyzed everything from their kind of all of the research, all of the experiments that have been happening, analyzed the paper that LinkedIn released about the algorithm. I've got some juicy stories on that later as well. And found a bit of an invisible code, which really, for me, proved that there is plausibility behind proxy bias in the algorithm.
And that's kind of how I got involved. And since then, it's all gone a little bit crazy.
Chad (05:55.68)
I don't think all the listeners understand what proxy bias is, so if you can, give a little insight to what proxy bias is.
Martyn Redstone (06:03.212)
Yeah, absolutely. direct bias would be LinkedIn putting in the code, if user equals female, then suppress their content. That would be just direct discrimination. When we look at proxy bias, it's some of the markers that people might understand somebody to be female or to be black or to be whatever characteristic. Yeah, exactly. I don't know what that is in British, but yeah. rounders, there we go.
Joel Cheesman (06:14.58)
Mm-hmm.
Chad (06:25.262)
playing softball.
Chad (06:29.262)
you
Joel Cheesman (06:30.159)
Rounders. Rounders.
Chad (06:31.566)
Yeah.
Martyn Redstone (06:33.036)
Yeah, absolutely. if we think about some of the things that are in there, in the algorithm, there's something called verbalization. And what that means is that they're looking at the tone that you write in. So in the new version of the algorithm, they stopped looking at your IDs and they started converting your profile into a text story that feats a large language model. And this introduces something that we call semantic bias. And what that means is that the algorithm loves, and we're now hearing this term, agentic language.
know, male coded languages like dominated and penalised. And so that kind of penalises more female coded kind of communal language, things like collaborated. So ultimately, it's going to start penalising females through the way that they use language compared to the way that men use language. And that's what mean by proxy bias. It's not direct discrimination, but it's picking up on markers that can create discrimination through that.
Chad (07:31.332)
It's like Amazon's algorithm went haywire when it looked out into the market and saw, whoa, all these developers are male. So let's just get rid of the females out of the entire funnel itself because, you know, what will up our chances to actually find more developers if we're just looking for males. It's kind of like the proxy buy side of the house to some extent. And then also being able to kill if, you know,
the types of universities that you might have gone to, which might be all female universities, the types of sports that you play, those types of things.
Martyn Redstone (08:01.61)
Peace out.
Yeah, that was the interesting proxy markers in the Amazon case was things like which college you went to, which university you went to, what sports you play, if it said female soccer team straight away, you know, it's so yeah, so that's where the proxy bias comes in. It's not direct discrimination, but it leads to it.
Chad (08:20.728)
You notice he said soccer. He said soccer Joel.
Joel Cheesman (08:20.959)
Fortunately, Amazon always does, he did say soccer. knows his audience. That's good. Fortunately, Amazon always does the right thing and canceled the algorithm. just keeping up with their, they're winning a record of batting a thousand. There's another baseball reference for you. Curious about, going back to the study, when you say they posted the same thing at the same time, that doesn't mean,
Martyn Redstone (08:23.68)
International audience, international audience. I'm not a fan of either soccer or football, so yeah.
Chad (08:31.083)
Isn't care.
Chad (08:36.042)
always does the right thing.
Martyn Redstone (08:38.871)
Yeah, I love it.
Joel Cheesman (08:49.437)
At two o'clock on Sunday, the 10th, they all posted the same time. it a Sunday and then the next Sunday? it a Sunday? then it seems like it would be really hard to have apples to apples comparison. So I'm just, I'm just wondering sort of what exactly happened when you say they posted the same thing at the same time. What does that mean?
Martyn Redstone (09:10.198)
Yeah, so they all they all had exactly the same post and Exactly the same word. So this was a little bit different. This is what kicked off the whole thing really more than anything else So so this wasn't looking really at the language and and this is the interesting bit. So exactly the same post And exactly the same day time of the week in the day So so this was just to see whether or not there was any kind of direct suppression of women and actually potentially there was
Joel Cheesman (09:38.431)
So it'd be like Chad and I posted the same thing with a little spin on adjectives and...
Martyn Redstone (09:44.151)
No, exactly the same. So the original experiment was exactly the same words at exactly the same time. So there was no spin on anything. But look, there's lots of other things going on in the algorithm, and there's lots of other things going on in other parts of how the system works that potentially impacted that as well. We found in the analysis of the algorithm a prompt instruction that explicitly
weights historical activity at 70 % and profile relevance at 30%. So mathematically forces AI to start prioritizing past visibility. So if your posts already haven't been doing well in the past, they're not going to go and do well in the future. And that can impact people who might have been giving birth to children and taking a bit of a break from LinkedIn. So there's lots of other thought processes around, again, more proxy buyers coming in.
Joel Cheesman (10:37.16)
What happened in, because I find it hard to Cindy Gallop very popular. cause one of my questions is going to be, you know, my followers aren't the same as your followers and quality quantity, but like there's no, could say there's probably no doubt. haven't done the research, but Cindy Gallop has quality people, following her and engage with her. so was there anything in regards to number of comments, number of likes or engagements did like
Chad (10:41.422)
Mm-hmm.
Chad (10:52.761)
hell yeah.
Martyn Redstone (10:53.688)
Hmm.
Chad (11:00.951)
impressions.
Joel Cheesman (11:01.608)
Can you, is there any breakdown of what exactly happened with each of the posts? Did one get no comments and the other got 300?
Martyn Redstone (11:08.992)
So all they measured was how many so, know on linkedin you see kind how many impressions? yeah, you'll each you know, so the only data that they captured that that they've made available is How many how many impressions you got on that post and they compared that to the number of followers? They've got because logically you'd say well if somebody's following you they should see your post and so you should get 100 visibility across your followers, but it doesn't work like that And and that's how that's what they've come to realize as well. There is a second
Chad (11:13.177)
Reach.
Joel Cheesman (11:25.18)
Okay.
Martyn Redstone (11:38.414)
paper and actually I probably mention it now so LinkedIn's paper that they released around their algorithm which is the 360brew paper has been available on Academic Repository Arts if since the beginning of the year over the last couple of weeks it disappeared it was taken down Just such a shame that I kind of saved it but but there is a second paper called Largest Hour of Troubled
Chad (11:46.916)
Mm-hmm.
Joel Cheesman (11:48.957)
Mm-hmm.
Chad (11:57.54)
magically.
Martyn Redstone (12:07.832)
Absolutely. And what that means is they also have things like popularity tokens going on as well. So this is the scout that looks for what to put in front of you. And we call it kind of the get rich, the rich get richer token. It's a popularity token. So like the top 1 % goes into the AI prompt to serve up content in front of you. So this is kind of hard coding a bias.
where the algorithm is only now retrieving content that is already popular as well. So it's locking out new voices and again, locking out people who've struggled with popularity or getting reach in the past. So it's positive only echo chamber. It's very, very interesting what's going on. Very, very interesting.
Chad (12:53.816)
I would think it would be somewhat easy to set these experiments up just from the standpoint of you can schedule the post at noon, everybody can schedule at the exact same time, it goes out the exact same time, and you just sit back and wait for the numbers to roll in. So what was, was there any response from LinkedIn about this?
Martyn Redstone (13:07.757)
Hmm.
Martyn Redstone (13:14.037)
Yeah, so recently LinkedIn wrote an engineering blog post and it specifically said in there we don't use gender as a variable So we can't be biased and
Chad (13:24.92)
And that's not what you said. Yeah. And that's not what you said because gender, we're talking proxy. We're not talking, you know, if female then this or if male then that.
Martyn Redstone (13:32.183)
Exactly.
Martyn Redstone (13:36.015)
Exactly. And they actually mentioned in the post what we do look at is, you know, this, this and this and this, which totally utterly, you know, proved the point, you know, and the algorithm just reflects what the user wants. So, you know, it was kind of their excuse. So so that for me really proved the because proxy bias doesn't need that gender variable, you know, and they did say they, you know, it looks at position, industry, language, etc, etc. And so we say, well, this is the
Joel Cheesman (13:55.87)
Mm-hmm.
Martyn Redstone (14:04.428)
the issue. you're looking at all of these things, you're going to uncover and promote proxy bias. So we're pushing for LinkedIn to really get a grip on it and deal with it. It's really difficult. You're not going to mathematically remove every bit of bias. But in this day and age, we expect people to own it, to report on it, and to do something to try and mitigate it.
Chad (14:29.624)
to monitor it and then make the changes necessary.
Joel Cheesman (14:29.639)
How much is?
Martyn Redstone (14:32.654)
Exactly.
Joel Cheesman (14:32.798)
How much of a black box is LinkedIn's AI? Do they publish some of it but not all of it? Do they keep it all in-house? Talk about their openness.
Martyn Redstone (14:42.306)
Well, they had an academic paper that explained how the the the brew 360 algorithm worked and they took that down. So for me, that's a black box now. You know, I think that it probably didn't do enough to explain it, but it did enough to ask questions. And so so I would say that now they've gone from semi transparent to black box, which is a real shame. And there are, as we know, regulations all over the world that can start trying to force them.
Joel Cheesman (14:52.626)
Okay.
Joel Cheesman (15:05.052)
Okay.
Martyn Redstone (15:11.992)
to be a little bit more transparent and open it up.
Joel Cheesman (15:14.846)
What are the odds that human beings are the problem? Because going back to Amazon, we were making decisions that were biased and we were doing things that were biased. Like, is there a chance that we're the problem and LinkedIn isn't so much?
Martyn Redstone (15:27.438)
Well, yes and no, I mean, ultimately with any algorithm humans are the problem because humans are the ones that are creating the algorithms. So having an engineering blog saying we don't use gender discrimination in our algorithms is putting proof. They've not hard coded in if gender equals female then downgrade post. And that's just not an excuse. yeah, so humans are the issue. There's a few.
Joel Cheesman (15:29.275)
Okay.
Martyn Redstone (15:54.585)
points to that. First of all, the humans that create the algorithms are the issue. But second of all, some of the humans that create a bit of a cesspit when it comes to social media interaction, they're also going to be the problem as well. Because the more engagement you give to somebody, the more engagement you give to a post, based on what we know about the way that the second algorithm, the scout algorithm, acts, that's going to improve people's reach over time as well. And there are
Joel Cheesman (16:18.323)
Mm-hmm.
Martyn Redstone (16:22.592)
some as we know across most of social media but even on LinkedIn there are some some bad actors absolutely.
Chad (16:29.057)
And LinkedIn's not Skynet.
Joel Cheesman (16:29.302)
So cesspit that's a cesspool right just clarifying for our American
Martyn Redstone (16:33.388)
is sex with things.
Chad (16:36.149)
And Joel, LinkedIn is not Skynet. So they are run by humans. therefore humans, they are the issue, period. the users, the end users.
Joel Cheesman (16:43.228)
I meant more the users, not the creators of the algorithm, but like, think more of how we, the people, are using LinkedIn may be impacting it in ways that aren't healthy.
Martyn Redstone (16:52.878)
Well, it's a vicious cycle because humans are creating the algorithm, which is telling the system to increase engagement on how people are interacting with you. So if you've got bad people interacting with bad content, and that's the other thing that we know from people like Cindy and Jane and from Dorothy is that if they're getting abuse on the platform and they report it to LinkedIn, LinkedIn aren't even doing anything about it a lot of the time.
and they're saying it's within our community guidelines. And trust me, it's been eye opening because since I've been involved in all of this, I've been getting abuse as well on LinkedIn, which is just laughable because... From men, I mean, it's been great. Yeah, no, it's been brilliant. I've had people messaging me saying, you know, as a computer scientist, you should know better.
Chad (17:31.086)
You've been getting abuse from who?
Chad (17:39.001)
What?
Martyn Redstone (17:47.599)
You should be teaching these people how to write better LinkedIn posts rather than blaming the algorithm. DNI content is never going to be interesting. I've had somebody publicly calling me a male feminist. I don't even know what that means. I think the challenge is, is when you get... My wife might be hearing this, stop it.
Chad (17:58.976)
my god.
Chad (18:05.602)
that you get laid and they don't. That's what that means. Go ahead, sorry, sorry. Well, hopefully she's the one who's doing the laying. I'm sorry, go ahead.
Joel Cheesman (18:12.06)
Martyn, as two guys with a podcast that's eight years old, if you don't have haters, you're doing it wrong. So take that for what it's worth.
Martyn Redstone (18:14.863)
I hope so, yeah.
Martyn Redstone (18:20.682)
Absolutely. I'm delighted that people are least engaging with my content, which is great for the algorithm. But yeah, it's eye-opening though, because you do tend to see kind what's going on. think I'm the furthest away from the stereotypical person that you would imagine to be called a male ally or a male feminist or something like that. I mean, these people are just...
Joel Cheesman (18:26.78)
Yes. Yes.
Martyn Redstone (18:46.594)
morons really aren't they? And they've got nothing better to do in their day than abuse people online. So I can't really take them seriously, it's, men you must do better. I think that's probably the best way I can put it. You there's no reason for that kind of behavior and it's just going to, you know, damage, you know, the experience of platforms like LinkedIn more so.
Chad (19:05.588)
I've noticed it's interesting because had a great comparison that I threw out there about NFL teams and their quarterbacks, right? And how the Green Bay Packers literally they have the fewest amount of quarterbacks since 1992, where I think like the Cleveland Browns have like close to 40. But the Green Bay Packers has like, they have like three, right? So there's literally just this comparison. And I was attributing that to
development of talent. And if we take a look at how Green Bay develops talent differently than the Vikings or somebody else in their division, right? Or even the Cleveland Browns. It is an entirely different culture, right? We can learn from this. And in that post, which is, you can say, there are a lot of females who watch NFL football, don't get me wrong, but it's very much a male sport, right?
Martyn Redstone (19:51.31)
Mm-hmm.
Chad (20:01.474)
male's playing it, it's very machismo, testosterone, whatever. At the end of the day, that post exploded. Now, I can't say it was whether it was because I tagged the Green Bay Packers or what it was, but it was a very, very, very male, male, know, Tim Allen home improvement post. And it just exploded.
So that to me was interesting. And I showed it to Julie like 10 minutes after I posted it, it just exploded. And she's like, holy shit, you should be posting about, know, NFL stuff more.
Joel Cheesman (20:36.146)
Yeah.
Martyn Redstone (20:36.398)
Yeah, and and I mean the thing is is that we still at that point where nobody really understands well enough and there are lots of gurus out there that think that they've cracked the algorithm but there are we still don't understand well enough, know exactly how it works but but absolutely, know, it could be because that was very as you said Tim Allen You know, I actually did watch that I've got no idea what you mean about quarterbacks and Packers and things like that, but I Hope improvements was one of my favorites
But yes, it could absolutely be that or it could just be the fact that you know that first hour rule, you just don't know. But there's certainly something weird going on that needs.
Joel Cheesman (21:16.19)
Hey, Martyn, you talk about legal implications in your post and they tend to be European focused, but I want you to talk about the UK Equality Act of 2010, EU acts that might impact this and probably the lack of any regulation in the US where nothing's going to happen with any of stuff. Talk about the legal implications.
Martyn Redstone (21:25.912)
Mm-hmm.
Martyn Redstone (21:34.595)
Yeah, yeah, so as you said, I mean, especially over here in Europe and continental not politically now that I'm in Brexit Britain, but over here in Europe, there's a lot of regulation about this. Like you said, you've got the Equality Act in the UK and the Equality Act means that ultimately you need to treat these 12 protected characteristics equally and not discriminate against them. One of them is gender.
And so there's potentially some issue with that regulation when it comes to the way that LinkedIn is managing their female members on the feed. The other one is the DSA, Digital Services Act in Europe, where there are rules around what they call VLOPS for very large online platforms and how they also need to manage algorithmic bias as well. And then...
Yeah, we do have the UAI Act, but that's all over the place right now anyway, so let's kind of ignore that. you know, we've absolutely got those two that are potentially areas that LinkedIn needs to be careful of when it comes to that. And they're already being investigated under one of those for something completely different in Ireland. So it needs to be, you know, they need to be careful and they need to. I think the thing is with me is that rather than just kind of brush it away and say we don't use gender.
They need to respect the lived experience of their users. And they need to take it on board and just be responsible and look into it. Even if they look into it and say, you know what? We've found all of this. There's nothing to be concerned about. We've tweaked some things slightly. Blah, blah. Just brilliant. Be transparent. But don't just ignore and dismiss the lived experience of your users, because that's going to start impacting trust. you don't want
Chad (23:24.12)
Did they investigate the experiment that Cindy and Jane did? I mean, that's the biggest question, because it's kind of like, if there's a murder, you go to the fucking crime scene, right? You check it out, you do all the investigations. Did they do any investigation whatsoever?
Martyn Redstone (23:38.323)
Not that I'm aware of. So the processes that have happened so far since have been that we had a member of parliament over here in the UK met with LinkedIn to go over this. And since then, they then created the engineering blog that I talked about earlier. And all of the responses have just been to point people towards that engineering blog. So they haven't given us any more detail around whether or not they've looked into.
The original experiment or looked into some of the work that I've done or looked into some of the lived experience of other people All we know is they've this blog and they're boilerplate responses Look at this blog. We don't use gender So it's a note. We're calling on LinkedIn to do something about it. We you know, Jane and Cindy have created a petition that people can sign and a website as well FairnessInTheFeed.com
And we're hoping that more and more people can join the call for LinkedIn to actually try and look into this. Just look into it, not even asking for a fix, because we don't know whether there is something to be fixed. But look into it, investigate it, and be transparent in your findings.
Joel Cheesman (24:47.912)
Do you think employers are at risk at all if LinkedIn does get pinched? Can that come down on employers at all? Are they pretty much safe?
Martyn Redstone (24:56.074)
I don't think so because, I mean, I would hope that people aren't relying on the feed, but the interesting bit is limiting economic activity. So, you when you've got self-employed people who are using their feed to encourage kind of inbound leads and those kind of things, and they're losing out on opportunity, that's potentially an issue, you know, because LinkedIn is all about creating economic opportunity for all. But the employer side, probably not, because they do have mechanisms, again, that they have released
publicly around how they mitigate for bias and proxy bias in their recruitment products. So they do have methodologies and actually that was part of what I wrote was that we know you do this for your recruitment products, so why aren't you just doing it for your feed as well?
Joel Cheesman (25:41.343)
The unfortunate thing is this is what happens with monopolies, right? If only LinkedIn had a competitor that people could say like, I'm going over here. You've mentioned a few ways that people can quote unquote fight the power on this one. What are some other ways, whether quietly, passively, or really actively, people can get the word out on what's going on at LinkedIn and maybe get them to open up a little bit?
Martyn Redstone (25:44.526)
Mm.
Martyn Redstone (26:06.006)
Yeah, so so I think it's just you know following the people that we mentioned people like Cindy people like jane, people like Dorothy, and and you know reposting what they've got to say Like I said the hashtag hashtag fairness in the feed is the one that that they're all using so follow that find the posts on that sign the petition I don't think we can do it quietly in terms of just reposting and engaging and what have you but You know as I was saying earlier to another group of people,
This is power in the masses, you know, and I'm not one for kind of mass protest or anything like that. But, you know, I think the more people that push back on LinkedIn to say, we want you to be transparent in this, the more opportunity we have. And that's ultimately through signing the petition.
Joel Cheesman (26:50.814)
Sponsored t-shirts, Martyn, that's the answer. We need t-shirts on everybody about this issue.
Chad (26:52.952)
There it is. It's all. That's it. That's what I'm saying. Cheeseman, shut up. That's our next promotion. That's Martyn Redstone. Martyn with a Y, kids. Martyn with a Y, not an I. Martyn, if people want to find out more about this, they want to get in contact with you. Maybe they want to troll you a little bit. Who knows? Where can they find you?
Martyn Redstone (26:56.029)
Unisex t-shirts!
Martyn Redstone (27:03.853)
Ha
Joel Cheesman (27:12.359)
and
Martyn Redstone (27:18.028)
Yeah, LinkedIn is always the best place to find me. I'm pretty active on there. I'll accept requests from anybody. Yeah. If they had a competitor, if they had a competitor, but it's best place to find me, you know, and I'll accept requests from anyone, trolls included. So, yeah.
Joel Cheesman (27:23.986)
third irony of the show. How to follow me. I'm on LinkedIn, everybody.
Chad (27:31.268)
That's right.
Joel Cheesman (27:31.4)
Yeah.
Chad (27:33.774)
Frames.
Chad (27:38.212)
You're so thirsty, Martyn. You're so thirsty.
Joel Cheesman (27:39.984)
Love it, love it, love it Martyn.
Martyn Redstone (27:42.728)
Totally, Business is business.
Joel Cheesman (27:45.182)
Yeah, and we haven't even talked about the gun target in the back there as a Brit. Anyway, maybe we'll bring you on again next year to talk about that. Chad, that is another one in the can. We out.
Chad (27:56.738)
WEEEE OUT





