The Chad & Cheese continue the well-pedigreed conversations, this time with a CEO/cofounder reppin' Harvard, Dartmouth, and MIT. Pymetrics, Frida Polli joins the boys on a quest for a better understanding of personality tests, matching of soft skills to job openings, and the threats that A.I. backlash has on businesses like hers. There even tips for female entrepreneurs.
Anything this smart has gotta be powered by Sovren, with AI so human you'll want to take it to dinner, right?
PODCAST TRANSCRIPTION sponsored by:
I just don't want people to get the impression that again, cause I just don't believe in it that there's such a thing as people that are always going to be top talent. I think that's, that's kind of a myth and it leads us to a lot of unproductive places. And I think in the world of recruiting,
Hide your kids! Lock the doors! You're listening to HR’s most dangerous podcast. Chad Sowash and Joel Cheeseman are here to punch the recruiting industry, right where it hurts! Complete with breaking news, brash opinion and loads of snark, buckle up boys and girls, it's time for the Chad and Cheese podcast.
Aw. Yeah, we're getting real cerebral on a Friday, kids. Welcome to the Chad and Cheese podcast. I'm your cohost Joel Cheeseman joined as always by Chad Sowash and today, holy shit. We were getting big brained, everybody. Okay. We got Frida Polli co-founder of Biometrics and that's just the tip of the iceberg. She's a smarty pants, Harvard MBA, Dartmouth undergrad, PhD from Sofolk. My brain hurts already Frida, welcome to the podcast.
Frida (1m 3s):
Thanks you guys.
Joel (1m 4s):
Let me apologize ahead of time for everything that's about to go down.
Frida (1m 7s):
No, I, this is a great way to spend a Friday morning.
Joel (1m 10s):
Unwind on a Friday.
Frida (1m 12s):
Joel (1m 13s):
So what should we know about you that I didn't cover? You don't like Irish people. What else?
Frida (1m 18s):
I mean, and they don't let them go drinking on a a hundred percent.
Chad (1m 28s):
Not cool Frida, not cool.
Frida (1m 30s):
I don't know what you want to know about me. I was born in Italy. I'm not from this country, although I sound like it, so I'm not an army brat. I moved around a lot because my dad was in management consulting, but more importantly, I guess for the show, I spent 10 years at Harvard and MIT becoming a smarty pants, cognitive scientists, and that's where a lot of, and that's where a lot of the science that we use at Pymetrics comes from and then, you know, went to the, an MBA program at Harvard. And that's where I saw recruiting firsthand and thought to myself, Ooh, this is, you know, all that science we've been using to, to look at people's soft skills could really be come in handy in this problem of people to job matching.
Frida (2m 13s):
And that's how Pymetrics was born because you know, really what I saw at HBS was it's not that people were confused about what was on people's resume, it was pretty clear. What they wanted to do was figure out who they were as a person, who they were as a, you know, individual human being. And they were trying to like tea leaf read off a resume, you know, they'd be like, Oh, you know, Chad played sports, Oh, it must be a team player. You know? Or I don't know, Joel had a side job in college, must be hardworking. And I mean, you can actually assess, I don't know if that's true or not, but the point, is you can actually.
Joel (2m 44s):
I would fail the metrics.
Frida (2m 48s):
This is what recruiters are doing, trying to figure out who you are as a human being, right from your resume when you can actually measure a lot of those things directly. And that's what we were trying to do is avoid people.
Joel (2m 58s):
You started this thing in 2013, right?
Frida (3m 1s):
We started the science part of it in 2013. That's correct. But we didn't have a product in market, until 2016.
Joel (3m 7s):
Talk about being ahead of the game? Like what, what the hell did you look at in 2013 and say, this, this is the wave of the future that allowed you to create this and how has it evolved in the last eight years?
Frida (3m 19s):
Yeah, well, again, I mean, I think it was, again, it was my experience at HBS. So, you know, an HBS, tons of companies come and everybody's looking for a job. Right. And so it was watching that process and realizing that people were still relying on coffee chats and resumes and all these like pretty outdated things to understand, Hey, does this person have what it takes to do the job? And then also seeing the flip side of that, which is, you know, some colleagues wanting to go into investment banking when all their friends were like, but you like to sleep 15 hours a day. Is that really a good fit? And then, you know, they'd get the job. And then, and then they, you know, two, two days later, the big, I hate my job. And so just realizing for the matching was not working that well. And a lot of it was because the resume stuff is clear. It's, what's not on the resume that people really, you know, the soft skills that people want to understand.
Chad (4m 1s):
It is it, is it really something that we should be doing is looking at people's personalities and saying that, that person's perfect for this job versus the skills they have? Because I mean, soft skills versus the hard skills certificate, certifications.
Frida (4m 16s):
So I would argue that, you know, absolutely, first of all, it's already what people are doing, right. So when they're trying to, so like take the recruiting situation at any college or school, the resumes, honestly, all look the same. They've been formatted the same way everyone's coming from the same school. They've all, you know, it's very hard to distinguish. So they're already looking for, how can I determine, you know, what makes this person unique, soft skills are what make people unique? And they're not about categories like pigeonholing people. It's actually about understanding what really makes them truly unique and enabling them to be that unique person. That's the way that we view soft skills in any, in any case. So it's not about homogenizing people.
Frida (4m 57s):
It's really about bringing out their diversity. The second piece about soft skills, unlike hard skills, they are far more equally distributed. So I always say you could take the same person with a particular soft skill profile and, you know, have them raised in, you know, an impoverished environment versus, you know, an elite environment. You're going to have very different resumes as we can all imagine, that person is fundamentally the same person, right? So if we want to start talking about equalizing gaps, whether there's socioeconomic gaps or racial gaps or gender gaps, soft skills are the way to do that. You know, there's just, I mean, if you're going to continue to rely on hard skills, hard skills are way more, not evenly distributed because of different advantages that, you know, different groups have in life.
Frida (5m 37s):
And not only that, I mean that if we rely too much on hard skills, that's when we get into you guys all remember the Amazon resume parser, fisasco, right? well, what was wrong with that resume parser? It was learning that their top folks went to colleges. Like for example, like some of the things they had that were different, you know, women go to Bard well, because they had more male engineers, no male engineer had ever gone to Bard, right? Cause it's a, it's a female college, right? Women play softball, men play baseball. So if you are literally training at off of a resume that has so many proxy variables for gender, race, socioeconomic status, you are almost invariably going to have some pretty big issues in terms of creating algorithms that are not biased or creating processes that are not biased versus soft skills are actually very equally distributed, so they are good equalizing factor.
Frida (6m 26s):
We have this way that we look at people in terms of, are you sort of more bias to action and impulsive, or are you more, you know, sort of thoughtful and really attentive to detail? Well, I have, since the age of zero always been a little bit on the impulsive side, which makes me bias to action. That's a soft skill, right? Like where you fall on that continuum. It doesn't make one end of the spectrum better or worse, it just means that I'm going to be more predisposed to jobs where I can do stuff. And somebody who falls in the other side of the spectrum is going to be better suited to jobs where they can sort of think, consider and sort of be more deliberate and thoughtful. Does that make sense?
Joel (7m 4s):
Yeah. And, and as isn't it fair to say, as automation comes more and more into the hard skills that the soft skills are really going to be, what separates the top talent from everyone else?
Frida (7m 15s):
So it's funny that you use the word top talent. In Primemetrics point of view, in my view, top talent is top talent for that particular job at that particular company. Like I think Frida Polli is, you know, top talent from Biometrics. I don't think Frida Polly's top talent for everything. You know what I mean? And I think we have to, it's all about matching. It's like Netflix, where it's like dating apps, right? We don't, we don't assume that there's like, it's Rotten Tomatoes versus Netflix. Right? Rotten tomatoes assumes are some people that are like some movies are always good and some movies that are always bad. Netflix doesn't assume that it says, Hey, you know, Joel and Chad, you like these kinds of movies, Friaa likes those kinds of movies and we're going to optimize so that, you know, people end up in the right sort of places where they are going to perform better.
Frida (7m 55s):
So if you're thinking about top talent as like top talent for that job and company a hundred percent. I just don't want people to get the impression that again, because I just don't believe in it that there's such a thing as people that are always going to be top talent. I think that's, that's kind of a myth and it leads us to a lot of unproductive places I think, in the world of recruiting. So,
Chad (8m 14s):
Amen sister. So let me talk a little bit more about soft skills because whenever I hear a company talk or say soft skills, I'm always thinking to myself, how in the hell are they going to defend that against an OFCCP audit?
Frida (8m 30s):
Sure. Yeah. Well, when you think about ONAP right. And the knowledge,
Chad (8m 32s):
No, I don't want to think about own it.
Joel (8m 34s):
Here we go. That's a mess.
Frida (8m 38s):
You know, we all don't want to think about ONAP, but unfortunately, you know, ONAP is an important thing in the, in the world of, you know, employment. And so when you think about ONAP, you know, or any way to think about jobs, there's knowledge, skills, and abilities, right? And when we think about knowledge skills and, you know, knowledge is primarily more sort of the hard skill domain, but depending on skills and abilities, they can be things like, you know, attention to detail, for example, you know, and things like that that are not, that are what we're calling soft skills, right? And again, people talk about these things differently, but a lot of the things that we measure are actually, you know, very directly related to the KSAs of particular occupation.
Frida (9m 20s):
So there are very defensible. And then on top of that, you know, you, we, and one can do job analysis, you know, including, you know, interviews with subject matter experts and all sorts of other ways to do job analysis, to make, to ensure that what you're measuring, you know, is related to the job, right? So you have the ONAP codes, you have a job analysis that you're conducting to confirm, or, you know, tweak the initial idea that you have based ONAP on. And then you're building models that presumably have good concurrent validation. And then last but not least, you're doing some sort of, you know, validity analysis on the backend. So I think there's a lot of ways that you can, you can validate these things and, you know, we have an, you know, we have OFCCP clients that use our products.
Frida (10m 5s):
So it, it has, I think it's, it has great defensibility.
Chad (10m 8s):
And you've been up against that and defended against it. That's the big question. Right? And that's, and that's the big, the big stamp, I think for any organization.
Frida (10m 17s):
I mean, you know, to be completely really honest, I mean, we haven't done that. I, and again, I know that, you know, OFCCP can audit you for a variety of different reasons. I mean, they're typically going to audit you if they see just, you know, signs of disparate impact?
Chad (10m 30s):
Frida (10m 31s):
And so we don't, again, I think I've mentioned this to you, but we don't actually, we have a way of creating our algorithms called fairness optimization, where we don't only optimize for performance, we optimize for performance and lack of disparate impact. So actually our platform won't release an algorithm if it has disparate impact. So the likelihood that we're going to get flagged by an OFCCP audit, looking for disparate impact is extremely low because.
Chad (10m 53s):
That's the answer right there.