Your Source for NPR News & Music
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Blendoor App Breaks Down Computer Bias In Hiring

ROBERT SIEGEL, HOST:

Laura told us that some companies are using computer programs to help them narrow the pool of applicants for certain jobs. Well, joining us now is Stephanie Lampkin, CEO of a company called Blendoor. It's an app designed to help companies find the right people to hire. Welcome to the program.

STEPHANIE LAMPKIN: Thank you.

SIEGEL: First of all, your app is supposed to break down bias in hiring. How does it do that?

LAMPKIN: So with Blendoor, we hide candidate name and photo and only show their skills, work experience and education. We also remove any indication of age. And so we're actually removing data as opposed to building algorithms to circumvent bias in recognition that there are certain things like gender, the way name sounds or how it's spelled, which can contribute to response rate.

SIEGEL: Why did you do this? What made you decide to make this happen?

LAMPKIN: So I came up with the idea the fall after many of the companies in Silicon Valley released their employee diversity numbers. And their argument was that there just aren't enough women - qualified women and people of color to hire, which I knew wasn't completely the case. And so I wanted to create a platform - and we would actually have data to validate that this is not just a pipeline problem.

SIEGEL: You are a Stanford and MIT-educated - well, MIT MBA, but also a computer scientist?

LAMPKIN: I actually got an engineering degree from Stanford.

SIEGEL: You're an African-American woman.

LAMPKIN: That's correct.

SIEGEL: Did you find that you were typecast as somebody who was not right for the kind of job that you felt you were absolutely trained for?

LAMPKIN: Oh, yeah, absolutely. And my first years at Microsoft, I experienced issues related to gender more so than race, which was surprising. It was definitely a wake-up call.

SIEGEL: Ultimately, you're blocking out key information about applicants until they get to the interview stage - right? - because eventually, they're going to sit down and talk to somebody if this succeeds.

LAMPKIN: Correct.

SIEGEL: If there actually are biases built into hiring, they could still come into play at that moment, wouldn't they?

LAMPKIN: Yes. By having this data - so even being able to track how far certain demographics of people are making it in the recruiting pipeline, we think having this information will actually make people a little bit more accountable. When you know you're being watched, it makes you a little bit more conscious about what you're doing.

SIEGEL: Is it correct to say that the assumption - I think the pretty hopeful assumption behind Blendoor is that if job applicants were considered gender-blind, race-blind, age-blind, whatever, the results of who would advance would be much, much more diverse than they are when people are aware of those factors.

LAMPKIN: Yes, absolutely. So this was actually implemented by many symphonies years ago. Auditions happen behind a curtain, and everyone has to wear soft shoes - so you can't tell the gender of the person even based on their shoes - and the gender diversity increased by five X as a result of this blind audition strategy. And so we are trying to replicate that when sourcing for talent.

SIEGEL: Well, Stephanie Lampkin, thank you very much for talking with us today.

LAMPKIN: Thank you.

SIEGEL: Stephanie Lampkin, CEO of Blendoor, talking about her app that's designed to avoid bias in hiring practices. Transcript provided by NPR, Copyright NPR.

Related Stories