Last summer, Jacky Alciné learned just how biased computers can be. Alciné, who is African-American, took a bunch of pictures with friends at a concert. Later he loaded them into Google Photos, which stores and automatically organizes images.
Google’s software is able to group together pictures of a particular friend, or pictures of dogs, cats, etc. But when it labeled a picture of one of Alciné’s friends, who is also African-American, it left him speechless.
“It labeled it as something else. It labeled her as a different species or creature,” says a horrified Alciné. Because it’s so cliché he doesn’t even want to say what creature it was. “I kind of refuse to. By saying that, I kind of reinforce the idea of it.”
I’m not going to reveal which animal it labeled his friend. But, it also happened to others with dark skin. Alciné isn’t buying that it’s just some weird technical glitch.
“One could say, ‘Oh, it’s a computer,’ I’m like OK … a computer built by whom? A computer designed by whom? A computer trained by whom?”
Alciné’s conclusion is that there probably weren’t any black people on the team that designed Google Photos. Google says it did test the product on employees of different races and ethnicities and it has apologized for what happened. The company says it’s still early days for image labeling technology and it’s working to improve it.
Alciné’s experience is one of many strange biases that turn up in computer algorithms, which sift through data for patterns.
Most of us are familiar with suggestion algorithms used by Amazon and Netflix — if you liked this movie, you’ll probably like this one. For example, the computer may learn over time that viewers who liked The Terminator also enjoyed Ex Machina.
But in another context user feedback can harden societal biases. A couple of years ago a Harvard study found that when someone searched in Google for a name normally associated with a person of African-American descent, an ad for a company that finds criminal records was more likely to turn up.
The algorithm may initially have done this for both black and white people, but over time the biases of the people who did the search probably got factored in, says Christian Sandvig, a professor at the University of Michigan’s School of Information.
“Because people tended to click on the ad topic that suggested that that person had been arrested when the name was African-American, the algorithm learned the racism of the search users and then reinforced it by showing that more often,” Sandvig says.
He says other studies show that women are more likely to be shown lower-paying jobs than men in online ads. Sorelle Friedler, a computer science professor at Haverford College, says women may reinforce this bias without realizing it.
“It might be that women are truly less likely to click on those ads and probably that’s because of the long history of women making less than men,” she says. “And so perhaps (women are) thinking, ‘Oh, that ad isn’t really for me. I’m not as likely to get that job.’ “
And so the algorithm determines it should no longer show those ads to women, because they don’t click.
What worries Friedler and other social scientists is that computer algorithms are being used in a growing number of ways — Pennsylvania is testing the idea of using algorithms to suggest how to sentence a particular criminal. Some companies are using them to narrow the pool of job applicants.
And it can be hard to guard against bias, Sandvig says.
“The systems are of a sufficient complexity that it is possible to say the algorithm did it,” he says. “And it’s actually true — the algorithm is sufficiently complicated and it’s changing in real time. It’s writing its own rules on the basis of data and input that it does do things and we’re often surprised by them.”
Yet, Sandvig remains optimistic about fixing biased algorithms and actually using computers to guard against prejudice. There are already apps to keep discrimination out of the hiring process.
Sandvig and others say it’s important to talk about the problem because if humans don’t fix it, computers won’t do it themselves.