Home Editorials Google Photos app labels black friends as “Gorillas,” Search Giant issues an...

Google Photos app labels black friends as “Gorillas,” Search Giant issues an apology

Google touted its Photos app, now detached from the company’s social media app, Google+, as revolutionary and enhancing to the current photos experience. And, indeed it is: anyone who’s used the app over the last few years remembers a time when you could barely even search your 11,000+ photos for specific pictures (yes, I do have that many stored with Google). So, the app itself is leaps and bounds ahead of where Google Photos was three years ago.

At the same time, however, this week’s story shows that Google Photos still needs some work if Google intends to increase human adoption of the new Photos app. US software developer Jacky Alciné says that he and a friend posed together in a picture that was backed up to his Google Photos storage account. However, what he found was surprising: the Photos app labeled he and his friend “gorillas.” Alciné posted the picture and label on Twitter and received correspondence from a Google representative within 90 minutes. Google apologized yesterday for the incident: “We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing,” the company said in a statement.

Yes, this is a shocking story for many. And, as someone of mixed racial descent, the story certainly hits home for my black relatives. At the same time, however, we must keep in mind that Google’s intelligent software isn’t as “intelligent” as human beings, and more work must be done before we can fully trust software to be as “smart” as humans are. At the same time, however, it’s likely the case that the software was trained to identify anything that it views as having “animal-like” traits as an animal. Sources also say that a bloody elbow was labeled “food” by the Photos app. I’m sure that the Google doesn’t condone cannibalism, either.

These stories are out of the usual, but they must be placed into perspective. The software is trained to identify animals, places, and objects that fit within the normal scheme of things. Google Photos isn’t yet “smart” enough to identify people within photos. For example, I can’t yet have my photos automatically upload and receive an automatic label that lets me know that Google recognizes a picture containing me and my friends. Google Photos doesn’t have automatic person labeling, but instead, leaves tagging up to the person who’s using the account. So, the software identified the developer as a “gorilla,” which is no doubt insulting; but should the person whose elbow was identified as “food” be offended by that and assume that Google’s software wants someone to carve the person up for lunch?

Of course, it is offensive to the individual involved, and we’re sorry that Jacky Alciné had to endure this. What we want you, our readership, to consider, is that this mistake can be seen in two ways: 1) it was a racist label intending to harm the individual at the center of it, or 2) the software identified the individual as an animal whose face it thought he resembled. Yes, this is an error, but it’s one whereby the Google Photos app mistook the developer for an animal instead of a human being. I wouldn’t look at this as an “the software labeled him a gorilla as a means to mock his race,” but instead as an “the software mistook a human being for an animal.” I think that’s the problem with the Google Photos app: it doesn’t know how to label human beings as human beings. If the app labeled a child as a “cat” or “tiger,” would we be any less offended?

Things like this should never happen, but software is no smarter than its developer. In this case, someone at Google has not yet programmed its software to be as smart as human beings, and some advanced facial recognition software may just solve the problem in the future. At the same time, we cannot continue to let stories that can be misinterpreted as an issue with racial tensions divide us.

This is not a racial story: Alciné is not being made fun of by the software for being a black man. The software couldn’t have “meant” any harm because it’s not a person. Instead, the software was programmed to recognize animals and the software took what it thought was a good guess on a picture with the developer. Sure, it’s a programming issue, and one that needs to be fixed, but it’s not a racial issue.

Humans know that black people are not gorillas, but how can we expect software to “know” that?

Comments are closed.