Google Dev apologises after Photos app tags black people as "gorillas".
59 replies, posted
[URL="http://arstechnica.com/business/2015/06/google-dev-apologizes-after-photos-app-tags-black-people-as-gorillas/"]Source[/URL]
[QUOTE]Last month, Google rolled out an updated version of its Photos app that had been divorced from Google+ and bolstered with a few slight tweaks—in particular, its ability to automatically tag photos and generate albums based on objects it identifies, including "food" and "landscapes."
Unfortunately, that object database not only included wild animals but also conflated them with humans—specifically, on Monday, when an African-American man looked in his Google Photos collection and discovered an automatically generated album of him and his black female friend labeled "gorillas."[/QUOTE]
Someone's been a rustlin'.
They need to stop being such [del]baboons[/del] baffoons.
I had a friend whose toddler called black people monkeys :v:
In the supermarket one day; "DADDY LOOK! MONKEYS!"
[QUOTE=FreyasFighter;48093491]I had a friend whose toddler called black people monkeys :v:
In the supermarket one day; "DADDY LOOK! MONKEYS!"[/QUOTE]
I did that as a kid too, in my defense on average you see 1 black person in about two months around these parts.
you can't make this shit up
this is how you know there weren't any black people in the office during development
Are we entirely sure that these weren't gorillas in human suits and the app wasn't trying to warn us?
AI doesn't know or care what political correctness is, it simply mislabelled it. If it had mislabelled a dog for a wolf it still would be an impressive algorithm and no one would bat an eye, having to apologize for something like that is weird, no one trained the algorithm to classify black people as gorillas on purpose.
[QUOTE=bunguer;48093961]AI doesn't know or care what political correctness is, it simply mislabelled it. If it had mislabelled a dog for a wolf it still would be an impressive algorithm and no one would bat an eye., having to apologize for something like that is weird.[/QUOTE]
And you'd expect the person who complained to realise this, given he's, you know, a programmer.
But no, gotta signal strength or something and get angry at everything.
[QUOTE=itisjuly;48093532]I did that as a kid too, in my defense on average you see 1 black person in about two months around these parts.[/QUOTE]
When I was really young ( 4 or 5) I asked my black friend why he had a chocolate face, our mothers were both present and my mother cringed badly.
Thinking back on it makes me want to punch myself in the balls.
No reason to apologize, but I see why they did (PR and all that)
[QUOTE=GrizzlyBear;48094000]When I was really young ( 4 or 5) I asked my black friend why he had a chocolate face, our mothers were both present and my mother cringed badly.
Thinking back on it makes me want to punch myself in the balls.[/QUOTE]
If I wanted to punch myself in the balls over all the stupid things I did and said as a kid I'd be sterile now.
[QUOTE=bunguer;48093961]AI doesn't know or care what political correctness is, it simply mislabelled it. If it had mislabelled a dog for a wolf it still would be an impressive algorithm and no one would bat an eye, having to apologize for something like that is weird, no one trained the algorithm to classify black people as gorillas on purpose.[/QUOTE]
thing is the 'AI' could be based off scraped webpage data,
there is always a chance the dataset includes a racial bias
[QUOTE=DrTaxi;48093978]And you'd expect the person who complained to realise this, given he's, you know, a programmer.
But no, gotta signal strength or something and get angry at everything.[/QUOTE]
If I got labelled an animal by facial recognition software I'd probably ask the developers too. I mean, the guy even asked what sort of sample image data they were developing the image recognition with. What I don't see is the guy running to every news site and saying 'LOOK GOOGLE IS RACIST!'. I think you're jumping the gun a bit.
[QUOTE=bunguer;48093961]AI doesn't know or care what political correctness is, it simply mislabelled it. If it had mislabelled a dog for a wolf it still would be an impressive algorithm and no one would bat an eye, having to apologize for something like that is weird, no one trained the algorithm to classify black people as gorillas on purpose.[/QUOTE]
What does this have to do with political correctness?
[QUOTE=catbarf;48094115]If I got labelled an animal by facial recognition software I'd probably ask the developers too. I mean, the guy even asked what sort of sample image data they were developing the image recognition with. What I don't see is the guy running to every news site and saying 'LOOK GOOGLE IS RACIST!'. I think you're jumping the gun a bit.[/QUOTE]
No, but he did seem upset.
Though I was being somewhat facetious; if you feel it's the expression itself, not the intent, that should be held responsible for harm in an insulting/offensive statement (and I'm [I]pretty sure[/I] this guy does), then of course you'd be upset about this like with any other harm bugs cause.
[QUOTE=JohnnyMo1;48093944]Are we entirely sure that these weren't gorillas in human suits and the app wasn't trying to warn us?[/QUOTE]
This would make a good movie plot
[QUOTE=bunguer;48093961]AI doesn't know or care what political correctness is, it simply mislabelled it. If it had mislabelled a dog for a wolf it still would be an impressive algorithm and no one would bat an eye, having to apologize for something like that is weird, no one trained the algorithm to classify black people as gorillas on purpose.[/QUOTE]
It isn't deliberate, but it's still a sign of shoddy testing. Like, did they not test any photos of black people or
Same year as searching Nigga House leading to the White House too :V
[QUOTE=CapellanCitizen;48094352]It isn't deliberate, but it's still a sign of shoddy testing. Like, did they not test any photos of black people or[/QUOTE]
Maybe they didnt have any within close reach, and they didnt bother?
[editline]1st July 2015[/editline]
[QUOTE=bunguer;48093961]AI doesn't know or care what political correctness is, it simply mislabelled it. If it had mislabelled a dog for a wolf it still would be an impressive algorithm and no one would bat an eye, having to apologize for something like that is weird, no one trained the algorithm to classify black people as gorillas on purpose.[/QUOTE]
What the fuck is with that political correctness shit? Why did you put that ?
Aren't we like 5% away from being gorillas? You can say their algorithm was 95% right.
[url]http://googleresearch.blogspot.dk/2015/06/inceptionism-going-deeper-into-neural.html[/url]
This should help explain why it's having these sort of issues, but i'm glad google didn't say something like "you're holding the phone wrong/posing wrong."
I like how the official Google response started with "holy fuck".
Makes it seem like they actually care about getting this right instead of simply covering their ass like most companies would.
[QUOTE=CapellanCitizen;48094352]It isn't deliberate, but it's still a sign of shoddy testing. Like, did they not test any photos of black people or[/QUOTE]
I'm somewhat unconvinced that Google would make the beginner's mistake of training against [I]no[/I] photos of black people.
I also doubt this happened to more than an extremely tiny minority of photos of black people, or given how insanely large Google's userbase is, it would've been noticed within minutes of launch. But it's more than a month later.
For all we know, the probability of the algorithm identifying a black person as a black person and not a gorilla could be 99.9999%. That's a damn accurate classifier. But it takes only this one mess up.
No, Google's mistake wasn't not training or testing enough. Their mistake was including categories that could conceivably cause offence if a subject was incorrectly put into them.
You computers are all racist
[QUOTE=Zeke129;48094674]I like how the official Google response started with "holy fuck".
Makes it seem like they actually care about getting this right instead of simply covering their ass like most companies would.[/QUOTE]
This was a developer, not a PR guy. He probably is personally embarrassed that his team fucked up. I know I would be.
probably what happened is that they were training the machine learning with wide shots of people
the algorithms were starting to look for clothing, and in that image that he showed there's very little clothing visible
the algorithm wasn't able to check off the clothing box on the "people" list
so it looked for the thing that was closest to matching
neural nets are hard
So will SJW now stop using computers since they're racist?
To further explain what I meant about political correctness, is that gorillas have been used as a racist weapon against black people, however, the AI does not know this, at least, not on purpose.
The algorithm probably also mistakes people for cats, dogs and other animals, but in this case, since gorillas is associated with racism connotations, I figured political correctness is an easy way to describe the "drama".
In the eyes of the users it is worse to mislabel black people as gorillas than it would be to mislabel them, say, as a lion.
I understand why it pissed of a couple of people, but as someone whose job is this type of work, I don't wish to annoy the social media because an algorithm made an understandable (from a technical point of view) mistake.
[QUOTE=JohnnyMo1;48093944]Are we entirely sure that these weren't gorillas in human suits and the app wasn't trying to warn us?[/QUOTE]
[IMG]http://www.sbmania.net/pictures/56b/436.png[/IMG]
?
Sorry, you need to Log In to post a reply to this thread.