• Flickr's new auto-tag function labeling black people as "ape" and concentration camps as "sport"
    10 replies, posted
[QUOTE]Flickr's new auto-tagging feature is generating some pretty offensive results. The photo sharing site, which is owned by Yahoo (YHOO, Tech30), launched image-recognition software two weeks ago that automatically creates tags for photos. The purpose was to make images more searchable in the system. For example, a black and white photo of London's Tower Bridge was automatically labeled "architecture," "outdoor," "city," "monochrome" and "skyline." But computer algorithms aren't perfect, and when they identify images incorrectly, the results can be disastrous. Some concentration camp photos received inappropriate tags, including "sport" and "jungle gym." Flickr had also been tagging some images of people as "ape" and "animal," including a photo of a black man named William taken by photographer Corey Deshon, according to the Guardian. The photo service had also labeled a white woman wearing face paint as "ape" and "animal," so Flickr's algorithm does not appear to be taking a person's skin color into consideration when auto-tagging them.[/QUOTE] [url]http://money.cnn.com/2015/05/21/technology/flickr-racist-tags/index.html[/url]
[quote] The photo service had also labeled a white woman wearing face paint as "ape" and "animal," so Flickr's algorithm does not appear to be taking a person's skin color into consideration when auto-tagging them.[/quote] So calling it racist is somewhat unfair because technically the algorithm is just being honest :v:
[QUOTE=Talishmar;47773181]So calling it racist is somewhat unfair because technically the algorithm is just being honest :v:[/QUOTE] Everyone just wants to be offended these days.
[QUOTE=revan740;47773235]Everyone just wants to be offended these days.[/QUOTE] [quote][b]Flickr's new auto-tags are racist and offensive[/b][/quote] Judging from this clickbait-as-fuck title, I'm inclined to agree with you. Humans look similar to apes, because they share many features with us. Especially humans whose skin tones are close to ape's skin tones. Concentration Camps have big grassy fields, fences, and big concrete buildings. That's vaguely similar to a plethora of things. It's no surprise that an automatic algorithm would make a mistake there. Lo-and-behold the programmers didn't think to put in a "potentially offensive" filter (as if they could cover everything that could be offensive anyways). Not that anyone should bother being offended by this shit anyways. There's literally no racist or offensive intent behind it, it's a simple and explainable mistake. CNN is just stirring up shit for people who are actively trying to be offended.
[QUOTE=revan740;47773235]Everyone just wants to be offended these days.[/QUOTE] Finding a reason to throw crap at a person or a group of people has been a trend for a long time and has only flared up after the internet came into being. Tricking SJWs into temper fits is hella lotta fun though.
These things do then to happen with a new search system, people get offended too easily today.
[QUOTE=Gar92;47773622]These things do then to happen with a new search system, people get offended too easily today.[/QUOTE] People enjoy controversy. That's about it.
arbiet macht swole
Apparently robots and machines can now have racist thoughts
[QUOTE=SenhorCreeper;47774154]Apparently robots and machines can now have racist thoughts[/QUOTE] The Three Laws of Robotics: 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. 3. Kill minorities, especially John, his weird ethnic food stinks up the fridge.
Sorry, you need to Log In to post a reply to this thread.