TED: We're building a dystopia just to make people click on ads
24 replies, posted
[media]https://www.youtube.com/watch?v=iFTWM7HV2UI[/media]
The impression I had from the talk is essentially that the algorithms used to target specific demographics with specific ads on social medias can be (and atleast to an extent [I]is[/I]) used in order to produce an authoritarian-like society, in a world where we all know and can recognize the signs of authoritarianism, by subtley yet very effectively tailoring - for a lack of better word - people's views, opinions and political activity.
China is a step ahead of everyone with their [url=https://en.wikipedia.org/wiki/Social_Credit_System]Social Credit System[/url]
Theoretically couldn't you make a scrambler that takes your online profile and visits it to thousands of random websites to skew the image they have of you.
[QUOTE=duckmaster;52901263]Theoretically couldn't you make a scrambler that takes your online profile and visits it to thousands of random websites to skew the image they have of you.[/QUOTE]
I want to say that ontop of this and military applications is where most AI go alongside things like cost curing measures such as editors, researchers and more.
AI has a potential to help but its being designed to sue us, kill us, scam us and replace us. These are the exact opposite of what many thinkers like Asimov had in mind for AI.
They're supposed to be companions in the cold uncaring universe.
[QUOTE=duckmaster;52901263]Theoretically couldn't you make a scrambler that takes your online profile and visits it to thousands of random websites to skew the image they have of you.[/QUOTE]
As far as I can tell, yes you could. On the other hand, what's the point? All that's gonna do is make sure that ads and the like remain as irrelevant to you as if you were watching TV.
Could you use it on a grander scale, like as a concerned well-meaning government make it an automatic thing for all the citizens in your country? Maybe, but to achieve what?
[QUOTE=MrJazzy;52901303]As far as I can tell, yes you could. On the other hand, what's the point? All that's gonna do is make sure that ads and the like remain as irrelevant to you as if you were watching TV.
Could you use it on a grander scale, like as a concerned well-meaning government make it an automatic thing for all the citizens in your country? Maybe, but to achieve what?[/QUOTE]
I dunno really, privacy's sake? It'd essentially make anyone using it an online grey spot.
[QUOTE=Trebgarta;52901309]
This isn't just about what you visit online and definitely not only about people who are conscious enough to do such a thing[/QUOTE]
I'm completely aware that its not but its just a neat thought.
[QUOTE=duckmaster;52901326]I dunno really, privacy's sake? It'd essentially make anyone using it an online grey spot.[/QUOTE]
Sure but why would you want that? That'd defeat the purpose of the algorithms in the first place, and the purpose isn't necessarily or at all sinister. And the algorithms could potentially be altered to account for the scrambler.
I don't think it's a bad idea, I've just never heard of it before so I'm wondering if it could really be a practical way to fight the algorithms - I mean... should we fight the algorithms?
[QUOTE=MrJazzy;52901360]Sure but why would you want that? That'd defeat the purpose of the algorithms in the first place, and the purpose isn't necessarily or at all sinister. And the algorithms could potentially be altered to account for the scrambler.
I don't think it's a bad idea, I've just never heard of it before so I'm wondering if it could really be a practical way to fight the algorithms - I mean... should we fight the algorithms?[/QUOTE]
Anything that tries to predict what you're going to do should be destroyed.
[QUOTE=SunsetTable;52901798]Anything that tries to predict what you're going to do should be destroyed.[/QUOTE]
Like studies?
[QUOTE=MrJazzy;52901070]
The impression I had from the talk is essentially that the algorithms used to target specific demographics with specific ads on social medias can be (and atleast to an extent [I]is[/I]) used in order to produce an authoritarian-like society, in a world where we all know and can recognize the signs of authoritarianism, by subtley yet very effectively tailoring - for a lack of better word - people's views, opinions and political activity.[/QUOTE]
This is why I'm quite confounded by people who say they don't care, or even are happy, with companies happily swapping and collecting tons of personal data without permission or transparency.
We don't have to go full tinfoil and reject all data collection, a lot of it is very important, useful, and safe, but current practices are dangerous as hell.
[QUOTE=SunsetTable;52901798]Anything that tries to predict what you're going to do should be destroyed.[/QUOTE]
time to kill everyone around me
[QUOTE=thelurker1234;52901852]This is why I'm quite confounded by people who say they don't care, or even are happy, with companies happily swapping and collecting tons of personal data without permission or transparency.
We don't have to go full tinfoil and reject all data collection, a lot of it is very important, useful, and safe, but current practices are dangerous as hell.[/QUOTE]
Most importantly, consumers should really understand that the real money is in your data. Why should we be paying [i]them[/i] to collect our data, when that's where they make their money anyway?
Realistically speaking, they should be paying me for using their products at this point. And no, this is not a joke.
[editline]17th November 2017[/editline]
Some companies are doing something similar to this right now too. It would be bizarre in a world where companies sell data after [I]buying their consumers[/I].
[QUOTE=kharkovus;52901853]time to kill everyone around me[/QUOTE]
lol damn i was thinking the same thing
When she started talking about how Youtube selects videos for the "up next" cue based on it's profile of you I looked over and saw a TED video titled "Inside the Mind of the Master Procrastinator" or something along those lines.
Not sure if I should be offended or not.
[editline]18th November 2017[/editline]
Maybe if I sleep on it I'll know by tomorrow.
wow
we're really heading for a legitimate dystopian cyberpunk future
Great to know that social media is slowly ending up like state media but more powerful
[QUOTE=Killuah;52901149]China is a step ahead of everyone with their [url=https://en.wikipedia.org/wiki/Social_Credit_System]Social Credit System[/url][/QUOTE]
i was going to say "isn't that a black mirror episode" but it's literally referenced in the article
[QUOTE=duckmaster;52901263]Theoretically couldn't you make a scrambler that takes your online profile and visits it to thousands of random websites to skew the image they have of you.[/QUOTE]
Wouldn't that just generate noise? I imagine that you should be able to still draw patterns from actual, legitimate use.
So a YouTube comment brought up an interesting idea. I'm rewording and expounding on what the comment said a bit here, and I am using the term "Internet" to refer to the collective Internet itself and these algorithms operating on it:
"The Internet is made to solve our problems. If we live in a utopia, where we have no problems, then the Internet has no purpose. And what better way is there to have problems to solve, than to [b]create[/b] problems to solve?"
It is an interesting idea, that I think might have some semblance of merit to it.
Machine learning, at its heart, is all about maximizing (or minimizing, which is just inverted maximizing) certain parameters. Machine learning learns quickest when there is a large gradient between its guesses and the desired (maximized) result: the larger the contrast, the more extreme its change.
If these machine learning algorithms have determined, either through explicit intent of programmers or as some unknown bias in their programming that gravitates toward this end, that the best way to maximize their target parameters is to harvest the most extreme gradients, the biggest changes in the data, then it only stands to reason that the algorithms could also have determined that the best way to maximize their target parameters is to [b]generate[/b] or [b]aggravate[/b] those extreme gradients.
And I can think of few things that generate extreme gradients faster than extremism itself, be it in the obvious politics, or the more obscure things like opinions on veganism or YouTuber drama. And so the algorithms, in pursuing these extreme gradients, choose to push these examples of extremism, bringing them to the forefront of the collective consciousness, inspiring polarization in the public, and in doing so, reaping the extreme gradients generated by such polarization.
I'm not saying it's right, or that this is what it's happening.
But I am saying it's an interesting thought to entertain, in my opinion.
[QUOTE=duckmaster;52901263]Theoretically couldn't you make a scrambler that takes your online profile and visits it to thousands of random websites to skew the image they have of you.[/QUOTE]
Someone already did. I don't remember the name but it was an extension for chrome that got so popular it was causing problems for google since people using it would register clicks on all ads without truly visiting them causing profiles to be created that were basically "This person likes EVERYTHING", which isn't very useful for targeted advertising. Google ended up blocking it to some extent.
[QUOTE=CommanderPT;52902737]Someone already did. I don't remember the name but it was an extension for chrome that got so popular it was causing problems for google since people using it would register clicks on all ads without truly visiting them causing profiles to be created that were basically "This person likes EVERYTHING", which isn't very useful for targeted advertising. Google ended up blocking it to some extent.[/QUOTE]
It's called AdNauseam. Google removed it from the Chrome store but it can still be installed manually.
It's horrifying to imagine there's content on the internet that I simply cannot see.
[QUOTE=duckmaster;52901263]Theoretically couldn't you make a scrambler that takes your online profile and visits it to thousands of random websites to skew the image they have of you.[/QUOTE]
[url]https://chrome.google.com/webstore/detail/noiszy/immakaidhkcddagdjmedphlnamlcdcbg[/url]
[QUOTE=duckmaster;52901263]Theoretically couldn't you make a scrambler that takes your online profile and visits it to thousands of random websites to skew the image they have of you.[/QUOTE]
Isn't [url]https://adnauseam.io/[/url] something like this?
Sorry, you need to Log In to post a reply to this thread.