[quote]“It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.”[/quote]
"I'm sorry" says poor, disheveled and empty shell of a man as he wipes away crocodile tears on $100 bills.
https://www.youtube.com/watch?v=15HTd4Um1m4
That's all I can think of when the head of a multi-billion dollar company "apologises".
They have already done experiments where most people who had gotten more money, in a monopoly game, are less likely to help others by giving them money. Im sure it would be even more extreme with real dosh. Plus zucc is extremely antisocial and probably has extreme narcissism with how he acted in college.
Hey now, I'm told I'm a purdy fart smellar and I ain't a jerk.
https://medium.com/@chrissaad/do-not-deletefacebook-because-of-cambridge-analytica-e8a2eec44730
Unpopular opinion here,:
Rather, Cambridge Analytica purchased the data from another company who had built a Quiz app that went viral. This Quiz app had the data because they built something that many users decided was fun and interesting to try. Upon hearing about it, they each instructed Facebook to share their data (and some of their friend’s data) with the Quiz app. This was something each user asked for and Facebook — as part of their compliance with the spirit of data portability and user rights — did what it was told. Facebook did not, and does not charge for this.
Further, the data was transferred from Facebook to the Quiz app under clear and standardized Facebook data usage policies that state that the Quiz app (and any other 3rd party app that engaged in the same process) must not store the data long-term or to sell it to 3rd parties. These policies are intended to (in so far as is possible) protect users from unforeseen uses of their data.
So what, in fact, has happened is that Facebook users trusted a 3rd party app with their data, and that app violated their trust.
Like with most things, there is a tension between Freedom/Rights and Responsibilities. Data portability gives users the right to share their data with apps they trust. But it also gives them the responsibility to choose apps worthy of that trust.
...
Many are upset about the notion that if an app is free, then it is likely selling you and your data to other companies — in particular, advertisers.
This is only partly true. Crucially, selling your Data and selling your Attention to advertisers are two different things.
Selling your data involves companies giving large portions of their data-set wholesale to other companies without your explicit permission. To be clear, this is not data portability because it is not done at the user’s request or with their direct consent. It is also not what Facebook did in this case.
However it is exactly what the Quiz app did! They wholesale sold the data they obtained from Facebook (with user permission) to Cambridge Analytica (without user permission). This is against FB’s Terms of Service.
Selling your attention is different. Your data is never actually given to the advertiser. Instead, Facebook, Google, Twitter etc, each have tools that allow companies to upload and target ads at you based on your profile data. The advertiser never actually gets your data, though. This is common and accepted practice for free online apps (not to mention radio stations, TV networks, roadside billboards, magazines etc).
I think the real criticism here is a security issue; facebook users should not have the ability to share their friends data beyond mere association ("John so-and-so is my friend", but nothing about his profile). Afaik, Facebook removed this aspect of portability. Moreover, I think Facebook views their platform differently from how its users view the platform: If an app asks for a facebook sign-in, then I think many people assume that the application has been sanctioned by facebook, when in fact it has agreed to a limited TOS that facebook implicitly believes the app will not break.
If this was limited to the individual users of the quiz app, then Facebook's responsibility would be in not educating users enough on the importance of trusting 3rd party applications, or making it very clear that the 3rd party applications you're trusting are not officially sanctioned by facebook and that when you agree to share ___ information, then they MEAN it.
But, since facebook's data sharing policies allowed (to what extent I'm not sure) the quiz app to access user's FRIENDS' data, then I think that was very poor judgement on Facebook's behalf.
I think facebook is about as responsible in this regard as Microsoft is for someone getting a virus on their computer due to a security flaw; partially responsible in that they should have fixed that flaw, but the fact remains that you shouldn't trust every website you visit.
What I think is unfair to facebook are the wild assertions going around that Facebook is stealing YOUR data, YOUR FRIENDS' data, and selling it with your name and porn habits in a big database to Madison Avenue. it's not.
I still can't believe people are mad at Zuckerberg for failing to "protect data" that they made public through his service which explicitely states that your data can and will be sold to advertisers and that it is not private.
I think you're missing the "influence elections, manipulated public opinion through unfair practices and targeting people psychologically.".
I'm not missing anything.
How do you actually prevent that if people are choosing to post the information publicly? Facebook isn't the one that did that.
By doing more than saying 'hey stop that' on the honor system when people decide to become bad actors.
How do you propose that this be detected in any way, shape, or form?
They already detected it - so obviously they are aware when the information is used in bad faith at least some of the time - presumably there's some metadata attached to the information they give out to folks to ensure that the information doesn't get passed along/misused against their agreements.
The cease and desist was for them violating the terms that they agreed to, but they didn't somehow know that they were working as a bad actor.
What else were they to do? They can (and do) revoke API keys if they continue to detect such behavior.
Violating the terms shows that they were working as a bad actor.
They could sue them for breach of contract, which they did breach.
You know that suing TOS violators is not evenly remotely reasonable for a company that size. Also, implying that it was somehow their responsibility to get involved over a legal battle for a breach of TOS is a bit nutty.
How do you "return data"?
super not how that works unless you're referring to some extremely specific esoteric data transferal method that they used that I am unaware of like an encrypted non-replicable suitcase (something like how they transfer digital movies). But as far as I know, that's not the case.
Yeah, they probably should. But they also weren't obligated to. We have literally no idea how much Facebook actually knew about the situation and/or the scope of the violation. Also, we have no idea if Cambridge Analytica managed to "prove" that they were using the data in a legitimate way. They very well might have, seeing as they were using a popular game as a layer of indirection.
The Washington Post have compiled a list of every time Zuckerberg has apologised for this stuff since 2003
https://www.washingtonpost.com/graphics/2018/business/facebook-zuckerberg-apologies/?utm_term=.4dc4e2dd61f5
Also this seems like the best opportunity I'll get to share this: I posted a writing prompt in the FP discord the other day 'what if garry ran Facebook instead of FP' and @Milk came up with this
https://i.imgur.com/Z02rFmS.png
Sorry for being caught
Going from average/poor to having way too much money fucks you up, unironically. Money changes people. Their decisions shift away from an empathic humanist position (if it was there in the first place) and into more selfish views that prioritize wealth and hoarding. It's one thing to be born rich, because you're born into the lifestyle and hopefully you are raised to appreciate the needs and struggles of those less fortunate than you, but becoming newly rich in adulthood can exert deeply powerful changes on a person's thoughts and behaviour, and it also has strong effects on the behaviour of others who become aware that this person in their life has become massively wealthy.
Dragons hoarding their treasure and killing anyone who threatens even a single coin of their wealth ain't just a fairy tale trope, it's a warning metaphor.
If I were to guess, it stems from the undeniable fact that whenever you become freshly minted wealthy everyone around you suddenly starts going 'HEY WHERE'S MY SHARE?!'. I know damn sure that's why I'd hoard mine if it happened to me, I ain't givin' handouts to everyone who's got even a tangential connection to me. 'If you don't live in my house you don't have any claim to my wealth so fuck right the hell off with your begging' is the way I'll see it.
The hearing is underway
https://i.imgur.com/74Su9IJ.jpg
https://i.imgur.com/Xk2FARk.jpg
I don't know of any specific studies as I'm not one to pore through scientific journals I'll barely understand (also I'm lazy). This blogpost links to a few studies and also covers some of the factors and why: https://www.moneycrashers.com/money-changes-people-affect-behavior/
This is a Medium-like site and so basically a blog post, but it also describes how things change: https://steemit.com/money/@theroadtoriches/top-10-things-that-change-once-you-become-rich
TIME ran an interview/article in 2010 with the co-author of a study who found that the richer you are, the less empathy you seem to have, and the worse you are at reading facial expressions for emotional information (literally I'm too rich to need to care about other peoples' problems because I don't need their help with mine): The Rich Are Different
And this TED post contains links to a few studies: 6 studies of money and the mind | TED Blog
Finding #1: We rationalize advantage by convincing ourselves we deserve it
Fox just switched to Huckabee Sanders as Sen. Finestein started mentioning Cambridge Analytica's CEO claiming it was involved in something I presume was related to Steve Bannon but I can't say for sure because they changed the streams over...
https://www.youtube.com/watch?v=6ValJMOpt7s
I feel like I should take a longer walk after work to visit the train station here in chicago, take some pictures- I was in there sunday and EVERY advertisement board and poster is facebook advertisements that have all these ominous statements like the below, all '[FAKE NEWS/SPAM/SCAMS] are not your friends' and the like. It's like a scene out of a weird dystopian movie. The ads are spreading through the city alarmingly fast, on bus stops and billboards
https://i.imgur.com/jnJ6U3l.jpg
https://i.imgur.com/IFgQO9E.png
apparently you need to hire more burmese translators to take this off facebook