Exhibit 170 – Mark Zuckerberg discussing linking data to revenue
Mark Zuckerberg email – dated 7 October 2012
‘I’ve been thinking about platform business model a lot this weekend…if we make it so devs
can generate revenue for us in different ways, then it makes it more acceptable for us to
charge them quite a bit more for using platform. The basic idea is that any other revenue
you generate for us earns you a credit towards whatever fees you own us for using plaform.
For most developers this would probably cover cost completely. So instead of every paying
us directly, they’d just use our payments or ads products. A basic model could be:
Login with Facebook is always free
Pushing content to Facebook is always free
Reading anything, including friends, costs a lot of money. Perhaps on the order of
$0.10/user each year.
For the money that you owe, you can cover it in any of the following ways:
Buys ads from us in neko or another system
Run our ads in your app or website (canvas apps already do this)
Use our payments
Sell your items in our Karma store.
Or if the revenue we get from those doesn’t add up to more that the fees you owe us, then
you just pay us the fee directly.’
Call me a shill but it doesn't sound like Marky Mark and the Zucky Bunch are talking about selling access to your friends data, they're talking about (in the former) security risks associated with the friends API and (in the latter) a system which would allow developers to pay for API access by serving facebook ads on facebook integrated applications
What alarms me is not the data access stuff because I think that's pretty well documented, but that facebook shut down API access to twitter for no reason other than the fact that they're a competitor in video content, which sounds unethical to me
So Facebook killed the 6 second video star?
There's a fair few emails in there that cover unethical behaviour. Using the API to strongarm competitors into following along with what Facebook wants, blocking competitors to try and crush them before they get a chance, etc. Along with using analytics groups to find competitors to buy up or destroy before they even really hit the market properly. It's all really fucking shifty and anti-competitive.
Essentially selling exclusive access to parts of the API that were shut down from everyone else because it's a genuine data security problem is also really fucking shitty. One rule for us, another for the super-corps like Netflix.
If anyone knew this would happen 9 years ago when Facebook hit it big, could this have been prevented?
The following is part of a discussion about giving Facebook's Android app permission to read users' call logs. It is dated 4 February 2015.
Michael LeBeau (Facebook product manager):
"As you know all the growth team is planning on shipping a permissions update on Android at the end of this month. They are going to include the 'read call log' permission... This is a pretty high-risk thing to do from a PR perspective but it appears that the growth team will charge ahead and do it."
If you ever wanted proof these corporations will sell you off just for a growth figure... Here you go.
are you offering to time travel and try it?
We knew climate change was going to be a thing for the longest time, did we prevent it? It would've been discounted as nonsense back then.
This shit is worthy of prison tbh
Treating your users like numbers rather than human beings isn't illegal
It absolutely should be.
It being illegal would destroy every corporation lmao
where's the issue
Make it so
I'm surprised nobody cares about this. Like, this is a lot less impactful than I thought it would be. Guess people are okay with their privacy being reaped.
People who get news from Facebook won't see this article lmao.
I know, but people could at least, ya know, try spreading it to other places instead of stopping at the first choice of FB and being complacent.
But what do you even mean by that? Business has been and always will be measured by quantifiable metrics, there is absolutely nothing wrong with companies using statistics to guide their strategies because that's how strategies work. If you want to have an effective conversation about Facebook being shitty and desperately needing to be regulated, you need to focus on specific failings and trends like doing 0 due diligence when handing out mass amounts of user data to sketchy sources, because "they treat users like numbers!" isn't going to cut it
the part where they steal users information even from other apps and sell it to everyone would be a start
Mark's statement on this:
This week a British Parliament committee published some internal Facebook emails, which mostly include internal discussions leading up to changes we made to our developer platform to shut down abusive apps in 2014-2015. Since these emails were only part of our discussions, I want to share some more context around the decisions we made.
We launched the Facebook Platform in 2007 with the idea that more apps should be social. For example, your calendar should show your friends' birthdays and your address book should have your friends' photos. Many new companies and great experiences were built on this platform, but at the same time, some developers built shady apps that abused people's data. In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access.
This change meant that a lot of sketchy apps -- like the quiz app that sold data to Cambridge Analytica -- could no longer operate on our platform. Some of the developers whose sketchy apps were kicked off our platform sued us to reverse the change and give them more access to people's data. We're confident this change was the right thing to do and that we'll win these lawsuits.
At the same time as we were focusing on preventing abusive apps, we also faced another issue with our platform -- making it economically sustainable as we transitioned from desktop to mobile. Running a development platform is expensive and we need to support it. Back when the main way people used Facebook was on computers, we supported the platform by showing ads next to developers' apps on our website. But on mobile, Apple's policies prevent us from letting apps run within Facebook and apps take the whole screen anyway, so we needed a new model to support this platform to let people log in and connect with other apps.
Like any organization, we had a lot of internal discussion and people raised different ideas. Ultimately, we decided on a model where we continued to provide the developer platform for free and developers could choose to buy ads if they wanted. This model has worked well. Other ideas we considered but decided against included charging developers for usage of our platform, similar to how developers pay to use Amazon AWS or Google Cloud. To be clear, that's different from selling people's data. We've never sold anyone's data.
Of course, we don't let everyone develop on our platform. I mentioned above that we blocked a lot of sketchy apps. We also didn't allow developers to use our platform to replicate our functionality or grow their services virally in a way that creates little value for people on Facebook. We restricted a number of these apps, and for others we asked developers to provide easy ways for people to share their content outside of their apps and to Facebook if they wanted.
We've focused on preventing abusive apps for years, and that was the main purpose of this major platform change starting in 2014. In fact, this was the change required to prevent the situation with Cambridge Analytica. While we made this change several years ago, if we had only done it a year sooner we could have prevented that situation completely.
I understand there is a lot of scrutiny on how we run our systems. That's healthy given the vast number of people who use our services around the world, and it is right that we are constantly asked to explain what we do. But it's also important that the coverage of what we do -- including the explanation of these internal documents -- doesn't misrepresent our actions or motives. This was an important change to protect our community, and it achieved its goal.
Yeah that's totally not an attempt to save face by bullshitting people.
I see the zuccbot’s face saving algorithm is functioning adequately.
I don't know if you've seen many privacy debates here on Facepunch but it's distressing just how many people don't seem to have the slightest care in the world about their right to privacy. Many of them seem to subscribe to the rather easy to debunk "if you have nothing to hide, you have nothing to fear" hogwash. (Taxes, medical records, credit card info, SSN, etc. are ALL things people have legitimate reasons to hide and there is absolutely nothing wrong with that. Doesn't mean you're doing anything wrong either.)
I don't know for sure why people think it either, but it's the biggest slippery slope you can imagine when someone says "no worries if you have nothing to hide".
I think people say these things cause they've never experienced having their privacy truly ripped from them and affecting their lives. Right now it's just some libertarian stooges selling off info to the Chinese. What's next? That's what we have to worry about. It starts simple
and snowballs from there. Careful regulation is required, but this stuff got pretty much no attention. I'm pretty sure Zuckerberg planned it to happen this way so the damage to FB would be minimal. It seems to be working too.
This response is not only a non-answer to the monopolisitic practices but its also a bold faced lie as Cambridge Analytica had access well into 2016 meaning either you didn't fix shit or you're hoping people just not their head.
RMS be like
Sorry, you need to Log In to post a reply to this thread.