This is some fucking orwellian garbage being presented under the guise of "the greater good"
Am I understanding this correctly? Is this google guy suggesting we stop thinking of our personal information as private because greater good?
Finally, we can outsource our decisions to google!
This would be great for humanity if it was controlled by someone with humanity's best interest at heart. (No one)
He seems to want it to be something like a behavioral profile that gets passed to your kids and other people so that they can learn about you, i think? The reality is that most communications will probably be public record 100 years from now, but I really don't know whether this whole 'ledger AI' is a good idea at all. Seems overly convoluted and utopian.
Okay, whats the deal with the music here? Is this googles acoustic mind control for its employees? Why the hell would they choose something so ominous for their evil orwellian plan memo?
https://www.youtube.com/watch?v=4NTIfTzq9XQ
I knew that tone from somewhere.
But why did they just repeat that part of the tone constantly through the video?
a pure AI would be alright if done properly
You're all selfish for wanting to protect your privacy, we need to break away from "privacy" so that advertising is easier.
They are just trying to make this shit sound good so people accept extreme datamining which means more profit for Google.
In summation...
Fuck Google.
You're saying with a straight face an ivy league coding engineer who's never even seen actual poverty or strife knows what's best for the daily life of someone working two jobs and weekend stints just to keep food on the table and the mortgage under control?
Also the idiom that google would never misuse the data they scrape has already been blown out of the water YEARS ago.
fuck right off.
While I don't disagree with "fuck google" in more ways than one.
The idea is that an AI (not the engineers that made the AI) would have a concept of what might best help someone in such a position - in particular with the assumption that both a multi-generational and multi-person approach is used.
That AI (or, what I'll call it for now ubermind - for lack of a better term) would make conclusions that no individual could come up with, considering far more information that anyone / anything today has access to.
The engineers would not know how the system arrives at some conclusion. It isn't a an "if then" type thing.
Kinda good video on machine learning:
https://www.youtube.com/watch?v=R9OHn5ZF4Uo
And that is the chilling bit to me. Not that it is possible in theory (I think it probably is with enough personal information, computing power, etc). But that we are close to a society where every facet of our day to day is logged and analysed to such a degree - inching closer to the birth of said ubermind.
More chilling: The video's implicit goal felt altruistic (kinda), but what if the training data for such an ubermind had goals for "further google's goals and aspirations" or maybe "marginalize this demographic" or "place societal pressure against those who exhibit X behavior"
Think about it: the "worker bees" or "ants" (as the metaphor for humans in the video went) mindlessly or carelessly followed suggestions that effectively suppressed some segment of the population or type of google competitors. The bees don't realize they are weaponized for the goals of the ubermind - but if most people buy X brand instead of Y, or visit A restaurant instead of B means the ubermind can determine success or failure based on it's goals.
So yeah. @27X - I agree "fuck google" - but not because they are being pretentious to assume they know the struggles of mankind. But because they might seek such an ubermind they control.
It's inevitable. It'll be done behind a curtain if people don't accept it, so it's in their interest that people get more and more comfortable with it.
I'm not excited for the future
Why the fuck didn't we prepare for this. This has become the logical extreme to a lack of properly implemented individual privacy laws.
Obligatory MGS2 Prediction.
https://www.youtube.com/watch?v=eKl6WjfDqYA
I would rather disconnect from everything than ever give up my privacy. We need data protection laws. Hell, you wanna scare em from this? Make them pay us for our data they take.
And then they actually start paying us something for the data - how many people will happily sell it, thinking they don't actually have anything of value (haha, I ripped off Goggle)? Millions of people sacrifice privacy on a daily basis one way or another - for free, without as much as a second thought. Give them a financial incentive, even a vague one, they'll fucking do it in a heartbeat, if anything Goggle knows it better than anyone.
Laws are absolutely essential here.
I highly doubt they would ever be as successful as they hope to be.
Until I stop getting Google for the exact shirt I bought online last week, I won't worry.
Yeah, we should all just give up now and make it easier for them instead.
I hope you're not implying that's what I'm saying, because I specifically say it's "in their interest". Not ours of course. We should do what we can to keep our individual rights protected.
We here at Google aren't exactly sure what we can get away with in the public eye. So you tell us!
20 to 30 year olds now are probably going to be the last generation in america to truly care about privacy rights.
worse: it posits somewhere in the video that the point of human existence is to create sellable data
Can't wait for this to be shown in schools. Please European Union stop this Black Mirror horseshit from becoming real.
The Bill of Rights is just a suggestion now
The Bill of Rights doesn't say anything about corporations, or really anything about privacy except the Fourth.
Does anyone else feel a little physically ill after watching this?
The way you were talking about how it's inevitable definitely made it sound like that's what you meant, at least to me.
https://www.youtube.com/watch?v=ubU-dB8B-94
Sorry, you need to Log In to post a reply to this thread.