• Apple: 'FBI could force us to turn on iPhone cameras and microphones'
    18 replies, posted
[URL]http://www.theguardian.com/technology/2016/mar/10/apple-fbi-could-force-us-to-turn-on-iphone-cameras-microphones[/URL] [quote] If the FBI wins in its case against Apple to help it unlock the San Bernardino killer’s [URL="http://www.theguardian.com/technology/iphone-5c"]iPhone 5C[/URL], [B] it won’t be long before the government forces Apple to turn on users’ iPhone cameras and microphones to spy on them, according to the company’s head of services Eddy Cue.[/B] The FBI has demanded that Apple creates custom software that bypasses certain security features of the company’s iOS to allow law enforcement to brute force the passcode of the gunman’s [URL="http://www.theguardian.com/technology/iphone"]iPhone[/URL] 5C. But according to [URL="http://www.theguardian.com/technology/apple"]Apple[/URL], making the modifications necessary in this case would set a dangerous precedent in offering backdoors into users’ smartphones. Cue [URL="http://www.univision.com/organizaciones/apple/apple-el-fbi-se-pone-del-lado-de-los-hackers-en-el-caso-del-iphone-de-san-bernardino"]said to Univision[/URL]: “[B]Someday they will want [Apple] to turn on [a user’s] camera or microphone. We can’t do that now, but what if we’re forced to do that?[/B] “[B]Where will this stop? In a divorce case? In an immigration case? In a tax case? Some day, someone will be able to turn on a phone’s microphone. That should not happen in this country[/B].” The [URL="http://www.theguardian.com/us-news/fbi"]FBI[/URL] is trying to access the locked iPhone of one of the San Bernardino killers and insists it needs Apple’s due to the software protections built into iOS, which require Apple’s unique signature. But security expert and NSA surveillance leaker Edward Snowden recently said that the FBI’s assertions that only Apple has the capability to unlock the phone is “[URL="http://www.theguardian.com/technology/2016/mar/09/edward-snowden-fbi-san-bernardino-iphone-bullshit-nsa-apple"]respectfully, bullshit[/URL]”.[/quote] [quote] Other security researchers have said that for Apple to modify the iOS software of the iPhone 5C to allow the FBI to guess the passcode via a machine without losing data [B]would be a slippery slope.[/B] [B]Apple has the backing of the majority of the technology industry, including Microsoft, Twitter, Facebook and Google[/B], which makes the most-used smartphone operating system, Android. The case will come to a head this month when Apple and the FBI go to federal court to contest the order. [B]The US government was recently dealt a blow in a similar but unrelated case over the [URL="http://www.theguardian.com/technology/2016/feb/29/apple-fbi-case-drug-dealer-iphone-jun-feng-san-bernardino"]unlocking of an iPhone in New York[/URL],[/B] which it is [URL="http://www.theguardian.com/technology/2016/mar/07/apple-iphone-new-york-case-fbi-encryption"]currently appealing[/URL] against.[/quote]
Thete were already reports that govermental agencies have so much of our data that they cant even process through it. Even if I agreed with this, its dumb to add even more work if we cannot deal with what we already have
Dang, what will it be next? That Wayne Industries audio-imaging cell phone botnet in Batman?
[QUOTE=Lobotmik;49906223]Dang, what will it be next? That Wayne Industries audio-imaging cell phone botnet in Batman?[/QUOTE] well i'm not saying they have exactly that buuuuut.... the NSA DOES have a massive metadata crunching machine that can identify whether a person is a potential terrorist by massive data crunching
I was under the assumption that they already do this, they just don't advertise it.
[QUOTE=Sableye;49906235]well i'm not saying they have exactly that buuuuut.... the NSA DOES have a massive metadata crunching machine that can identify whether a person is a potential terrorist by massive data crunching[/QUOTE] The idea of leaving it to a computer to decide if you are the enemy or not is way too disconnected. It probably does save time but it seems just so open for error.
[QUOTE=pentium;49906374]The idea of leaving it to a computer to decide if you are the enemy or not is way too disconnected. It probably does save time but it seems just so open for error.[/QUOTE] It's not a decision that you're an enemy, it's a flag that tells them to look further. We had an incident at work a while back where someone was saying threatening stuff on an open facebook group and homeland security's systems flagged that individual automatically, they reviewed the case, saw it as legit, and contacted our company to warn us and whatnot. I'm not particularly for active human monitoring but if open channels are automatically monitored for keyphrases and shit that get properly reviewed by people it has the potential to mitigate a lot of possible crises like the above without being too invasive.
This is a confusing matter, imo. Only because they (Apple, MS, FB, Google) stand so strong "for the people" in this case... yet all of them constantly monitor what you do to try and sell you shit based on it. It's like they're masking their own spying by calling out the FBI, and people will eat that shit up. Unless I'm missing something here.
I am suspicious too about this whole shit.
[QUOTE=Robman8908;49906705]This is a confusing matter, imo. Only because they (Apple, MS, FB, Google) stand so strong "for the people" in this case... yet all of them constantly monitor what you do to try and sell you shit based on it. It's like they're masking their own spying by calling out the FBI, and people will eat that shit up. Unless I'm missing something here.[/QUOTE] You're not missing anything. They hate revealing stuff their own for free while people love their privacy so make a case about privacy to gain public support while pleasing your internal goals.
[QUOTE=pentium;49906374]The idea of leaving it to a computer to decide if you are the enemy or not is way too disconnected. It probably does save time but it seems just so open for error.[/QUOTE] A computer is as open to error as the person who programmed it. I see no distinction between that and a human deciding potential threats.
[QUOTE=mix999;49906335]I was under the assumption that they already do this, they just don't advertise it.[/QUOTE] Yeah I mean why would anyone think otherwise. Apple itself has a security feature you can buy that does JUST fucking that, while also logging GPS cords AND taking pictures everytime its open after you turn the security feature on(you can call apple) ANY Apple consumer has access to this. The fact thats already in the hands of consumers should seriously flick the switch in peoples heads and realize the governments been able to do that a few years beforehand. Anytime a company comes out with technology, everyone should realize the governments had access to it a few years prior. Thats actually what USA National Security is about. Government holds onto tech, becomes the most versed with it, then companies and people start getting their hands on it. It happened with cellphones It happened with Computers It happened with Drones. It happens with a lot of shit(not everything obviously), even mundane shit that isnt even a security risk or has powerful uses.
[QUOTE=proboardslol;49906767]A computer is as open to error as the person who programmed it. I see no distinction between that and a human deciding potential threats.[/QUOTE] Because computers make mistakes on a whole different level than people do. An algorithm being wrong and the computers 'misjudging' someone is one thing. A line of code being wrong is different thing entirely, because suddenly the program could be behaving incorrectly and there may well not be clear signs that they are doing so. With a human being you can follow their reasoning, figure out where mistakes were made. Often times they will even be able to figure out themselves when something's wrong. With a program though, it isn't always clear and if a program has an error in its code, and it absolutely will somewhere, it will make that mistake again and again and again.
[QUOTE=proboardslol;49906767]A computer is as open to error as the person who programmed it. I see no distinction between that and a human deciding potential threats.[/QUOTE] Even if they are able to get it to read human communication with minimal error they need to be able to sort real threats from fake ones which isn't that easy, hell I'm probably on some list from all the jihad videos I've watched, it just seems like an excuse to me to invade peoples privacy when most terrorists don't even bother since they know the chances of getting caught are minimal. I bet Apple is just doing this so it looks like they care about your privacy, I really doubt they actually give a shit, particularly when they're supported by companies that love to collect personal information.
[QUOTE=Rufia;49907000]post[/QUOTE] [QUOTE=Chryseus;49907037]post[/QUOTE] Not that yall are all wrong but its worth restating: [QUOTE=DOG-GY;49906440]It's not a decision that you're an enemy, it's a flag that tells them to look further. We had an incident at work a while back where someone was saying threatening stuff on an open facebook group and homeland security's systems flagged that individual automatically, they reviewed the case, saw it as legit, and contacted our company to warn us and whatnot. I'm not particularly for active human monitoring but if open channels are automatically monitored for keyphrases and shit that get properly reviewed by people it has the potential to mitigate a lot of possible crises like the above without being too invasive.[/QUOTE] You have nothing to worry about watching videos. And if you're flagged it's not like the system is gonna auto launch a drone on you lol. Humans ultimately make the decisions. I imagine the behavioral pattern analysis is also probably a lot more sophisticated than to just flag people based off normal morbid curiosity.
[QUOTE=DOG-GY;49907221]Not that yall are all wrong but its worth restating: You have nothing to worry about watching videos. And if you're flagged it's not like the system is gonna auto launch a drone on you lol. Humans ultimately make the decisions. I imagine the behavioral pattern analysis is also probably a lot more sophisticated than to just flag people based off normal morbid curiosity.[/QUOTE] Ya I didn't say they actually use the data, the only reason I mention it is that it was one of the leaks recently and I saw an article about it [url]http://www.wired.com/2015/05/nsa-actual-skynet-program/[/url] The program was capable of identifying complex patterns, and indeed a journalist that frequently contacted terrorist organizations was flagged, but it's not like they would immediately launch missiles at these targets, they would start investigating
Soon you won't even be able to google terrorism without getting flagged.
[QUOTE=Robman8908;49906705]This is a confusing matter, imo. Only because they (Apple, MS, FB, Google) stand so strong "for the people" in this case... yet all of them constantly monitor what you do to try and sell you shit based on it. It's like they're masking their own spying by calling out the FBI, and people will eat that shit up. Unless I'm missing something here.[/QUOTE] You're right. A major difference is that Apple will use that info to try to sell you more crap, law enforcement however would use that info to try to take your rights away and possibly throw you in jail. Poor consolation though.
i dont want the fbi to have pics of me while taking shits
Sorry, you need to Log In to post a reply to this thread.