Tim Cook stands up to Apple user rights and says no to the U.S and FBI who want them to create a bac
66 replies, posted
[QUOTE]Apple chief Tim Cook has attacked the recent court order that compels Apple to unlock and decrypt the San Bernardino gunman's iPhone. "Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the US government," says an open letter published by Cook early this morning.
Late yesterday, a federal judge in California [URL="http://arstechnica.co.uk/tech-policy/2016/02/judge-apple-must-help-fbi-unlock-san-bernardino-shooters-iphone/"]ordered Apple to help the US government[/URL] (the FBI) unlock and decrypt the iPhone 5C belonging
to Syed Rizwan Farook, who shot up an office party in San Bernardino in December 2015.
In the past, Apple has helped extract data from iPhones when issued with an appropriate warrant. Since iOS 8, however, full encryption has been
enabled by default—a move that was seemingly introduced specifically to prevent such data-grabs by governments. "Unlike our competitors,
Apple cannot bypass your passcode and therefore cannot access this data," the company wrote on its website at the time. "So it's not technically
feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8."
Now, however, judge Sheri Pym has ordered Apple to introduce a backdoor to help the FBI unlock the iPhone—and, unsurprisingly, Tim Cook is
not best pleased.
Cook's "[URL="http://www.apple.com/customer-letter/"]message to our customers[/URL]" is quite impressively aggressive. It begins by explaining why we need encryption, moves onto a brief history of the San Bernardino case, and then explains exactly what the FBI actually wants from Apple.
Basically, right now there are measures in place to stop someone from picking up your iPhone and brute-forcing the code to unlock your phone. The FBI wants a backdoor that allows such a brute-force attack to take place. With direct passcode input through the iPhone's Lightning port, and no additional delay between passcode attempts, cracking the code would be very easy.
[/QUOTE]
[url]http://arstechnica.com/gadgets/2016/02/tim-cook-says-apple-will-fight-us-govt-over-court-ordered-iphone-backdoor/[/url]
Tim Cook's customer letter:
[url]http://www.apple.com/customer-letter/[/url]
dude can be a fuckwit sometimes but otherwise he's really good when it comes to customer data rights
Good.
Heres hoping that more huge tech companies like Google and Samsung take a stand with Apple against backdoor encryption.
[QUOTE=Bathtub;49758739]Heres hoping that more huge tech companies like Google and Samsung take a stand with Apple against backdoor encryption.[/QUOTE]
Google has taken a stand against adding backdoors(how is this possible btw?) to Android's system encryption iirc
[QUOTE=Bathtub;49758739]Heres hoping that more huge tech companies like Google and Samsung take a stand with Apple against backdoor encryption.[/QUOTE]
pretty sure both MS and Google don't support backdoors but remember that Apple was the last tech company to sign up with NSA's programmes according to Snowden's leaked documents
Apple never selled out on customer data until they were literally forced to
[QUOTE=usaokay;49758786]Shouldn't there be specific cases that involve a threat to national security? I do understand that no one wants their shit to be obtainable from the Big Brother, but there's a concern when it comes down to attempting to access one of the San Bernardino shooter's phone.
I'm not disagreeing with Tim Cook. Just want to understand the moral ambiguity.[/QUOTE]
You can't make a specific single-iPhone-only encyrption-breaking update. If you do, a backdoor is created for every iPhone on the planet.
[QUOTE=Starpluck;49758799]You can't make a specific single-iPhone-only encyrption-breaking update. If you do, a backdoor is created for every iPhone on the planet.[/QUOTE]
This is why you sacrifice freedom for security and sacrifice security for freedom.
The problem is, in order to satisfy those extreme cases, you'd have to create a massive security hole that can then be used against you. A vault is only secure as it's door.
[QUOTE=usaokay;49758802]Okay, that clears it up.[/QUOTE]
I work at the DEA and we tried this loads of times. It is simply not feasible.
[QUOTE=SharpTeeth;49758815]The problem is, in order to satisfy those extreme cases, you'd have to create a massive security hole that can then be used against you. A vault is only secure as it's door.[/QUOTE]
This is exactly why they're against it, it's nothing but bad news for everyone involved.
If it were possible to securely allow apple to break a phones encryption without risking the security of everyone elses, I'd be for it provided a warrant was involved. But you just can't have a backdoor that's guaranteed to only let the government use it.
having a backdoor for an encryption system entirely ruins the point of encryption in the first place. government shouldn't get special rights when it comes to consumer hardware
[editline]17th February 2016[/editline]
suck it up and crack it yourself if you want it that bad
From what I gathered from the Radio 4 interview this morning:
The FBI want Apple to remove the timeout for wrong passcode inputs so they can bruteforce the combination.
You know the 'wait 5 minutes' when you get the code wrong a few times, they want Apple to remove this feature from the phone so they can use a computer to try all combinations and get into the phone.
The reason why there is a big deal about this case is that it sets world-wide precedent. If the USA can ask for this on a phone, so can Turkey, Saudi Arabia and other non-liberal countries.
[editline]17th February 2016[/editline]
It's also right there in the article:
[QUOTE]Basically, right now there are measures in place to stop someone from picking up your iPhone and brute-forcing the code to unlock your phone. The FBI wants a backdoor that allows such a brute-force attack to take place. With direct passcode input through the iPhone's Lightning port, and no additional delay between passcode attempts, cracking the code would be very easy.[/QUOTE]
While I am in full support of Cook, the situation can make for some interesting arguments.
Let's say, hypothetically, that the authorities were aware of an attack that would soon take place in a city, but they do not know who is attacking, how they are attacking, when, and which city. However, they have an encrypted phone that, if unlocked, could provide the full details; but it won't do them any good if they can't open it quickly. A backdoor into the device could save countless lives, but could mean the end to privacy in the future.
What is the right thing to do?
How dumb do you have to be to weaken encryption in a time where corporate espionage is rampant?
[QUOTE=Overwatch 7;49759114]
Let's say, hypothetically, that the authorities were aware of an attack that would soon take place in a city, but they do not know who is attacking, how they are attacking, when, and which city. However, they have an encrypted phone that, if unlocked, could provide the full details; but it won't do them any good if they can't open it quickly. A backdoor into the device could save countless lives, but could mean the end to privacy in the future.[/QUOTE]
Since the attackers aren't stupid, they probably won't use something with a weak encryption anyway.
In the USA/ UK/ any other western country, this is fine. I'd totally be happy for the UK authorities to have the power to request the passcode protection be unlocked, so long as it backed up by a court warrant.
But at the same time I can understand why Apple doesn't want to do it because it means that all iPhones are vulnerable and they would have to grant this power to all countries, even non-liberal ones.
TBH, I think the FBI should be allowed to bruteforce it. Or else every criminal in the world will get an iPhone knowing that they can organise everything on there without ever giving the authorities any evidence.
[QUOTE=Mythman;49759157]TBH, I think the FBI should be allowed to bruteforce it. Or else every criminal in the world will get an iPhone knowing that they can organise everything on there without ever giving the authorities any evidence.[/QUOTE]
Encryption is nothing new. Criminals can already do what you describe. After all, you would have to prohibit math, which is impossible.
[QUOTE=Mythman;49759157]
I think the FBI should be allowed to bruteforce it. Or else every criminal in the world will get an iPhone knowing that they can organise everything on there without ever giving the authorities any evidence.[/QUOTE]
If the FBI can bruteforce it, so can anyone else.
I'm glad Apple's got a backbone here. I mean, they always have; Steve wasn't a huge fan of government meddling, or meddling in general. He put up one of the first Canary pages of any major tech company.
Yes, it would help in the investigation if the FBI could crack the phone. But, at the same time, the deed has been done and the terrorists are dead. I'm not sure there's much to gain by cracking the phones of the dead, but we've got quite a lot to lose.
[QUOTE=SashaWolf;49759208]If the FBI can bruteforce it, so can anyone else.[/QUOTE]
Indeed, I do not mind security services having ways to access certain data in an emergency (provided good reason) the only issue is a backdoor is open for anyone who can find the key, it makes it much more likely that some cybercriminal can just jump in and nab yer details thank you very much sir £3000 wired to my account.
As it stands there's no way to guarantee that government will have exclusive access.
[QUOTE=Overwatch 7;49759114]Let's say, hypothetically, that the authorities were aware of an attack that would soon take place in a city, but they do not know who is attacking, how they are attacking, when, and which city. However, they have an encrypted phone that, if unlocked, could provide the full details; but it won't do them any good if they can't open it quickly. A backdoor into the device could save countless lives, but could mean the end to privacy in the future.[/QUOTE]
There's no clear "right" anwser for those of us who favor unbackdoored phone security to give. The concept and value of security and privacy is very abstract for most people and difficult to materialize, whereas the dramatic and scary possibility of a terrorist attack is very tangible.
The only arguement you can really make is such a backdoor would need to already be in place for such a system to work, and while it is clearly warranted in your case for its use, there are many cases out there where it might not be warranted for its use. I don't believe that the very rare possibility of a terrorist attack warrants such a power to be granted to anyone who knows how to use or abuse it.
I furthermore think that I should have the right to do whatever I want with my own data, including encryption. Though this might be an attack on Apple's encryption, ultimately the goal of the FBI is to do away with all obstacles that prevent them from getting at data. I don't believe that anyone has the right to prevent me from encrypting my data if I want to, nor should anyone be able to force me to decrypt it.
[QUOTE=woolio1;49759275]I'm glad Apple's got a backbone here. I mean, they always have; Steve wasn't a huge fan of government meddling, or meddling in general. He put up one of the first Canary pages of any major tech company.
Yes, it would help in the investigation if the FBI could crack the phone. But, at the same time, the deed has been done and the terrorists are dead. I'm not sure there's much to gain by cracking the phones of the dead, but we've got quite a lot to lose.[/QUOTE]
think they're interested in busting any terrorist cell he was with because he most likely had some chat app that is likely encrypted that they would get into to see who else is roaming about
that's all i see them wanting
[QUOTE=Starpluck;49758820]I work at the DEA[/QUOTE]
okay, this is offtopic, but what?!
holy shit dude, that's amazing.
back on topic, I'm happy that even Tim Cook knows better than to allow backdoors.
I don't quite understand why FP agrees with this.
I understand people have the right to have their shit hidden, but when you are gathering evidence for an investigation of a murderer he suddenly then should have the same security rights as every other innocent person ? If they believe there is evidence of the phone and want Apple to unlock it I don't see what the issue is, especially is most non court related data is kept hidden
[QUOTE=redBadger;49759664]I don't quite understand why FP agrees with this.
I understand people have the right to have their shit hidden, but when you are gathering evidence for an investigation of a murderer he suddenly then should have the same security rights as every other innocent person ? If they believe there is evidence of the phone and want Apple to unlock it I don't see what the issue is, especially is most non court related data is kept hidden[/QUOTE]
[QUOTE=Starpluck;49758799]You can't make a specific single-iPhone-only encyrption-breaking update. If you do, a backdoor is created for every iPhone on the planet.[/QUOTE]
And for the love of Christ, please don't make us explain how that's a terrible idea.
[QUOTE=redBadger;49759664]I don't quite understand why FP agrees with this.
I understand people have the right to have their shit hidden, but when you are gathering evidence for an investigation of a murderer he suddenly then should have the same security rights as every other innocent person ? If they believe there is evidence of the phone and want Apple to unlock it I don't see what the issue is, especially is most non court related data is kept hidden[/QUOTE]
If Apple creates a backdoor, then that backdoor will literally apply to every iPhone on the planet. As Starpluck confirmed, there's no way to limit this to a single iPhone.
Innocent people will be made extremely vulnerable to hackers because they'll now be able to break the previously invulnerable encryption that Apple uses (and trust me, if the FBI can do it, so can everyone else). Millions of people have sensitive information on their phones - stuff like names, phone numbers, addresses, and email accounts. With the introduction of digital payment systems like Apple Pay, the risks are even higher than they were before.
This isn't only about privacy. This is about the trust held by the public that their technological devices are secure. If the government forces corporations to include weakened encryption, then that will irrevocably destroy the security that the Internet has been painfully developing since it first emerged in the 1960s.
Thank you for explaining, I was under the impression they could cut off and limit this for one phone.
Though it should be possible if they code a version specifically made for that phone. They can install it through hard wire and not OTA so nobody else gets it.
The details of people's lives should always be sacred, whether it may help a few cases or not.
Having backdoors in people's phones would be a step up the stairs towards lacking privacy, with even more people commiting crimes out of fear from the gov'.
Well, Untill they make it law and raid their offices when they don't comply.
I'm not very idealistic; I have some websites that nobody uses, and I encrypt user info. If, however, the NSA got a warrant and asked for my database, I'd turn all the data over. However, it would probably be useless to them. I'd like to help out with law enforcement if I was ever in the situation, but I couldn't compromise security from bad actors for the NSA. It's either total encryption and security, such that the NSA can't easily get the data, or no encryption and security, such that EVERYONE can get the ata
[editline]17th February 2016[/editline]
[QUOTE=Starpluck;49758799]You can't make a specific single-iPhone-only encyrption-breaking update. If you do, a backdoor is created for every iPhone on the planet.[/QUOTE]
Couldn't you simply find the specific iPhone you want, encrypt everything on that user's phone with one key (theirs), and then encrypt it again with a second key (yours), such that you and the user can both access his or her data? Or do you mean that apple can't just target a single iPhone for an update?
Sorry, you need to Log In to post a reply to this thread.