|
He said that right before he discovered electricity and changed the world.
If it's not broken, fix it until it is
|
|
|
|
|
There are two types of people in this world: those that pronounce GIF with a soft G, and those who do not deserve to speak words, ever.
|
|
|
|
|
We're supposed to be the home of the brave, not the home of the pansies. +5 million
Jeremy Falcon
|
|
|
|
|
Yup. I read a book about 1984... things were really screwed up back then. Hope it doesn't happen again.
Mark
Just another cog in the wheel
|
|
|
|
|
I see a couple of problems with your argument:
1. Apple at the moment can claim that an attempt to unlock the phone constitutes an "undue burden" - cracking the encryption on the account is presumably non-trivial. Once it has been done, however, the FBI has both precedent and the existence of the cracking tool on its side. How long do you think it will be before they persuade judges to give them court orders for all (not just terrorist) cases?
2. Once the unlocking tool has been developed, do you really believe it will remain only in the hands of the "Good Guys"? I would not bet anything I care to lose that the tool won't be in the hands of the criminal fraternity by the end of the month. Given that Apple's encryption protects sensitive financial data, among other things, the consequences are not good.
I agree that a good look at the terrorist's phone might provide important clues, but some benefits are not worth the future cost.
If you have an important point to make, don't try to be subtle or clever. Use a pile driver. Hit the point once. Then come back and hit it again. Then hit it a third time - a tremendous whack.
--Winston Churchill
|
|
|
|
|
If such a tool existed - and I was a Serious Criminal - I'd be prepared to spend BIG money in terms of bribes to the appropriate Apple employee(s) to get a copy of the source.
How many people here would turn down - say - ten years salary in a Swiss bank account? The criminals could get that back in a day...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Having been a developer for 30+ years, I can't sit here and believe that Apple doesn't already have a way to open a phone.
Have you ever written encryption without a way to unlock it? How would you test it without an unlocking mechanism?
It already exists - Apple just doesn't want to give it up.
If it's not broken, fix it until it is
|
|
|
|
|
A good encryption system is one that will not allow an attacker to decrypt a ciphertext even if he (a) knows the encryption/decryption algorithms and (b) has both plaintext and ciphertext of a set of messages encrypted with the key. If the key used has enough bits, the only way to crack the encryption is to attack the algorithm.
Once the algorithm is known to be sound, you test an encryption system by generating keys (or key pairs). You do not encrypt (and destroy the plaintext) of any important data.
No one can prove that many popular algorithms do not have built-in "back doors" (rumors about the NSA's work are legion), but if so - no one is talking...
If you have an important point to make, don't try to be subtle or clever. Use a pile driver. Hit the point once. Then come back and hit it again. Then hit it a third time - a tremendous whack.
--Winston Churchill
|
|
|
|
|
Daniel Pfeffer wrote: No one can prove that many popular algorithms do not have built-in "back doors"
Most widely used algorithms have open-source implementations, meaning you can look at the source and see if you see any deficiencies or back doors. So... the algorithms themselves are pretty sound.
|
|
|
|
|
Very few people have the background in cryptography required to analyze an encryption algorithm. An algorithm with a vulnerability could be perfectly encoded, but still be vulnerable to attack.
If you have an important point to make, don't try to be subtle or clever. Use a pile driver. Hit the point once. Then come back and hit it again. Then hit it a third time - a tremendous whack.
--Winston Churchill
|
|
|
|
|
"Very few" is different than "no one", I believe you used the latter.
Daniel Pfeffer wrote: An algorithm with a vulnerability could be perfectly encoded, but still be vulnerable to attack.
Sure, that is true of anything in this world, but that's the rationale for open sourcing projects... To allow other people other than the original designers to assess vulnerabilities.
|
|
|
|
|
Once you let the Genie out of the bottle, you can't put it back in...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
This section seems a bit strange:-
Quote: Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession.
If the file exists on the phone and was encrypted using an existing version of the data, how would installing a new version of the iOS allow easier unencryption?
Also - wouldn't doing that utterly corrupt the chain of evidence meaning anything discovered could not possibly be used in a civilian court of law?
|
|
|
|
|
No
Yes
If it's not broken, fix it until it is
|
|
|
|
|
They have enough evidence to go to any level of legal measure required. This is an attempt to get more information and intelligence.
|
|
|
|
|
I believe I read somewhere that there's currently a security measure that deletes the encryption key upon too many failed attempted login attempts. If I'm not mistaken, they're asking Apple to change that setting so that they can brute force the password (i.e. make it so it doesn't delete anything when faced with a brute force attack).
|
|
|
|
|
Duncan Edwards Jones wrote: If the file exists on the phone and was encrypted using an existing version of the data, how would installing a new version of the iOS allow easier unencryption?
My understanding is that if you attempt bad passwords X number of times, the phone bricks itself essentially. The "new" iOS being requested by the courts/FBI would allow unlimited attempts therefore making any phone that can have that OS installed brute forcible.
|
|
|
|
|
We answered the same thing at just about the same time, so I guess that is the stated story.
I can see the concern, if this "modified" version of the OS got out onto "the wild", anybody could brute force an iPhone.
|
|
|
|
|
Vark111 wrote:
My understanding is that if you attempt bad passwords X number of times, the phone bricks itself essentially. The "new" iOS being requested by the courts/FBI would allow unlimited attempts therefore making any phone that can have that OS installed brute forcible. Ten attempts. Then the phone not just blocks all info on the phone, it erases it completely. After that no tool can recover it; there is nothing to recover.
The code in question is a 4 decimal digit code, so a brute force attack requires only ten thousand tries (or on the average half of that) - so little that it neither sounds very much "brute" nor very strong "force"
|
|
|
|
|
Duncan Edwards Jones wrote:
If the file exists on the phone and was encrypted using an existing version of the data, how would installing a new version of the iOS allow easier unencryption?
Unless the user specifies the full encryption key every time the encrypted information is accessed, the software does know the key. It is stored somewhere in the file system. Move that flash (/disk, for general PCs) over to another machine, as a secondary storage device, and the key can be read by that other machine.
Sure, the key is usually encrypted; you won't find it in cleartext. But the OS/Application knows how to decrypt it. It must know, in order to decrypt the info for the proper user. But in a standard version, the OS/App refuses to do it until the operater has authenticated himself. The special OS edition on the other machine may be willing to decrypt the key without the the owner authenticating himself, e.g. presenting a password or fingerprint.
Couldn't that info, given by the user, be (part of) what encrypts the key, so that an intruder would have to know that?
But the OS knows that, too. It must know the PW (or some transformation of it) in order to check that the user gives the right one. So the alternate OS version may pretend that it has just read from the user a PW corresponding to the expected one, even if no user ever specified anything.
Whether you install the alternate OS version on the same device or you move the storage device (flash/disk) to another machine makes no essential difference, as long as there exists a possiblity for loading a new OS version without logging in to the machine. In the old days, that wasn't always the case, but with modern automatic over-the-air updates and fixes, it it probably possible to replace all essential parts of the OS that way.
The only safe encryption is where you are the one generating the key, the only one knowing it, and you never present it to the OS or to any application. For standard PC use, I would like to have a USB dongle where I can load, say, my X.509 certificates into a flash area that is not adressable across the USB interface; only the processor in the dongle can see it. So the PC sends the ciphertext across the USB interface, the dongle decrypts it, and returns hte cleartext to the PC across the USB interface. (Or it receives cleartext and returns ciphertext.) In many applications (such as S-MIME), the ciphertext will not be the full document text but e.g. a one-time 3DES or AES256 key, used for the text body, but in principle, the dongle could encrypt/decrypt the entire text body.
This dongle could itself require authentication. E.g. it can have a Bluetooth [Smart] interface to your smartphone, requesting a 6-digit PIN to be keyed on the phone. No keylogger on the PC will be able to pick it up (the way it can pick up any PIN, PW or key you type at the PC keyboard). So to access an encrypted document would require both the right USB dongle with the proper keys loaded, the right smartphone for authentication, and knowledge of the PIN requested by the dongle. (Plus, implicitly, the ability to unlock the smartphone, eg. by fingerprint.) In principle, a keylogger may be installed on the phone, but the risk of the intruder knowing how those digits typed are actually used - as a PIN code for some independent dongle - is rather small.
The biggest problem is to make e.g. an email program use that dongle for decrypting/encrypting the one-time-key (or the entire text body). Even if there exists standard encryption APIs, there is a great risk that common mail programs insists on accessing the X.509 certificate itself; maybe it doesn't use that standard encryptin API at all. So if I make myself such a dongle (in fact, I do have access to a programmable USB dongle that could do the job - I just have to learn to develop software for it!), I guess I would have to obtain some open-source email reader (such as Thunderbird) to adapt the source code for it. I guess that I might get the time to complete that project as soon as I retire as an old age pensioner...
|
|
|
|
|
More people have been killed with babies by guns than terrorists. Don't let hoopla and propaganda cloud your judgement. Yes it was sad, but the media blew it up to play the fear card to make it seem like it's a much bigger problem than it really is. So, it's not worth Pandora's box being opened.
Jeremy Falcon
|
|
|
|
|
I say we ban babies!
|
|
|
|
|
Agreed. They don't do anything but cry and poop anyway. Who needs them.
Jeremy Falcon
|
|
|
|
|
It could be a marketing stunt on the part of Apple:
(1)Apple publish that they refuse to unlock phones knowing damn well that that will unlock them.
(2)Their sales go up and they gain a market share from the Android users who think 'Apple have an ethical stance'.
(3)Apple then say that sadly they had no choice and unlock the phone - they come out of it smelling of roses.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
It is not like that...but Apple has no idea how to unlock iPhone
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|