Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Here's iOS disk / PIN encryption as I understand it:
- Block 0 of the NAND is used as effaceable storage and a series of encryption "lockers" are stored on it. This is the portion that gets wiped when a device is erased, as this is the base of the key hierarchy. Each locker has a randomly generated encryption key in it, and that key is encrypted with a combination of a deducible unique hardware key for the device and the user's PIN. Locker #4 (the class 4 key, also referred to as the class D key) was not encrypted with the user PIN and in previous versions of iOS (<8), was used to "encrypt" most of the file system. Because the PIN wasn't included in the crypto, anyone with root level access (such as Apple) could easily decrypt most of the file system contents (as most of Apple's app data was not using data protection at the time). Fast forward to iOS 8, and virtually the entire file system is using keys from lockers that *are* protected with the user's PIN. The hardware-accelerated AES crypto functions allow for very fast encryption and decryption of the entire hard disk making this technologically possible since the 3GS, however for no valid reason whatsoever, Apple decided not to properly encrypt the file system until iOS 8.
- In order to deduce the PIN in 8 (and access any of the file system), you need to iterate through all 10,000 possibilities. This takes about 20 minutes with code execution on the device. Because newer devices' boot loaders have been stripped down and several vulnerabilities have been addressed, getting root execution isn't presently feasible unless you are the rare owner of a low level 0day or if you have Apple's signing keys (to sign your own ram disk). If you jailbreak your device, then you remove a number of security mechanisms that allow certain forensics tools to boot unsigned code that is capable of cracking a PIN. There have been recent documents released in the Snowden corpus that suggest NSA was attempting to target developers using Xcode, and possibly Apple themselves. If NSA is in possession of Apple's signing keys, they'd be able to sign and boot a RAM disk of their own, or more the likely use runtime 0days to gain privileged execution on a target's device.
- This is one reason it is very important to use a complex passcode. 10,000 iterations only takes about 20 minutes, but a very long passcode could take years, decades, or longer to brute force. Thanks to the fingerprint reader in newer devices combined with Apple's 24-hour timeout and other protections, users can really benefit by using a strong complex passphrase without the inconvenience of having to type it in very often. It also prevents video surveillance to attempt to steal your PIN, if you're not typing it in most of the time.
- PIN/Passcode protection is only designed to provide encryption for data *at rest*. If NSA were targeting you and had a 0day to gain code execution on your device, then whatever lockers were unlocked at a given time could easily allow for data theft. A simple program could even be injected waiting for crypto to become unlocked, and then harvesting the data back to a C&C server when that happens... code execution is tricky, but given Apple's patch history, a very real threat that PIN/passcodes won't offer protection against.
- Additionally, certain parts of the file system encryption can be unlocked using the escrowbag included with an iTunes pair record. If your device is seized at an airport along with your laptop, for example, the pair record on your desktop could be used to access data on your device. This is much more involved than it used to be, however, as iOS 7 and lower had a number of encryption backdoors that would allow someone to bypass the backup encryption on the device. Even though those "diagnostic services" *cough* have been closed, it's still possible to decrypt and harvest most third party application data with that pair record, so long as the device has not been power cycled since the PIN/passcode was last typed in. This is why most agencies are now keeping devices powered on while it's transported back to forensics. iOS 8 devices can be upgraded to beta versions, which re-enable these encryption backdoors, and so if the investigating agent obtains the user's PIN or passcode, they could potentially dump all information from the device, even if backup encryption is enabled. Fortunately, upgrading to beta at least requires a reboot... users who are running Apple public betas should be aware that these encryption backdoors can be accessed WITHOUT a reboot, meaning a forensics investigator only need a pair record to access that data on your device. Fortunately, Apple has at least shut down wireless access to the most critical parts of that data.
- Also of note, the fingerprint reader shuts down after 24 hours of inactivity, so compelling a user's fingerprint will do no good unless the court system is streamlined to grant such warrants quickly. Even this, however, is easy to thwart if the user sets an obscure finger as the only authenticated finger; after a few failed attempts with an index finger or thumb, for example, the reader also shuts down.
- So the current state of encryption is this: a four digit PIN is in no way NSA proof. NSA likely has 0days they could use to remote backdoor your device, and if they don't then I'd put my money on their capabilities to sign their own root code to run on the device to brute force the PIN. A passcode will protect your data at rest much better than a PIN, however nothing would protect you from 0days if you were actually targeted. If you are actually targeted by NSA, however, chances are your iPhone wouldn't be the most vulnerable device and data could be leaked from your desktop and other devices. There's also warrants to access your iCloud data, including iCloud backups, which contains an obscene amount of data that most people underestimate.
- But in terms of a non-NSA police agency seizing your locked device without a desktop pairing record, unless you have jailbroken it, the forensics tools available to these agencies are far more limited. A four digit PIN can still be deduced by dusting latents off of the screen, or by video surveillance, guessing, or with tools such as IP-BOX which shuts down the device in between attempts before it can flush the failed attempts to disk. I am not convinced that Apple has completely fixed this vulnerability. A complex passcode is far more secure, and no law enforcement agency I'm aware of has any tools capable of brute forcing this or even attacking it on a non-jailbroken phone.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement