According to cryptographers at Johns Hopkins University, iOS makes less use of built-in encryption measures, potentially creating unnecessary security vulnerabilities. Wired).
Using publicly available documentation from Apple and Google, reports from law enforcement officials circumventing mobile security features, and their own analysis, the cryptographers assessed the robustness of iOS and Android encryption. The study found that while the encryption infrastructure on iOS “ sounds really good, ” it goes largely unused:
“IOS especially has the infrastructure for this hierarchical encryption that sounds very good,” said Maximilian Zinkus, chief iOS researcher. “But I was absolutely amazed to see how much of it is unused.”
When an iPhone boots up, all stored data will be in “Full Protection” status and the user must unlock the device before anything can be decrypted. While this is extremely secure, the researchers emphasized that once the device is unlocked for the first time after a reboot, a large amount of data is moved to a state Apple calls “Protected until First User Authentication.”
Since devices are rarely rebooted, most data is usually in a “Protected to First User Authentication” state instead of “Full Protection”. The advantage of this less secure state is that decryption keys are stored in fast access memory where they can be quickly accessed by applications.
In theory, an attacker could find and exploit certain types of security vulnerabilities in iOS to obtain encryption keys in the fast access memory, allowing them to decrypt large amounts of data from the device. This is believed to be how many smartphone access tools work, such as those from forensic access firm Grayshift.
While it is true that attackers need a specific operating system vulnerability to gain access to the keys, and both Apple and Google fix many of these shortcomings as soon as they are spotted, but it can be avoided by hiding encryption keys more deeply.
“It just really shocked me because I came to this project with the idea that these phones really do a good job of protecting user data,” said Johns Hopkins cryptographer Matthew Green. “Now I’ve come out of the project with the idea that almost nothing is as protected as it could be. So why do we need a back door to law enforcement when the protection these phones actually provide is so bad?”
The researchers also shared their findings and some technical recommendations directly with Apple. An Apple spokesperson made a public statement in response:
“Apple devices are designed with multiple layers of security to provide protection against a wide variety of potential threats, and we are constantly working to add new protections for our users’ data. Customers keep the amount of sensitive information they store on their devices, we will continue to develop additional protections in both hardware and software to protect their data. “
The spokesman also said Wired that Apple’s security work is primarily aimed at protecting users from hackers, thieves and criminals who want to steal personal information. They also noted that the types of attacks the researchers highlighted are very expensive to develop, require physical access to the target device, and only work until Apple releases a patch. Apple also stressed that its goal with iOS is to balance security and convenience.