feature article
Subscribe Now

Apple and Serious Security: NOT an Oxymoron

Cryptographic Processor Has Utility Well Beyond Apple Pay

I know most of you read the first few words of the title up to the colon and thought, “Oh jeez, he’s back on his Serious Security Soapbox and using the Apple celebrity photo hack as a cautionary tale.”  Hardly. And lest anyone think I’ve been hard on Apple of late vis-à-vis their eponymous smart watch, I am going to build a veritable security fortress using the iPhone 6.

Let’s start with backup, as in backing up your data, as in that REALLY important thing that most people cannot be bothered with. I’ve taken backup VERY seriously since WAY back for the purely practical reason that storage media weren’t at all reliable:

  • I started backing up paper tape the first time one of my programs was shredded by a jam in the tape reader
  • I started backing up tape storage the first time one of my tape cartridges unspooled
  • I started backing up floppies the first time I left one on top of the monitor, which at the time were things called “cathode ray tubes” that emitted magnetic fields
  • I started backing up hard drives immediately, because those early 10 MB [sic] units experienced head crashes if you sneezed in their vicinity

My over-the-top backup regimen continues to this day, mostly due to muscle memory. Backups still save my bacon from time to time, mainly when I overwrite a file by accident and don’t discover the goof for a few weeks. Backing up your data is a REALLY good idea because, well, shit happens. If any of us knew all the possible shit that could happen, we could take appropriate measures … but shit happens in many super-creative ways, most often involving ourselves and occasionally involving bad people.

I will use the iPhone 6 in all my examples, but not because I have one. (Here in Silicon Valley I seem to be the only iPhone 5 user that has not yet upgraded; I may be misreading the body language, but I am fairly certain I’ve seen, “Dude, your phone has been obsolete for, like, over a MONTH,” more than once.) Because the iPhone 6 includes some very useful security features.

For the aforementioned reasons, let’s start with backup. And let’s assume you are backing up your data to iCloud. This is a double-edged sword: Apple deserves kudos for making device backup all but automatic, but now a TON of potentially sensitive data is sitting in the cloud waiting for something bad to happen. Said celebrity photo hack apparently was NOT done via iCloud backup, but it could have been. Don’t listen to me, listen to the hip-and-trendy dude from Wired, whose story begins:

In the space of one hour, my entire digital life was destroyed.

I’ll skip the obvious fact that online backup should be ONE PART of a proper backup regimen. Online backup needs to be more than automatic; it needs to be exceptionally secure. Like me, you may have this CRAZY notion that the Fourth Amendment was put there for VERY good reasons; or you may be worried that some teenager in Eastern Europe might hack your ‘private’ photos.

Here’s a pedestrian, broadly overlooked scenario: imagine the carnage possible should a ne’er-do-well gets their hands on your contact list. All those names and numbers and email addresses are a friggin’ social engineering goldmine. Put just a TINY bit of imagination into the possibilities, and having your credit card number stolen becomes the least of your worries. So while you MUST backup this data—think about re-creating your contacts from scratch in this day and age—you want this backup EXCEPTIONALLY secure.

Let’s get to it. No surprise, we need to encrypt the backup; that measure is necessary but not sufficient. If you want your data secure, we need to encrypt the backup in a manner that can ONLY be decrypted by you. This means the encryption cannot take place in the cloud, because, well, shit happens in the cloud. Encryption MUST take place on your iPhone 6, and the key used for encryption needs to be locked deeply away.

Enter the “Secure Element” new to the iPhone 6 and iPad Air 2: a dedicated chip that Apple talked about in the context of Apple Pay.

With Apple Pay, instead of using your actual credit and debit card numbers when you add your card to Passbook, a unique Device Account Number is assigned, encrypted, and securely stored in the Secure Element. These numbers are never stored on Apple servers.

Apple is not going into any technical details of the Secure Element, so for the time being let’s assume it is a proprietary cryptoprocessor along the lines of the extremely common yet seldom used Trusted Platform Module (TPM). Assuming Apple has followed protocol—and I am giving them full credit in light of the size of the bet they are making on Apple Pay—consider the Secure Element a nearly un-hackable vault for your most sensitive information. (Note to Apple: was marketing not around when the engineers named this thing? You could have easily come up with something that inspired a bit more confidence. “Trusted Platform Module” sounds far better, and geez, I think Intel came up with that name.)

Let’s go to work. We begin by generating a strong 4096-bit private/public key pair; we want a good source of entropy to generate these keys, so let’s use the Touch ID fingerprint reader and your pinkie, a digit unlikely to be used in the course of unlocking your device. We feed the entropy generated by your pinkie into the Secure Element; the private/public key pair is generated and stored INSIDE that chip. Critical factor here if you are after exceptional security: your private key NEVER LEAVES the Secure Element. Your public key, on the other hand, is uploaded to iCloud for everyone to see.

Next step: we need a password to protect the backup itself. If you believe that the reliability kinks have been worked out of Touch ID, that will work great; otherwise, pick a good strong password. In either case, this “backup password” is fed into the Secure Element and, using your PRIVATE key, an algorithm produces the 256-bit AES key to encrypt your entire backup. The now encrypted backup is sent to iCloud, where it looks like completely random gibberish.

There are folks more paranoid than even me who will take issue with my next statement: 256-bit AES encryption is un-hackable. The fringe element will claim that not only nation states but sophisticated mobsters can crack 256-bit AES. I will simply point out that there is no evidence that properly implemented (FIPS 140-2, Level 2 or higher) 256-bit AES has ever been compromised; don’t take my word for it, I checked with folks in the super-security business who make their LIVING being paranoid.

Those of you paying close attention at home will note that burying our 4096-bit private key in the Secure Element does have its drawbacks. Namely, one of the primary reasons for backing up your phone is that you might lose it … in which case, you’re going to need that private key to pull the backup down to your new iPhone. Copying a page out of the Microsoft—gasp!—playbook for Bit Locker (very good whole disk encryption using the TPM) we need to physically store a copy of your private key on, say, an SD Card.

One can purchase a Lightning Connector to SD Card reader for all of $7 (or the low, low price of $40 if you want one from Apple). We could simply copy the 4096-bit private key out of the Secure Element, but where is the fun (and security) in that?  This private key backup is NOT something you are going to use every day—only when you are bringing up a new device—so we can require a long, strong password … given the physical nature of the SD Card, writing down this password is not as insane as usual.

Let’s backup your 4096-bit private key. We feed the long, strong key password to the Secure Element and use Diffie-Hellman Elliptic Curve encryption to scramble the daylights out of it BEFORE it leaves the Secure Element and is saved to the SD Card. If you’re properly security minded, you’ll hide the SD Card and the written key password in two different locations … two locations that hopefully you’ll remember next year when you need them.

Last, but not least, let’s tackle data sharing. We’ll assume that you want to share ‘non-private’ photos with a select group of friends:

  • Encrypt the photos with your PRIVATE key and your friends’ PUBLIC keys; once again performing the encryption INSIDE the Secure Element
  • Upload the encrypted photos to iCloud
  • Each of your friends decrypts the photos using their PRIVATE key and your PUBLIC key, once again performing the decryption INSIDE the secure element

Repeating for emphasis: private keys NEVER LEAVE their respective Secure Element. Result? The encrypted data may be hacked and copied … but cannot be decrypted because the required private key is sequestered away inside the Secure Element.

I’ve made quite a few assumptions about the capabilities of the Secure Element; suffice to say, all of the above is readily accomplished using the five-year-old TPM. The Secure Element has been mentioned only in the context of Apple Pay, so it may not have the flexibility to pull off all of the above. And, sadly, there does not appear to be a physically unclonable function (PUF) in the picture for beyond-exceptional security. Here’s hoping that the iPhone 7 incorporates both the requisite flexibility and the PUF in its “Securer Element.” In the meantime, here’s hoping Apple goes beyond payments with their shiny new cryptoprocessor.

14 thoughts on “Apple and Serious Security: NOT an Oxymoron”

  1. ======
    You cannot “copy” a private key “out of” the secure element. NFC chips have never been breached, to my knowledge.
    I agree completely!

    I am describing a mechanism for backing up the private key: first encrypt it (with a separate passphrase) using Diffie-Hellman Elliptic Curve encryption and only then copy this encrypted version out of the secure element. This is the same mechanism Microsoft uses to backup the BitLocker key out of the TPM.

  2. ‘Apple Security’ may not be an oxymoron, but you may have accidentally penned another one: ‘useful security feature’. Security is usually a trade-off against utility/efficiency, and the effectiveness of a security feature is judged by history. If you are running a pool on the first breach victim of ‘Apple Pay’, let me know~

  3. Ryan–

    Spot-on #1: security is almost always a trade-off. Let’s consider Apple Pay as it exists (as opposed to my concoction). Even though setting up Apple Pay is relatively easy, many people will never go through the steps to activate it. And others will not trust it.

    Spot-on #2: only time will tell how secure Apple Pay (and the secure element) really are. Given that Apple Pay uses the credit cards you already have, a reasonable measure will (say, in one year) be “how often is Apple Pay compromised versus legacy credit card payment methods?” Today is doesn’t take much to do better than legacy credit card measures. 😉 But with the rollout of chip-and-pin in the states (finally!) the answer is less clear.

  4. Pingback: 123movies
  5. Pingback: gvk biosciences
  6. Pingback: Bdsm contract
  7. Pingback: orospu cocuklari
  8. Pingback: bandar taruhan

Leave a Reply

featured blogs
Jul 17, 2018
In the first installment, I wrote about why I had to visit Japan in 1983, and the semiconductor stuff I did there. Today, it's all the other stuff. Japanese Food When I went on this first trip to Japan, Japanese food was not common in the US (and had been non-existent in...
Jul 16, 2018
Each instance of an Achronix Speedcore eFPGA in your ASIC or SoC design must be configured after the system powers up because Speedcore eFPGAs employ nonvolatile SRAM technology to store the eFPGA'€™s configuration bits. Each Speedcore instance contains its own FPGA configu...
Jul 12, 2018
A single failure of a machine due to heat can bring down an entire assembly line to halt. At the printed circuit board level, we designers need to provide the most robust solutions to keep the wheels...