Security used to be the purview of the select few – those working on defense projects, financial systems, or other “high-security” devices and platforms. Today, however, with IoT, wearables, connected cars, smartphones, tablets, and so forth, everything is connected to everything, and practically every device owned by every human on earth is able to make safety- and financially-critical transactions. That means security has become important in just about every system we design today.
Unfortunately, security has to be designed into our systems from the ground up. There’s no such thing as a padlock and chain we can throw on top of our existing system-level design to shore it up. And, for real security, we need features built into the hardware itself, rather than relying solely on software to keep the fort safe.
Microsemi just announced that the SmartFusion 2 and IGLOO 2 FPGAs are the first in the industry to include a physically uncloneable function (PUF) to provide a basis for a number of security capabilities. The PUF provides what the company calls a “biometric” identifier that uniquely identifies a particular piece of silicon. Like fingerprints, every chip has its own code that cannot be duplicated.
FPGAs are somewhat unusual in the security picture because they sit at the intersection of the security interests of at least three different parties. First, there is the FPGA vendor who wants security features to prevent unauthorized copies of its chips by cloning, overbuilding, “floor sweeping,” or other common industry tricks. Second, there is the OEM who develops the application for the FPGAs who wants to secure their bitstream and design IP so that their system cannot be copied, and so their design IP is safe. Finally, there is the end-user of the system whose private data and transactions need to be secured.
Since FPGAs sit at the juncture of all of these, and since FPGAs are often involved in the communication path of the systems into which they’re designed, FPGAs are often used as the security enablers. Building a PUF into hard IP in an FPGA as Microsemi has done is a logical step in providing a high-grade hardware-based security platform for system designers.
Microsemi has a long history of providing products to the aerospace and defense industries, and the lineage of the SmartFusion 2 SoC FPGAs and IGLOO 2 FPGAs goes back to the days before the company acquired Actel – who also had a major presence in the mil-aero space. So, it’s not surprising to see them paving the way in security-centric features for FPGAs. In this case, the company partnered with Intrinsic-ID for the PUF technology.
The FPGA-based PUF could be thought of as a giant register of SRAM flip-flops. Each flip-flop is created by cross-coupled gates, and the flip-flop will power up in a (fairly) consistent state, depending on manufacturing variations. Usually, one driver will be slightly more powerful than the other, and that one will tend to “win” at power-up time, making the initial state of that bit consistent.
That sounds simple enough, right? Build a long chain of these guys and you’d have a private key of arbitrary length, and each chip would generate its own un-cloneable consistent key at startup – based entirely on manufacturing variations.
Ah yes – we see your hand in the back: “What about temperature and other environmental factors?” Yes, smarty pants, this CAN be affected by environmental factors. And it also turns out that some bits are (randomly) more stable than others – having more asymmetry in the power of the two drivers. The solution is to start with more bits than you need, and to post-process a bit. By choosing only the most stable bits for the keys, and by having error-correcting code built in to take care of the random and environmental inconsistencies that crop up (the company estimates something like a 15% error margin in the raw bits), you can get a nice, consistent key that behaves just the way you want.
As it turns out, there are also some cool things you can do with those wishy-washy bits that can’t make up their mind. Those bits make an excellent beginning of a random number seed, which comes in handy in a lot of security applications. So, both the “good” and the “bad” bits of the PUF are put to good use – either because of their consistency, or because of their lack of it.
Microsemi says that the hardened PUF is already gaining a lot of traction with M2M and IoT designs. In addition to providing the basis for the end-user security features required by the market, the PUF allows OEMs to protect their own systems from counterfeiting and pirating, potentially saving a bundle on the bottom line. Putting those capabilities into a low-cost FPGA (which is often already in their system for other reasons) makes both logical and financial sense.
The results of the PUF’s magic are stored in each device’s eNVM as an industry-standard X.509 conforming certificate signed by Microsemi. The certificate certifies authenticity of the device and its contents – including serial number and date code, part number, speed grade, screening level, etc. In larger devices, it also includes the device’s ECC public keys. A key-verification protocol binds this certificate to the private key generated by the PUF. A challenge will verify that the device is authentic, and that the feature levels described in the certificate are correct – proving that the device hasn’t been counterfeited and that it hasn’t been fraudulently “upgraded.”
The OEM, upon receiving the device from Microsemi’s distribution, can verify the authenticity of the device and can then provide proof-of-possession of the device’s private key. They can then generate a user key-pair and enroll the device in the user PKI. Then, downstream applications can take advantage of the X.509 certificate for a wide range of application-specific security features and to connect to – and communicate with – other secured components of the system or network. Various classes of certificates can then be generated for application domains such as Medical, Mil-Aero, etc – all rooting back to the original private key-in-silicon maintained by the hardware PUF.
Since the private key is never passed to an external or software function, it is never visible to any human or machine in the system. This means it does not rely on “security by obscurity” or any organizational maintenance of secret key data. It is also multiply protected from attacks such as DPA – both by the architecture of the PUF itself, and by anti-DPA IP built into the FPGA itself (licensed from DPA experts Cryptography Research, Inc.). Additionally, the devices have a robust set of anti-tamper capabilities, allowing the “zeroing out” of critical data if a tamper attack is detected.
While no system or device can be made 100% secure, this implementation appears to provide a very robust starting point. It’s not a magic bullet, of course. It is up to the OEM and the end user of the system to understand their responsibilities in the security chain. There are ample opportunities in any design to lock the front door tightly while leaving the back door wide open. As system designers, we have to look at security from a system-wide holistic perspective and not develop myopia with regard to any particular platform or technology.