feature article
Subscribe Now

Obscurity and the Illusion of Security

Eric Raymond, prominent voice in the open-source movement and author of The Cathedral and the Bazaar, stated it well: “Any security software design that doesn’t assume the enemy possesses the source code is already untrustworthy.” Decades earlier, Claude Shannon was even more succinct: “The enemy knows the system.” Security experts call this Kerckhoffs’ Principle, in honor of a 19th century mathematician who first formulated it for cryptosystems. The underlying assumption is that any security-critical flaw will be found and exploited sooner or later, so at best, secrecy buys you only some delay.

Despite all this, there’s a widespread yet mistaken belief that security of a system requires that the code is secret or obfuscated, an approach summarized as “security by obscurity.” Here are a few recent examples.

Open-Source vs. Closed-Source Code

One data point comes from a recent ruling of the Federal Communications Commission (FCC), the agency responsible for certifying that wireless communication devices won’t interfere with networks. The FCC stated, “A system that is wholly dependent on open-source elements will have a high burden to demonstrate that it is sufficiently secure to warrant authorization as software-defined radio.” Why is the bar higher for open-source than for closed-source code? Presumably because there is an expectation that even certified code contains plenty of security holes, and in open-source code these are easier to find and exploit than in closed-source code. This is classical security by obscurity. I hope it makes you feel all warm, fuzzy and secure. It doesn’t do it for me, though.

National Security by Obscurity

The second example is more worrying. Green Hills Software is a leading provider of operating systems for the aerospace and defense markets. In a white paper from CEO and founder Dan O’Dowd and published on the company’s web site, we find this intriguing statement: “Publishing the source code for the operating systems used in our most critical defense systems is analogous to publishing the wiring diagrams for our military base security systems. […] Unless an operating system has no vulnerabilities, publishing its source code is sure to reduce security.” Note that the company’s Integrity operating system has just obtained Common Criteria EAL6+ security certification, the highest ever achieved for a general-purpose OS, leading some to claim (incorrectly) that Integrity is “provably secure.” So, if it’s so secure, why does it depend on security by obscurity? Apparently, O’Dowd doesn’t believe it’s secure enough to publish the source code.

Do we really feel comfortable basing national security on obscurity? Kerckhoff and Shannon weren’t, and neither am I.

Assembly Coded or C-Coded

The third example is actually quite entertaining. Trango Virtual Processors, a provider of virtualization software for embedded systems (and just acquired by VMware), uses the tagline, “The Secure Virtual Processor.” We learn more about their idea of security from an FAQ on their web site, which states, “[The] hypervisor is small, and written in assembly language. […] As an assembly-coded product it is also much more difficult for hackers to decipher than C-coded products.” Again, classical security by obscurity.

But what can we really learn from that statement? A hacker wouldn’t actually have access to the assembler source, only the binary, as Trango keeps the source secret (they actually keep the documentation secret, too). Why should the binary code generated from assembly source be any more obscure than the binaries generated by a C compiler? Compiler output can be highly structured, although much of that structure gets lost when optimization is turned on. The structure of assembly code depends on — how shall I put this? — how competently it was written. So if the assembly code is more obscure than the (optimized) compiler output, it must be an unintelligible pile of spaghetti. Does that make you feel that you can trust it? I would think the exact opposite!

Getting Real about Security

While some of this may be amusing, security in general is a serious issue – too serious to be left to amateurish obfuscation and secrecy that’s only a third-class substitute. It’s time that we got serious about security, rather than admit failure and hide behind obscurity. We obviously need systems that are designed for security. These exist, with Integrity being one of them. But we need more: we need proof that they really are secure. This includes proof that the implementation of the design is correct.

No system to date has such a proof, and the conservative assumption must therefore be that all systems are insecure. But we really can do better. The L4.verified research project at NICTA is showing us how. It is performing a complete formal verification of seL4, a version of the L4 microkernel. In other words, the researchers are developing a mathematical proof that the system’s implementation (in C and assembler code) has the required security properties for which the system was designed. Simply stated: proof that the system is free from security-relevant bugs.

A year ago, that proof covered an executable model that serves as a low-level design of the kernel. This already made it the most deeply formally analyzed operating system ever. The researchers expect to complete the proof covering the implementation within the next couple of months. This will establish it as the first really secure system.

Obscurity provides no security, only an illusion of it. Let’s get real security.

Leave a Reply

featured blogs
Jan 15, 2021
It's Martin Luther King Day on Monday. Cadence is off. Breakfast Bytes will not appear. And, as is traditional, I go completely off-topic the day before a break. In the past, a lot of novelty in... [[ Click on the title to access the full blog on the Cadence Community s...
Jan 14, 2021
Learn how electronic design automation (EDA) tools & silicon-proven IP enable today's most influential smart tech, including ADAS, 5G, IoT, and Cloud services. The post 5 Key Innovations that Are Making Everything Smarter appeared first on From Silicon To Software....
Jan 13, 2021
Here are some genius solutions to everyday problems you probably didn'€™t even know existed, but after you'€™ve seen them you'€™ll say '€œWow!'€...
Jan 13, 2021
Testing is the final step of any manufacturing process, and arguably the most important, and yet it can often be overlooked.  Releasing a poorly tested product onto the market has destroyed more than one reputation for quality, and this is even more important in an age when ...

featured paper

Common Design Pitfalls When Designing With Hall 2D Sensors And How To Avoid Them

Sponsored by Texas Instruments

This article discusses three widespread application issues in industrial and automotive end equipment – rotary encoding, in-plane magnetic sensing, and safety-critical – that can be solved more efficiently using devices with new features and higher performance. We will discuss in which end products these applications can be found and also provide a comparison with our traditional digital Hall-effect sensors showing how the new releases complement our existing portfolio.

Click here to download the whitepaper

Featured Chalk Talk

Evaluation and Development Kits

Sponsored by Samtec

With signal integrity becoming increasingly challenging in today’s designs, interconnect is taking on a key role. In order to see how a particular interconnect solution will perform in our design, we really need hands-on evaluation of the technology. In this episode of Chalk Talk, Amelia Dalton chats with Matthew Burns of Samtec about evaluation and development kits for high-speed interconnect solutions.

More information about Samtec Evaluation and Development Kits