feature article
Subscribe Now

Obscurity and the Illusion of Security

Eric Raymond, prominent voice in the open-source movement and author of The Cathedral and the Bazaar, stated it well: “Any security software design that doesn’t assume the enemy possesses the source code is already untrustworthy.” Decades earlier, Claude Shannon was even more succinct: “The enemy knows the system.” Security experts call this Kerckhoffs’ Principle, in honor of a 19th century mathematician who first formulated it for cryptosystems. The underlying assumption is that any security-critical flaw will be found and exploited sooner or later, so at best, secrecy buys you only some delay.

Despite all this, there’s a widespread yet mistaken belief that security of a system requires that the code is secret or obfuscated, an approach summarized as “security by obscurity.” Here are a few recent examples.

Open-Source vs. Closed-Source Code

One data point comes from a recent ruling of the Federal Communications Commission (FCC), the agency responsible for certifying that wireless communication devices won’t interfere with networks. The FCC stated, “A system that is wholly dependent on open-source elements will have a high burden to demonstrate that it is sufficiently secure to warrant authorization as software-defined radio.” Why is the bar higher for open-source than for closed-source code? Presumably because there is an expectation that even certified code contains plenty of security holes, and in open-source code these are easier to find and exploit than in closed-source code. This is classical security by obscurity. I hope it makes you feel all warm, fuzzy and secure. It doesn’t do it for me, though.

National Security by Obscurity

The second example is more worrying. Green Hills Software is a leading provider of operating systems for the aerospace and defense markets. In a white paper from CEO and founder Dan O’Dowd and published on the company’s web site, we find this intriguing statement: “Publishing the source code for the operating systems used in our most critical defense systems is analogous to publishing the wiring diagrams for our military base security systems. […] Unless an operating system has no vulnerabilities, publishing its source code is sure to reduce security.” Note that the company’s Integrity operating system has just obtained Common Criteria EAL6+ security certification, the highest ever achieved for a general-purpose OS, leading some to claim (incorrectly) that Integrity is “provably secure.” So, if it’s so secure, why does it depend on security by obscurity? Apparently, O’Dowd doesn’t believe it’s secure enough to publish the source code.

Do we really feel comfortable basing national security on obscurity? Kerckhoff and Shannon weren’t, and neither am I.

Assembly Coded or C-Coded

The third example is actually quite entertaining. Trango Virtual Processors, a provider of virtualization software for embedded systems (and just acquired by VMware), uses the tagline, “The Secure Virtual Processor.” We learn more about their idea of security from an FAQ on their web site, which states, “[The] hypervisor is small, and written in assembly language. […] As an assembly-coded product it is also much more difficult for hackers to decipher than C-coded products.” Again, classical security by obscurity.

But what can we really learn from that statement? A hacker wouldn’t actually have access to the assembler source, only the binary, as Trango keeps the source secret (they actually keep the documentation secret, too). Why should the binary code generated from assembly source be any more obscure than the binaries generated by a C compiler? Compiler output can be highly structured, although much of that structure gets lost when optimization is turned on. The structure of assembly code depends on — how shall I put this? — how competently it was written. So if the assembly code is more obscure than the (optimized) compiler output, it must be an unintelligible pile of spaghetti. Does that make you feel that you can trust it? I would think the exact opposite!

Getting Real about Security

While some of this may be amusing, security in general is a serious issue – too serious to be left to amateurish obfuscation and secrecy that’s only a third-class substitute. It’s time that we got serious about security, rather than admit failure and hide behind obscurity. We obviously need systems that are designed for security. These exist, with Integrity being one of them. But we need more: we need proof that they really are secure. This includes proof that the implementation of the design is correct.

No system to date has such a proof, and the conservative assumption must therefore be that all systems are insecure. But we really can do better. The L4.verified research project at NICTA is showing us how. It is performing a complete formal verification of seL4, a version of the L4 microkernel. In other words, the researchers are developing a mathematical proof that the system’s implementation (in C and assembler code) has the required security properties for which the system was designed. Simply stated: proof that the system is free from security-relevant bugs.

A year ago, that proof covered an executable model that serves as a low-level design of the kernel. This already made it the most deeply formally analyzed operating system ever. The researchers expect to complete the proof covering the implementation within the next couple of months. This will establish it as the first really secure system.

Obscurity provides no security, only an illusion of it. Let’s get real security.

Leave a Reply

featured blogs
Nov 30, 2023
Cadence Spectre AMS Designer is a high-performance mixed-signal simulation system. The ability to use multiple engines and drive from a variety of platforms enables you to "rev up" your mixed-signal design verification and take the checkered flag in the race to the ...
Nov 27, 2023
See how we're harnessing generative AI throughout our suite of EDA tools with Synopsys.AI Copilot, the world's first GenAI capability for chip design.The post Meet Synopsys.ai Copilot, Industry's First GenAI Capability for Chip Design appeared first on Chip Design....
Nov 6, 2023
Suffice it to say that everyone and everything in these images was shot in-camera underwater, and that the results truly are haunting....

featured video

TDK CLT32 power inductors for ADAS and AD power management

Sponsored by TDK

Review the top 3 FAQs (Frequently Asked Questions) regarding TDK’s CLT32 power inductors. Learn why these tiny power inductors address the most demanding reliability challenges of ADAS and AD power management.

Click here for more information

featured paper

Power and Performance Analysis of FIR Filters and FFTs on Intel Agilex® 7 FPGAs

Sponsored by Intel

Learn about the Future of Intel Programmable Solutions Group at intel.com/leap. The power and performance efficiency of digital signal processing (DSP) workloads play a significant role in the evolution of modern-day technology. Compare benchmarks of finite impulse response (FIR) filters and fast Fourier transform (FFT) designs on Intel Agilex® 7 FPGAs to publicly available results from AMD’s Versal* FPGAs and artificial intelligence engines.

Read more

featured chalk talk

Nexperia Energy Harvesting Solutions
Sponsored by Mouser Electronics and Nexperia
Energy harvesting is a great way to ensure a sustainable future of electronics by eliminating batteries and e-waste. In this episode of Chalk Talk, Amelia Dalton and Rodrigo Mesquita from Nexperia explore the process of designing in energy harvesting and why Nexperia’s inductor-less PMICs are an energy harvesting game changer for wearable technology, sensor-based applications, and more!
May 9, 2023
25,463 views