It’s computers all the way down. We know about recursion in software, but it’s surprising to find it occurring in hardware. How many computers are really inside your computer? One? A couple? Maybe a dozen? In reality, it’s probably hundreds.
Normal people count the average PC, Mac, or Linux box as one computer. But, as engineers, we know there’s really more than one processor inside. But how many, really? In the early days of the IBM Personal Computer Model 5150, the keyboard had its own 8048 microcontroller chip that translated the up/down key actions into the weird IBM “scancodes” that PCs have used ever since. So that’s two processors…
Today, your PC’s video output is probably handled by a GPU from Intel, AMD, or nVidia. That’s one more processor — and a pretty elaborate one, too. GPUs are not simple machines, and they’re completely programmable, which makes them processors by any definition.
Depending on your GPU, it may have tens, hundreds, or even thousands of separate processing cores. Do we count each one separately, or treat them all as one GPU? Same goes for your computer’s main processor. It’s probably got four, eight, or more CPU cores.
Importantly, each one of those CPU cores is a complete processor that’s programmable and largely independent of its sibling CPU cores on the same die. Each core might also be dual- or multithreaded, nearly doubling its capabilities. Plus, there are security processors buried within the x86 processor, like the Intel Management Engine or AMD’s Platform Security Processor. Even the MMU can operate on its own. How many processors are we up to now?
Hard disk drives, SSDs, and optical drives all have their own controller ICs that contain one or more processors (probably ARM-based), and many of those are multi-core designs as well. Ethernet and Wi-Fi interfaces are processor-controlled, as are USB ports. Even USB cables have processors inside. Pluggable SD cards have their own internal controllers, not just memory. Got a fancy gaming rig with addressable LEDs, PWM fans, an AIO cooler, a Corsair controller, and DIMMs that light up? Guess what’s controlling all of those.
Then there are the less obvious embedded computers inside smartphones, cars, HVAC systems, game consoles, coffee makers, and the zillion other devices that futurist Donald A. Norman calls “invisible computers.”
But then it starts to get weird. You could argue that a computer is anything that can run a program. More technically, it must be “Turing complete,” meaning it can run any arbitrary program at all, including compilers, emulators, applications, games, operating systems, viruses — anything. And Turing-complete machines turn out to be surprisingly common. Many are even accidentally Turing complete (or at least, not obviously deliberately so). Like, say, a deck of Magic: The Gathering cards. Or PostScript fonts. They can all run programs.
As researcher Gwern Branwen points out, a mind-numbing number of everyday items are Turing complete, making it almost impossible to count the number of “computers” with any accuracy. For example, Adobe’s PostScript and Apple’s TrueType are both capable of running arbitrary programs. A digital font is, in essence, a program executed by a PostScript or TrueType interpreter, but there’s nothing in either definition that restricts it to merely rendering typefaces. Something as simple as a new font could deliver code, or it could inject malware. Both have already happened. One creative programmer implemented a game using nothing but the Times New Roman font. It’s playable anywhere downloadable fonts are supported, which is pretty much everywhere.
That’s not to say that PostScript would be a good choice of language for code development or deployment. Only that its interpreters are real, stack-based, Turing-complete machines.
Branwen identifies other non-obvious computers, like sendmail configuration files, Excel spreadsheets, PowerPoint presentations (including this live demonstration), the C preprocessor (not the compiler), the MMU on an x86 processor (yes, it’s programmable, even though it has zero instructions), the Border Gateway Protocol (BGP), PDF documents, Java TypeScript, and games like Pokemon Yellow, Minecraft (perhaps predictably), and, weirdest of all, an ordinary deck of Magic: The Gathering cards. It’s like 1950s punch cards all over again.
Not surprisingly, some of these “accidental computers” have been exploited as bugs. Google’s Project Zero identified a font as a malware vector, which resulted in 15 separate CVE reports with severity ranging from minor to severe. It can perform ”… arithmetic, logic, conditional, and other [operations] anywhere on the exploited thread’s stack, with full control over what is overwritten and how… defeating all modern exploit mitigation techniques such as stack cookies, DEP, ASLR, SMEP and so on.” That’s some nasty lettering.
Branwen’s point is that there are so many Turing-complete machines lurking in unexpected places that it’s impossible to fully secure any system. “It’s hard enough to make a program do what it’s supposed to do without giving anyone in the world the ability to insert another program into your program. That we find these demonstrations surprising is itself a demonstration of our lack of imagination and understanding of computers, computer security, and AI.”
There are simply too many holes. The more we raise levels of abstraction and delegate tasks to subsystems, the more holes we open up. The whole point of delegating the task of, say, handling disk I/O, is to offload that work from the main CPU. But if the delegate (in this case, the disk controller) is untrustworthy, you’ve laid the groundwork for a palace coup. Quis custodiet ipsos custodes?
The hunt for accidental computers is both amusing and disheartening. It’s fun to experiment with turning a set of Tinkertoys into a mechanical computer, but scary to think that you actually might do it. As we add more CPUs and MCUs to our systems — because why not, they’re cheap — we open up more potential vulnerabilities. We trust that we have the ability to make our systems do what we want. But we must also remember that someone else could do the same.