feature article
Subscribe Now

Why Your Computer is Slower Than a 1970s PC

Creeping Featurism has Actually Slowed Down User Input

“I’m not deaf. I’m ignoring you.” T-shirt wisdom

Devo and The Tubes warned us this would happen: it’s de-evolution, or the completion backwards principle. We’re regressing. It’s not our imagination. Our computers really are getting slower. 

Okay, so maybe not the entire computer, but one important part of nearly every modern machine is slower – a lot slower – than computers of two or three decades ago.  You can also get custom designed PCs so that they have exactly what you need so if you have specific requirements you should definitely consider a custom PC.

I blame software. 

Processors are faster, but user interfaces are slowing down. Specifically, the time from when you press a key on your keyboard or tap the touchscreen until the input shows up on your screen is taking far longer than it did in the 1970s, ’80s, or ’90s. We’ve added so many filters and abstraction layers that a modern PC, phone, or tablet can be ten times slower to recognize input compared to a lowly Commodore PET from 1977. 

Granted, every other part of our computers has improved by orders of magnitude since the first Star Wars movie, and that’s a good thing. But, in the interim, we’ve replaced simple keyboard switches with microcontrollers, scan loops, translation layers, abstraction layers, buffers, and (surprisingly) slower displays. These changes all add up, to the point where input latency has gone from 30–40 milliseconds in the 1970s to 300–400 ms today. 

Even tablets and smartphones, which didn’t exist in the Disco Era, exhibit orders of magnitude difference in responsiveness. And perversely, some of the older handhelds show more alacrity than the newer ones. Here again, we seem to be progressing backwards. 

These numbers come to us courtesy of New York–based engineer Dan Luu, who tried to convince himself that his new machine wasn’t really slower than his old one… but failed. He used a high-speed 1000 frame/sec camera to measure the time between keypress (or screen tap) to display update. In all, he tested over 50 different machines – guy’s got a big closet – and found that computers from the Carter Administration almost always outperformed newer machines when it comes to processing keystrokes. 

The fastest machine in his test was an Apple IIe, circa 1983, with a 30ms response time. The slowest was Amazon’s Kindle 4, at 860ms. That’s a 28:1 difference in response time over a (coincidentally) 28-year span. This, despite the Kindle’s 800-MHz ARM processor versus the Apple’s 1-MHz 6502. Clearly, CPU horsepower ain’t the problem. The culprit lies somewhere else, as we’ll see. 

There are some interesting groupings in his results, which suggest that certain operating systems, hardware platforms, and design decisions consistently affect response time. The three slowest machines in Luu’s experiment were all Kindles – and they were all much slower than the 50+ other devices he tested. That’s due to three factors all conspiring together: slow display, slow operating system, and low-cost hardware design. Amazon sells Kindles for cheap in the hope of making up for lost profit in content sales. Kindles are the razors to Amazon’s blades. They’re designed for lowest cost, not highest performance, and it shows. Kindles also use slow monochrome displays suited to reading. 

Tied for first place with the ancient Apple IIe was another Apple product, the brand new iPad Pro. In fact, Apple products occupy almost all the top dozen spots in the portable category. They’re significantly quicker than equivalent Android phones or tablets, often by a 2:1 margin. 

What’s the reason for this? Luu credits vertical integration. Unlike the legion of Android designers, Apple creates its own software to run on its own hardware. There’s no need for Android’s keyboard or screen abstraction; the company’s engineers can tune the user-interface stack from top to bottom without worrying that some OEM will want to integrate an unusual input or output device. 

To no one’s great surprise, Windows 10 lands squarely at the slow end of the latency spectrum. It’s not the worst by any means, but the 4.2-GHz Core i7-7700K machine he tested took 200ms – one-fifth of a second – to render user input. The same machine running Linux was a bit faster, at 170ms, but barely enough to notice. 

Which raises an important question: Can anyone even notice a 30ms difference in response time, or are we splitting interface hairs? The short answer is, yes. Studies have shown that even a 2ms difference in visual response time is noticeable. Increasing it much beyond double digits creates a noticeable lag that causes people to make mistakes, like double-tapping an icon or misdirecting a mouse. Get into triple-digit latencies and people get headaches and generally start hating life. And your product. 

Latency depends on several factors, and Luu dug into why today’s machines are so slow. For one, screen refresh rates make a big difference. Because he’s measuring the time for output to appear on the screen, not just when it gets sent to a display controller, the screen’s update frequency is important. 

“We get a 90-ms improvement from going from 24 Hz to 165 Hz [refresh rate]. At 24 Hz, each frame takes 41.67ms, and at 165 Hz, each frame takes 6.061ms,” reports Luu. That’s basic math. What’s interesting is that the actual delay is much longer than just the frame rate. 

With a typical raster-scan display screen, you’d expect the first pixel to be displayed in the upper-left corner, ending at the lower-right, before the process repeats. At a nice round 30-Hz refresh rate, that’s about 33ms per update. Worst case, if you’re unlucky and just miss the screen update, you’d have to wait another 33ms for the next update to come around. On average, though, you’ll miss about half of each cycle, or about 17ms. 

But that’s not what happens. 

In practice, Luu sees latency of about 2.5 times the refresh rate, and it’s consistent across devices and display settings. Even with an infinitely fast display, today’s machines would barely match the display latency of “standard machines from the ’70s and ’80s.” 

Another contributor to sluggish user interfaces is the ever-increasing layer of software. Instead of treating keyboard switches as switches, we massage them with microcontrollers, firmware, OS abstraction layers, keyboard translation, and more. By the time the keypress has been debounced, filtered, verified, recognized, and buffered, several milliseconds have passed and the OS (much less the application) still doesn’t even know the key’s been pressed. 

It takes longer for a keypress to get from the keyboard to the screen than it does for a TCP/IP packet to circle the globe, Luu notes with a mixture of wonder and frustration. 

Gamers and graphics artists are advised to crank up their monitor’s refresh rate as high as it will go. Get a faster (and more expensive) graphics card, and don’t rely on basic HDMI cables, which are limited to 60 Hz. 

Lovely and intuitive graphical user interfaces are generally considered A Good Thing and a technological step forward. But all that abstraction takes its toll, and not just in additional RAM and ROM. They slow things down, and unless you tune your hardware and software to minimize the visible delays, your users might be visibly upset. 

In his landmark business book, The Innovator’s Dilemma, Clayton Christensen argues that innovators flourish when incumbent vendors overshoot the market’s demand for shiny new features. Upstarts with “inferior” products sneak in with lower prices and less functionality because that’s all the customer really wanted. Let’s hope we’re not squandering all that new computing performance on features that create an opening for competitors. 

3 thoughts on “Why Your Computer is Slower Than a 1970s PC”

  1. Worse still for some people, the dramatic ‘opening a window ‘ visual event can trigger ‘Threshold Amnesia’, known to most of us as “Why did I come into this room just now?…)
    This hindbrain protection survival interrupt clears your mental desktop to prevent you walking off a precipice, out of your cave, or into a clearing while lost in thought. We can assume that those who did not have this trigger, or had it set too low, or could mask this NMI, died off shortly after we learned to string two thoughts together and create our internal ‘reality’.
    Sadly, those of us with a low threshold trigger now find themselves wondering ‘Why do I have this app/dialog/web page open?..’
    We lament the old character-based one-app screen with slow transitions between modes that gave you time e.g. to decide what filename to use. Oh, and you could type as fast as they could spell, you didn’t need to wait for each char to appear either. As a sensitive ADHD soul, I don’t look when opening apps and pages, I use the time to plan instead.

  2. To be fair, comparing the Apple //e with a Kindle is unfair. The point of the Kindle is using e-ink for reading and this is a technology that is still in development and very slow by nature. If you want reflective light screen like paper, this is what there is, no alternatives (yet). Also, a Kindle (or any other e-reader) is a device whose purpose doesn’t need a fast response to the user input. It is a device made for relaxed use.

    1. Not unfair (IMHO) but definitely a contributor. The e-ink display in Kindles is very slow, but that’s fine for reading novels. It has other advantages (daylight visibility, for one) but speed isn’t one of them.

Leave a Reply

featured blogs
Dec 2, 2024
The Wi-SUN Smart City Living Lab Challenge names the winners with Farmer's Voice, a voice command app for agriculture use, taking first place. Read the blog....
Dec 3, 2024
I've just seen something that is totally droolworthy, which may explain why I'm currently drooling all over my keyboard....

Libby's Lab

Libby's Lab - Scopes Out Silicon Labs EFRxG22 Development Tools

Sponsored by Mouser Electronics and Silicon Labs

Join Libby in this episode of “Libby’s Lab” as she explores the Silicon Labs EFR32xG22 Development Tools, available at Mouser.com! These versatile tools are perfect for engineers developing wireless applications with Bluetooth®, Zigbee®, or proprietary protocols. Designed for energy efficiency and ease of use, the starter kit simplifies development for IoT, smart home, and industrial devices. From low-power IoT projects to fitness trackers and medical devices, these tools offer multi-protocol support, reliable performance, and hassle-free setup. Watch as Libby and Demo dive into how these tools can bring wireless projects to life. Keep your circuits charged and your ideas sparking!

Click here for more information about Silicon Labs xG22 Development Tools

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

Outgassing: The Hidden Danger in Harsh Environments
In this episode of Chalk Talk, Amelia Dalton and Scott Miller from Cinch Connectivity chat about the what, where, and how of outgassing in space applications. They explore a variety of issues that can be caused by outgassing in these applications and how you can mitigate outgassing in space applications with Cinch Connectivity interconnect solutions. 
May 7, 2024
39,304 views