feature article
Subscribe Now

The Human Factor

IoT Takes People Out of the Loop

The imminent IoT is the final phase of a computing revolution that began decades ago.

First, it was all about processor speed. From our puny 8-bit 1MHz microprocessors to today’s 64-bit multi-core multi-gigahertz behemoths, the pursuit of MIPS and FLOPS led us from machines that could barely run the most rudimentary programs to multiprocessing beasts that can crunch data at unimaginable speeds. Over time, however, processor performance gradually faded from the spotlight. Now, the incremental capability we gain with a doubling of processor speed is comparatively insignificant, and the focus in processor design has switched from raw performance to power efficiency.

Next, the big issue was storage. The newer, faster processors ran more complex software, which demanded more memory and more mass storage. Our applications could gobble up every order-of-magnitude capacity gain our memory technology could provide. The kilo-to-mega-to-giga-to-tera-to-petabyte storage progress meter was pegged to the right for the better part of fifty years. Over time, though, the repeated 1,000x-ing of available storage capacity caught up with our most ambitious demands. The average consumer could afford enough media to store their entire music and video collection on devices they could grab from the grocery checkout line. And, even the biggest of the big-data applications didn’t bump into the server storage ceiling. We had, for most practical purposes, enough storage.

Then, the battle was about bandwidth. Our fast-processing, massive data-storing systems needed to talk to one another, and they needed to do it at eye-watering data rates. Bits blasted through SerDes channels at speeds that would have previously been unimaginable. Packets were routed to their destinations with remarkable speed, accuracy, and efficiency. Our “series of tubes” was replaced by scores of enormous data pipes wrapping around the Earth. Over time, though, even our seemingly insatiable need for bandwidth began to slowly subside.

The burden then moved to the software sector. With all this speed, storage, and bandwidth, the imagination of application developers was challenged anew. The intelligent information infrastructure had the potential to accomplish the seemingly impossible, but the billions of lines of code required to realize that vision would take armies of programmers decades to complete. If hardware were to suddenly come to a complete stop, new software technology could keep us innovating for another half century at near-Moore’s Law rates.

But now, the focus must switch to the interface. For far too long our incredible computing infrastructure has depended on the keyboard, mouse, and screen for the vast majority of its interaction with humans. These decades-old technologies have barely budged. While other aspects of our technological revolution have leapt ahead at a dizzying pace, adding zeroes right and left to key capability metrics, the old keyboard, mouse, and monitor have been plugging along with basically the same capabilities they had in 1984. Sure, we’ve picked up a few more color pixels during the journey, but the fundamental power of our human interface has lagged severely compared with other elements of technology.

This is what the IoT is really about.

IoT is about new ways of getting information into our machines – bypassing the human-at-a-keyboard model. It’s about putting information to work in new and innovative ways that don’t require a video screen to impact our lives. It’s about a complete overhaul of our aging, obsolete, laggardly methods of human-machine interface. The keyboard and mouse must die. 

The IoT allows devices to gather and share data without human intervention. It allows our smart machines to take action directly, without first presenting information to a human on a display. It brings our technological takeover to the final stage, where humans can be unplugged, and where the machines can start actually serving people rather than the other way around. 

This final wave of revolutionary invention will require more inter-disciplinary engineering than any of the tasks before it. The full force of our semiconductor, software, and fabrication technologies will need to be paired with leaps in mechanical, sensor, actuator, and chemical technology. Giving machines the ability to truly see, feel, hear, taste and smell the world is a monumental endeavor, the surface of which we’ve barely scratched. Endowing our creations with senses and abilities far surpassing our own is an ambitious but achievable task. But, in doing so, we will gain the ability to know and do things that were previously simply un-knowable and undoable. 

Those who dismiss IoT over its first, wobbly steps with arguments like “I don’t need my toaster to talk to my refrigerator” or “IoT is nothing but a security nightmare” are missing the bigger picture. Consider, for example, the potential impact on healthcare. Our bodies are packed with information that could help us maintain our health, but we are currently not instrumented to collect it. Physicians attempt diagnoses with paltry single samples of data. When you go to the clinic, your heart rate, blood pressure, and other vitals are typically measured once – and not at the most useful time. The “best guess” diagnosis that results is notoriously inaccurate. But, if our vital signs and other important health-related data could be collected and analyzed continuously, trendiness could are established, response to various situations could be monitored, and the accuracy and timeliness of diagnoses would skyrocket. And, we would most certainly discover new early warning signs for numerous conditions that we simply did not have the data to diagnose before. This would result in a dramatic improvement in the efficiency of healthcare. Physicians could focus on treatment of patients whose conditions are known with a high degree of certainty and who were diagnosed at the most advantageous time.

Similarly, an IoT-driven revolution in safe transportation is already underway. Within a few years, the number of people killed or injured in transportation-related accidents should drop precipitously. The energy consumed by transport should be dramatically reduced, and the reliability and performance of our transportation should make a leap forward. Self-driving cars are certainly the most visible of these innovations, but the changes that will follow in their wake will be perhaps even more profound. 

There are numerous other examples of whole industries that will be created, re-imagined, or destroyed by IoT technology within the coming decades. And, as a result, life for most humans will, again, change dramatically – transformed by the IoT technologies we are all developing today.

Of course, like most powerful technology-driven transformations, the social impact will be the most difficult to manage. While the creation of revolutionary technology is inevitable, society’s ability to absorb rapid change is limited. As it has been with the information technology revolution of the past three decades, it is likely that entire generations of people will be confused and bewildered by the coming IoT future. People are a paranoid lot, and the kinds of new norms that IoT will bring will challenge and likely overwhelm our current social structure. And, our cultural and political systems will most likely be nowhere near up to the task of managing the kind of fear, distrust, and abuse that the coming wave of IoT innovation will bring in its wake.

Those of us who are designing the IoT must be ever mindful of that most important set of considerations – those that relate to how people will interact with and relate to what we are creating. As we engineer the human out of the system, we need to remember to engineer the system around the human. Often, the technical challenges are the easy part. The most difficult engineering task is to create technology that is approachable, accessible, and understandable – even though its workings may be mind-bogglingly complex. Since IoT is fundamentally designed to serve humanity, we need to be sure that it solves more problems than it causes.


One thought on “The Human Factor”

  1. Your comparison of IoT with all of the previous advances in computing is flawed because faster processors and more memory/storage/bandwidth were all demanded by consumers. IoT, on the other hand, is being pushed by manufacturers as a way to add dubious capabilities to their products in order to charge more for them. While you are dismissive of people who say “I don’t need my toaster to talk to my refrigerator,” that’s the way IoT is being pushed into the marketplace.

    As for the healthcare applications, sensors exist today which can gather large amounts of information continuously. The limitation isn’t the technology or bandwidth, it’s the cost.

    Whether IoT becomes an enabling technology for new and imaginative purposes beyond connected household appliances remains to be seen. Or maybe we will look back on it as this decade’s version of what 3D TV was in the last: A vendor-driven novelty that people thought was cool but nobody really wanted in their home.

Leave a Reply

featured blogs
Sep 25, 2020
What do you think about earphone-style electroencephalography sensors that would allow your boss to monitor your brainwaves and collect your brain data while you are at work?...
Sep 25, 2020
Weird weather is one the things making 2020 memorable. As I look my home office window (WFH – yet another 2020 “thing”!), it feels like mid-summer in late September. In some places like Key West or Palm Springs, that is normal. In Pennsylvania, it is not. My...
Sep 25, 2020
[From the last episode: We looked at different ways of accessing a single bit in a memory, including the use of multiplexors.] Today we'€™re going to look more specifically at memory cells '€“ these things we'€™ve been calling bit cells. We mentioned that there are many...
Sep 25, 2020
Normally, in May, I'd have been off to Unterschleißheim, a suburb of Munich where historically we've held what used to be called CDNLive EMEA. We renamed this CadenceLIVE Europe and... [[ Click on the title to access the full blog on the Cadence Community site...

Featured Video

Latency-Optimized PAM-4 Architecture for Next-Generation PCIe

Sponsored by Synopsys

This video presentation briefly describes how DesignWare® IP for PCIe® 5.0 is minimizing risk and accelerating time to market, and what Synopsys is doing to help designers prepare for next-generation PAM-4 PCIe 6.0 designs.

Click here for more information about DesignWare IP Solutions for PCI Express

Featured Paper

The Cryptography Handbook

Sponsored by Maxim Integrated

The Cryptography Handbook is designed to be a quick study guide for a product development engineer, taking an engineering rather than theoretical approach. In this series, we start with a general overview and then define the characteristics of a secure cryptographic system. We then describe various cryptographic concepts and provide an implementation-centric explanation of physically unclonable function (PUF) technology. We hope that this approach will give the busy engineer a quick understanding of the basic concepts of cryptography and provide a relatively fast way to integrate security in his/her design.

Click here to download the whitepaper

Featured Chalk Talk

Machine Learning at the Edge

Sponsored by Mouser Electronics and NXP

AI and neural networks are part of almost every new system design these days. But, most system designers are just coming to grips with the implications of designing AI into their systems. In this episode of Chalk Talk, Amelia Dalton chats with Anthony Huereca of NXP about the ins and outs of machine learning and the NXP iEQ Machine Learning Software Development Environment.

Click here for more information about NXP Semiconductors i.MX RT1060 EVK Evaluation Kit (MIMXRT1060-EVK)