feature article
Subscribe Now

The Human Factor

IoT Takes People Out of the Loop

The imminent IoT is the final phase of a computing revolution that began decades ago.

First, it was all about processor speed. From our puny 8-bit 1MHz microprocessors to today’s 64-bit multi-core multi-gigahertz behemoths, the pursuit of MIPS and FLOPS led us from machines that could barely run the most rudimentary programs to multiprocessing beasts that can crunch data at unimaginable speeds. Over time, however, processor performance gradually faded from the spotlight. Now, the incremental capability we gain with a doubling of processor speed is comparatively insignificant, and the focus in processor design has switched from raw performance to power efficiency.

Next, the big issue was storage. The newer, faster processors ran more complex software, which demanded more memory and more mass storage. Our applications could gobble up every order-of-magnitude capacity gain our memory technology could provide. The kilo-to-mega-to-giga-to-tera-to-petabyte storage progress meter was pegged to the right for the better part of fifty years. Over time, though, the repeated 1,000x-ing of available storage capacity caught up with our most ambitious demands. The average consumer could afford enough media to store their entire music and video collection on devices they could grab from the grocery checkout line. And, even the biggest of the big-data applications didn’t bump into the server storage ceiling. We had, for most practical purposes, enough storage.

Then, the battle was about bandwidth. Our fast-processing, massive data-storing systems needed to talk to one another, and they needed to do it at eye-watering data rates. Bits blasted through SerDes channels at speeds that would have previously been unimaginable. Packets were routed to their destinations with remarkable speed, accuracy, and efficiency. Our “series of tubes” was replaced by scores of enormous data pipes wrapping around the Earth. Over time, though, even our seemingly insatiable need for bandwidth began to slowly subside.

The burden then moved to the software sector. With all this speed, storage, and bandwidth, the imagination of application developers was challenged anew. The intelligent information infrastructure had the potential to accomplish the seemingly impossible, but the billions of lines of code required to realize that vision would take armies of programmers decades to complete. If hardware were to suddenly come to a complete stop, new software technology could keep us innovating for another half century at near-Moore’s Law rates.

But now, the focus must switch to the interface. For far too long our incredible computing infrastructure has depended on the keyboard, mouse, and screen for the vast majority of its interaction with humans. These decades-old technologies have barely budged. While other aspects of our technological revolution have leapt ahead at a dizzying pace, adding zeroes right and left to key capability metrics, the old keyboard, mouse, and monitor have been plugging along with basically the same capabilities they had in 1984. Sure, we’ve picked up a few more color pixels during the journey, but the fundamental power of our human interface has lagged severely compared with other elements of technology.

This is what the IoT is really about.

IoT is about new ways of getting information into our machines – bypassing the human-at-a-keyboard model. It’s about putting information to work in new and innovative ways that don’t require a video screen to impact our lives. It’s about a complete overhaul of our aging, obsolete, laggardly methods of human-machine interface. The keyboard and mouse must die. 

The IoT allows devices to gather and share data without human intervention. It allows our smart machines to take action directly, without first presenting information to a human on a display. It brings our technological takeover to the final stage, where humans can be unplugged, and where the machines can start actually serving people rather than the other way around. 

This final wave of revolutionary invention will require more inter-disciplinary engineering than any of the tasks before it. The full force of our semiconductor, software, and fabrication technologies will need to be paired with leaps in mechanical, sensor, actuator, and chemical technology. Giving machines the ability to truly see, feel, hear, taste and smell the world is a monumental endeavor, the surface of which we’ve barely scratched. Endowing our creations with senses and abilities far surpassing our own is an ambitious but achievable task. But, in doing so, we will gain the ability to know and do things that were previously simply un-knowable and undoable. 

Those who dismiss IoT over its first, wobbly steps with arguments like “I don’t need my toaster to talk to my refrigerator” or “IoT is nothing but a security nightmare” are missing the bigger picture. Consider, for example, the potential impact on healthcare. Our bodies are packed with information that could help us maintain our health, but we are currently not instrumented to collect it. Physicians attempt diagnoses with paltry single samples of data. When you go to the clinic, your heart rate, blood pressure, and other vitals are typically measured once – and not at the most useful time. The “best guess” diagnosis that results is notoriously inaccurate. But, if our vital signs and other important health-related data could be collected and analyzed continuously, trendiness could are established, response to various situations could be monitored, and the accuracy and timeliness of diagnoses would skyrocket. And, we would most certainly discover new early warning signs for numerous conditions that we simply did not have the data to diagnose before. This would result in a dramatic improvement in the efficiency of healthcare. Physicians could focus on treatment of patients whose conditions are known with a high degree of certainty and who were diagnosed at the most advantageous time.

Similarly, an IoT-driven revolution in safe transportation is already underway. Within a few years, the number of people killed or injured in transportation-related accidents should drop precipitously. The energy consumed by transport should be dramatically reduced, and the reliability and performance of our transportation should make a leap forward. Self-driving cars are certainly the most visible of these innovations, but the changes that will follow in their wake will be perhaps even more profound. 

There are numerous other examples of whole industries that will be created, re-imagined, or destroyed by IoT technology within the coming decades. And, as a result, life for most humans will, again, change dramatically – transformed by the IoT technologies we are all developing today.

Of course, like most powerful technology-driven transformations, the social impact will be the most difficult to manage. While the creation of revolutionary technology is inevitable, society’s ability to absorb rapid change is limited. As it has been with the information technology revolution of the past three decades, it is likely that entire generations of people will be confused and bewildered by the coming IoT future. People are a paranoid lot, and the kinds of new norms that IoT will bring will challenge and likely overwhelm our current social structure. And, our cultural and political systems will most likely be nowhere near up to the task of managing the kind of fear, distrust, and abuse that the coming wave of IoT innovation will bring in its wake.

Those of us who are designing the IoT must be ever mindful of that most important set of considerations – those that relate to how people will interact with and relate to what we are creating. As we engineer the human out of the system, we need to remember to engineer the system around the human. Often, the technical challenges are the easy part. The most difficult engineering task is to create technology that is approachable, accessible, and understandable – even though its workings may be mind-bogglingly complex. Since IoT is fundamentally designed to serve humanity, we need to be sure that it solves more problems than it causes.

 

One thought on “The Human Factor”

  1. Your comparison of IoT with all of the previous advances in computing is flawed because faster processors and more memory/storage/bandwidth were all demanded by consumers. IoT, on the other hand, is being pushed by manufacturers as a way to add dubious capabilities to their products in order to charge more for them. While you are dismissive of people who say “I don’t need my toaster to talk to my refrigerator,” that’s the way IoT is being pushed into the marketplace.

    As for the healthcare applications, sensors exist today which can gather large amounts of information continuously. The limitation isn’t the technology or bandwidth, it’s the cost.

    Whether IoT becomes an enabling technology for new and imaginative purposes beyond connected household appliances remains to be seen. Or maybe we will look back on it as this decade’s version of what 3D TV was in the last: A vendor-driven novelty that people thought was cool but nobody really wanted in their home.

Leave a Reply

featured blogs
Apr 9, 2021
You probably already know what ISO 26262 is. If you don't, then you can find out in several previous posts: "The Safest Train Is One that Never Leaves the Station" History of ISO 26262... [[ Click on the title to access the full blog on the Cadence Community s...
Apr 8, 2021
We all know the widespread havoc that Covid-19 wreaked in 2020. While the electronics industry in general, and connectors in particular, took an initial hit, the industry rebounded in the second half of 2020 and is rolling into 2021. Travel came to an almost stand-still in 20...
Apr 7, 2021
We explore how EDA tools enable hyper-convergent IC designs, supporting the PPA and yield targets required by advanced 3DICs and SoCs used in AI and HPC. The post Why Hyper-Convergent Chip Designs Call for a New Approach to Circuit Simulation appeared first on From Silicon T...
Apr 5, 2021
Back in November 2019, just a few short months before we all began an enforced… The post Collaboration and innovation thrive on diversity appeared first on Design with Calibre....

featured video

Learn the basics of Hall Effect sensors

Sponsored by Texas Instruments

This video introduces Hall Effect, permanent magnets and various magnetic properties. It'll walk through the benefits of Hall Effect sensors, how Hall ICs compare to discrete Hall elements and the different types of Hall Effect sensors.

Click here for more information

featured paper

Understanding the Foundations of Quiescent Current in Linear Power Systems

Sponsored by Texas Instruments

Minimizing power consumption is an important design consideration, especially in battery-powered systems that utilize linear regulators or low-dropout regulators (LDOs). Read this new whitepaper to learn the fundamentals of IQ in linear-power systems, how to predict behavior in dropout conditions, and maintain minimal disturbance during the load transient response.

Click here to download the whitepaper

Featured Chalk Talk

Transforming 400V Power for SELV Systems

Sponsored by Mouser Electronics and Vicor

Converting from distribution-friendly voltages like 400V down to locally-useful voltages can be a tough engineering challenge. In SELV systems, many teams turn to BCM converter modules because of their efficiency, form factor, and ease of design-in. In this episode of Chalk Talk, Amelia Dalton chats with Ian Masza of Vicor about transforming 400V into power for SELV systems.

Click here for more information about Products by Vicor