The imminent IoT is the final phase of a computing revolution that began decades ago.
First, it was all about processor speed. From our puny 8-bit 1MHz microprocessors to today’s 64-bit multi-core multi-gigahertz behemoths, the pursuit of MIPS and FLOPS led us from machines that could barely run the most rudimentary programs to multiprocessing beasts that can crunch data at unimaginable speeds. Over time, however, processor performance gradually faded from the spotlight. Now, the incremental capability we gain with a doubling of processor speed is comparatively insignificant, and the focus in processor design has switched from raw performance to power efficiency.
Next, the big issue was storage. The newer, faster processors ran more complex software, which demanded more memory and more mass storage. Our applications could gobble up every order-of-magnitude capacity gain our memory technology could provide. The kilo-to-mega-to-giga-to-tera-to-petabyte storage progress meter was pegged to the right for the better part of fifty years. Over time, though, the repeated 1,000x-ing of available storage capacity caught up with our most ambitious demands. The average consumer could afford enough media to store their entire music and video collection on devices they could grab from the grocery checkout line. And, even the biggest of the big-data applications didn’t bump into the server storage ceiling. We had, for most practical purposes, enough storage.
Then, the battle was about bandwidth. Our fast-processing, massive data-storing systems needed to talk to one another, and they needed to do it at eye-watering data rates. Bits blasted through SerDes channels at speeds that would have previously been unimaginable. Packets were routed to their destinations with remarkable speed, accuracy, and efficiency. Our “series of tubes” was replaced by scores of enormous data pipes wrapping around the Earth. Over time, though, even our seemingly insatiable need for bandwidth began to slowly subside.
The burden then moved to the software sector. With all this speed, storage, and bandwidth, the imagination of application developers was challenged anew. The intelligent information infrastructure had the potential to accomplish the seemingly impossible, but the billions of lines of code required to realize that vision would take armies of programmers decades to complete. If hardware were to suddenly come to a complete stop, new software technology could keep us innovating for another half century at near-Moore’s Law rates.
But now, the focus must switch to the interface. For far too long our incredible computing infrastructure has depended on the keyboard, mouse, and screen for the vast majority of its interaction with humans. These decades-old technologies have barely budged. While other aspects of our technological revolution have leapt ahead at a dizzying pace, adding zeroes right and left to key capability metrics, the old keyboard, mouse, and monitor have been plugging along with basically the same capabilities they had in 1984. Sure, we’ve picked up a few more color pixels during the journey, but the fundamental power of our human interface has lagged severely compared with other elements of technology.
This is what the IoT is really about.
IoT is about new ways of getting information into our machines – bypassing the human-at-a-keyboard model. It’s about putting information to work in new and innovative ways that don’t require a video screen to impact our lives. It’s about a complete overhaul of our aging, obsolete, laggardly methods of human-machine interface. The keyboard and mouse must die.
The IoT allows devices to gather and share data without human intervention. It allows our smart machines to take action directly, without first presenting information to a human on a display. It brings our technological takeover to the final stage, where humans can be unplugged, and where the machines can start actually serving people rather than the other way around.
This final wave of revolutionary invention will require more inter-disciplinary engineering than any of the tasks before it. The full force of our semiconductor, software, and fabrication technologies will need to be paired with leaps in mechanical, sensor, actuator, and chemical technology. Giving machines the ability to truly see, feel, hear, taste and smell the world is a monumental endeavor, the surface of which we’ve barely scratched. Endowing our creations with senses and abilities far surpassing our own is an ambitious but achievable task. But, in doing so, we will gain the ability to know and do things that were previously simply un-knowable and undoable.
Those who dismiss IoT over its first, wobbly steps with arguments like “I don’t need my toaster to talk to my refrigerator” or “IoT is nothing but a security nightmare” are missing the bigger picture. Consider, for example, the potential impact on healthcare. Our bodies are packed with information that could help us maintain our health, but we are currently not instrumented to collect it. Physicians attempt diagnoses with paltry single samples of data. When you go to the clinic, your heart rate, blood pressure, and other vitals are typically measured once – and not at the most useful time. The “best guess” diagnosis that results is notoriously inaccurate. But, if our vital signs and other important health-related data could be collected and analyzed continuously, trendiness could are established, response to various situations could be monitored, and the accuracy and timeliness of diagnoses would skyrocket. And, we would most certainly discover new early warning signs for numerous conditions that we simply did not have the data to diagnose before. This would result in a dramatic improvement in the efficiency of healthcare. Physicians could focus on treatment of patients whose conditions are known with a high degree of certainty and who were diagnosed at the most advantageous time.
Similarly, an IoT-driven revolution in safe transportation is already underway. Within a few years, the number of people killed or injured in transportation-related accidents should drop precipitously. The energy consumed by transport should be dramatically reduced, and the reliability and performance of our transportation should make a leap forward. Self-driving cars are certainly the most visible of these innovations, but the changes that will follow in their wake will be perhaps even more profound.
There are numerous other examples of whole industries that will be created, re-imagined, or destroyed by IoT technology within the coming decades. And, as a result, life for most humans will, again, change dramatically – transformed by the IoT technologies we are all developing today.
Of course, like most powerful technology-driven transformations, the social impact will be the most difficult to manage. While the creation of revolutionary technology is inevitable, society’s ability to absorb rapid change is limited. As it has been with the information technology revolution of the past three decades, it is likely that entire generations of people will be confused and bewildered by the coming IoT future. People are a paranoid lot, and the kinds of new norms that IoT will bring will challenge and likely overwhelm our current social structure. And, our cultural and political systems will most likely be nowhere near up to the task of managing the kind of fear, distrust, and abuse that the coming wave of IoT innovation will bring in its wake.
Those of us who are designing the IoT must be ever mindful of that most important set of considerations – those that relate to how people will interact with and relate to what we are creating. As we engineer the human out of the system, we need to remember to engineer the system around the human. Often, the technical challenges are the easy part. The most difficult engineering task is to create technology that is approachable, accessible, and understandable – even though its workings may be mind-bogglingly complex. Since IoT is fundamentally designed to serve humanity, we need to be sure that it solves more problems than it causes.