feature article
Subscribe Now

The Three Foes of the IoT

Wafer Fabs Show that the Obstacles are Money, Knowhow and Fear

The Internet of Things will be a defining element of the next big era of computing, but building it is like making a lake with an eye-dropper. That’s because each IoT user has to figure out its return on investment (ROI), design an embedded system geared for it, and overcome its fears of insecurity.

Back in the late 1980’s, Mark Weiser of Xerox PARC defined this next phase as one in which computers are deeply embedded in everyday objects, creating smart environments. At the time, he called it ubiquitous computing, but now it’s commonly called the IoT.

A broad array of relatively small, low-power microcontrollers and wireless networks are enabling the IoT. The competition in this space is so varied and fierce that even giants such as Intel and Samsung abandoned latecomer efforts to enter it with their Quark and Artik chips. Established vendors from Renesas in Japan to STMicroelectronics in Europe and a dozen others more than fill the market’s needs.

Similarly, there’s no shortage of wireless networks to link MCUs. They span the spectrum from narrowband RFID and Sigfox to Wi-Fi 6 and 5G cellular, with everything from Bluetooth to Zigbee living somewhere in between.

A whole new family of wireless alternatives has emerged in the last few years to rival cellular. LoRa and Narrowband-IoT, a version of LTE, lead a pack of so-called low-power, wide-area (LPWA) networks.

So, the good news is that the IoT has plenty of available ingredients. And there are plenty of chefs, since the IoT is also an outgrowth of embedded systems design, a practice to which many engineers have dedicated their entire careers–spawning books, magazines and events.

Have It Your Way

The trouble is that IoT requires a special business and technology recipe for every company that embraces it. What works for a retail chain will make no sense for a factory, and what works for a factory will not serve a farm. Even smart cities cannot be created from a cookie cutter—what serves the needs of Albany won’t solve the problems in Albuquerque.

Each user first has to find its ROI for building a system. Each must ask what data it needs to capture in order to find insights that could lead to productivity gains, saving costs, or generating new business models.

This can be a real head scratcher. It requires a well thought-out report capable of convincing a board of directors to make a big upfront investment on a promise of cost savings or a new and untested revenue stream.

Once you get over that big hurdle, someone has to design the right embedded system to capture and analyze that data. The user’s IT department—if it has one—will not have embedded design engineers.

Much of today’s embedded-design expertise is buried inside the field-application engineering teams in a few dozen semiconductor companies. An industrial shift is needed to create a new kind of IoT vendor, one that has the business acumen of an Accenture or Deloitte combined with the technical chops of an NXP or Microchip.

The lack of such companies is one reason why the IoT is evolving at a glacial pace. But it is not the only reason.

Fear is a Powerful Foe

To get a better understanding of the state of the IoT, it is instructive to look inside semiconductor fabs. This is ground zero of the tech industry, the billion-dollar factories that crank out the thousand-dollar wafers that contain the chips that are in everything that plugs into the wall or has a battery.

Fab operators stand at the pinnacle of industrial automation in their sophistication. The reason is simple—money.

A fab’s profits are directly tied to their yields—how many working chips they can cut out of each wafer. Yields depend on a thousand fine parameters—the level of light exposure in a lithography machine or the tolerances at which chemicals are deposited on or etched off a wafer’s surface.

Fab operators make a living measuring and fine tuning all these details across dozens of machines running hundreds of wafers an hour. They know something about the productivity born from gathering and analyzing data across multiple machines and responding in real time to the insights they sift out of it. Given these dynamics, wafer fabs stand at the forefront of the IoT.

I asked Tom Salmon what’s the top issue for IoT in wafer fabs. Salmon leads standards efforts for the SEMI trade group that represents companies who make the capital equipment that fabs use. His groups include about 25 fab owners running about 135 fabs, along with more than 60 of their suppliers.

Here’s what Salmon sees as the top IoT issue in his groups:

“Until recently it was a lack of understanding or consensus on the ROI. Companies were interested in IoT, but it hasn’t always been easy to get managers to pay for it,” he said.

“But, more recently, IP and cybersecurity have trumped ROI as the main issue that’s holding a lot of people back–issues with viruses getting in, cyber-attacks, or IP leakage–that’s holding people back more than ROI now,” he said.

“Locked Down Tight”

I tested Salmon’s observation out as I walked the show floor at Semicon West earlier this year. Two engineers that work on optical inspection systems for Nikon validated it. They would like to offer remote access to their gear to provide the kind of diagnostics and updates common for PCs and smartphones, but the fabs “are locked down tight,” they said.

No data goes in, no data goes out. No Internet of Things. An internal network of things is OK, but the data on it will not go outside the facility or at least not the company. It will not be shared with other companies—even sometimes including its top vendors. As for a cloud service—fuggidaboudit!

So, it seems that, at the bleeding edge of the IoT, we have a security problem. It’s the same security problem faced by Equifax and Target and every other company that has had its systems hacked over the last several years. It’s the same set of measures and countermeasures that have been the story of security since Day One.

Security is the next big challenge of the IoT, but it is not the only challenge, even in wafer fabs. A Rockwell Automation executive who specializes in the semiconductor sector said that there are still lively debates among capital equipment makers about which version of industrial Ethernet their systems should support. It’s the kind of small-minded internecine battle that vendors traditionally fight in an effort to get some edge.

I remain among the many who believe that the Internet of Things is the next big phase of computing. It will be huge. But it will take a loooooong time to get there.

Leave a Reply

featured blogs
Nov 14, 2019
In addition to playing retro games, THEC64 allows you to write your own programs in C64 or VIC 20 BASIC....
Nov 14, 2019
The Cadence Academic Network hosted an Academic Speaker Series event, in collaboration with the Shanghai Site Technical Talk series, in Cadence Shanghai Office. The talk attracted more than 150... [[ Click on the title to access the full blog on the Cadence Community site. ]...
Nov 14, 2019
Scientists, researchers, and data analysts from academia, industry and government agencies will be center stage at SC19 next week in Denver. SC19 is the International Conference for High Performance Computing, Networking, Storage, and Analysis. Next-generation high-performanc...
Nov 13, 2019
By Elven Huang – Mentor, A Siemens Business SRAM debugging at advanced nodes is challenging. With pattern matching and similarity checking, Calibre tools enable designers to more quickly and precisely locate SRAM modification errors and determine the correct fix. Static...
Nov 8, 2019
[From the last episode: we looked at the differences between computing at the edge and in the cloud.] We'€™ve looked at the differences between MCUs and SoCs, but the one major thing that they have in common is that they have a CPU. Now'€¦ anyone can define their own CPU ...