feature article
Subscribe Now

The Industrial Internet Reference Architecture

The IIC Tries to Think of Everything

Deep in the heart of Portland or Austin or Minneapolis or any of dozens of towns across the nation and the world, Makers are busily building components for the Internet of Things (IoT). Long dismissed as hobbyists unworthy of sales attention, many of these skilled designers are where the IoT rubber hits the road.

Way on the other end of abstraction, folks in the Industrial Internet Consortium (IIC) have been expending much effort trying to lay out an IoT framework that might promote interoperability and numerous other desirable traits. Many of these abstract characteristics may, at some point, be implemented in a concrete fashion by one of those engineers.

To me, these feel like opposite ends of a spectrum: high ideals for how things should be vs. practical considerations for how things have to be, this time anyway, in order to get a product finished. To be sure, not all IoT design is done by Makers; there’s much happening in large companies too. But the little guy captures, for me, that practical, entrepreneurial garage spirit.

On the abstract side, however, this summer saw the release of an Industrial Internet Reference Architecture. Now, don’t panic… this isn’t a hard standard specifying how to build an IoT. To me, it seems like two things:

  • A generalized model of the industrial IoT (IIoT), along with language to be used for promoting a cogent conversation with consistent terminology, and
  • A comprehensive list of things to think about when designing an industrial internet system (IIS).

And yes, based on its name, this is about the IIoT, not the Consumer IoT that might come to your home at some point. Both IoTs have a current tendency towards walled-gardenness, with Apple and Google vying to control our homes (along with other proprietary schemes) and with current machine-to-machine (M2M) installations dominated by specific companies.

Of course, this is about much more than just interop (which we’ll return to shortly). It’s a comprehensive hundred-page document that starts by laying out concerns from four different viewpoints and then addressing a series of issues that are nominally orthogonal to the viewpoints: Safety, Security, Resilience, various notions and levels of interoperability, Connectivity, Data Management, Analytics, Control, and, for advanced students, notions of dynamic and automated interop.

Four Ways to See the IIoT

The four viewpoints constitute a set of generalized requirements based on the needs of different stakeholders at different levels. They’re four levels of abstraction, each of which has a series of concerns.

  • The business viewpoint deals with your typical high-level business concerns. Why would someone buy this? Will we make money? Might we get sued for something? What will be our long-term cost?
  • The usage viewpoint deals more with “use cases” and their ilk. Who is going to use this? Do they all have the same or different roles? How will they interact with the system? What will the workflow look like?
  • The functional viewpoint starts to take the needs articulated in the first two viewpoints and cast them into an overall functional architecture. What high-level pieces will we need, and what will they do? What will talk to what? The document identifies five different functional domains, delving into further details for each one:
    • Controls, at the lowest level, interacting with the physical systems;
    • Operation, at the mid-level;
    • Information, at the mid-level;
    • Application, at the mid-level; and
    • Business, at the top level.
  • The implementation viewpoint gets down to brass tacks, dealing with specific system architecture, protocols to be used, components to be used, and how it all comes together. They identify five different architecture patterns, again, further deconstructing each one:
    • Three-tier architecture pattern, with – surprise! – three tiers: the Edge tier (which includes edge gateways); the Platform tier (which includes various data services and operational considerations; they may be local or remote), and the Enterprise tier (which covers domain applications and business rules and such).
    • Gateway-Mediated Edge Connectivity and Management architecture pattern, which models a local network connected to a wide-area network by a gateway.
    • Edge-to-Cloud architecture pattern, which is like the former one, except that individual edge nodes can directly access wide-area connectivity (rather than having to go through a gateway).
    • Multi-Tier Data Storage architecture pattern, which addresses the various levels of data storage that might be implemented to balance capacity, performance, and locality.
    • Distributed Analytics architecture pattern, which is all about where data travels, where it’s reduced, and where it’s analyzed.

Security, interop, deep learning, and more

Security permeates all viewpoints and all tiers. While it has a section all of its own, each of the viewpoints include security issues specific to that viewpoint. No other issue gets sprinkled as liberally throughout the document as security. In particular, they make a distinction between mandatory security considerations (arising out of regulations and other non-negotiable considerations) and other security issues; the latter may be subject to cost tradeoffs, while the former won’t be.

The discussion of interoperability is particularly illustrative, and, interestingly, many of the concepts run parallel to a prior piece I did on interop – including the references to natural language syntax, semantics, and pragmatics. While I came to a somewhat dreary conclusion about the likelihood of interop becoming a thing in the face of companies preferring to keep control, this document doesn’t veer in that direction; it merely lays out the notions and their implications.

It defines three levels of interop that roughly correspond to those linguistic notions. The first is “integrability,” which they define as “the capability to communicate with each other based on compatible means of signaling and protocols.” Next is “interoperability” (which I’ve been using more generically, for lack of a better generic term); their definition is “the capability to exchange information with each other based on common conceptual models and interpretation of information in context.” Highest up is “composability,” which they define as “the capability of a component to interact with any other component in a recombinant fashion to satisfy requirements based on the expectation of the behaviors of the interacting parties.”

That all sounds relatively impenetrable, so they have a couple of examples to better illustrate the distinction. One involves an airplane cockpit: integrability refers to whether or not a pilot can physically fit into the cockpit and see the various instruments as well as out the window. Interoperability means that the pilot also understands what each of the instruments means. Composability means that this understanding is part of a broader grasp of a particular type of plane. How you fly a Cessna will be different from flying a 747, even if they share some of the same dials. It’s this bigger picture that makes something composable.

Later on they talk about how these notions can be made dynamic, allowing a system to learn about its own configuration and circumstances and to adapt to changes as they occur.

Of course, they don’t say how to build these in; rather, they’re raising them as issues to take into account during architectural planning.

Another interesting distinction they make gets to the familiar area of models. As programmers, we’re used to creating a model of some entity or system and then coding accordingly. The programmer understands the model, but the system itself doesn’t.

This is contrasted with ideas more closely tied to deep learning, where the system itself builds a model based on its experience. A machine-learned model may be completely different from a programmer-created model – and yet, because it’s based on the reality of experience (assuming it has been exposed to a suitably wide range of situations), it may be more realistic than a more idealized model that a human is likely to build. On the flip-side, the machine model may be completely impenetrable to a human, unlike the programmer’s model.

These few examples don’t begin to penetrate all of the other ideas and notions spelled out in the document. As you can see, they’re pretty high level, and you might wonder what relevance they might have to Fred in the Shed with the soldering iron. Clearly, those of you building this stuff will mostly be represented through the implementation viewpoint. Hopefully you’re also able to interact as the functional – and even the usage – viewpoint is refined into specific implementation requirements.

At the very least, it’s a useful read – I’d be willing to wager that most of you can’t get through it without encountering at least one new thing that you hadn’t thought of before.

More info:

The Industrial Internet Reference Architecture (need to enter personal info to download)

13 thoughts on “The Industrial Internet Reference Architecture”

  1. Pingback: puffco plus faq
  2. Pingback: Training
  3. Pingback: seedboxes
  4. Pingback: binaural
  5. Pingback: gvk biosciences
  6. Pingback: DMPK Studies
  7. Pingback: pezevenk
  8. Pingback: Olahraga Terkini
  9. Pingback: agen bola sbobet

Leave a Reply

featured blogs
Apr 24, 2024
Learn about maskless electron beam lithography and see how Multibeam's industry-first e-beam semiconductor lithography system leverages Synopsys software.The post Synopsys and Multibeam Accelerate Innovation with First Production-Ready E-Beam Lithography System appeared fir...
Apr 24, 2024
Diversity, equity, and inclusion (DEI) are not just words but values that are exemplified through our culture at Cadence. In the DEI@Cadence blog series, you'll find a community where employees share their perspectives and experiences. By providing a glimpse of their personal...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Dependable Power Distribution: Supporting Fail Operational and Highly Available Systems
Sponsored by Infineon
Megatrends in automotive designs have heavily influenced the requirements needed for vehicle architectures and power distribution systems. In this episode of Chalk Talk, Amelia Dalton and Robert Pizuti from Infineon investigate the trends and new use cases required for dependable power systems and how Infineon is advancing innovation in automotive designs with their EiceDRIVER and PROFET devices.
Dec 7, 2023
17,998 views