feature article
Subscribe Now

Combatting Complexity

From Intelligent Design to Evolution in Engineering

I once had a colleague who defined a “system” as “the thing one level of complexity higher than you can understand.”

I always thought there was a bit of profound insight in that definition.

We humans have a finite capacity for understanding, and that capacity does not benefit from the bounty of Moore’s Law. Our brains don’t magically think twice as fast every two years, or have the capacity to consider exponentially more factors each decade. Our limited brains also don’t compensate well for the layer upon layer of encapsulated engineering accomplishments of our predecessors. Even comparatively simple examples of modern technology can be deceptively complex – perhaps more complex than a single human brain can fully comprehend. 

The historical engineering defenses against our limited human mental capacity involve two key mechanisms – collaboration and encapsulation. If we are designing some kind of electronic gadget and we can’t understand both the digital and the analog portions of our design, for example, we might take the digital side ourselves – and team up with an analog expert who can focus on the analog part. Somebody else takes over the mechanical design for the enclosure and stuff, and we may even reluctantly acknowledge that we’ll need other experts for things like manufacturing, testing, and (shudder) marketing and sales. Even though nobody on the team understands all the details of the whole product, as a group we are able to conceive, create, manufacture, distribute, market, sell, and support the thing. That’s the beauty and power of collaboration.

We also rely heavily on encapsulation. Somewhere, back in the old days, somebody figured out how to make logic gates out of MOSFETs. Somebody else figured out how to make processors, memory, and peripherals out of those gates. All those old folks quite helpfully encapsulated all that knowledge into handy microcontroller chips that we can simply drop into our design – even if we don’t have the foggiest clue how MOSFETs work. In practice, just about every design we create is somehow a hierarchical pyramid of encapsulated technology – most of which we don’t understand. We move ahead by riding the upper edge of these layers of abstraction, assembling the currently-encapsulated components into higher-level systems – which may themselves be encapsulated as building blocks of future projects. We build systems out of components we mostly don’t understand using tools we mostly don’t understand. Yet, somehow, we are able to understand the result. 

These two coping mechanisms – encapsulation and collaboration – are imperfect at best. Frequently we see breakdowns where projects run astray because team members don’t share the same top-level vision for the thing they’re working together to create. Or, we see cases where the encapsulation of underlying technology is less than perfect, and systems fail at a higher level because of lower-level interactions between “encapsulated” technologies.

The idealistic side of our brains wishes that this weren’t true. It is easy to envision a world where every part of our system is designed from the ground up specifically for our purposes and to our specifications. In safety-critical systems, we often see a noble effort in that direction. In reality, however, with the complexity of modern technology, such a scenario is becoming increasingly more unrealistic. It is practically impossible to traverse the entire scope of a complex system and determine that each part properly serves the goals of the whole. A high-level software application may use a low-level driver that was compiled with an open-source compiler. That version of the compiler may have targeted an older version of a processor architecture where the bit-order of a word was later changed. That whole application may be part of a small subsystem which itself is activated only in the rare case where… You get the picture. Complex systems created by humans are, by their nature, incomprehensible to humans.

This fallacy that we can somehow completely understand the systems we design has been an underlying assumption in the creation of our engineering design flow itself. The traditional “waterfall” development processes assume that we can completely understand our system before we build it. It asks us to create a detailed specification, a project plan that will let us design something that meets that specification, and a testing plan that will allow us to evaluate whether or not we have succeeded. It is a methodology based on a belief that we can intelligently design a complex system in a vacuum, and then have that system perform well in the real world. With the reality of today’s complex hardware/software/connected systems, it is also a load of rubbish.

I’m going to point out the unpopular (in this forum) truth that software is much more complex than hardware. Because of this truth, software engineers hit the complexity wall earlier, realizing that the “waterfall” development concept was inadequate and unworkable for modern systems. The entire “agile software” movement is a reflection of that realization – that the only practical way to make robust software systems is to allow them to evolve in the real world, adjusting and adapting to problems as they are encountered. It is not enough to simply apply a bunch of really intelligent and extremely conscientious software developers to a problem. Whatever thing they design in a clean room will not work correctly. Their inability to predict all of the situations that may arise in real-world use and to visualize how their system should respond to those events will inevitably lead to problems.

There are those who argue that hardware has now hit that same complexity wall, and that the only workable solution is something like agile development for hardware as well. Whether you believe that or not, it should be obvious that our systems (of which software is an increasingly large component) have now crossed that line. Furthermore, now that we have systems that are parts of vast, interconnected, super-systems (like the IoT), it should be doubly clear that our technology is now fully in an evolutionary model, and the days of sending a team of bright engineers away to a nice quiet space to create something new on a blank sheet of paper (sorry, I meant “on a nice, responsive tablet”) are over.

Today, our systems are truly at a level of complexity higher than anyone can actually fully understand. Because of that, they will increasingly “evolve” rather than being “designed”. Perhaps, we will reach a point where system design is more akin to breeding livestock – choosing subsystems and components that seem to behave the way we want, connecting them together, and watching to see what happens.

In any case, the deterministic, control-oriented version of engineering we all learned in school will necessarily change to something else. Don’t panic, though. If engineering education prepared us for anything, it is the reality of constant change. That’s what keeps us employed. But, beyond employment, that’s what makes engineering a constant source of inspiration and excitement.

3 thoughts on “Combatting Complexity”

  1. Even though Moore’s Law makes our systems twice as complex every two years, we still use the same old brains to design them. At some point, that means our systems are more complex than we can understand. (And, for some of us, that happens sooner than for others…)

    What happens then?

  2. In a war we have Generals, with a chain of command, and significant specialization, all the way down to foot soldiers and cooks. None of them are expected to know the job, and do it well, from top to bottom.

    It’s foolish to expect members of engineering teams to have the same grand breadth.

    We just need to accept the “architect” has the vision to direct the product evolution in a responsible way that will meet the market goals. With that other people can build teams to execute the implementation of each subsystem that is tasked. Using specialists, to deal with the nail biting complexity of each narrow part of the problem.

  3. I’ll provisionally agree that software is more complex than hardware, and has hit the “complexity wall” sooner.
    This however, has more to do with economic realities related to software production than any real necessity for software to be that complex.
    Wordstar was distributed for years on a 140k floppy. Only slightly more functionality appears on 6 CDs or 2 DVDs today.
    Only with the advent of many core and multicore cpu architectures has there been a real reason for greater complexity. The “code bloat” created by a software industry that is responsive to perceived market demand is the real reason for software complexity.
    Push it out the door and let the next generation of hardware solve the speed issues of a slow piece of software, has been the battle cry for a few decades.
    And the experts that studied massive parallization strategies immediately pronounced the same design goals that were required to get work done when machine resources were constraining factors.
    Back to the basics.
    Go figure.

Leave a Reply

featured blogs
Mar 28, 2024
The difference between Olympic glory and missing out on the podium is often measured in mere fractions of a second, highlighting the pivotal role of timing in sports. But what's the chronometric secret to those photo finishes and record-breaking feats? In this comprehens...
Mar 26, 2024
Learn how GPU acceleration impacts digital chip design implementation, expanding beyond chip simulation to fulfill compute demands of the RTL-to-GDSII process.The post Can GPUs Accelerate Digital Design Implementation? appeared first on Chip Design....
Mar 21, 2024
The awesome thing about these machines is that you are limited only by your imagination, and I've got a GREAT imagination....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured chalk talk

Advanced Gate Drive for Motor Control
Sponsored by Infineon
Passing EMC testing, reducing power dissipation, and mitigating supply chain issues are crucial design concerns to keep in mind when it comes to motor control applications. In this episode of Chalk Talk, Amelia Dalton and Rick Browarski from Infineon explore the role that MOSFETs play in motor control design, the value that adaptive MOSFET control can have for motor control designs, and how Infineon can help you jump start your next motor control design.
Feb 6, 2024
7,301 views