feature article
Subscribe Now

Treading on Thin Air

Engineering the Second Generation

Somewhere, in a nondescript cubicle in building number umpteen of a multi-billion-dollar multinational multi-technology conglomerate, an engineer sits at a lab bench staring at an eye diagram on six-figure scope. It’s the same every day. Any time he is not in a meeting or writing a status report, he sits in this lab and eats and breathes signal integrity. He has almost no concept of the end product that will incorporate his work. His entire universe is jitter, pre-emphasis, equalization, noise, amplitudes, and bit-error rates. For him, time stands still – in the picoseconds.

Across town, another engineer is also worried about signal integrity. He is including some DDR4 memory on his FPGA board, and he needs to be sure his design will work. He will worry about signal integrity for less than one day. He has prototyped his project on an FPGA development board with a couple of add-on modules plugged in with expansion connectors. His system is using an application he wrote on top of a LINUX TCP-IP stack, a six-axis gyro/accelerometer, a GPS, a camera, and some brushless motors – all coordinated by an FPGA with an embedded processor. He will check his DDR4 interface with an EDA tool that simplifies the operation almost to a pushbutton. This engineer is doing just about every aspect of the design, from the FPGA work to the board layout to the applications software. He is intensely aware of the requirements of his end product.

The second engineer relies almost completely on engineers like the first. He is truly working in a “jack of all trades, master of none” scenario. In order for him to be successful, an entire legion of engineers like the first must do their jobs, boil the essence down to a few variables or API calls, and package up the whole thing into a tight bundle of encapsulated engineering expertise. It is this kind of collaboration with those we’ve never met that empowers us to create the remarkable systems we are seeing today. 

The technology ecosystem truly has a food chain, beginning with bare-metal technologies and working its way – layer by layer – up the pyramid to the system-level design. As the structure gets taller, the lower layers become ever broader, requiring a greater number of increasingly specialized engineering disciplines to keep the whole thing supported. The number of distinct electronic engineering specialties today is greater than it has ever been before, and there is no sign of the trend letting up.

Our postgraduate engineering education is designed to focus us on particular narrow areas of technology. This is an ever-moving target. Universities fund departments to put together programs in what are perceived as the “hot” topics of the day, and, at any given time, the most popular tracks are those tackling today’s most exciting and interesting problems. Few electronic engineering programs divert students into mature, stable areas of development, even if there is considerable work remaining to be done there. 

The result is an age gradient in the technology chain, with the lower levels occupied by older engineers who have spent long careers refining their solutions at what have become supporting levels of technology. The higher levels trend toward younger engineers who went to school when various higher-level challenges were in fashion. Almost like sedimentary rock, the layers of technology are attached to the generations of engineers who produced and refined them. 

This presents an obvious and disturbing problem. What happens when the bottom of the pyramid goes into retirement? Does our whole technological structure run the risk of collapsing onto itself because nobody is left who truly understands the deepest and most fundamental levels of technology?

Fortunately, in semiconductors, the base level has been re-engineered every two years for the past five decades, which has kept a good deal of fresh blood circulating to the bottom of the pyramid. The engineers designing with 14nm FinFETs could not rest on the laurels of those who understood one-micron planar CMOS. The bottom-to-top churn of Moore’s Law has had the side effect of constantly renewing our attention on every level of the food chain, preventing knowledge decay at the lower levels. 

But what about software? Does our arsenal of algorithms become so robust and canned that we lose touch with the fundamentals? Will there be a time in EDA, for example, when nobody understands the innermost workings of some of the most sophisticated tools? Is there a risk that the brilliant and subtle optimizations and refinements of engineers of the past will be lost as art, existing only deep in the lines of the legacy code they left behind? 

We are just now reaching the time where the first generation of engineers who created the Moore’s Law revolution have retired. We have not yet seen an age like the second and later generations will experience, where our base technologies were all optimized by engineering ghosts from the past. 

As our systems become more complex, constructed by plugging-and-playing scores of independent components whose entangled fractal complexity is completely understood by no one, will we reach a point where our systems take on organic characteristics? When nobody fully understands a system, will its behavior becomes less deterministic, and will something more akin to a personality emerge? And, does this mean there will be a time when we diagnose and treat our designs in something that more closely resembles today’s medical practices? Will concrete engineering decisions of the distant past become the subject of future research discoveries?

It is entirely possible that engineering archeology will one day be an important discipline, and those who can reverse-engineer the lost miracles of the past will be just as highly regarded as those who create new solutions for the future. 

As I stare at the latest FPGA development board and realize that I cannot identify all of the components mounted on it, I can feel my contact with the bare metal layer begin to erode. Even though that board will enable me to create the most sophisticated designs of my career, that increased sophistication comes with a loss of innocence, a frightening failure of gravity that lifts our technological feet out of contact with the terra firma. Taking the plunge of trust and depending on technology that we do not fully understand is more challenging for an engineer than for most other personalities. Our identity is often wrapped up in our command of our domain from almost a molecular level, and feeling that positive contact slipping away can be terrifying. 

Leave a Reply

featured blogs
Nov 23, 2022
The current challenge in custom/mixed-signal design is to have a fast and silicon-accurate methodology. In this blog series, we are exploring the Custom IC Design Flow and Methodology stages. This methodology directly addresses the primary challenge of predictability in creat...
Nov 22, 2022
Learn how analog and mixed-signal (AMS) verification technology, which we developed as part of DARPA's POSH and ERI programs, emulates analog designs. The post What's Driving the World's First Analog and Mixed-Signal Emulation Technology? appeared first on From Silicon To So...
Nov 21, 2022
By Hossam Sarhan With the growing complexity of system-on-chip designs and technology scaling, multiple power domains are needed to optimize… ...
Nov 18, 2022
This bodacious beauty is better equipped than my car, with 360-degree collision avoidance sensors, party lights, and a backup camera, to name but a few....

featured video

How to Harness the Massive Amounts of Design Data Generated with Every Project

Sponsored by Cadence Design Systems

Long gone are the days where engineers imported text-based reports into spreadsheets and sorted the columns to extract useful information. Introducing the Cadence Joint Enterprise Data and AI (JedAI) platform created from the ground up for EDA data such as waveforms, workflows, RTL netlists, and more. Using Cadence JedAI, engineering teams can visualize the data and trends and implement practical design strategies across the entire SoC design for improved productivity and quality of results.

Learn More

featured paper

How SHP in plastic packaging addresses 3 key space application design challenges

Sponsored by Texas Instruments

TI’s SHP space-qualification level provides higher thermal efficiency, a smaller footprint and increased bandwidth compared to traditional ceramic packaging. The common package and pinout between the industrial- and space-grade versions enable you to get the newest technologies into your space hardware designs as soon as the commercial-grade device is sampling, because all prototyping work on the commercial product translates directly to a drop-in space-qualified SHP product.

Click to read more

featured chalk talk

Series Five Product Introduction

Sponsored by Mouser Electronics and Amphenol Aerospace

Size and weight are critical design considerations when it comes to military and aerospace applications. One way to minimize weight and size in these kinds of designs is to take a closer look at your choice of connectors. In this episode of Chalk Talk, Amelia Dalton chats with Anthony Annunziata from Amphenol Aerospace about the series five next generation connectors from Amphenol Aerospace. They investigate the size and weight advantages that these connectors bring to military and aerospace applications and how you can get started using the series five in your next design.

Click here for more information about Amphenol Aerospace Series Five Black Zinc-Nickel Circular Connectors