feature article
Subscribe Now

Treading on Thin Air

Engineering the Second Generation

Somewhere, in a nondescript cubicle in building number umpteen of a multi-billion-dollar multinational multi-technology conglomerate, an engineer sits at a lab bench staring at an eye diagram on six-figure scope. It’s the same every day. Any time he is not in a meeting or writing a status report, he sits in this lab and eats and breathes signal integrity. He has almost no concept of the end product that will incorporate his work. His entire universe is jitter, pre-emphasis, equalization, noise, amplitudes, and bit-error rates. For him, time stands still – in the picoseconds.

Across town, another engineer is also worried about signal integrity. He is including some DDR4 memory on his FPGA board, and he needs to be sure his design will work. He will worry about signal integrity for less than one day. He has prototyped his project on an FPGA development board with a couple of add-on modules plugged in with expansion connectors. His system is using an application he wrote on top of a LINUX TCP-IP stack, a six-axis gyro/accelerometer, a GPS, a camera, and some brushless motors – all coordinated by an FPGA with an embedded processor. He will check his DDR4 interface with an EDA tool that simplifies the operation almost to a pushbutton. This engineer is doing just about every aspect of the design, from the FPGA work to the board layout to the applications software. He is intensely aware of the requirements of his end product.

The second engineer relies almost completely on engineers like the first. He is truly working in a “jack of all trades, master of none” scenario. In order for him to be successful, an entire legion of engineers like the first must do their jobs, boil the essence down to a few variables or API calls, and package up the whole thing into a tight bundle of encapsulated engineering expertise. It is this kind of collaboration with those we’ve never met that empowers us to create the remarkable systems we are seeing today. 

The technology ecosystem truly has a food chain, beginning with bare-metal technologies and working its way – layer by layer – up the pyramid to the system-level design. As the structure gets taller, the lower layers become ever broader, requiring a greater number of increasingly specialized engineering disciplines to keep the whole thing supported. The number of distinct electronic engineering specialties today is greater than it has ever been before, and there is no sign of the trend letting up.

Our postgraduate engineering education is designed to focus us on particular narrow areas of technology. This is an ever-moving target. Universities fund departments to put together programs in what are perceived as the “hot” topics of the day, and, at any given time, the most popular tracks are those tackling today’s most exciting and interesting problems. Few electronic engineering programs divert students into mature, stable areas of development, even if there is considerable work remaining to be done there. 

The result is an age gradient in the technology chain, with the lower levels occupied by older engineers who have spent long careers refining their solutions at what have become supporting levels of technology. The higher levels trend toward younger engineers who went to school when various higher-level challenges were in fashion. Almost like sedimentary rock, the layers of technology are attached to the generations of engineers who produced and refined them. 

This presents an obvious and disturbing problem. What happens when the bottom of the pyramid goes into retirement? Does our whole technological structure run the risk of collapsing onto itself because nobody is left who truly understands the deepest and most fundamental levels of technology?

Fortunately, in semiconductors, the base level has been re-engineered every two years for the past five decades, which has kept a good deal of fresh blood circulating to the bottom of the pyramid. The engineers designing with 14nm FinFETs could not rest on the laurels of those who understood one-micron planar CMOS. The bottom-to-top churn of Moore’s Law has had the side effect of constantly renewing our attention on every level of the food chain, preventing knowledge decay at the lower levels. 

But what about software? Does our arsenal of algorithms become so robust and canned that we lose touch with the fundamentals? Will there be a time in EDA, for example, when nobody understands the innermost workings of some of the most sophisticated tools? Is there a risk that the brilliant and subtle optimizations and refinements of engineers of the past will be lost as art, existing only deep in the lines of the legacy code they left behind? 

We are just now reaching the time where the first generation of engineers who created the Moore’s Law revolution have retired. We have not yet seen an age like the second and later generations will experience, where our base technologies were all optimized by engineering ghosts from the past. 

As our systems become more complex, constructed by plugging-and-playing scores of independent components whose entangled fractal complexity is completely understood by no one, will we reach a point where our systems take on organic characteristics? When nobody fully understands a system, will its behavior becomes less deterministic, and will something more akin to a personality emerge? And, does this mean there will be a time when we diagnose and treat our designs in something that more closely resembles today’s medical practices? Will concrete engineering decisions of the distant past become the subject of future research discoveries?

It is entirely possible that engineering archeology will one day be an important discipline, and those who can reverse-engineer the lost miracles of the past will be just as highly regarded as those who create new solutions for the future. 

As I stare at the latest FPGA development board and realize that I cannot identify all of the components mounted on it, I can feel my contact with the bare metal layer begin to erode. Even though that board will enable me to create the most sophisticated designs of my career, that increased sophistication comes with a loss of innocence, a frightening failure of gravity that lifts our technological feet out of contact with the terra firma. Taking the plunge of trust and depending on technology that we do not fully understand is more challenging for an engineer than for most other personalities. Our identity is often wrapped up in our command of our domain from almost a molecular level, and feeling that positive contact slipping away can be terrifying. 

Leave a Reply

featured blogs
Jan 15, 2021
I recently saw (what appears at first glance to be) a simple puzzle involving triangles. But is finding the solution going to be trickier than I think?...
Jan 15, 2021
It's Martin Luther King Day on Monday. Cadence is off. Breakfast Bytes will not appear. And, as is traditional, I go completely off-topic the day before a break. In the past, a lot of novelty in... [[ Click on the title to access the full blog on the Cadence Community s...
Jan 14, 2021
Learn how electronic design automation (EDA) tools & silicon-proven IP enable today's most influential smart tech, including ADAS, 5G, IoT, and Cloud services. The post 5 Key Innovations that Are Making Everything Smarter appeared first on From Silicon To Software....
Jan 13, 2021
Testing is the final step of any manufacturing process, and arguably the most important, and yet it can often be overlooked.  Releasing a poorly tested product onto the market has destroyed more than one reputation for quality, and this is even more important in an age when ...

featured paper

Speeding Up Large-Scale EM Simulation of ICs Without Compromising Accuracy

Sponsored by Cadence Design Systems

With growing on-chip RF content, electromagnetic (EM) simulation of passives is critical — from selecting the right RF design candidates to detecting parasitic coupling. Being on-chip, accurate EM analysis requires a tie in to the process technology with process design kits (PDKs) and foundry-certified EM simulation technology. Anything short of that could compromise the RFIC’s functionality. Learn how to get the highest-in-class accuracy and 10X faster analysis.

Click here to download the whitepaper

Featured Chalk Talk

Maxim's Himalaya uSLIC Portfolio

Sponsored by Mouser Electronics and Maxim Integrated

With form factors continuing to shrink, most engineers are working hard to reduce the number of discrete components in their designs. Power supplies, in particular, are problematic - often requiring a number of large components. In this episode of Chalk Talk, Amelia Dalton chats with John Woodward of Maxim Integrated about how power modules can save board space, improve performance, and help reliability.

Click here for more information about Maxim Integrated Himalaya uSLIC™ MAXM1546x Step-Down Power Modules