feature article
Subscribe Now

Treading on Thin Air

Engineering the Second Generation

Somewhere, in a nondescript cubicle in building number umpteen of a multi-billion-dollar multinational multi-technology conglomerate, an engineer sits at a lab bench staring at an eye diagram on six-figure scope. It’s the same every day. Any time he is not in a meeting or writing a status report, he sits in this lab and eats and breathes signal integrity. He has almost no concept of the end product that will incorporate his work. His entire universe is jitter, pre-emphasis, equalization, noise, amplitudes, and bit-error rates. For him, time stands still – in the picoseconds.

Across town, another engineer is also worried about signal integrity. He is including some DDR4 memory on his FPGA board, and he needs to be sure his design will work. He will worry about signal integrity for less than one day. He has prototyped his project on an FPGA development board with a couple of add-on modules plugged in with expansion connectors. His system is using an application he wrote on top of a LINUX TCP-IP stack, a six-axis gyro/accelerometer, a GPS, a camera, and some brushless motors – all coordinated by an FPGA with an embedded processor. He will check his DDR4 interface with an EDA tool that simplifies the operation almost to a pushbutton. This engineer is doing just about every aspect of the design, from the FPGA work to the board layout to the applications software. He is intensely aware of the requirements of his end product.

The second engineer relies almost completely on engineers like the first. He is truly working in a “jack of all trades, master of none” scenario. In order for him to be successful, an entire legion of engineers like the first must do their jobs, boil the essence down to a few variables or API calls, and package up the whole thing into a tight bundle of encapsulated engineering expertise. It is this kind of collaboration with those we’ve never met that empowers us to create the remarkable systems we are seeing today. 

The technology ecosystem truly has a food chain, beginning with bare-metal technologies and working its way – layer by layer – up the pyramid to the system-level design. As the structure gets taller, the lower layers become ever broader, requiring a greater number of increasingly specialized engineering disciplines to keep the whole thing supported. The number of distinct electronic engineering specialties today is greater than it has ever been before, and there is no sign of the trend letting up.

Our postgraduate engineering education is designed to focus us on particular narrow areas of technology. This is an ever-moving target. Universities fund departments to put together programs in what are perceived as the “hot” topics of the day, and, at any given time, the most popular tracks are those tackling today’s most exciting and interesting problems. Few electronic engineering programs divert students into mature, stable areas of development, even if there is considerable work remaining to be done there. 

The result is an age gradient in the technology chain, with the lower levels occupied by older engineers who have spent long careers refining their solutions at what have become supporting levels of technology. The higher levels trend toward younger engineers who went to school when various higher-level challenges were in fashion. Almost like sedimentary rock, the layers of technology are attached to the generations of engineers who produced and refined them. 

This presents an obvious and disturbing problem. What happens when the bottom of the pyramid goes into retirement? Does our whole technological structure run the risk of collapsing onto itself because nobody is left who truly understands the deepest and most fundamental levels of technology?

Fortunately, in semiconductors, the base level has been re-engineered every two years for the past five decades, which has kept a good deal of fresh blood circulating to the bottom of the pyramid. The engineers designing with 14nm FinFETs could not rest on the laurels of those who understood one-micron planar CMOS. The bottom-to-top churn of Moore’s Law has had the side effect of constantly renewing our attention on every level of the food chain, preventing knowledge decay at the lower levels. 

But what about software? Does our arsenal of algorithms become so robust and canned that we lose touch with the fundamentals? Will there be a time in EDA, for example, when nobody understands the innermost workings of some of the most sophisticated tools? Is there a risk that the brilliant and subtle optimizations and refinements of engineers of the past will be lost as art, existing only deep in the lines of the legacy code they left behind? 

We are just now reaching the time where the first generation of engineers who created the Moore’s Law revolution have retired. We have not yet seen an age like the second and later generations will experience, where our base technologies were all optimized by engineering ghosts from the past. 

As our systems become more complex, constructed by plugging-and-playing scores of independent components whose entangled fractal complexity is completely understood by no one, will we reach a point where our systems take on organic characteristics? When nobody fully understands a system, will its behavior becomes less deterministic, and will something more akin to a personality emerge? And, does this mean there will be a time when we diagnose and treat our designs in something that more closely resembles today’s medical practices? Will concrete engineering decisions of the distant past become the subject of future research discoveries?

It is entirely possible that engineering archeology will one day be an important discipline, and those who can reverse-engineer the lost miracles of the past will be just as highly regarded as those who create new solutions for the future. 

As I stare at the latest FPGA development board and realize that I cannot identify all of the components mounted on it, I can feel my contact with the bare metal layer begin to erode. Even though that board will enable me to create the most sophisticated designs of my career, that increased sophistication comes with a loss of innocence, a frightening failure of gravity that lifts our technological feet out of contact with the terra firma. Taking the plunge of trust and depending on technology that we do not fully understand is more challenging for an engineer than for most other personalities. Our identity is often wrapped up in our command of our domain from almost a molecular level, and feeling that positive contact slipping away can be terrifying. 

Leave a Reply

featured blogs
Oct 19, 2020
We'€™re proud to see that many expert verification teams exploit the powers of UVM vr_ad, in implementing intricate verification environments in e . The vr_ad is an open source package, part of UVM- e... [[ Click on the title to access the full blog on the Cadence Communit...
Oct 16, 2020
Another event popular in the tech event circuit is PCI-SIG® DevCon. While DevCon events are usually in-person around the globe, this year, like so many others events, PCI-SIG DevCon is going virtual. PCI-SIG DevCons are members-driven events that provide an opportunity to le...
Oct 16, 2020
If you said '€œYes'€ to two of the items in the title of this blog -- specifically the last two -- then read on......
Oct 16, 2020
[From the last episode: We put together many of the ideas we'€™ve been describing to show the basics of how in-memory compute works.] I'€™m going to take a sec for some commentary before we continue with the last few steps of in-memory compute. The whole point of this web...

Featured Video

Four Ways to Improve Verification Performance and Throughput

Sponsored by Cadence Design Systems

Learn how to address your growing verification needs. Hear how Cadence Xcelium™ Logic Simulation improves your design’s performance and throughput: improving single-core engine performance, leveraging multi-core simulation, new features, and machine learning-optimized regression technology for up to 5X faster regressions.

Click here for more information about Xcelium Logic Simulation

featured Paper

New package technology improves EMI and thermal performance with smaller solution size

Sponsored by Texas Instruments

Power supply designers have a new tool in their effort to achieve balance between efficiency, size, and thermal performance with DC/DC power modules. The Enhanced HotRod™ QFN package technology from Texas Instruments enables engineers to address design challenges with an easy-to-use footprint that resembles a standard QFN. This new package type combines the advantages of flip-chip-on-lead with the improved thermal performance presented by a large thermal die attach pad (DAP).

Click here to download the whitepaper

Featured Chalk Talk

Fuses in Automotive Applications

Sponsored by Mouser Electronics and Littelfuse

Automotive applications put a high demand on fuses. With the increasing electrical content in modern vehicles, correct fuse specification is a critical item for safety, and standards have been slow to catch up with demands. In this episode of Chalk Talk, Amelia Dalton chats with Saad Lambaz from Littelfuse to discuss the evolution of standards for fuses in automotive use, and how to do about choosing fuses in your next design.

Click here for more information about Littelfuse Automotive & Industrial Standard Fuses