feature article
Subscribe Now

In the Shadow of Moore’s Law

Lithography Pulls Progress in its Wake

For the last half century, a large segment of the world’s electronic engineers have worked in the insane vortex of Moore’s Law. Entire engineering careers from college through retirement have been spent with the only constant being exponential change. Whatever you learned in engineering school was obsolete before your first job promotion. Whatever you managed to design one year, you had to double to stay competitive two years later. Just about every tool, technique, or technology you invested time in learning had lost relevance within a couple of years, and your professional life was a nonstop exercise in constant re-education.

But Moore’s Law was all about lithography – trying to cram as many transistors as possible onto a slice of silicon. Applications in the direct path of digital design were dragged by their hearts through the Moore’s Law vortex. But many peripheral technologies didn’t experience Moore’s Law directly. Instead, they evolved much more slowly, rocking in the wake of the ongoing exponential change in digital electronics. That evolution is interesting to study because now, with Moore’s Law slowing to a close, previously pedestrian areas of engineering are poised to take center stage, picking up the slack where Moore left off.

Consider, for example, high current components. The physics of the amount of copper required to conduct a given quantity of coulombs hasn’t changed over the years, and consequently advanced lithography has little value. But progress in fields like materials over the years has given us new transistors with significant new advantages. Great strides have been made in wide-bandgap semiconductors, giving us better power transistors and new ways of making LEDs. Combined with today’s digital elements, they enable completely new classes of applications.

The analog and RF worlds haven’t ridden the crest of Moore’s Law either, but their domains have been profoundly affected by the technology that has. Analog functions have been wholesale converted to digital over the past few decades, and the once-proud art of analog design has, in many cases, been reduced to building the best possible ADCs and DACs so that the digital whiz kids could take over the heavy lifting. Advances in MIMO, beamforming and the DSP technologies that drive those have revolutionized antenna design and performance, and those changes can be directly traced to the exponential improvement in digital semiconductor lithography.

MEMS may seem like an area that would benefit directly from the miracles of lithography advancement, but, in practice, there is little reward for packing more sensors on a chunk of silicon. While its easy to come up with ways to use a million (or a billion) transistors instead of a thousand, it’s hard to think of an application that needs more than a dozen or so gyros or accelerometers. So, while sensors play a critical role in the enablement of the IoT (which is built heavily on the bounty of Moore’s Law), there is not going to be a sustained exponential improvement in those technologies over the span of years or decades as we have seen in semiconductors.

Software engineering has an almost inverse relationship with Moore’s Law. Advances in semiconductors have delivered vast improvements in computational performance, memory and storage capacity, and network bandwidth. Our human ability to conceptualize and write efficient, bug-free code has not come remotely close to keeping up. In order to develop the unbelievably complex applications that take advantage of today’s computing technology, software engineering has had to adopt a “quantity over quality” approach. Sure, you can write much faster and more efficient code in assembly than you can with a modern high-level language compiler, but the ~1000x productivity advantage you get from using modern software development, debug, and verification tools comes in more than a little bit handy in getting your job finished before the next ice age.

The software problem is an enormous one. There is no miraculous lithography engine making software engineers twice as productive or intelligent every two years, so they can continue to develop ever more complex applications in the same amount of time. The industry has adapted by modest improvements in software development tools and languages, adding vastly more manpower to the software engineering profession, and building impressive amounts of reusable software IP stacks. This has permitted an increasingly large army of software engineers to develop the most complex pieces of engineering ever conceived by humans. You may think the latest Xeon processor is a complicated piece of silicon, but it’s trivial compared with the complexity of the programs it will execute. An iPhone may seem like a daunting chunk of hardware technology, but it is dwarfed by the complexity of the software and app ecosystem it drives.

In an interesting feedback loop, one segment of the software engineering industry – electronic design automation (EDA) – has created extraordinarily complex software specifically for the purpose of driving Moore’s Law forward. EDA has constantly pushed the boundaries of software engineering and computational power in order to facilitate the design of the next shrink in transistor process. The frantic two-year cycle of Moore’s Law meant that EDA had to re-invent itself almost entirely every two years. The placement and routing algorithms that could deal with 10,000 gates back in the early 1980s had n-squared algorithms that could run for days to finish one design. Throw many millions of gates at those methods and algorithms from one of today’s chips, and no number of modern servers would ever finish the problem.

Because of this constant crushing pressure to double performance, capacity, and capability of EDA tools every two years, the EDA industry had little time to fuss over niceties like fancy user interfaces or glitzy cross-tool interfaces. And, as the challenges of chip design became more formidable, previously separate silos such as synthesis and place-and-route had to be combined. Now, with Moore’s Law slowing down, the pressure on EDA to run at full speed in order to simply remain in the same place may ease, and we seem to be entering an era where EDA tools can catch up to the rest of the software industry in their underlying architecture and usability. If EDA succeeds, they may make chip design itself more accessible and less risky and expensive, which could pave the way for a much wider gamut of new applications to take advantage of what Moore’s Law has brought us. It will be interesting to watch.

Leave a Reply

featured blogs
Jun 18, 2021
It's a short week here at Cadence CFD as we celebrate the Juneteenth holiday today. But CFD doesn't take time off as evidenced by the latest round-up of CFD news. There are several really... [[ Click on the title to access the full blog on the Cadence Community sit...
Jun 17, 2021
Learn how cloud-based SoC design and functional verification systems such as ZeBu Cloud accelerate networking SoC readiness across both hardware & software. The post The Quest for the Most Advanced Networking SoC: Achieving Breakthrough Verification Efficiency with Clou...
Jun 17, 2021
In today’s blog episode, we would like to introduce our newest White Paper: “System and Component qualifications of VPX solutions, Create a novel, low-cost, easy to build, high reliability test platform for VPX modules“. Over the past year, Samtec has worked...
Jun 14, 2021
By John Ferguson, Omar ElSewefy, Nermeen Hossam, Basma Serry We're all fascinated by light. Light… The post Shining a light on silicon photonics verification appeared first on Design with Calibre....

featured video

Kyocera Super Resolution Printer with ARC EV Vision IP

Sponsored by Synopsys

See the amazing image processing features that Kyocera’s TASKalfa 3554ci brings to their customers.

Click here for more information about DesignWare ARC EV Processors for Embedded Vision

featured paper

4 common questions when isolating signal and power

Sponsored by Texas Instruments

A high-voltage circuit design requires isolation to protect human operators, enable communication to lower-voltage circuitry and eliminate unwanted noise within the system. Many options are available when designing a power supply for digitally isolated circuits including; flyback, H-bridge LLC, push-pull, and integrated isolated data and power solutions. This article explores common questions when isolating signal and power in a design as well as a brief overview of available power solutions.

Click to read more

featured chalk talk

Using the Graphical PMSM FOC Component in Harmony3

Sponsored by Mouser Electronics and Microchip

Developing embedded software, and particularly configuring your embedded system can be a major pain for development engineers. Getting all the drivers, middleware, and libraries you need set up and in the right place and working is a constant source of frustration. In this episode of Chak Talk, Amelia Dalton chats with Brett Novak of Microchip about Microchip’s MPLAB Harmony 3, with the MPLAB Harmony Configurator - an embedded development framework with a drag-and-drop GUI that makes configuration a snap.

Click here for more information about Microchip Technology MPLAB® X Integrated Development Environment (IDE)