feature article
Subscribe Now

Soft Everything

Designing Complexity

We talk a lot in these pages about programmable this and programmable that. In our efforts to make slivers of silicon do our increasingly complex bidding, we need some way to communicate our intent to our chips and to incite them to behave accordingly. In our happy little engineering silos, of course, we separate all these types of “programming” out into various disciplines – firmware, middleware, OS, application software, drivers, FPGA fabric, analog configurations, transceiver settings… The list goes on and on.  

All of this “programming” ultimately determines the behavior of our device. We create tribes of “engineers like us” who specialize in one flavor of programming or another, adorn ourselves with tools and tricks and folklore that enable us to get some reasonable results in our chosen area, and mostly ignore the neighboring disciplines.

As Moore’s Law marches forward, the hardware portion of what we’re programming is increasingly often completed before we get our chip, and it has been integrated onto a single device with the hardware to support all the other types of programming. The latest SoC devices include diverse arrays of hardware – processors, peripherals, memory, analog circuits, power and system monitoring hardware, specialized accelerators — just about anything that we might want to use in our application is thrown into the SoC. As silicon real estate has become increasingly inexpensive and the cost of creating a chip has continued to rise exponentially, the trend among chip makers is to build everything you might possibly need into one device and let you sort out which parts will be doing what by programming that device later. 

As FPGAs have become increasingly sophisticated and capable, they have also become less and less dissimilar from other types of SoCs. If you take away the FPGA fabric, some modern FPGAs look pretty much like any other SoC you could buy. You’ve got multiple high-performance processors, various types of memory, versatile bus structures, configurable IO, a variety of peripherals, accelerators for specialized tasks like audio and video, analog for doing functions like control, timers and controllers and built-in power supplies… Layering the LUTS into the mix just adds another dimension of programmability. 

If we follow these trend lines out toward the horizon (and we don’t have to follow them very far these days), we see that most of electronic system design is programming. Since most of our system is on one chip, we don’t spend much time choosing all the various chips that go on our board anymore. Similarly, with only one SoC doing most of the work, we don’t have to spend as much time on board design and layout (although the board design we now have to do has become quite a bit more complicated owing to huge pin counts and crazy data speeds).

Still, the engineers making a system from a current or future SoC will do most of their design work essentially programming their SoC using a massive array of software tools. This tool collection has been gathering for a while, of course, in all of our tribal engineering silos. For embedded software and firmware development, we have a well-worn set of standard tools – most of them open source – that get us from main() to working, tested code. For FPGA design, the various vendors and their third-party partners have generously supplied us with a mostly-suitable set of design tools that can at least get the LEDs on our development boards blinking in the order we desire. As our needs have broadened, so has our tool chest. Today’s SoCs always come with at least some set of supporting tools, and often those tools are quite sophisticated.

The FPGA companies seem to be most on top of this all-tools-for-all-things trend. Altera, Lattice, and Xilinx all supply tool suites with impressive lists of capabilities – embedded software development and debug, DSP algorithm design and acceleration, signal integrity assurance and monitoring, power analysis and optimization, system-level design, IP integration… Oh yeah –and even FPGA design. These increasingly powerful and integrated tool suites cover the duties of most of our design teams and tend to consolidate our skills. It’s easier to dip over from hardware into the software side just a bit if you and the software folks are using the same tools.

IP is also becoming increasingly integrated. In the simple olden days, IP blocks arrived as a few thousand lines of VHDL or Verilog that you could synthesize into your design. If you were lucky, a helpful IP supplier might even include some vectors or test programs to help you figure out if the thing was doing what it was supposed to do in your design. Today, IP isn’t really worth its salt if it doesn’t include the whole software stack, appropriate drivers, metadata, and even a sample reference design or application that lets you bring up your new widget without writing a single original line of code.

Of course, all of these highly-integrated point-and-click, drag-and-drop, plug-and-play, IP and tool solutions combined with mega-integrated SoC platforms create a potential problem for engineering as a career. With all of this raising of design abstraction, a lot of powerful systems that would previously have required a high level of expertise to engineer can now be slapped together in an afternoon by any halfway-competent technician using a not-too-expensive tool suite. You don’t need a top-flight engineer to drag and drop a few IP blocks and push the “GO” button. 

So, where does our engineering expertise go in this brave new world where everything is programmable and tools shoulder an increasing share of the engineering workload? Well, we will always need engineers doing the hard-core low-level stuff. If the dudes that understood PCI stopped back when we had 33MHz PCI IP for our FPGAs, we’d be in a pretty sad situation today. As progress marches forward, we’ll need increasingly specialized people designing the IP components that everybody else uses to realize their system designs.

We’ll also need more and better tool developers. The EDA industry is a mess today, and the revolution in SoCs has only put commercial EDA in a more precarious position. If they’re not careful, EDA may find that they have created the tools that engineered their own demise. Regardless of that, however, there will be a need for insanely-complicated design tools and for engineers with the skills to create them. Whether those tools come from independent EDA companies or from in-house development teams at SoC companies, they’ll require a sustained monumental engineering effort to keep them up with the demands of both their customers and their silicon platforms.

Between those root-level engineers and high-level system engineering, there will likely be a gap caused by programmability. Simple systems will be designable by almost anyone. Complex systems will be designed by people with extreme domain-specific knowledge. Between those two, well, hand a design kit to your middle-school kids and let them go to town.

In short, engineers will have to adjust their contributions to the times and to the technology if they want to continue their careers. But then again, that’s the way engineering has always worked.

One thought on “Soft Everything”

  1. A good and valid article. We can now go more directly from the design to a working system. FPGAs become the universal whiteboard for design.

    But beware of backward looking assumptions. You do not need as many engineers to design last year’s or the last decade’s systems. If you have absorbed all the IP associated with CPUs, GPUs and I/O devices, you are not done yet. Even though you can make a Really Big Computing System.

    Look *outside* the chip for new problems for the firmware (i.e. software and IP) to solve, as in system design. For example, robotics and mechatronics involve sensors and actuators. Sensors come in all sizes and types, and actuators tend to be big, power driven devices that have a large set of their own (expensive/valuable) peculiarities.

    And these sensors and actuators are not necessarily enumerable. For example, a camera generates pixels. These pixels must be interpreted into images or recognized as 3D objects. Some (R&D) software will be required. And actuators come in all flavors, from motors and gears to hydraulics and piezoelectric drives.

    More importantly, the sensors and actuators are expensive and central to many applications. They are the “dog,” while the electronics is the “tail.” – as in, the tail does not wag the dog.

    So, yes, the engineering design jobs of last year will decline in number, while the jobs of this and next year will grow.

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 24, 2024
Learn about maskless electron beam lithography and see how Multibeam's industry-first e-beam semiconductor lithography system leverages Synopsys software.The post Synopsys and Multibeam Accelerate Innovation with First Production-Ready E-Beam Lithography System appeared fir...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Peak Power Introduction and Solutions
Sponsored by Mouser Electronics and MEAN WELL
In this episode of Chalk Talk, Amelia Dalton and Karim Bheiry from MEAN WELL explore why motors and capacitors need peak current during startup, the parameters to keep in mind when choosing your next power supply for these kind of designs, and the specific applications where MEAN WELL’s enclosed power supplies with peak power would bring the most benefit.
Jan 22, 2024
13,350 views