feature article
Subscribe Now

Soft Everything

Designing Complexity

We talk a lot in these pages about programmable this and programmable that. In our efforts to make slivers of silicon do our increasingly complex bidding, we need some way to communicate our intent to our chips and to incite them to behave accordingly. In our happy little engineering silos, of course, we separate all these types of “programming” out into various disciplines – firmware, middleware, OS, application software, drivers, FPGA fabric, analog configurations, transceiver settings… The list goes on and on.  

All of this “programming” ultimately determines the behavior of our device. We create tribes of “engineers like us” who specialize in one flavor of programming or another, adorn ourselves with tools and tricks and folklore that enable us to get some reasonable results in our chosen area, and mostly ignore the neighboring disciplines.

As Moore’s Law marches forward, the hardware portion of what we’re programming is increasingly often completed before we get our chip, and it has been integrated onto a single device with the hardware to support all the other types of programming. The latest SoC devices include diverse arrays of hardware – processors, peripherals, memory, analog circuits, power and system monitoring hardware, specialized accelerators — just about anything that we might want to use in our application is thrown into the SoC. As silicon real estate has become increasingly inexpensive and the cost of creating a chip has continued to rise exponentially, the trend among chip makers is to build everything you might possibly need into one device and let you sort out which parts will be doing what by programming that device later. 

As FPGAs have become increasingly sophisticated and capable, they have also become less and less dissimilar from other types of SoCs. If you take away the FPGA fabric, some modern FPGAs look pretty much like any other SoC you could buy. You’ve got multiple high-performance processors, various types of memory, versatile bus structures, configurable IO, a variety of peripherals, accelerators for specialized tasks like audio and video, analog for doing functions like control, timers and controllers and built-in power supplies… Layering the LUTS into the mix just adds another dimension of programmability. 

If we follow these trend lines out toward the horizon (and we don’t have to follow them very far these days), we see that most of electronic system design is programming. Since most of our system is on one chip, we don’t spend much time choosing all the various chips that go on our board anymore. Similarly, with only one SoC doing most of the work, we don’t have to spend as much time on board design and layout (although the board design we now have to do has become quite a bit more complicated owing to huge pin counts and crazy data speeds).

Still, the engineers making a system from a current or future SoC will do most of their design work essentially programming their SoC using a massive array of software tools. This tool collection has been gathering for a while, of course, in all of our tribal engineering silos. For embedded software and firmware development, we have a well-worn set of standard tools – most of them open source – that get us from main() to working, tested code. For FPGA design, the various vendors and their third-party partners have generously supplied us with a mostly-suitable set of design tools that can at least get the LEDs on our development boards blinking in the order we desire. As our needs have broadened, so has our tool chest. Today’s SoCs always come with at least some set of supporting tools, and often those tools are quite sophisticated.

The FPGA companies seem to be most on top of this all-tools-for-all-things trend. Altera, Lattice, and Xilinx all supply tool suites with impressive lists of capabilities – embedded software development and debug, DSP algorithm design and acceleration, signal integrity assurance and monitoring, power analysis and optimization, system-level design, IP integration… Oh yeah –and even FPGA design. These increasingly powerful and integrated tool suites cover the duties of most of our design teams and tend to consolidate our skills. It’s easier to dip over from hardware into the software side just a bit if you and the software folks are using the same tools.

IP is also becoming increasingly integrated. In the simple olden days, IP blocks arrived as a few thousand lines of VHDL or Verilog that you could synthesize into your design. If you were lucky, a helpful IP supplier might even include some vectors or test programs to help you figure out if the thing was doing what it was supposed to do in your design. Today, IP isn’t really worth its salt if it doesn’t include the whole software stack, appropriate drivers, metadata, and even a sample reference design or application that lets you bring up your new widget without writing a single original line of code.

Of course, all of these highly-integrated point-and-click, drag-and-drop, plug-and-play, IP and tool solutions combined with mega-integrated SoC platforms create a potential problem for engineering as a career. With all of this raising of design abstraction, a lot of powerful systems that would previously have required a high level of expertise to engineer can now be slapped together in an afternoon by any halfway-competent technician using a not-too-expensive tool suite. You don’t need a top-flight engineer to drag and drop a few IP blocks and push the “GO” button. 

So, where does our engineering expertise go in this brave new world where everything is programmable and tools shoulder an increasing share of the engineering workload? Well, we will always need engineers doing the hard-core low-level stuff. If the dudes that understood PCI stopped back when we had 33MHz PCI IP for our FPGAs, we’d be in a pretty sad situation today. As progress marches forward, we’ll need increasingly specialized people designing the IP components that everybody else uses to realize their system designs.

We’ll also need more and better tool developers. The EDA industry is a mess today, and the revolution in SoCs has only put commercial EDA in a more precarious position. If they’re not careful, EDA may find that they have created the tools that engineered their own demise. Regardless of that, however, there will be a need for insanely-complicated design tools and for engineers with the skills to create them. Whether those tools come from independent EDA companies or from in-house development teams at SoC companies, they’ll require a sustained monumental engineering effort to keep them up with the demands of both their customers and their silicon platforms.

Between those root-level engineers and high-level system engineering, there will likely be a gap caused by programmability. Simple systems will be designable by almost anyone. Complex systems will be designed by people with extreme domain-specific knowledge. Between those two, well, hand a design kit to your middle-school kids and let them go to town.

In short, engineers will have to adjust their contributions to the times and to the technology if they want to continue their careers. But then again, that’s the way engineering has always worked.

One thought on “Soft Everything”

  1. A good and valid article. We can now go more directly from the design to a working system. FPGAs become the universal whiteboard for design.

    But beware of backward looking assumptions. You do not need as many engineers to design last year’s or the last decade’s systems. If you have absorbed all the IP associated with CPUs, GPUs and I/O devices, you are not done yet. Even though you can make a Really Big Computing System.

    Look *outside* the chip for new problems for the firmware (i.e. software and IP) to solve, as in system design. For example, robotics and mechatronics involve sensors and actuators. Sensors come in all sizes and types, and actuators tend to be big, power driven devices that have a large set of their own (expensive/valuable) peculiarities.

    And these sensors and actuators are not necessarily enumerable. For example, a camera generates pixels. These pixels must be interpreted into images or recognized as 3D objects. Some (R&D) software will be required. And actuators come in all flavors, from motors and gears to hydraulics and piezoelectric drives.

    More importantly, the sensors and actuators are expensive and central to many applications. They are the “dog,” while the electronics is the “tail.” – as in, the tail does not wag the dog.

    So, yes, the engineering design jobs of last year will decline in number, while the jobs of this and next year will grow.

Leave a Reply

featured blogs
Dec 4, 2020
As consumers, wireless technology is often taken for granted. How difficult would everyday life be without it? Can I open my garage door today? How do I turn on my Smart TV? Where are my social networks? Most of our daily wireless connections – from Wi-Fi and Bluetooth ...
Dec 4, 2020
I hear Percepio will be introducing the latest version of their Tracealyzer and their new DevAlert IoT device monitoring and remote diagnostics solution....
Dec 4, 2020
[From the last episode: We looked at an IoT example involving fleets of semi-trailers.] We'€™re now going to look at energy and how electronics fit into the overall global energy story. Whether it'€™s about saving money on electricity at home, making data centers more eff...
Dec 4, 2020
A few weeks ago, there was a webinar about designing 3D-ICs with Innovus Implementation. Although it was not the topic of the webinar, I should point out that if your die is more custom/analog, then... [[ Click on the title to access the full blog on the Cadence Community si...

featured video

Improve SoC-Level Verification Efficiency by Up to 10X

Sponsored by Cadence Design Systems

Chip-level testbench creation, multi-IP and CPU traffic generation, performance bottleneck identification, and data and cache-coherency verification all lack automation. The effort required to complete these tasks is error prone and time consuming. Discover how the Cadence® System VIP tool suite works seamlessly with its simulation, emulation, and prototyping engines to automate chip-level verification and improve efficiency by ten times over existing manual processes.

Click here for more information about System VIP

featured paper

Keys to quick success using high-speed data converters

Sponsored by Texas Instruments

Whether you’re designing an aerospace system, test and measurement equipment or automotive lidar AFE, hardware designers using high-speed data converters face tough challenges with high-frequency inputs, outputs, clock rates and digital interface. Issues might include connecting with your field-programmable gate array, being confident that your first design pass will work or determining how to best model the system before building it. In this article, we take a look at each of these challenges.

Click here to download the whitepaper

Featured Chalk Talk

Embedded Display Applications Innovation

Sponsored by Mouser Electronics and Texas Instruments

DLP technology can add a whole new dimension to your embedded design. If you considered DLP in the past, but were put off by the cost, you need to watch this episode of Chalk Talk where Amelia Dalton chats with Philippe Dollo of Texas Instruments about the DLP LightCrafter 2000 EVM. This new kit makes DLP more accessible and less expensive to design in, and could have a dramatic impact on your next embedded design.

Click here for more information about Texas Instruments DLP2000 Digital Micromirror Device (DMD)