feature article
Subscribe Now

Designing for the Long Haul

Considering EDA Tool Longevity

There is a story in the American South about two “country boys” who walk into a lumber yard and ask to buy some wood. The lumber yard attendant asks what size they want. They look confused. He gives them some choices “two-by-fours, two-by-sixes, four-by-fours…”

They step aside for a minute to confer with each other. “We’ll take two-by-fours.” 

The lumber yard attendant takes them to the aisle with the two-by-fours. “How long do you need them?” 

The two step aside to confer again, whispering frantically to each other.

“A long time – we’re building a house!”

When we choose EDA tools, we typically don’t think much about longevity. Our design horizon is often reaching beta test or production, and it’s hard to think much about time beyond that – except to visualize ourselves on a tropical beach with an umbrella-bearing drink watching the sun set – both literally and on our project-related stress.

If you design custom ICs, your planning process is probably more robust, but your design horizon is still relatively near. In all likelihood, your chip will be obsolete in a couple of years, and nobody will care what design tools you used for your original project. If you created IP of value, it will be easily archived and theoretically re-usable. (You DID follow all of the relevant industry standards and guidelines to make your IP re-usable, didn’t you? Yeah, we knew you did. Mum’s the word.)

If you design something with larger geometries, however, like PCBs, cabling harnesses, or more elaborate systems, your design isn’t captured in something as simple as a few lines of VHDL or Verilog. Again, for most applications, this isn’t a problem. If your design needs to be changed in a few years, you will still have most of the same tools, and if the EDA vendor was responsible in their subsequent releases, you’ll still be able to read those old data files in the new version and make the required changes (even though looking at some of the things you did design-wise “way back then” may make you cringe a little. Easy does it! Fix ONLY the things that you need to change. Don’t go on a cleaning spree. Move away from the mouse…) 

What happens, though, when a LOT of time passes between the time a design is first created and the time we need to get in and do something important to update it? For those of you who spend your careers in the fast-paced markets like consumer electronics, you may find such a scenario difficult to imagine. But, in many heavy-iron industries like automotive, there are lengthy standards in place to deal with just such eventualities. What happens when you need to make a hardware change in a design that was deployed thirty-odd years ago in a system that’s still in the field. Have you still got a working copy of your GenRad HiLo simulator? Do you still remember GHDL? Does that Apollo workstation in the basement still fire up? How about that CALMA station? 

Uh Oh.

In this fast-paced world where we sometimes find ourselves racing the clock to get our designs shipped before they become obsolete, it’s easy to forget about the long haul. It’s hard to justify spending valuable engineering time thinking about creating a design time capsule for a future engineer (presently age two) that may be working in the company that bought the company that acquired the division that was sold to the big conglomerate that will buy the start-up you’re working for next year. THAT dude is just on his own, right?

For EDA companies, the challenge is similar. The focus is on making money this year, or even this quarter. Customers buy tools based on what capabilities they bring to the table right now and how they handle the massive design challenges we face with today’s frenetic pace of technology evolution. You don’t often get a customer coming to your EDA sales team asking, “How will I fire up this tool in thirty years if I need to make a design change?” 

“You mean on your head-mounted quantum computer with neural user interface?”

EDA companies do a great job handling this problem for the reasonable time frames we’ve all dealt with in our careers. They make sure old design data is generally forward compatible and that standards exist for some manner of interoperability in cases where an EDA company might go out of business unexpectedly. Daisy and Valid customer support aren’t answering too many calls these days.

However, we are really just reaching the time when this becomes a real issue for the industry. Before the early 1980s, most design was done on paper, and those rolls of blueprint seem to be pretty forward-compatible. In the 1980s, however, a large percentage of the world’s engineering moved over to electronic design automation systems. We are just now reaching the era where this will start to matter. Engineering careers span something in the range of thirty years, so the people who did those designs are beginning to exit the industry. Second and third generations of EDA-using engineers are already coming up in their places. The tools those early EDA users adopted are in a woeful state today. Most of them used proprietary computing systems, long-obsolete operating systems, and proprietary design data formats. If you wanted to go back and fix that one flaw in the flight control system your company designed in 1984, you’re in a bit of a pickle. 

Well, almost.

There is a bit of a bright side to this scenario. Some of the longer-lived EDA companies have noticed this problem and actually found a way to turn it into a benefit – for themselves AND their long-term customers. Mentor Graphics, for example, has a team dedicated exclusively to supporting customers on long-obsolete tools. For a price, they will go on a software archaeological dig, unearth that copy of XYZ-whatever tool source code from 1985, port it to a modern operating system and computing platform, and help you get your grandpa’s design up so you can replace that one TTL part that isn’t available any more. In Mentor’s case, they don’t even restrict themselves to Mentor tools. They have a repertoire of tool technology from long-dead startups that might be just what you need to save your bacon someday. 

Services like this are highly profitable for the EDA companies that provide them and potential lifesavers for the companies that use them. While they address only a fraction of the design archival issues that we all face, they do bridge the gap to an era where EDA itself was new and naïve and mass-produced computing hardware couldn’t handle the needs of electronic designers. However, we can all make great strides for the future by simply thinking about the longevity of our designs, anticipating what kind of changes might be required in the normal evolution of our product, and leaving at least some breadcrumbs for future forensic designers to follow in re-tracing our footsteps.

Well, time to go back and watch some of those 8mm family movies.

Image courtesy of the Computer History Museum

3 thoughts on “Designing for the Long Haul”

  1. Kevin, to clarify and add to your long term life cycle planning observations: I wish design teams indeed would consider long term implications of their tool and flow choices more then they do today.

    That Submarine in the harbor of San Diego doesn’t get updated with new electronics every other year, maintenance happens with flows approved and signed off long ago, and when unforeseen corner cases poke their nasty head up, deploying new tools on old designs is often not even an option.

    Indeed at Mentor we have a strategy of support our end-users on the flows of their choice, which are not always the preferred flows of the EDA companies to work on. Hee new technology is always more exiting and definitely more in the limelight than legacy flows. So at Mentor we try to have as long a life-cycle as is needed for our paying customers by keeping these applications “fresh” until end-user no longer have their own value chain support requirements.
    And sometimes we indeed have to dig in the archives to provide an early 90’s compiler to resurrect a locomotive sitting on the sidelines with a DOS3.0 issue until that issues is fixed and recompiled with the approved compiler version (just to name one example)

    I’d like to invite design teams that are currently facing in-house proprietary tools, maybe because commercial versions are not available, to vector those tools into their life-cycle decisions.
    At Mentor we will consider in-sourcing those tools and keeping them “fresh” for those teams so they can focus on what they do best, which is innovate and create the next generation products while Mentor will keep their proprietary tools available to them until they either can be replaced by commercial versions, or the support need has ended.

    Thanks for bringing this “boring” topic (but with possibly very expensive ramifications) to the attention of your readers.

    Aaik van der Poel, Continuing Engineering Division, Mentor Graphics

  2. Plannng for HW obsolence and bit rot takes some thought too. It’s getting hard to find running PDP11 systems, and early Sun systems, from the 70’s and 80’s that will still power up, boot, and run without crashing. Backup tapes, floppies, and other media have serious bit rot over time, if you can find a functional device to read them.

    Simulators for older architectures are one very viable way to just send the older platforms to some place “nice”, and still have cycle accurate execution of an older OS with applications. Doing this as an ongoing sustaining engineering operation certainly makes sense when you are forced to support proprietary tools where the vendor is long history.

    Software and data archival takes significant thought and resources … a real librarian function that is delibertly funded and staffed, is certainly necessary to avoid the head scratching later.

Leave a Reply

featured blogs
Sep 22, 2021
'μWaveRiders' 是ä¸ç³»åˆ—æ—¨å¨æŽ¢è®¨ Cadence AWR RF 产品的博客,按æˆæ›´æ–°ï¼Œå…¶å†…容涵盖 Cadence AWR Design Environment æ新的核心功能,专题视频ï¼...
Sep 22, 2021
3753 Cruithne is a Q-type, Aten asteroid in orbit around the Sun in 1:1 orbital resonance with the Earth, thereby making it a co-orbital object....
Sep 21, 2021
Learn how our high-performance FPGA prototyping tools enable RTL debug for chip validation teams, eliminating simulation/emulation during hardware debugging. The post High Debug Productivity Is the FPGA Prototyping Game Changer: Part 1 appeared first on From Silicon To Softw...
Aug 5, 2021
Megh Computing's Video Analytics Solution (VAS) portfolio implements a flexible and scalable video analytics pipeline consisting of the following elements: Video Ingestion Video Transformation Object Detection and Inference Video Analytics Visualization   Because Megh's ...

featured video

Accurate Full-System Thermal 3D Analysis

Sponsored by Cadence Design Systems

Designing electronics for the data center challenges designers to minimize and dissipate heat. Electrothermal co-simulation requires system components to be accurately modeled and analyzed. Learn about a true 3D solution that offers full system scalability with 3D analysis accuracy for the entire chip, package, board, and enclosure.

Click here for more information about Celsius Thermal Solver

featured paper

Designing device power-supply ICs in an application-specific automated test equipment system

Sponsored by Maxim Integrated (now part of Analog Devices)

This application note provides guidelines for selecting the device power-supply (DPS) IC in an automated test equipment (ATE) system. These considerations will help you select the right DPS IC for your specific ATE system. It also explains the best system level architecture to tackle the output current and thermal requirements of the ATE system.

Click to read more

featured chalk talk

FPGAs Advance Data Acceleration in the Digital Transformation Age

Sponsored by Achronix

Acceleration is becoming a critical technology for today’s data-intensive world. Conventional processors cannot keep up with the demands of AI and other performance-intensive workloads, and engineering teams are looking to acceleration technologies for leverage against the deluge of data. In this episode of Chalk Talk, Amelia Dalton chats with Tom Spencer of Achronix about the current revolution in acceleration technology, and about specific solutions from Achronix that take advantage of leading-edge FPGAs, design IP, and even plug-and-play accelerator cards to address a wide range of challenges.

Click here for more information