feature article
Subscribe Now

Catapult Levels Up

Mentor Attacks ESL Subsystem Design

In my book, ESL is a serious contender for the title of “worst technical term of the decade.” As we’ve discussed before, the ESL label was possibly created by Dataquest in an attempt to create a category that could hold all of the EDA products that didn’t fit cleanly into any of the previously established tool categories. As such, ESL turned into more of a “bucket” than a “category” as it snowballed down the mountain of misfit design software, accumulating technologies such as transaction-level simulation tools, graphical block-based design environments, high level language modeling, behavioral hardware synthesis, alternative hardware description languages, digital signal processing analysis and design tools, software/hardware co-development aids, and teaching English to non-native English speakers.

OK, maybe that last one wasn’t Dataquest’s fault.

Despite the shortcomings of the category, however, many of the products and technologies that have been labeled “ESL” are very promising in their own right. Mentor Graphics’s Catapult C is one such tool. We’ve discussed Catapult for a couple of years now, explaining its position as a power tool for the hardware designer – a product that takes a fully-behavioral ANSI C/C++ description of an untimed algorithm and explores the tradeoff space of hardware area/gate count, operating frequency, power, latency and throughput for single hardware modules.

Now, Mentor has announced a new, higher-level of Catapult called Catapult SL. The basic idea is to take the Capabilities of Catapult C up one level of hierarchy to the multi-module subsystem level. I hear your question – “Why is that ‘subsystem’ instead of ‘system’?” While it seems that the concept of “system” is elusive and can always be extended to cover something not in the scope of your original definition, today’s systems-on-chip (SoCs) generally contain a strong software component. Catapult SL is focused exclusively on the design of high-performance hardware (albeit from algorithms described in a software language). As such, Mentor isn’t ready to declare victory in the full, system-level design arena just yet. Catapult’s goal is to accelerate the process of developing hardware – in a big way.

With today’s design challenges, just a few percentage points in productivity isn’t enough to persuade most designers to completely change their methodologies. Most custom hardware today is still designed at the register transfer level (RTL) using hardware description languages (HDLs) such as VHDL and Verilog. One of the promises of ESL technologies is to improve productivity beyond what is possible with RTL/HDL hardware design methodologies. “We feel that even 50% or so productivity improvement – which is what a lot of the methodologies are touting – isn’t enough for what designers need,” says Shawn McCloud, high-level synthesis product line director at Mentor. “In order to handle today’s complex designs on today’s schedules with current market windows, we need to deliver orders of magnitude in productivity improvement. That’s what Catapult is designed to do.”

The idea of raising the level of abstraction in hardware design amounts to moving the line between what we designers do manually and what is automated. As we have moved from designing at the transistor level through the gate level, the module level, the register-transfer level and the block level, one thing has remained constant – the human (that’s you and me) has always started by feeding our tools with a description of the structure of the circuit. We have moved from creating structural descriptions based on very small elements like transistors and gates to creating structural descriptions based on very large, complex elements like processors, busses, peripherals, and I/O interfaces.

The human has always been 100% responsible for beginning with the abstract algorithm for solving a particular problem and creating a hardware structure that can implement that algorithm. Once that structure is specified (at any of these levels of abstraction), automated tools can take the reins and guide us to a successful final solution. Traditionally, the designer/tool line has been analogous to the partition between hardware and software engineers – hardware engineers created general-purpose physical structures that allowed software engineers to create solutions based on algorithmic rather than structural descriptions.

However, any time the problem didn’t work with that dual-discipline solution (i.e. the combination of general purpose hardware with software algorithms didn’t have the required performance), a hardware specialist had to be called in to create a custom structure. This structure had to be conceived in the mind of that hardware engineer based on the algorithm to be implemented and design constraints like power, performance, and cost (area). This custom hardware designer also had to strike a balance between the throughput of the datapath portion of the algorithm and the size and characteristics of the pipes feeding it data. This is tricky business, and some trial and error is often required to find the optimal structure. Usually, however, that trial and error is so expensive and exhaustive to carry out that the first semi-suitable structure is chosen and becomes the “official” design. Coding and validating the RTL description of that design is time- consuming and painstaking work, and there is seldom time to try a second or third alternative.

With Catapult, the hardware designer explores hardware structure alternatives rapidly, choosing one that seems to offer the best tradeoff of engineering factors. The tool can automatically generate architectures from the behavioral/algorithmic description, and the designer can then interactively and iteratively modify the design constraints to find alternative architectures. Catapult’s strength has always been its ability to come up with highly optimized architectures for each set of constraints. The designer can rapidly experiment with considerations such as pipeline depth, parallelism, and interface protocols and strategies to find a well-balanced architecture that suits his particular situation.

Before Catapult SL, that capability was limited to the scope of a single function that mapped to a single datapath/control/memory/interface structure. Now, with Catapult SL, a hierarchy of C functions can be broken down into an interconnected system of hardware machines, each containing datapath, control, memory and interface elements. Now, all of the parts of that hardware subsystem can be taken into account simultaneously when exploring to find the best hardware architecture. This holistic approach is increasingly important in applications such as 3G wireless, image processing, SDR, and HD video. Catapult SL allows those applications to be developed with the block-level capabilities of Catapult combined with new channel synthesis algorithms for architecting inter-block connection networks.

As complex as the design portion of these designs can be, the verification and validation task is equally daunting. Catapult addresses this issue by creating verification models at various levels of abstraction in the form of SystemC models that can be used in high-level system simulation/verification environments. Catapult now also boasts a push-button power analysis capability that takes advantage of these SystemC models to generate switching information for third-party power estimation tools. This allows power consumption of architectural alternatives to be comparatively evaluated before a final RTL structure is chosen.

While many of Catapult’s customers are ultimately targeting custom ASIC hardware with their design, most are at least prototyped in FPGAs. Approaching design algorithmically makes the FPGA-to-ASIC transition less perilous – offering an option to optimize the architecture for the vagaries of each technology – like ASIC’s faster switching times and FPGA’s register richness. Although most teams opt to keep the RTL constant across implementations to avoid verification headaches, the ability to create technology-specific architecture optimizations can sometimes make the difference between an FPGA prototype that actually works and one that simply verifies that the algorithm is correct.

Mentor is apparently well aware that Catapult is a highly-valued tool for high-end design teams. Its price tag – $350K (US) for a one-year license for the new SL product and $140K for the previous version, now renamed Catapult BL (for Block Level) – puts this family in a range where only serious shoppers will inquire. Apparently there are lots of those folks in the world, however, because Mentor says Catapult has been blowing the doors off its sales targets and has been already adopted by over fifty major electronics companies with numerous production designs and ASIC tapeouts to its credit. When you consider the overall cost of a system design project, and, beyond that, the potential value of beating the leading edge of a critical market window with a product, even prices like these seem a pittance.

Leave a Reply

Catapult Levels Up

Mentor Attacks ESL Subsystem Design

OK, maybe that last one wasn’t Dataquest’s fault.

Despite the shortcomings of the category, however, many of the products and technologies that have been labeled “ESL” are very promising in their own right. Mentor Graphics’s Catapult C is one such tool. We’ve discussed Catapult for a couple of years now, explaining its position as a power tool for the hardware designer – a product that takes a fully-behavioral ANSI C/C++ description of an untimed algorithm and explores the tradeoff space of hardware area/gate count, operating frequency, power, latency and throughput for single hardware modules.

Now, Mentor has announced a new, higher-level of Catapult called Catapult SL. The basic idea is to take the Capabilities of Catapult C up one level of hierarchy to the multi-module subsystem level. I hear your question – “Why is that ‘subsystem’ instead of ‘system’?” While it seems that the concept of “system” is elusive and can always be extended to cover something not in the scope of your original definition, today’s systems-on-chip (SoCs) generally contain a strong software component. Catapult SL is focused exclusively on the design of high-performance hardware (albeit from algorithms described in a software language). As such, Mentor isn’t ready to declare victory in the full, system-level design arena just yet. Catapult’s goal is to accelerate the process of developing hardware – in a big way.

With today’s design challenges, just a few percentage points in productivity isn’t enough to persuade most designers to completely change their methodologies. Most custom hardware today is still designed at the register transfer level (RTL) using hardware description languages (HDLs) such as VHDL and Verilog. One of the promises of ESL technologies is to improve productivity beyond what is possible with RTL/HDL hardware design methodologies. “We feel that even 50% or so productivity improvement – which is what a lot of the methodologies are touting – isn’t enough for what designers need,” says Shawn McCloud, high-level synthesis product line director at Mentor. “In order to handle today’s complex designs on today’s schedules with current market windows, we need to deliver orders of magnitude in productivity improvement. That’s what Catapult is designed to do.”

The idea of raising the level of abstraction in hardware design amounts to moving the line between what we designers do manually and what is automated. As we have moved from designing at the transistor level through the gate level, the module level, the register-transfer level and the block level, one thing has remained constant – the human (that’s you and me) has always started by feeding our tools with a description of the structure of the circuit. We have moved from creating structural descriptions based on very small elements like transistors and gates to creating structural descriptions based on very large, complex elements like processors, busses, peripherals, and I/O interfaces.

The human has always been 100% responsible for beginning with the abstract algorithm for solving a particular problem and creating a hardware structure that can implement that algorithm. Once that structure is specified (at any of these levels of abstraction), automated tools can take the reins and guide us to a successful final solution. Traditionally, the designer/tool line has been analogous to the partition between hardware and software engineers – hardware engineers created general-purpose physical structures that allowed software engineers to create solutions based on algorithmic rather than structural descriptions.

However, any time the problem didn’t work with that dual-discipline solution (i.e. the combination of general purpose hardware with software algorithms didn’t have the required performance), a hardware specialist had to be called in to create a custom structure. This structure had to be conceived in the mind of that hardware engineer based on the algorithm to be implemented and design constraints like power, performance, and cost (area). This custom hardware designer also had to strike a balance between the throughput of the datapath portion of the algorithm and the size and characteristics of the pipes feeding it data. This is tricky business, and some trial and error is often required to find the optimal structure. Usually, however, that trial and error is so expensive and exhaustive to carry out that the first semi-suitable structure is chosen and becomes the “official” design. Coding and validating the RTL description of that design is time- consuming and painstaking work, and there is seldom time to try a second or third alternative.

With Catapult, the hardware designer explores hardware structure alternatives rapidly, choosing one that seems to offer the best tradeoff of engineering factors. The tool can automatically generate architectures from the behavioral/algorithmic description, and the designer can then interactively and iteratively modify the design constraints to find alternative architectures. Catapult’s strength has always been its ability to come up with highly optimized architectures for each set of constraints. The designer can rapidly experiment with considerations such as pipeline depth, parallelism, and interface protocols and strategies to find a well-balanced architecture that suits his particular situation.

Before Catapult SL, that capability was limited to the scope of a single function that mapped to a single datapath/control/memory/interface structure. Now, with Catapult SL, a hierarchy of C functions can be broken down into an interconnected system of hardware machines, each containing datapath, control, memory and interface elements. Now, all of the parts of that hardware subsystem can be taken into account simultaneously when exploring to find the best hardware architecture. This holistic approach is increasingly important in applications such as 3G wireless, image processing, SDR, and HD video. Catapult SL allows those applications to be developed with the block-level capabilities of Catapult combined with new channel synthesis algorithms for architecting inter-block connection networks.

As complex as the design portion of these designs can be, the verification and validation task is equally daunting. Catapult addresses this issue by creating verification models at various levels of abstraction in the form of SystemC models that can be used in high-level system simulation/verification environments. Catapult now also boasts a push-button power analysis capability that takes advantage of these SystemC models to generate switching information for third-party power estimation tools. This allows power consumption of architectural alternatives to be comparatively evaluated before a final RTL structure is chosen.

While many of Catapult’s customers are ultimately targeting custom ASIC hardware with their design, most are at least prototyped in FPGAs. Approaching design algorithmically makes the FPGA-to-ASIC transition less perilous – offering an option to optimize the architecture for the vagaries of each technology – like ASIC’s faster switching times and FPGA’s register richness. Although most teams opt to keep the RTL constant across implementations to avoid verification headaches, the ability to create technology-specific architecture optimizations can sometimes make the difference between an FPGA prototype that actually works and one that simply verifies that the algorithm is correct.

Mentor is apparently well aware that Catapult is a highly-valued tool for high-end design teams. Its price tag – $350K (US) for a one-year license for the new SL product and $140K for the previous version, now renamed Catapult BL (for Block Level) – puts this family in a range where only serious shoppers will inquire. Apparently there are lots of those folks in the world, however, because Mentor says Catapult has been blowing the doors off its sales targets and has been already adopted by over fifty major electronics companies with numerous production designs and ASIC tapeouts to its credit. When you consider the overall cost of a system design project, and, beyond that, the potential value of beating the leading edge of a critical market window with a product, even prices like these seem a pittance.

Leave a Reply

featured blogs
Nov 5, 2024
Learn about Works With Virtual, the premiere IoT developer event. Specifically for IoT developers, Silicon Labs will help you to accelerate IoT development. ...
Nov 7, 2024
I don't know about you, but I would LOVE to build one of those rock, paper, scissors-playing robots....

featured video

Introducing FPGAi ā€“ Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Alteraā€™s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured chalk talk

Vector Funnel Methodology for Power Analysis from Emulation to RTL to Signoff
Sponsored by Synopsys
The shift left methodology can help lower power throughout the electronic design cycle. In this episode of Chalk Talk, William Ruby from Synopsys and Amelia Dalton explore the biggest energy efficiency design challenges facing engineers today, how Synopsys can help solve a variety of energy efficiency design challenges and how the shift left methodology can enable consistent power efficiency and power reduction.
Jul 29, 2024
67,990 views