feature article
Subscribe Now

Zen and the Art of Modelling FPGAs

What is the sound of one hand clapping? If a tree falls in the forest and no one is there, does it make a sound? Why is a product specification?

The answer is not obvious: the product specification is a statement of what the product should and shouldn’t do. At the very highest level, this is simple. “We want a family of FPGAs for motor control.” Then it starts getting complicated. Before you know what has happened, you have a three-volume specification, setting out in painstaking detail every attribute and activity of the end product – possibly.

But there are huge problems with this. Firstly, how does the person requiring the product know that the requirement is accurate?

How does one know the specification is complete, and whether it could be improved by adding features? There are a million stories where the user sees the end result and realises that they could have improved it immeasurably if they had just added something extra to the specification.

What do you do when the spec is changed during the development cycle? Multiple studies have identified incomplete and changing specifications as by far the biggest contributors to project delays and even failures.

How does the implementation team know that what they are building is what the specification requires? And this conundrum is doubled when you have both hardware and software teams working together.

Finally, how do you know the finished product matches the spec?

Issues like this have dogged projects ever since Noah was forced to take a saw to the Ark to let the giraffes in. Yet there are tools that can at least alleviate the problem, even if they might not totally solve them. And one of them is modelling, argues the MathWorks.

The MathWorks has been around for over a quarter of a century, and its language, MATLAB, has become the de facto standard for mathematical computing. (If you want to see how flexible it can be, there is a discussion of solving Sudoku with MATLAB on The MathWorks web site -http://www.mathworks.com/company/newsletters/news_notes/2009/clevescorner.html) MATLAB is also widely used to describe algorithms for implementation in FPGAs, and there are a number of tools and techniques for converting MATLAB code to RTL.

But MATLAB becomes even more powerful when used with the other major MathWorks product, Simulink. Designed as a multi-domain simulator, Simulink has evolved into a model-based design tool. Adding in a selection of support tools makes it an interesting approach for FPGA design.

Building the model is normally done through assembling (by drag and drop) blocks of functions from a series of libraries. A simple version might be a waveform generator whose output is amplified. The block libraries include viewers, such as an oscilloscope, which will display what is happening. The developer can change parameters in the blocks, like the amplitude of the waveform and the gain in the amplifier.

Obviously MATLAB and Simulink are closely coupled: MATLAB algorithms can be imported directly into Simulink, and many of the blocks are expressions of MATLAB modules.  (In fact, the code underlying the model is a form of MATLAB.) MATLAB can be opened within Simulink, and functions created within MATLAB can be stored as objects within the Simulink library. Simulink also permits the definition of signals and parameters, and add-on products provide further capability.  A particularly useful add-on is Stateflow, which allows the development of state charts within Simulink.

Once the model has been assembled, it can be exercised as a simulation and tested. The SystemTest software can be used to develop test sequences and store test results. The simulation allows the blocks and modules, and the system as a whole, to be exercised with different parameters, giving the opportunity to try different configurations and optimise the design.

The joy of building a model is that you can look at it from different viewpoints, taking a high-level view for overall discussions, such as confirming that the system modelled is that which was intended, and then zooming in to lower levels for detailed analysis. By looking at the model in action, it is far easier to identify that small feature that could be added (or removed) to significantly enhance the end product.

Before generating code, a further add-on, appropriately called Simulink Verification and Validation, can be applied to verify and validate the model. This uses a formal methods approach to examine the design for a raft of things. It looks at the model coverage by the simulation and test tools, identifying those not exercised, and, equally, it looks for unnecessary design constructs. The written specification will still exist — you can’t get away from those three volumes of paper — but also as a Word or HTML document, possibly with Excel tables. Simulink modules can be linked to the relevant requirements in the specification, and it should be possible to check to see that all specifications are satisfied and that each module is required by a specification.

If the design is going to be used in safety-critical and similar areas, there are further tools to confirm that the validation and verification conform to the relevant standard, such as IEC 61508/26262 and Do-178.

Now that you have a model that represents the requirements, you don’t want to sit down and start writing code, with all the possibilities of introducing misunderstandings and changes. The MathWorks has several alternatives. It has RTL code generators, for both VHDL and Verilog, which will map into a range of synthesis tools, such as Cadence Encounter RTL Compiler, Mentor Graphics Precision, Synopsys Design Compiler, and Synplicity Synplify. It also has two FPGA-specific code generators, for Xilinx and Altera.

The RTL code can be verified using functional verification products, including Cadence Incisive, Mentor Graphics ModelSim, and Synopsys VCS. In all cases, the RTL is “correct by construction,” which should remove the requirement to debug code.

But FPGAs are no longer just logic gates. Simulink can export executable C code, which may be compliant with standards like MISRA-C, to target microprocessors, including the ARM cores that are now appearing in FPGAs, and for DSPs embedded in the fabric.

The validation tools that are part of the tool chain can be used to exercise the design and the results compared with those for the model. Discrepancies, if any, can be investigated to see what has caused them and to track down the issues causing them.

Equally, the first physical implementation of an FPGA can be tested, using the test vectors created with SystemTest, and the results compared with the model.

Getting all this set up and running for the first project is going to be time-consuming, but once it is in place, there are considerable benefits.

At the first stage, just being able to exercise the specification as a model provides a way of identifying issues with the specification and resolving them. And there is ample evidence that the later these issues are found, the more it costs to correct them. Even if the issues can be corrected, there are manpower costs and project slippages. In extreme cases, a faulty specification can mean that the product never sees the light of day.

The other source of significant costs and delays in a project is when changes are made to the initial requirements as the project progresses. A model-based approach means that the model is re-jigged and re-exercised, and new code is generated. As we have seen, the model is the golden reference against which all other validation can be measured, and a revised model provides the data to evaluate verification at the later stages.

If the requirement is a family of devices, then a base model can be preserved and variant models created for each member of the family.

Zen is not concerned with worldly things, nor is it concerned with posing questions that can be answered. I would justify the title in three ways: it provides a tantalising hook to get you to read on, the article is as much about Zen as is Pirsig’s Zen and the Art of Motorcycle Maintenance, and finally, the question of “Why is a specification?” is still unresolved.

Leave a Reply

featured blogs
Nov 23, 2020
It'€™s been a long time since I performed Karnaugh map minimizations by hand. As a result, on my first pass, I missed a couple of obvious optimizations....
Nov 23, 2020
Readers of the Samtec blog know we are always talking about next-gen speed. Current channels rates are running at 56 Gbps PAM4. However, system designers are starting to look at 112 Gbps PAM4 data rates. Intuition would say that bleeding edge data rates like 112 Gbps PAM4 onl...
Nov 20, 2020
[From the last episode: We looked at neuromorphic machine learning, which is intended to act more like the brain does.] Our last topic to cover on learning (ML) is about training. We talked about supervised learning, which means we'€™re training a model based on a bunch of ...
Nov 20, 2020
Are you a lab instructor sitting at home right now? Have you completed some Cadence Online Training courses for your education and earned Digital Badges for personal promotion and spicing up your CV... [[ Click on the title to access the full blog on the Cadence Community si...

Featured video

Synopsys and Intel Full System PCIe 5.0 Interoperability Success

Sponsored by Synopsys

This video demonstrates industry's first successful system-level PCI Express (PCIe) 5.0 interoperability between the Synopsys DesignWare Controller and PHY IP for PCIe 5.0 and Intel Xeon Scalable processor (codename Sapphire Rapids). The ecosystem can use the companies' proven solutions to accelerate development of their PCIe 5.0-based products in high-performance computing and AI applications.

More information about DesignWare IP Solutions for PCI Express

featured paper

Simplify your isolated current & voltage sensing designs

Sponsored by Texas Instruments

Learn how the latest isolated amplifiers and isolated ADCs can operate with a single supply on the low side, and why this offers substantial benefits over traditional solutions.

Click here to download the whitepaper

Featured Chalk Talk

Electronic Fuses (eFuses)

Sponsored by Mouser Electronics and ON Semiconductor

Today’s advanced designs demand advanced circuit protection. The days of replacing old-school fuses are long gone, and we need solutions that provide more robust protection and improved failure modes. In this episode of Chalk Talk, Amelia Dalton chats with Pramit Nandy of ON Semiconductor about the latest advances in electronic fuses, and how they can protect against overcurrent, thermal, and overvoltage.

More information about ON Semiconductor Electronic Fuses