feature article
Subscribe Now

Zen and the Art of Modelling FPGAs

What is the sound of one hand clapping? If a tree falls in the forest and no one is there, does it make a sound? Why is a product specification?

The answer is not obvious: the product specification is a statement of what the product should and shouldn’t do. At the very highest level, this is simple. “We want a family of FPGAs for motor control.” Then it starts getting complicated. Before you know what has happened, you have a three-volume specification, setting out in painstaking detail every attribute and activity of the end product – possibly.

But there are huge problems with this. Firstly, how does the person requiring the product know that the requirement is accurate?

How does one know the specification is complete, and whether it could be improved by adding features? There are a million stories where the user sees the end result and realises that they could have improved it immeasurably if they had just added something extra to the specification.

What do you do when the spec is changed during the development cycle? Multiple studies have identified incomplete and changing specifications as by far the biggest contributors to project delays and even failures.

How does the implementation team know that what they are building is what the specification requires? And this conundrum is doubled when you have both hardware and software teams working together.

Finally, how do you know the finished product matches the spec?

Issues like this have dogged projects ever since Noah was forced to take a saw to the Ark to let the giraffes in. Yet there are tools that can at least alleviate the problem, even if they might not totally solve them. And one of them is modelling, argues the MathWorks.

The MathWorks has been around for over a quarter of a century, and its language, MATLAB, has become the de facto standard for mathematical computing. (If you want to see how flexible it can be, there is a discussion of solving Sudoku with MATLAB on The MathWorks web site -http://www.mathworks.com/company/newsletters/news_notes/2009/clevescorner.html) MATLAB is also widely used to describe algorithms for implementation in FPGAs, and there are a number of tools and techniques for converting MATLAB code to RTL.

But MATLAB becomes even more powerful when used with the other major MathWorks product, Simulink. Designed as a multi-domain simulator, Simulink has evolved into a model-based design tool. Adding in a selection of support tools makes it an interesting approach for FPGA design.

Building the model is normally done through assembling (by drag and drop) blocks of functions from a series of libraries. A simple version might be a waveform generator whose output is amplified. The block libraries include viewers, such as an oscilloscope, which will display what is happening. The developer can change parameters in the blocks, like the amplitude of the waveform and the gain in the amplifier.

Obviously MATLAB and Simulink are closely coupled: MATLAB algorithms can be imported directly into Simulink, and many of the blocks are expressions of MATLAB modules.  (In fact, the code underlying the model is a form of MATLAB.) MATLAB can be opened within Simulink, and functions created within MATLAB can be stored as objects within the Simulink library. Simulink also permits the definition of signals and parameters, and add-on products provide further capability.  A particularly useful add-on is Stateflow, which allows the development of state charts within Simulink.

Once the model has been assembled, it can be exercised as a simulation and tested. The SystemTest software can be used to develop test sequences and store test results. The simulation allows the blocks and modules, and the system as a whole, to be exercised with different parameters, giving the opportunity to try different configurations and optimise the design.

The joy of building a model is that you can look at it from different viewpoints, taking a high-level view for overall discussions, such as confirming that the system modelled is that which was intended, and then zooming in to lower levels for detailed analysis. By looking at the model in action, it is far easier to identify that small feature that could be added (or removed) to significantly enhance the end product.

Before generating code, a further add-on, appropriately called Simulink Verification and Validation, can be applied to verify and validate the model. This uses a formal methods approach to examine the design for a raft of things. It looks at the model coverage by the simulation and test tools, identifying those not exercised, and, equally, it looks for unnecessary design constructs. The written specification will still exist — you can’t get away from those three volumes of paper — but also as a Word or HTML document, possibly with Excel tables. Simulink modules can be linked to the relevant requirements in the specification, and it should be possible to check to see that all specifications are satisfied and that each module is required by a specification.

If the design is going to be used in safety-critical and similar areas, there are further tools to confirm that the validation and verification conform to the relevant standard, such as IEC 61508/26262 and Do-178.

Now that you have a model that represents the requirements, you don’t want to sit down and start writing code, with all the possibilities of introducing misunderstandings and changes. The MathWorks has several alternatives. It has RTL code generators, for both VHDL and Verilog, which will map into a range of synthesis tools, such as Cadence Encounter RTL Compiler, Mentor Graphics Precision, Synopsys Design Compiler, and Synplicity Synplify. It also has two FPGA-specific code generators, for Xilinx and Altera.

The RTL code can be verified using functional verification products, including Cadence Incisive, Mentor Graphics ModelSim, and Synopsys VCS. In all cases, the RTL is “correct by construction,” which should remove the requirement to debug code.

But FPGAs are no longer just logic gates. Simulink can export executable C code, which may be compliant with standards like MISRA-C, to target microprocessors, including the ARM cores that are now appearing in FPGAs, and for DSPs embedded in the fabric.

The validation tools that are part of the tool chain can be used to exercise the design and the results compared with those for the model. Discrepancies, if any, can be investigated to see what has caused them and to track down the issues causing them.

Equally, the first physical implementation of an FPGA can be tested, using the test vectors created with SystemTest, and the results compared with the model.

Getting all this set up and running for the first project is going to be time-consuming, but once it is in place, there are considerable benefits.

At the first stage, just being able to exercise the specification as a model provides a way of identifying issues with the specification and resolving them. And there is ample evidence that the later these issues are found, the more it costs to correct them. Even if the issues can be corrected, there are manpower costs and project slippages. In extreme cases, a faulty specification can mean that the product never sees the light of day.

The other source of significant costs and delays in a project is when changes are made to the initial requirements as the project progresses. A model-based approach means that the model is re-jigged and re-exercised, and new code is generated. As we have seen, the model is the golden reference against which all other validation can be measured, and a revised model provides the data to evaluate verification at the later stages.

If the requirement is a family of devices, then a base model can be preserved and variant models created for each member of the family.

Zen is not concerned with worldly things, nor is it concerned with posing questions that can be answered. I would justify the title in three ways: it provides a tantalising hook to get you to read on, the article is as much about Zen as is Pirsig’s Zen and the Art of Motorcycle Maintenance, and finally, the question of “Why is a specification?” is still unresolved.

Leave a Reply

featured blogs
Nov 22, 2024
We're providing every session and keynote from Works With 2024 on-demand. It's the only place wireless IoT developers can access hands-on training for free....
Nov 22, 2024
I just saw a video on YouTube'”it's a few very funny minutes from a show by an engineer who transitioned into being a comedian...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

From Sensor to Cloud:A Digi/SparkFun Solution
In this episode of Chalk Talk, Amelia Dalton, Mark Grierson from Digi, and Rob Reynolds from SparkFun Electronics explore how Digi and SparkFun electronics are working together to make cellular connected IoT design easier than ever before. They investigate the benefits that the Digi Remote Manager® brings to IoT design, the details of the SparkFun Digi XBee Development Kit, and how you can get started using a SparkFun Board for XBee for your next design.
May 21, 2024
37,644 views