feature article
Subscribe Now

Making FPGAs Cool Again – Part 2

How Tools Unlock the Hardware Power Capabilities

A couple weeks ago we looked at the state of FPGA low-power design from the standpoint of hardware. We saw a range of features, from very little to branded feature sets. But none of that matters without tools: tools are the window into the silicon, and no silicon feature has a shred of value unless a tool uses it (as can be testified to by the scores of now-defunct PLD businesses that were run by “the cheapest silicon always wins and software is annoying” types). And with a domain like low-power design, the tools can have features on their own even if there are no explicit hardware features to exploit: more intelligent use of plain-vanilla silicon can reduce power as well.

So this week we look at the ways in which various FPGA vendors have addressed the power problem in their tools. There are really two parts to that story: analysis and synthesis. It doesn’t make sense to try to design for low power if there’s no way to see how much power you’re consuming.

Seeing is believing

Back in the day, when ICC fell off the datasheet in any meaningful way, there was pretty much no way to know how much power your design would consume. The only answer you could get was, “It depends, and it’s too hard to figure out.” Eventually, spreadsheet approaches came around where you could put together a simplistic power model and at least kind of eyeball your ballpark power. Early attempts to model power relied solely on a “percent activity” measure that was used to derive some estimate of dynamic power. This, while somewhat sound academically, was a bit of a cop-out for the vendors, since who can estimate, over dozens or hundreds of unrelated signals, what an overall activity percentage is? It makes sense as an early rough estimation, but as a final design power calculator, it gives you more of a bracketing capability – if the power is acceptable for some overestimate of activity, then it should be ok in real life. Today, with much much larger designs, and with multiple modes of operation and multiple clock domains, an activity estimate becomes much less satisfying in the final stages of a design.

Most analysis tools now allow you to take a Voltage Change Dump (VCD) file from your simulation and apply it to the power analysis tools. This lets your tool look at realistic signaling and can give a much truer sense of power consumption – if the simulation is good. If the VCD file has good coverage, including all modes and corner cases in roughly the frequency they would actually occur, then your power profile will be a good reflection of what you can expect in real life. If not, well, your mileage will vary.

The other variable on the accuracy of power analysis tools is the nature of the power model. All vendors will provide nominal power, but worst-case conditions will be, well, worse. Actel believes their estimate to be good within 10% or so, which will improve when they finish their qual and can tighten up the numbers. Altera also claims high accuracy on their models.

Of course, getting an accurate estimate of power takes time, since you need to have a good simulation set and a well-evolved design. On the other hand, if you’re just in early days and want a rough estimate of power, Lattice has a downloadable power estimator that doesn’t require any RTL. You simply estimate the switching activity and the resources that will be required by your design as a percent, then they estimate the routing required for those resources based on statistics they’ve run and give you a power estimate. Altera provides a similar tool for early estimation, with the high-accuracy detailed estimators available once the detailed design is in place; their intent is to allow refinement as the design progresses.

Actel, as part of their focus on power and battery applications, allows you to enter the amp-hours of the battery you plan to use, and their tool will estimate the battery life, eliminating one more manual calculation that would be required to get to the bottom line for a battery-based application. They have also provided lots of granularity on viewing power: you can cut power consumption by block, by clock domain, by rail, or by type of logic. If not using a VCD, you can specify the percentage of time in different modes of operation to refine an otherwise coarse estimate.

There are also tools that can point to design changes that may reduce power. Actel will identify logic hazards, which are a source of wasted power. Any spurious switching consumes dynamic power, so eliminating hazards lowers power. They are looking to add the ability to actually eliminate the hazards automatically in the future, but for now it points them out and you eliminate them. Altera has a Power Advisor tool that will scan the design and point out various design changes through which power can be reduced.

Taking control

When it comes to actually reducing power, options become a bit more limited. Actel, Altera, and Xilinx have different forms of power-driven fitting. Actel’s fitter has an option such that after performance constraints have been met, you can have it keep running to reduce power by tweaking clock routing and shortening net lengths, typically resulting in 10-30% lower power.

Altera’s PowerPlay tool can take a similar approach, going back and reducing power after performance is hit until slack disappears. This tool uses the low-power settings on the logic array blocks (LABs), clocks, and memories to reduce power where possible. This can actually be a complicated business, since a LAB can host more than one logic function, and in fact the “register packing” function can borrow an unused register from one LAB for some other function. So the tool is smart enough to know not to borrow a low-power LAB register for a high-speed function if that will kill the slack. The synthesis tool will map user RAMs for less power and select logic block inputs in a manner that reduces capacitance on active signals. Altera claims typical power reduction of 10% as a result of the automatic optimization steps.

Xilinx’s XST synthesis tool and their ISE fitter each have a low-power setting that has shown a 6-10% power reduction across their suite of benchmarks. It will do such things as creating enable logic in front of the memories and using the MREG pipeline register in the DSP48 instead of the PREG, as well as attempting to minimize net length. They also have an optimize-for-area setting that can be used to reduce static power, but that setting causes the timing constraints to be ignored, so it’s kind of an either-or thing.

One thing that doesn’t seem to have crossed the radar here yet is the use of a power-intent description. While a couple of standards are evolving in IC-land (the UPF and CPF formats), they haven’t been taken up by any FPGA vendors. Xilinx did say they were looking at creating their own internal format, but they weren’t ready to say anything more than the fact that it wouldn’t be UPF or CPF. Hopefully, if they do indeed roll their own, there will be a clear motivator as to why the existing external standards couldn’t work.

Meanwhile, it does feel like power is getting more attention in FPGA silicon and tool design, so we’re sort of on a curve now, and things should continue to evolve. Some companies have made power a centerpiece in their strategies; others are simply accommodating it. Given the occasionally irrationally fiercely competitive nature of this market, any advantage that one company may get will not escape the vigilant covetous eyes of the other, and tit will presumably be met with tat, hopefully all to the benefit of FPGA designers, as FPGAs, once notorious power consumers, try to reposition themselves as active participants in an increasingly power-aware world. We may even see, at some point, a novel demonstration of low power… perhaps some kind of circuit driven by a primitive battery, like maybe a grapefruit…

Leave a Reply

featured blogs
Dec 2, 2024
The Wi-SUN Smart City Living Lab Challenge names the winners with Farmer's Voice, a voice command app for agriculture use, taking first place. Read the blog....
Dec 3, 2024
I've just seen something that is totally droolworthy, which may explain why I'm currently drooling all over my keyboard....

Libby's Lab

Libby's Lab - Scopes Out Silicon Labs EFRxG22 Development Tools

Sponsored by Mouser Electronics and Silicon Labs

Join Libby in this episode of “Libby’s Lab” as she explores the Silicon Labs EFR32xG22 Development Tools, available at Mouser.com! These versatile tools are perfect for engineers developing wireless applications with Bluetooth®, Zigbee®, or proprietary protocols. Designed for energy efficiency and ease of use, the starter kit simplifies development for IoT, smart home, and industrial devices. From low-power IoT projects to fitness trackers and medical devices, these tools offer multi-protocol support, reliable performance, and hassle-free setup. Watch as Libby and Demo dive into how these tools can bring wireless projects to life. Keep your circuits charged and your ideas sparking!

Click here for more information about Silicon Labs xG22 Development Tools

featured chalk talk

Machine Learning on the Edge
Sponsored by Mouser Electronics and Infineon
Edge machine learning is a great way to allow embedded devices to run applications that can collect sensor data and locally process that data. In this episode of Chalk Talk, Amelia Dalton and Clark Jarvis from Infineon explore how the IMAGIMOB Studio, ModusToolbox™ Software, and PSoC and AURIX™ microcontrollers can help you develop a custom machine learning on the edge application from scratch. They also investigate how the IMAGIMOB Studio can help you easily develop and deploy AI/ML models and the benefits that the PSoC™ 6 Artificial Intelligence Evaluation Kit will bring to your next machine learning on the edge application design process.
Aug 12, 2024
56,203 views