High Power Tools for Low Power
Gabe Moretti, Senior Editor
The subject of design for low power, it is a miracle we do not have DLP in the vocabulary, continues to be in the minds of EDA people. I talked about it with Krishna Balachandran, product marketing director for low power at Cadence.
I asked Krishna how pervasive is the need to conserve power among the Cadence users community.
Krishna responded: “The need to conserve power is pervasive in IP and chips targeting mobile applications. Interest in conserving power has spread to include plug-in-the-wall target applications because of government regulations regarding energy usage and the need to be green. Almost all integrated circuits designed today exercise some power management techniques to conserve power. Power conservation is so pervasive that it has now become the third optimization metric in addition to area and speed. Another noticeable trend is that while power management techniques are widespread in both purely analog and purely digital designs, mixed-signal designs are increasingly becoming power-managed.”
Given how pervasive the focus on reducing power seems to be I wondered how do tools for low power help in deciding tradeoffs? I know, it is quite an open ended question and I got a very long answer from Krishna.
“Power estimation and analysis must be done early and accurately for maximizing the possible tradeoffs. If a design is only optimized for power for the hardware without comprehending how the software will take advantage of the power management in the hardware, power estimates will be inaccurate and problems will be found too late in the product development cycle. Power estimation is increasingly done employing hardware emulation platforms which are excellent at identifying peak-power issues by running real-life software application scenarios on the hardware being designed that allows appropriate power architecture modifications at an early stage.
Thorough functional verification of the register-transfer level (RTL) is very important to weed out any power-related design bugs. Metric-driven power-aware simulation, simulation acceleration and tools providing powerful low power debug capabilities are deployed to ensure that the design is functional in all the power modes and help shorten the overall verification time. Low power static and exhaustive formal verification complement simulation to find functional design bugs as well as issues with the power intent specification.
Smart leakage and dynamic power optimization options in logic synthesis and Place-and-Route (P&R) tools work hard by using power as a cost function in addition to area and speed. It is important that P&R tools do not undo any intelligent optimizations done for power at the logic synthesis stage. For example, placement-aware logic synthesis tools can swap out single bit flip flops and replace them with multi-bit flops referred to as Multi-Bit Cell Inferencing (MBCI), which can significantly reduce the load on the clock tree, a major power consumer in today’s System-On-Chip (SoC) designs. A Multi-Mode Multi-Corner (MMMC) optimization capability is a must to simultaneously optimize power while meeting timing. A rich set of power switch and enable chaining options and integration with a power-analysis and signoff engine are needed in your P&R tool to help identify the appropriate number and locations of power switches. Power switch analysis, insertion and placement capabilities tradeoff ramp time for the power domain that is waking up vs. the rush current and IR drop in the neighboring power domains that were already on.-Coupled with this, Graphical User Interface (GUI) capabilities within the P&R system need to allow the designer to interactively specify the switch and enable chaining topology to deliver a power efficient design that takes the tradeoffs between ramp time, IR drop and rush currents into consideration. All successive refinements of the design need to be verified vs. the original design intent (the RTL), and low power equivalence and static and formal verification tools do just that.
Mixed-signal low power designs pose unique challenges for both implementation and functional verification. Tools must be able to stitch together the digital blocks in an analog schematic environment taking into account the disparate representations of power in the two domains. Verification solutions must be able to check the interfaces of the analog/digital domains that are frequently a source of errors such as missing level shifters. Without employing such tools, the process is manual and error prone with design bugs creeping into silicon.”
Since it is important to plan power optimization solutions at the system level what is the approach that Cadence envisions as the most productive? I know you have covered this somewhat in your previous answer but I would like to focus on the subject more. Krishna was very polite in focusing his answer.
“ES-level tools provide the biggest bang for the buck when it comes to power optimization. The sooner you optimize, the more options you have for making excellent trade-offs. A good example of this is a High Level Synthesis tool that generates multiple micro-architectures from a given high-level description of an algorithm in C/C++/SystemC and helps the designer make the trade-off between area, speed and power for the target design. Since it operates at a pre-RTL stage, the resulting power/area/speed trade-offs are very impactful. Furthermore, it is desirable to integrate power estimation and logic synthesis engines within the High Level Synthesis tool, thus ensuring a high degree of correlation with downstream implementation tools.”
I think we will see increased attention to power optimization tools at the system level in the short term. Feedback from designers and architects will help in defining the problem better. And it is my hope that hardware engineers will be able to teach software developers how to use hardware more efficiently. In my professional life time we have gone from having to count how many bites resulted from the code to considering hardware as an unlimited asset. The time might be here to start considering how the software code impacts hardware again.