By John Blyler, Chief Content Officer
A panel of experts from Cadence, Mentor, NXP, Synopsys and Xilinx debate the reality and causes of the apparently widening verification gap in chip design.
A panel of semiconductor experts debate the reality and causes of the apparently widening verification gap in chip design.Does a verification gap exist in the design of complex system-on-chips (SoCs)? This is the focus of a panel of experts at DVCon 2014, which will include Janick Bergeron, Fellow at Synopsys; Jim Caravella, VP of Engineering at NXP – Harry Foster, Chief Verification Technologist at Mentor Graphics, John Goodenough, VP, ARM, Bill Grundmann, a Fellow at Xilinx; and Mike Stellfox, a Fellow at Cadence. JL Gray, a Senior Architect at Cadence, organized the panel. What follows is a position statement from the panelist in preparation for this discussion. – JB
Panel Description: “Did We Create the Verification Gap?”
According to industry experts, the “Verification Gap” between what we need to do and what we’re actually able to do to verify large designs is growing worse each year. According to these experts, we must do our best to improve our verification methods and tools before our entire project schedule is taken up by verification tasks.
But what if the Verification Gap is actually occurring as a result of continued adoption of industry standard methods. Are we blindly following industry best practices without keeping in mind that the actual point of our efforts is to create a product with as few bugs as possible, as opposed to simply trying to find as many bugs as we can?
Are we blindly following industry best practices …
Panelists will explore how verification teams interact with broader project teams and examine the characteristics of a typical verification effort, including the wall between design and verification, verification involvement (or lack thereof) in the design and architecture phase, and reliance on constrained random in absence of robust planning and prioritization to determine the reasons behind today’s Verification Gap.
Grundmann: Here are my key points:
- Methodologies and tools for constructing and implementing hardware have dramatically improved, while verification processes appear to have not kept pace with the same improvements. As hardware construction is simplified, then there is a trend to have less resources building hardware but same or more resources performing verification. Design teams with 3X verification to hardware design are not unrealistic and that ratio is trending higher.
… we have to expect to provide a means to make in-field changes …
- As it gets easier to build hardware, performing hardware verification is approaching software development type of resourcing in a project.
- As of now, it very easy to quickly construction various hardware “crap”, but it very hard to prove any are what you want.
- It possible that we can never be thoroughly verification “clean” without delivering some version of the product with a reasonable quality level of verification. This may mean we have to expect to provide a means to make in-field changes to the products through software-like patches.
Stellfox: Most chips are developed today based on highly configurable modular IP cores with many embedded CPUs and a large amount of embedded SW content, and I think a big part of the “verification gap” is due to the fact that most development flows have not been optimized with this in mind. To address the verification gap, design and verification teams need to focus more on the following:
- IP teams need to develop and deliver the IP in a way that it is more optimized for SoC HW and SW integration. While the IP cores need to be high quality, it is not sufficient to only deliver high quality IP since much of the work today is spent in integrating the IP and enabling earlier SW bring-up and validation.
There needs to be more focus on integrating and verifying the SW modules with HW blocks …
- There needs to be more focus on integrating and verifying the SW modules with HW blocks early and often, starting at the IP level to Subsystem to SoC. After all, the SW APIs largely determine how the HW can be used in a given application, so time might be wasted “over-verifying” designs for use cases which may not be applicable in a specific product.
- Much of the work in developing a chip is about integrating design IPs, VIPs, and SW, but most companies do not have a systematic, automated approach with supporting infrastructure for this type of development work.
Foster: No, the industry as a whole did not create the verification challenge. To say this lacks an understanding of the problem. While design grows at a Moore’s Law rate, verification grows at a double exponential rate. Compounded with increased complexity due to Moore’s Law are the additional dimensions of hardware-software interaction validation, complex power management schemes, and other physical effects that now directly affect functional correctness. Emerging solutions, such as constrained-random, formal property checking, emulation (and so on) didn’t emerge because they were just cool ideas. The emerged to address specific problems. Many design teams are looking for a single hammer that they can use to address today’s verification challenges. Unfortunately, we are dealing with an NP-hard problem, which means that there will never be a single solution that will solve all classes of problems.
Many design teams are looking for a single hammer that they can use to address today’s verification challenges.
Historically, the industry has always addressed complexity through abstraction (e.g., the move from transistors to gates, the move from gates to RTL, etc.). Strategically, the industry will be forced to move up in abstraction to address today’s challenges. However, there is still a lot of work to be done (in terms of research and tool development) to make this shift in design and verification a reality.
Caravella: The verification gap is a broad topic so I’m not exactly sure what you’re looking for, but here’s a good guess.
Balancing resource and budget for a product must be done across much more than just verification.
Jasper: [Editor’s Note: Although not part of the panel, Jasper provided an additional perspective on the verification gap.]
- Customers are realizing that UVM is very “heavy” for IP verification. Specifically, writing and debugging a UVM testbench for block and unit level IP is very time consuming task in-and-of-itself, plus it incurs an ongoing overhead in regressions when the UVC’s are effectively “turned off” and/or simply used as passive monitors for system level verification. Increasingly, we see customers ditching the low level UVM testbench and exhaustively verifying their IPs with formal-based. In this way, the users can focus on system integration verification and not have to deal with bugs that should have been caught much sooner.
UVM is very “heavy” for IP verification.
- Speaking of system-level verification: we see customers applying formal at this level as well. In addition to now familiar SoC connectivity and register validation flows, we see formal replacing simulation in architectural design and analysis. In short, even without any RTL or SystemC, customers can use an architectural spec to feed into formal under-the-hood to exhaustively verify that a given architecture or protocol is correct by construction, won’t deadlock, etc.
- The need for sharing coverage data between multiple vendors’ tool chains is increasing, yet companies appear to be ignoring the UCIS interoperability API. This is creating a big gap in customers’ verification closure processes because it’s a challenge to compare verification metrics across multi-vendor flows, and they are none too happy about it.