Published on October 28th, 2010
Chip complexity is driving verification requirements. As process nodes continue to shrink and complexity grows, the dominant problem faced by engineering teams lies not in actually creating new designs, but in creating them correctly, on time, while managing risk. The ability to develop new products is no longer bound by just the ability to design them, but even more so by the ability to verify them. As the industry extends beyond 100+ million gate designs, functional verification has become the most critical and difficult phase of the design flow.
Verification is both a complex and unbounded process. In today’s designs, it’s impossible to determine when every single function or use case has been verified. The verification space for today’s designs is growing exponentially and hence requires progressively more resources, both in terms of compute-power as well as large engineering teams. One of the most difficult aspects of functional verification is that there is no clear indication of when verification is “done.” In fact, it is well understood that verification is not something to ever consider complete, but, instead, verification teams must make sure that an acceptable level of coverage has been reached as a measure of confidence in the design. The effectiveness of a verification environment is therefore measured by how quickly verification closure can be reached.
The verification challenge is further compounded by the requirement of verifying a combination of internal and external IP at different levels of abstraction: digital, analog and mixed-signal, software, black-box design and verification IP—in a globally distributed design and verification environment.
Cost of Verification
The nonlinear growth in verification complexity has put a tremendous strain on compute infrastructures for chip development. Compute farms used for verification often range from thousands to tens of thousands of servers, with annual operating and depreciation expense in the millions of dollars. Long verification runs and extremely large projects with terabytes of verification data can strain IT infrastructures, which are typically configured for more general corporate requirements. According to IBS, the cost of verification is growing at an exponential rate and already makes up the highest portion of the overall design cost.
Overall Design Cost Breakdown
As designs have grown larger, their functionality has also grown more complicated. Today’s verification teams must create more tests to verify more logic, but the tests themselves are also becoming more complex. An IC’s blocks may perform many different functions, and many ICs include arrays of replicated blocks—often even multiple processing units. The number of external and internal interfaces has increased, thereby increasing the complex interactions that must be considered in the overall verification strategy. Verification of complex power management schemes adds even greater levels of complexity to the verification process. These power management schemes often require specific and involved interactions between portions of the chip as well as intricate handling of voltage islands and dynamic voltage scaling.
Further integration of functions on individual chips results in many designs that now include both analog and digital blocks within the same package. They need to be verified at the same time in order to ensure that the entire design works, not just the individual blocks. The increased use of processor-based devices and systems on chips (SoCs) means that both the software and the hardware need to be verified working together. Software development needs to be done before silicon is available, and silicon needs to be verified using the actual target software.
As design sizes, design complexity and integration of functions increase, cost and time to debug also increase significantly. This increase includes the time needed for the verification environment to prepare and output debug information, as well as the time consumed by the debug process itself. The time and cost associated with debug is the largest single portion of the entire verification process.
Verification Requirements Faced by Industry Leading Designs
Leading-edge products tend to combine many of the complexity drivers mentioned above on the same IC. For example, today’s leading-edge low power microprocessors have more than 100 million gates combined with complex power management schemes and typically require several billion simulation cycles per week to meet the verification goals. In order to meet these objectives, a high-capacity, high-performance verification solution with advanced low power verification capabilities is required.
An advanced networking SoC, as another example of a leading edge product, would combine a 40+ million gate design with very complex subsystem interactions and board/software integration. To verify this design, a high-capacity verification solution and a powerful verification methodology are required, both of which are tightly integrated with a virtual-prototyping environment.
Similarly, a mobile SoC would combine several complexity drivers, for example 25 voltage islands, complex sub-system interactions and RF blocks, with several million lines of software running on the SoC. To sufficiently verify such a device, the verification environment must include an advanced methodology for low power verification and a mixed-signal simulation solution, all integrated with FPGA-based and virtual prototyping.
Today’s Verification Solution for Industry Leaders
As industry leaders typically are the first to tackle design complexity issues, it is imperative that they deploy a verification solution that they can rely on to meet all of the verification requirements dictated by the complexity drivers described previously. Only such a verification solution enables them to confidently manage risks and effectively address overall verification cost.
In order to meet the verification requirements of industry leaders, a verification solution must offer both superior verification technology and leading-edge innovations that address advanced challenges. This verification solution must provide a scalable architecture to accommodate the performance and capacity needs of leading-edge designs, and to enable technology additions as leading-edge customers encounter never-before-seen challenges. It must also provide a complete portfolio of verification automation applications, such as planning, coverage, constraint solver, debut, etc. and low power verification capabilities, verification IP and an advanced verification methodology. The verification solution must also support multiple languages and methodologies, and offer tightly integrated interfaces with analog/mixed signal as well as virtual platform capabilities.
Ultimately, industry leaders require leading-edge products that benefit from constant innovation. It is critical for the right verification solution to be supported by a forward-thinking and customer-focused team that embraces a collaborative approach to consistently innovate and add value to the users’ verification process.
Michael Sanie is director of verification product marketing at Synopsys. He has more than 20 years of experience in semiconductor design and design software. Prior to Synopsys, Sanie held executive and senior marketing positions at Calypto, Cadence and Numerical Technologies. Sanie started his career as a design engineer at VLSI Technology and holds four patents in design software. He holds BSEE and MSEE degrees from Purdue University and an MBA from Santa Clara University.