Part of the  

Chip Design Magazine

  Network

About  |  Contact

Headlines

Headlines

Specialists and Generalists Needed for Verification

Gabe Moretti, Senior Editor

Verification continues to take up a huge portion of the project schedule. Designs are getting more complex and with complexity comes what appears to be an emerging trend –– the move toward generalists and specialists. Generalists manage the verification flow and are knowledgeable about simulation and the UVM. Specialists with expertise in formal verification, portable stimulus and emulation are deployed when needed.  I talked with four specialists in the technology:

David Kelf, Vice President Marketing. OneSpin Solutions,

Harry Foster, Chief Scientist Verification at Mentor Graphics

Lauro Rizzatti, Verification Consultant, Rizzatti LLC, and

Pranav Ashar, CTO, Real Intent

I asked each of them the following questions:

- Is this a real trend or a short-term aberration?

- If it is a real trend, how do we make complex verification tools and methodologies suitable for mainstream verification engineers?

- Are verification tools too complicated for a generalist to become an expert?

David: Electronics design has always had its share of specialists. A good argument could be made that CAD managers were specialists in the IT department, and that the notion of separate verification teams was driven by emerging specialists in testbench automation approaches. Now we are seeing something else. That is, the breakup of verification experts into specialized groups, some project based, and others that operate across different projects. With design complexity comes verification complexity. Formal verification and emulation, for example, were little-used tools and only then for the most difficult designs. That’s changed with the increase in size, complexity and functionality of modern designs.

Formal Verification, in particular, found its way into mainstream flows through “apps” where the entire use model is automated and the product focused on specific high-value verification functions. Formal is also applied manually through the use of hand-written assertions and this task is often left to specialist Formal users, creating an apparently independent group within companies who may be applied to different projects. The emergence of these teams, while providing a valuable function, can limit the proliferation of this technology as they become the keepers of the flame, if you like. The generalist engineers come to rely on them rather than exploring the use of the technology for themselves. This, in turn, has a limiting factor on the growth of the technology and the realization of its full potential as an alternative to simulation

Harry: It’s true, design is getting more complex. However, as an industry, we have done a remarkable job of keeping up with design, which we can measure by the growth in demand for design engineers. In fact, between 2007 and 2016 the industry has gone through about four iterations of Moore’s Law. Yet, the demand for design engineers has only grown at a 3.6 percent compounded annual growth rate.

Figure 1

During this same period, the demand for verification engineers has grown at a 10.4 percent compounded annual growth rate. In other words, verification complexity is growing at a faster rate than design complexity. This should not be too big a surprise since it is generally accepted in the academic world that design complexity grows at a Moore’s Law rate, while verification complexity grows at a much steeper rate (i.e., double exponential).

One contributing factor to growing verification complexity is the emergence of new layers of verification requirements that did not exist years ago. For example, beyond the traditional functional domain, we have added clock domains, power domains, security domains, safety requirements, software, and then obviously, overall performance requirements.

Figure 2

Each of these new layers of requirements requires specialized domain knowledge. Hence, domain expertise is now a necessity in both the design and verification communities to effectively address emerging new layers of requirements.

For verification, a one-size-fits-all approach to verification is no longer sufficient to completely verify an SoC. There is a need for specialized tools and methodologies specifically targeted at each of these new (and continually emerging) layers of requirements. Hence, in addition to domain knowledge expertise, verification process specialists are required to address growing verification complexity.

The emergence of verification specialization is not a new trend; although, perhaps it has become more obvious due to growing verification complexity. For example, to address the famous floating point bug in the 1990’s it became apparent that theorem proving and other formal technology would be necessary to fill the gap of traditional simulation-based verification approaches. These techniques require full-time dedication that generalist are unlikely to master because their focus is spread across so many other tools and methodologies. One could make the same argument about the adoption of constrained- random, coverage-driven testbenches using UVM (requiring object-oriented programing skills, which I do not consider generalist skills), emulation, and FPGA prototyping. These technologies have become indispensable in today’s SoC verification/validation tool box, and to get the most out of the project’s investment, specialist are required.

So the question is how do we make complex tools and methodologies suitable for mainstream verification engineers? We are addressing this issue today by developing verification apps that solve a specific, narrowly focused problem and require minimal tool and methodology expertise. For example, we have seen numerous formal apps emerge that span a wide spectrum of the design process from IP development into post-silicon validation.  These apps no longer require the user to write assertions or be an expert in formal techniques. In fact, the formal engines are often hidden from the user, who then focuses on “what” they want to verify, versus the “how.” A few examples include: connectivity check used during IP integration, register check used to exhaustively verify control and status register behavior

against its CSV or IP-XACT register specification, and security check used to exhaustively verify that only the paths you specify can reach security or safety-critical storage elements. Perhaps one of the best- known formal apps is clock-domain crossing (CDC) checking, which is used to identify metastabilty issues due to the interaction of multiple clock domains.

Emulation is another area where we are seeing the emergence of verification apps. For example, deterministic ICE in emulation, which overcomes unpredictability in traditional ICE environments by adding 100 percent visibility and repeatability for debugging and provides access to other “virtual- based” use models. In addition, DFT emulation apps that accelerate Design for Test (DFT) verification prior to tape-out to minimize the risk of catastrophic failure while significantly reducing run times when verifying designs after DFT insertion.

In summary, the need for verification specialists today is driven by two demands: (1) specialized domain knowledge driven by new layers of verification requirements, and (2) verification tool and methodology expertise. This is not a bad thing. If I had a brain aneurysm, I would prefer that my doctor has mastered the required skills in endoscopy and other brain surgery techniques versus a general practitioner with a broad set of skills. Don’t get me wrong, both are required.

Lauro: In my mind, it is a trend, but the distinction may blur its contours soon. Let’s take hardware emulation. Hardware emulation always required specialists for its deployment, and, even more so, to fully optimize it to its fullest capacity. As they used to say, it came with a team of application engineers in the box to avoid that the time-to-emulation would exceed the time-to-first silicon. Today, hardware emulation is still a long way from being a plug-and-play verification tool, but recent developments by emulation vendors are making it easier and more accessible to use and deploy by generalists. The move from the in-circuit-emulation (ICE) mode driven by a physical target system to transaction-based communications driven by a virtual testbed designates it a datacenter resource status available to all types of verification engineers without specialist intervention. I see that as a huge step forward in the evolution of hardware emulation and its role in the design verification flow.

Pranav: The generalist vs. specialist discussion fits right into the shifting paradigm in which generic verification tools are being replaced by tools that are essentially verification solutions for specific failure modes.

The failure modes addressed in this manner are typically due to intricate phenomena that are hard to specify and model in simulation or general-purpose Assertion Based Verification (ABV), hard to resolve for a simulator or unguided ABV tool, whose propensity for occurrence increases with SOC size and integration complexity, and that are often insidious or hard to isolate. Such failure modes are a common cause of respins and redesign, with the result that

sign-off and bug-hunting for them based on solution-oriented tools has become ubiquitous in the design community.

Good examples are failures caused by untimed paths on an SOC, common sources of which are asynchronous clock-domain crossings, interacting reset domains and Static Timing Analysis (STA) exceptions. It has become common practice to address these scenarios using solution-oriented verification tools.

In the absence of recent advances by EDA companies in developing solution-oriented verification tools, SOC design houses would have been reliant on in-house design verification (DV) specialists to develop and maintain homegrown strategies for complex failure modes. In the new paradigm, the bias has shifted back toward the generalist DV engineer with the heavy lifting being done by an EDA tool. The salutary outcome of this trend for design houses is that the verification of SOCs for these complex failures is now more accessible, more automatic, more robust, and cheaper.

My Conclusions

It is hardtop disagree with the comments by my interlocutors.  Everything said is true.  But I think they have been too kind and just answered the questions without objecting to their limitations.  In fact the way to simplify verification is to do improve the way circuits are designed.  What is missing from design methodology is validation of what has been implemented before it is deemed ready for verification.  Designers are so pressed for time, due to design complexity and short schedules, that they must find ways to cut corners.  They reuse whenever possible and rely on their experience to assume that the circuit does what is supposed to do.  Unfortunately in most cases where a bug is found during design integration, they have neglected to check that the circuit does not do what is not supposed to do.  That is not always the fault of EDA tools.  The most glaring example is the choice by the electronic industry to use Verilog over VHDL.  VHDL is a much more robust language with built-in checks that exclude design errors that can be made using Verilog.  But VHDL takes longer to write and design engineers decided that schedule time took precedence over error avoidance.

The issue is always the same, no matter how simple or complex the design is: the individual self-assurance that he or she knows what he or she is doing.  The way to make design easier to verify is to create them better.  That means that the design should be semantically correct and that the implementation of all required features be completely validated by the designers themselves before handing the design to a verification engineer.

I do not think that I have just demanded that a design engineer also be a verification engineer.  What may be required is a UDM: Unified Design Methodology.  The industry is, may be unconsciously, already moving in that directions in two ways: the increased use of third party IP and the increasing volume of Design Rules by each foundry.  I can see these two trends growing brighter with each successive technology iteration: it is time to stop ignoring them.

Tags: , , , , , , , , , ,

Leave a Reply