Published in Winter 2011 issue of Chip Design Magazine
Ask almost any engineer who uses a verification tool for an assessment of it and the answer will be the same: big promises, inadequate results, and loads of frustration. Many an electronic-design-automation (EDA) company's R&D team toils away in earnest, working on the next breakthrough technology to tackle the verification challenge. But these socalled breakthroughs always seem to be missing something. That elusive, practical solution to the verification challenge needs to be cost effective while quickly providing quality of results. Easy to use would be ideal as well. To the EDA vendor, this combination may seem like an impossibility. To the verification engineer, however, doing without any one of these aspects is no longer an option—especially when verification continues to devour 70% of an ever-shrinking project cycle.
A bit of chip design history may be needed here to show that EDA has attempted through the years to provide the right solution for the job. Breadboard emulators were all the rage in the early 1980s and are still in use for some applications. When integrated circuits (ICs) had gates from a few hundred to a few thousand, a breadboard emulator could verify and debug a design in its target system environment before silicon was fabricated. Because the design was tested under real operating conditions, functional correctness was assured. Once a chip had more than about 10,000 gates, however, breadboarding became impractical.
Event-driven simulation came next. This tool is also used today at the register transfer level (RTL) because it allows accurate functional and timing verification. Simulators are easy to use. In addition, they provide the best debugging capabilities of any verification solution and can be purchased at commodity pricing when acquired in large quantity. However, they lose their effectiveness on chip designs at the gate level or when design sizes reach into the tens of millions of application-specific-integrated-circuit (ASIC) -equivalent gates. Because of poor runtime performance, they aren't useful for testing a design in its target system either. They can be used to simulate small fractions of device behavior, which results in functional failures going undetected and—worse yet— costly design respins.
Fortunately, EDA has made great strides in the last 10 years with more to come. Many technologies have narrowed the gap between engineering goals and results from the conventional logic-verification blueprint, which uses software simulation driven by hardware-descriptionlanguage (HDL) testbenches. For example, formal or static verification methods don't require test vectors. While they're mostly effective, they don't test design functionality. Only dynamic testing driven by testbenches is effective because embedded software, such as drivers, real-time operating systems (RTOSs), and custom applications, must be tested.
Test languages and C/C++ libraries of test functions have improved productivity through the sheer number of tests generated. In addition, functional-verification coverage tools have increased confidence in the testbenches produced with hardware verification languages (HVLs). However, neither addresses how to reduce the amount of time needed to apply those tests. Another issue is that they cannot be used when developing embedded software—an increasingly important verification challenge.
That's one of the reasons why hardware-assisted verification has become a popular, must-have solution for system-ona- chip (SoC) designs. Hardware-assisted verification tools, which include hardware emulation, have been around since the early days of EDA. They have developed a reputation for being expensive, slow, and hard to use. The latest generation of hardware emulation is quickly changing that perception, however. These tools are now easy to use and more cost effective while quickly providing quality of results. Also, these hardware emulators can be used by both the embeddedsoftware team and hardware designers. In fact, hardware emulation is considered a universal verification tool. It's even becoming increasingly popular as a solution to the runtime problems associated with event based simulation. Hardware emulators have a smaller footprint, thereby saving space, power, and infrastructure costs. They can execute at several megahertz or speeds close to real time, making them suitable for in circuit test. Meanwhile, EDA continues to innovate and develop new tools and methodologies to tackle the verification challenge.
Lauro Rizzatti is general manager of EVEUSA. He has more than 30 years of experience in the EDA and ATE industries in roles like top management, product marketing, technical marketing, and engineering. lauro@eve-team. com, www.eve-team.com