Published in October / November 2007 issue of Chip Design Magazine
Future Verification Appears UncertainThe EDA market is struggling to solve new verification challenges.
The EDA industry has faced similar challenges in the past-typically by drawing upon the innovations of smaller EDA firms. But according to Rich Faris, Vice President of Marketing and Business Development at Real Intent Inc.-a 25-person verification-focused EDA startup-changes in the business models for EDA vendors and a hostile business environment may kill the entrepreneurial goose that's been laying the golden eggs of innovation. As Faris puts it, "There are some interesting trends that verification companies have to overcome to be successful."
THE TECHNICAL CHALLENGE
For at least a decade, verification has been an extraordinarily stable segment of the EDA market. After all, checking functionality at the register transfer level (RTL) was a well-understood problem. Even as the number of gates inside average chips doubled and redoubled, EDA vendors continued to ride the curve through standard techniques like standardization and performance improvements.
For example, standardization around System Verilog has made it possible for both large and small EDA firms to add incremental value to the verification process, states Robert Hum, Vice President and General Manager of the Design Verification and Test Division at Mentor Graphics. "System Verilog has established itself as a viable platform, which has been of enormous benefit to the EDA industry," he explains. Similarly, a great deal of development effort has been expended in performance improvements to help verification keep pace with the increased demands of smaller geometries. Borgstrom cites the example of FASTSPICE, which can run 50 times faster than the original SPICE analog verification tool. In doing so, it allows designers to perform multiple verification runs to test for different problems that might occur in the chip.
According to Smith, however, the linear improvements resulting from standardization and performance improvements won't be sufficient to solve the verification challenges of the future. The reason is that the nature of chip design is about to change radically. "The problem is that we're moving away from a design model based upon RTL to a design model based upon ESL. That means that today's verification techniques-based largely on RTL-are destined to become obsolete."
Figure 1: A virtual platform is depicted here. (courtesy Synopsys)
One culprit is the system-on-a-chip (SoC). By forcing designers to address software as well as hardware, it raises the conceptual design above the RTL. "The industry learned rather quickly that it's entirely possible to design an SoC that was 100% verified at the RTL, but won't run the software as intended," explains Faris. Another culprit is the increased use of semiconductor IP to make massive designs more manageable. "IP introduces an additional level of abstraction in the design that makes traditional verification less effective," says Borgstrom.
According to Smith, changes in the larger electronics market also are demanding new verification techniques. He cites the example of the first third-generation (3G) cell phones introduced in Europe. "They were good at warming your ears for the 15 minutes of up time that they supported," he quips. It turns out that the only way to build phones that don't gobble power is to design chips with circuits that consume different levels of power depending upon what they need at any one point in time. "There's no way to represent variable power models in RTL. Therefore, it's essentially impossible to verify that function using standard verification tools," explains Smith.
As usual, EDA vendors have been hard at work trying to solve these knotty verification problems. Synopsys, for example, has developed what it calls a "virtual platform." That platform allows higher-level functionality to be verified through a simulation of how the end-user device will behave (see Figure 1). "It's a software model of a complete system that includes the elements that would go into your end device," explains Borgstrom.
Similarly, EDA vendors are creating higher-level tools that enable verification at the IP level. "Mentor has developed `assertion' tools that test whether an IP block is used correctly inside a given design as well as `coverage-based verification,' which can produce consolidated reports about how much of the chip can be verified against the original design spec," he explains.
Other problems, such as the ability to verify circuitry in a variable power environment, still have to be fully solved. "As software and hardware design merge, there will be a need for a wider variety of verification tools as well as a higher-level methodology that defines the entire verification process," says Smith. He points out that the large EDA vendors have failed to come up with such a methodology. As usual, they're depending upon innovation in smaller EDA firms to take up the slack. "The big vendors are always looking to acquire startups working on innovative techniques."
And there's the rub, comments RealIntent's Faris. Startup firms have been the innovation engine of the EDA industry. But it's not clear if that will be the case with the demand for new verification tools.
NEW GAME RULES
That engine may sputter and even perhaps die because of changes in the way that the large EDA firms do business in conjunction with a general business climate that's more hostile to small companies. For example, traditional EDA vendors are now in competition with the FPGA chip vendors that bundle their verification software into the price of each chip. "Because FPGA vendors supply software for a very low cost along with the sale of FPGA devices, it appears to be extremely inexpensive, which devalues software by any outside parties that address the problem in non-FPGA design projects," explains Faris.
In addition, the large EDA vendors have vigorously promoted the idea of product bundling-allowing a customer to combine different products as needed based upon a fixed price. This approach encourages designers to view verification software as "free." Once their design firm has purchased the bundle, it is typically little or no additional cost to mix in any tool. "When looking at the purchase of a given point tool," Faris continues, "it appears to the design firm that they're getting that point tool from the big vendor for free, which forces startups to offer their products at a lower price point than otherwise might be the case."
The net effect of all of this "free" verification software creates a disincentive for smaller firms to get into the verification space. "You need to have products that are an order-of-magnitude better than those from other firms. And even then, you can't command a premium price," Faris explains. Indeed, the need for good verification software has been growing. Relatively, though, the amount of revenue paid for verification software has remained basically proportional to the rest of the EDA market-rather than a growth spurt reflecting the increased demand (see Figure 2).
Figure 2: Quarterly revenue is shown according to different EDA market segments. (courtesy EDAC).)
To make matters worse, verification startups now face a business climate that's more hostile to small firms. Traditionally, EDA firms have had the option of an IPO in order to raise capital to fund expansion. Today, however, firms seeking IPO dollars must first undergo a Sarbanes Oxley audit. This audit can cost as much as $1.5 million dollars even for a small firm, states Andrew Yang, CEO of Apache Design Systems. "The larger vendors are using Sarbanes to intimidate smaller firms so that their only viable growth path is seen as being acquired by one of the big three," he warns.
The result is a business climate that may discourage some EDA entrepreneurs from entering the verification segment. Such reluctance could adversely impact the ability of semiconductor firms to overcome the verification complexity that is inherent in smaller chip geometries. Those small geometries will abound in the plethora of products that are destined for the consumer-electronics market.
While Smith believes that the EDA industry will eventually produce the required tools, he sees a market that's very much in turmoil. "We're still in the middle of the transition from one level of verification to another and it's not at all clear if the transformation might not change the landscape of the EDA market," he says. One thing is certain, though. It will be a long time before anybody thinks of verification as the industry's "sleepy backwater" area again.