WHERE'S THE VALUE IN DFM? D OR M?
The term design-formanufacturing (DFM) has come to mean many different things of late, as numerous EDA providers jump on and off the so-called DFM bandwagon. In its simplest definition, DFM refers to a suite of tools that enable the creation of a design that can be manufactured. By this definition, all EDA tools can be labeled DFM. After all, I don't know of any tools that create designs with the goal that they shouldn't be manufactured!
While most of the new DFM-tool enhancements have value, it seems that the current emphasis has been on the interface between the final design layout and the manufacturing process. While this link is critical and must be effective, it has very limited influence on the creative process that I typically consider to be `design." By my definition, design involves making decisions based on analysis, experience, and instinct. As a design goes through each transformation and gets closer to the manufacturing process, the design choices that are available become very limited. For example, there are layout rules that must be adhered to in order to avoid shorts and opens. In addition, minor layout corrections must be made to ensure that the mask patterns can be printed. To me, however, making sure that the design can be "made" is not really "designing."
The promise that the term DFM implies--and the one bought into by many VCs--is that tools will become available to enable the early "design" steps. They will therefore account for the underlying manufacturing process so that the design will get into volume production sooner and with better yield. In addition, the design should be of higher quality in terms of all the key metrics like timing, power, leakage, and area.
This may be somewhat of a Holy Grail. In order to be able to influence a design and make it less subject to the vagaries of the manufacturing process, however, it's obvious that the process needs to be changed. Specifically, it needs to be abstracted into a form that can be easily digested by the tools that are used early in the design process, such as logic and physical synthesis.
Although many manufacturing effects can be predicted and analyzed, it is often too late in the design process. Typically, they are dependent on the final physical layout. For example, the lithography impact on a particular cell instance's timing won't be fully known until the context of that cell instance is known. This won't happen until the cell instance is placed and routed. Yet the design decisions about the size and type of cell instances to use and its related neighbors have to be made before the cell is placed. Any analysis and fixing to account for lithographyinduced delay only have meaning if the fix can be localized and doesn't result in a complete new set of instances.
While it is necessary to check for potential manufacturing showstoppers throughout the design flow, what is really needed are abstract models that are based on actual process behavior. Statistical cell models offer this promise. They can be derived by characterizing cells using statistical device models--SPICE models based on and refined by actual silicon measurements. Statistical cell models can be used to represent any manufacturing variation, whether it's systematic or random, on or off chip, and correlated or uncorrelated.
The process for creating statistical cell models is relatively straightforward and no more complex than traditional cell characterization. The key technical challenges are turnaround time and accuracy. Each major parameter that affects the cell's electrical characteristics needs to be exercised. In addition, its impact must be tabulated. This will result in a 1X to 2X increase in runtime per parameter for systematic variation. Random variation places an even greater burden on runtime, as the impact of each random parameter (for example, voltage threshold) needs to be modeled for each transistor within a cell. On average, there are 25 transistors per cell in a typical library. This could mean at least a 25X increase in runtime.
Thankfully, there's a new method for dramatically reducing characterization runtime. By taking an "inside view" of each cell and understanding the cell's electrical internals, nominal characterization runtime can be sped up by an order of magnitude. Accurate statistical cell modeling can be achieved with runtimes that are similar to those taken by existing characterization methods for nominal characterization. Model accuracy can be assured via validation against Monte Carlo simulations.
Armed with statistical models, an optimization tool should be able to generate a design that meets a realistic performance/yield tradeoff. Such an approach is better than forcing the design choices into an unrealistic corner, where yield is good but the cost is too high in terms of area, power, and leakage. As the underlying process evolves and matures, the statistical cell models can be easily kept in sync with the process via re-characterization. As a design evolves and more explicit layout and manufacturing information becomes known, statistical parameter terms can be easily disabled and replaced with deterministic counterparts. Consequentially, statistical and deterministic models can co-exist to give the best appropriate representation of the process for each step in the design flow.
In conclusion, only by abstracting process variations into statistical cell models and deploying these models throughout the design flow will the true promise of "design" for manufacturing become a reality.
Jim McCanny, CEO & founder of Altos Design Automation
Prior to Altos, Jim was the Timing and Signal Integrity Marketing Group Director at Cadence. He was the VP of Marketing and Business Development at CadMOS when they were acquired by Cadence in 2001. Before CadMOS, Jim was Executive VP at Ultima Interconnect Technology (which as Celestry was acquired by Cadence in 2003), Major Account Technical Program Account Manager at EPIC Design Technology (which IPO'ed in 1994) and a Member of Group Technical Staff at Texas Instruments. Jim holds a BS in Math/Computer Science from Manchester, UK and has over 25 years experience in EDA.
WHERE DOES DFM GO FROM HERE?
DFM has not been the EDAmarket- segment panacea that many expected. Last year was especially difficult. Industry pundit John Cooley declared DFM to be on a deathwatch and DFM pioneer Aprio underwent layoffs that reached as far as its CEO.
The fact of the matter is that the DFM picture isn't as bleak as many have painted. In fact, DFM's promise is quite promising. DFM will hit its stride and add unique value with manufacturing-intended DFM vendor tools that target the manufacturers, such as foundries and IDMs. These manufacturer-intended DFM products are used for OPC/RET generation and verification. They've been selling well across the board as more and more IDMs and foundries migrate to 130-nm and below process nodes, which mandate such tool usage.
DFM issues will become increasingly critical as we hit 45 nm, 32 nm, and below. At advanced processes, we're seeing that design-induced systematic yield loss is becoming more dominant than random yield loss. At the moment, most DFM-tool-development efforts are focused on injecting process information into the design flow with IC designers as the targeted users. This approach hasn't translated well into revenue, however. In fact, the result has been dismal. Designers are choosing not to vote their EDA tool budget for these designer-intended DFM tools.
There are some major obstacles for the adoption of designer-intended DFM tools. Design closure on timing and power for advanced technology nodes is already tough to attain. It consumes huge amounts of a designer's time. It will be difficult to convince individual users to add an additional layer to their design for DFM unless they can get to design closure and are mandated by their semiconductor suppliers. In addition, most IC designers are apathetic or even resistant to learning about how to design with IC processes and fabrication in mind.
Those designers (or their managers) who do see the need for DFM don't want their design flows to be upset with radical changes to the methodology. Unless designers can't get to design closure, we won't see them moving into DFMoriented EDA flows anytime soon. Of course, these design flows come from the large, full-line EDA vendors.
Given this market reality, the DFM startups are compelled to gamble their limited resources to build various interfaces for major EDA vendors' design flows. This trend significantly increases product development costs, which have to be added on top of basic technology and product R&D costs. One observation regarding market adoption: At 65 nm, no foundry has made designer-intended DFM tool usage mandatory at tapeout signoff. Therefore, designer-intended DFM tools will move along at a slow adoption rate.
The market situation is quite different--and quite optimistic--for manufacturer-intended DFM tools like OPC/PSM, which were widely adopted years ago for process nodes at 0.18 um, 0.13 um, 90 nm, and below. Long before designers started their 65-nm projects (with or without designer-intended DFM tools), process people at foundries and IDMs were already hard at work developing 45- and even 32-nm process nodes. At these processes, there is a tremendous need to address newly identified manufacturing issues in the patterning loop, among other issues. Semiconductor equipment suppliers will have to get on board. After all, they're developing machines for these future processes precisely because of market imperatives.
Finally, manufacturers like foundries and IDMs are willing to deploy manufacturer-intended DFM tools even though foundries are reluctant to turn over process data to DFM-tool developers due to proprietary considerations. These manufacturer-intended DFM tools can offer initial streams of revenue to the DFM-tool developer market segment until designer-intended DFM tools take hold in the marketplace. That adoption of designer-intended DFM tools could be years away.
Dr. Lars Liebmann at IBM has suggested that we push design intent into the manufacturing process as a way to deliver meaningful product yield and recoup the billions of dollars that have been spent to build fabrication plants. Extrapolating from Liebmann's suggestion, we have to note that the historically poor understanding of design intent has started to seriously hurt yield on the manufacturing side. This is precisely where manufacturer-intended DFM can prevent systematic yield loss. One tangible benefit does allow the quicker adoption of manufacturerintended DFM: It's much easier to measure improvement of yield loss than on the design side. The EDA industry has already demonstrated great success in the adoption of OPC/RET application and flow methodology. Therefore, great market opportunities do exist for some DFM startups that provide DFM solutions to push design intent into manufacturing.
Conversely, we see the majority of effort in developing DFM tools to be on pushing manufacturing up into design. Again, there's probably great potential there, but it will take years to get adopted.
So what's the primary question we have to answer here? In my mind, it's whether the designer-centric or manufacturercentric approach will bridge the gap between IC design and manufacturing for the improvement of chip yield. My answer is that they both will. But the manufacturercentric approach will yield results and help designers first and for the foreseeable future.
The market will have to decide when it's ready for designer-intended DFM tools at 65, 45, or 32 nm. I can say that there's urgent demand out there for manufacturer-intended DFM solutions. It's this side of the DFM house that's drawing revenue. As a testament to the allure of manufacturer-centric tools, 2006 ended with the acquisition of Brion Technologies, a manufacturerintended DFM-tool supplier. Brion was purchased by ASML Holding NV, an advanced equipment vender,, for $270 million in cash. Far from being on a DFM deathwatch, we're on the cusp of a DFM boom.
Dr. Chenmin Hu, President, Anchor Semiconductor, Inc.