Part of the  

Chip Design Magazine


About  |  Contact

Posts Tagged ‘Apache Design’

Deeper Dive – FinFet Validation Tools

Thursday, November 21st, 2013

By Caroline Hayes, Senior Editor

The industry prepares to embrace the all-encompassing FinFET validation model – a view from the supply chain.

TSMC’s 16nm FinFET reference flow has made headlines recently, and EDA and IP companies are responding with supporting products. It is not a simple support role, however, it demands a rigorous, all-encompassing model.

In response, Apache Design has announced that its RedHawk and Totem have completed methodology innovations for the thee-dimensional transistor architecture and TSMC has certified Mentor’s Olympus-SoC place and route system and its Calibre physical verification platform.

The first reaction has to be one of surprise as the excessive interest in FinFET. Apache Design’s vice president product engineering & customer support, Aveek Sarkar, provides the answer: “[FinFET] can manage voltage closely and lower the supply voltage considerably,” he told System Level Design. “Power is a quadratic formula, so to lower voltage from 1V to 0.7V reduces the dynamic power by 50%,” he adds, explaining the appeal of FinFET.

System Level Design asked if lower supply voltages can outweighed the obstacles FinFET poses to EDA? It has a more complex structure, with more restrictive design rules than planar structures and poses challenges in extraction. It seems these have not proved to be deterrents, judging by the industry’s activity.

For example, TSMC has given certification to the Mentor Olympus-SoC place and route system, and its Calibre physical verification platform. Avrind Narayanan, product marketing manager Place & Route division, Mentor Graphics, explains that the Olympus-SoC for 16nm FinFET enables efficient double patterning (DP) and timing closure. “It also has comprehensive support for new design rule checks and multi-patterning rules, fin grid alignment for standard cells and macros during placement, and Vt min-area rule and implant layer support during placement,” he adds.

Explaining the Calibre product, Michael Buehler-Garcia, senior director, marketing, Calibre Design Solutions, Mentor Graphics, tells System Level Design that it supports 16nm FinFET advanced design rule definition and litho hotspot pre-filtering. The Calibre SmartFill facility has been enhanced to support the TSMC-specified filling requirements for FinFET transistors, including support for density constraints and multilayer structures needed for FinFET layers.
Significantly, SmartFill also provides double patterning support for back end layers and, says Buehler-Garcia, “goes beyond simple polygons to automatically insert fill cells into a layout based on analysis of the design”.

He continues to point out the new challenges of 16nm FinFET designs. “[They] require careful checking for layout features that cannot be implemented with current lithography systems—so-called litho hotspots. They also require much more complex and accurate fill structures to help ensure planarity and to also help deal with issues in etch, lithography, stress and rapid thermal annealing (RTA) processes”. The value of place and route layout tools will be in implementing fin grid alignment for standard cells and macros during placement, he notes, as well as in Vt min-area rules and implant layer support during placement.

Apache has enhanced its PathFinder, which verifies ESD (electrostatic discharge) at the SoC level for the technology. Since FinFET lacks snapback protection, diodes have to be used, to protect against ESD. However, using diodes brings the drawback of degraded performance due to a higher current density. FinFET means that instead of one supply domain, there are now hundreds of voltage islands across the chip, says Sarkar, explaining Apache’s approach. These islands have to be protected individually, and the designer needs to be able to predict what problems will happen on each of the islands, which means that layout-based SoC sign-off is critical, he concludes. “It is no longer a visual check, but electrical analysis,” he says.

TSMC and Mentor Graphics introduced a fill ECO (Engineering Change Order) flow as part of the N16 reference flow. This enables incremental fill changes, which reduce run time and file size while supporting last minute engineering changes. “By preserving the vast majority of the fill, the ECO flow limits the timing impact of fill to the area around the ECO changes,” says Buehler-Garcia.

Sarkar agrees that FinFET requires more attention to fill and its impact on capacity, and the time needed for design and verification. The company works with the foundry for certification to ensure that the tool is ready in terms of capacity, performance and turnaround time. However, he warns that accuracy for the full chip is only possible by simulating the whole chip in the domain analysis. This means examining how much change is needed, and where the voltage is coming from. “Every piece has to be simulated accurately,” he says, predicting more co-design with different aspects will need to be brought into the design flow. Expanding on the theme, he says that one environment may focus on the package and the chip simultaneously, while another environment may include the package, the chip and the system. “There will be less individual silo-based analysis and more simulations that looks across multiple domains.”

For Buehler-Garcia, the main difference for 16nm FinFET was that new structures brought a new set of requirements that had to be developed and carefully verified throughout the certification process. He describes the collaboration between the foundry and the company as “an evolutionary step, not revolutionary”.

In the next Deeper Dive (December 5) System Level Design will look at the role of double patterning in FinFET processes and how different EDA tools address its IP validation.

The Current State Of Model-Driven Engineering

Wednesday, December 19th, 2012

By John Blyler

Panelists from industry, national laboratories, and the Portland State System Engineering graduate program recently gathered for an open forum on model-driven engineering.

The goal of the forum—which was hosted in collaboration with PSU, the International Council on Systems Engineering (INCOSE) and IEEE—was to connect systems engineering and IT modeling to domain specialties in electronic/electrical, mechanical and software engineering. Panelists included speakers from Mentor Graphics, ANSYS, CH2M Hill, Pacific Northwest National labs, SAIC, Veterans Affair Resource Center and PSU.

To clarify what is meant by systems engineering (SE), Herman Migliore, director of PSU’s SE program, cited Norm Augustine’s often quoted definition: Systems engineering is the practice of creating means of performing useful functions through combination of two of more interacting components. This broad definition encompasses all domain specific SE disciplines, including hardware and software.

Migliore noted that modeling the entire system engineering process, from beginning to end, is made difficult by the challenges of exchanging modeling information between all disciplines. These disciplines include engineering, science, business and even the legal profession, as well as vertical markets such as defense, electronics and software.

“Each discipline and market has it own view of engineering and modeling the system,” said Migliore. The challenge becomes integrating all these differing points of view. That’s why the one model that might unite them all is the Vee-Diagram, which emphasizes the decomposition of the high-level system into component pieces, followed by the integration of the components into a working whole. This approach requires designers to consider test, verification and validation requirements at every phase of the development life cycle.

Next up was James Godfrey from CH2M-Hill, a construction management company that includes semiconductor equipment programming and deployment. To date, many vendors use UML diagrams to engage customers about needed processes that will then be created in software. Unfortunately, UML doesn’t address continuous systems needed for continuing improvement, according to Godfrey. SysML does deal with continuous processes, e.g., pumps, fans and moving waste.

Doing his work at PSU, Godfrey learned about a collaborative system M&S framework developed at Georgia Tech (see diagram below).

Many in the construction management world question the need for models. Godfrey noted that these users wonder why that can’t continue to use Visio to capture typical construction drawings and specification. This often leads to a redundant entering of information into static diagrams and then later in dynamic models.

“Reality feeds into models that then can become diagrams,” said Godfrey. All of which should be stored in one data repository.

ANSYS approached the system modeling challenge from a more electronics point-of-view. According to Andy Byers, ANSYS started as a structural analysis company in the nuclear industry, among others. With the acquisition of Ansoft in 2008, ANSYS added electromagnetic modeling. System-level multiphysics and electronic power modeling were added with the purchase of Apache Design a few years later.

Today, most engineers communicate via documents. But many now want models in addition to documentation for the systems they’re building or integrating. Yet models in one engineering domain don’t often translate well to other domains.

“Pictures may be best way to talk across different engineering disciplines,” observed Byers.

Another factor encouraging model-driven development is that many component companies are now moving up the supply chain (or left-hand, integration side of the Vee-Diagram) to create subsystems, including both embedded hardware and software.

As companies are moving further up the system supply chain, they are finding out that optimization modeling techniques don’t scale across multiple point and physics, noted Byers. Such inefficient optimization leads to overdesign, where designers leave too much margin on the table. This message was a key theme at the recent Ansys-Apache Electronics conference (JB: reference]

But a system-level model must be simple enough for all engineers to use. Today, most analysis are set up and performed by a few experts with PhDs. These experts are becoming a bottleneck, said Byers. “There needs to be a democratization of simulation to the engineering masses.

Finally, as useful as the Vee-Diagram is for system-level modeling, users must look beyond engineering to other systems, like cost, schedule, and even legal. Focusing on this last point, Byers related a story concerning the exchange of models in the automotive industry between and OEM and a Tier 1 (subsystem) and Tier 2 (component) vendors. In order to avoid intellectual property (IP) and gross negligence issues, the OEM lawyers wanted to embedded a legal model into the engineering one. It was unclear as to the success of this approach.

Switching perspectives, Ryan Slaugh spoke about the challenges of hardware-software integration from the standpoint of the Pacific Northwest National Labs (PNNL). With its changing mission, PNNL is facing a problem that is commonplace to electronic companies—deciding when research projects are ready for commercialization. “ We are trying to cross the chasm of death from R&D to successful product development,” said Slaugh.

To determine the maturity of an R&D project, PNNL uses a Technology Readiness Level (TRL) process. This helps grade projects to tell when they might be ready to become products. For example, a project with high confidence, which is one that re-uses known good hardware and software, has a low score. Once in the product stage, systems engineering techniques are applied to the life cycle to low the risk of failure.

How are complex modeling approaches taught to students? What is needed to help college students get used to modeling? These questions where addressed by William “Ike” Eisenhauser, an affiliate professor at PSU and director of…

Simple modeling approaches make great communication tools, especially for non-technical professionals. But in essence, all models are wrong, noted Eisenhauser. “Yet some can be useful.”

Eisenhauser presented a brief overview of different kinds of models:

  1. Simple representation: e.g., solar system ball-and-string model in high school.
  2. Math model: Describes a situation (y=function of x).
  3. State diagram: Moving from math to device representation.
  4. Engineering flowcharts (non-math models): Communicate to others to help make decisions.
  5. Behavior models: More complex, intended to describes why system behaves as it does. These models help to predict change.
  6. Discrete models: Sometimes mistaken for the actual system. They demonstration implementation, e.g., balls moving in a physical model.

The greatest challenge with modeling is teaching that models are just tools, not playthings. “Modelers must learn when to stop using models,” cautioned Eisenhauser. “This is a critical lesson for engineers. “

The problem is that students go into modeling because they want to create cool models. It is an analogous problem to physics majors who go into physics to build light sabers, not to help mankind with issues of global importance, said Eisenhauser.

That’s why it is important to teach engineers the objectives of modeling and knowing when to stop.

How does modeling fit into the role of systems engineering? Unfortunately, SE remains a text-intensive discipline. Documentation matters in detailing complex systems. There is an ongoing need to reduce text editing in SE modeling. That’s where system-modeling approaches such as SysML can help.

The educational problem that Eisenhauer and others in PSU’s SE program face is how to provide a useful SE modeling tool. All such tools—even SysML—require more than one 8-week course to learn. Any such tool will need to be taught across several classes.

Is SysML the best tool for SE modeling in university course? That’s an ongoing challenging in modeling education, namely, how to discern the popular software-of-the-day from truly useful and market-acceptable tools, said Eisenhauser.

The final speaker was Bill Chown, from Mentor Graphics. He spoke about Model Driven Development (MDD), a contemporary approach in which the model is the design and implementation is directly derived from the model.

The challenges facing system designers are well known, from increasing complexity to the convergence of multiple engineering disciplines and the associated problem of optimizing a comprehensive system design.

The design team itself is a dynamic entity, comprised of an architect or systems engineer, the hardware or software component designer and the system integrator who puts it all together, noted Chown. Further, each of these professionals may only be involved in the design for their portion of the life cycle, such as from the concept through design and to domain specific areas.

What types of models are used through the lifecycle? Chown listed three categories:

  1. Platform Independent model, which includes function, architecture, interfaces, interactions and which can demonstrate that requirements are understood and met.
  2. Platform-dependent models, such as hardware architectures with virtual prototypes or software architectures with partitions and data, which can be used to determine resources and performance goals and for hardware-software co-design before physical implementation.
  3. Platform-specific models, for implementation, verification, test and deliverables.

Models can and should drive implementation. For example, software models can generate code once configured to an RTOS. Hardware flows have emerged for C-to-RTL synthesis and UML-to-SystemC simulation and validation. Test languages also can be generated directly from models.

Model-driven design has evolved to cover the full system or product life cycle, from requirements to prototype and then production.