Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘Accellera’

Next Page »

Blog Review – Monday, September 14, 2015

Monday, September 14th, 2015

MonolithIC 3D identifies IoT drive; Trio advise Accellera Work Group on SoC definition; Shape-shifters on the catwalks; Weathering the storms of design challenges; IP subsystems ahead of OIP; AMD SoCs find their calling; Cadence keeps lines of communication open

Ahead of the IEEE S3S Conference, Zvi Or-Bach, MonolithIC 3D looks at the wafer demands for cheaper IoT development. His illustrated preview of some of the papers gives some insight into the discussions that lie ahead.

The clock is ticking for further technology contributions to the Accellera Portable Stimulus Working Group. Ahead of the deadline, September 16, Tom Fitzpatrick, Mentor Graphics, adds some background to the announcement that Mentor, Cadence and Breker have joined the group, offering portable test and stimulus expertise, in the definition of an SoC verification standard with IP and re-use opportunities.

Sportswear designer, Chromat, held its Spring/Summer 2016 runway show at MADE Fashion Week, with models wearing responsive garments that transform shape based on the wearer’s body temperature, adrenaline or stress levels. The experimental Adrenaline Dress, says Ayse Ildeniz, was powered by Intel’s Curie Module. There was also the Aeros Sports Bra which can respond to changes in perspiration, respiration and body temperature to adjust body temperature.

Design challenges that can make a real difference are highlighted by Brian Fuller, ARM, as he profiles the winning project in the inveneo solar power Micro Data Center Design Challenge 2015. The Micro Weather Station is explained inside and out and makes fascinating reading.

An interesting preview of his talk about the concept of IP subsystems at TSMC’s OIP (Open Innovation Platform) is given by Navraj Nandra, Synopsys. He uses examples of wearable and automotive technology to show the role of the foundry, as well as design and integration challenges.

Trying to figure out where the smart set goes if it’s not into a smartphone, automotive design or consumer device, Chris Ciufo, eecatalog, champions the power of AMD’s G-Series SoCs for “everything else”, especially thin client computer and some arresting digital signage.

All relationships rely on good communication, so Christine Young, Cadence, explains how to simplify the design flow between schematic and layout engineers. Instead of checking versions, she recommends some of the sound advice given by Karim Khalfan, director of application engineering at ClioSoft.

Caroline Hayes, Senior Editor

Blog Review – Tuesday, August 18, 2015

Monday, August 17th, 2015

Where will the future of embedded software lead; Manufacturing success; DDR memory IP – a personal view; Untangling the IoT protocols; The battle of virtual prototyping; Accellera SI update; Smart buildings; SoC crisis management

The rise of phones, GPS, tablets and cars means embedded software increases in complexity, muses Colin Walls, Mentor Graphics. He traces the route of hardware and software in simple systems to ones that have to work harder and smarter.

Managing to avoid sounding smug, Falan Yinug reports on the SIA (Semiconductor Industry Association) paper confirming the semiconductor industry is the USA’s most innovative manufacturing industry, and looks at its role in the economy.

Less about being a woman in technology and more about the nitty gritty of DDR controller memory IP, Anne Hughes, DDR IP Engineering, Cadence talks to Christine Young.

Fitting protocols like CoAp and IPSO smart objects in the IoT structure can be daunting, but Pratulsharma, ARM, has written an illustrated blog that can lead readers through the wilderness.

Clearly taken with the Battlebots TV show, Tom De Schutter, Synopsys, considers how to minimise design risks to avoid destruction, or at least behave as intended.

A considered view of the Accellera Sytems Initiative is given by Gabe Moretti, Chip Design Magazine. He elaborates on what the UVM standardization will mean for the wider EDA industry.

Where the IoT is used, and how, for smart buildings, is examined by Rob Sheppard, Intel.

Alarming in his honesty, Gadge Panesar, Ultrasoc, says no-one know how SoCs operate and urges others to be as honest as he is and seek help – with some analytics and IP independence.

Blog Review – Mon. April 28 2014

Monday, April 28th, 2014

The first PHYs; compute shaders; Accellera Day online; Security and privacy
By Caroline Hayes, Senior Editor.

Mentor Graphics’ Dennis Brophy initially questioned the need for an online Accellera Day, but soon retracted and is even offering to keep blog visitors informed as more are posted.

If you are interested in compute shaders, Sylvek at ARM, explains clearly and concisely what they are and how to add one to an application.

Another informative blog is the first of three from Corrie Callenbach, Cadence, directing us to Kevin Yee’s video: Take command of MIPI PHYS. The presenter takes us through the first of three PHY specifications introduced by MIPI.

Intel’s Mayura Garg points blog visitors to Michael Fey’s presentation at ISS2014. Fey, Executive Vice President, General Manager of Corporate Products and Intel Security CTO focuses on Security and Privacy in the Information Age – the blog’s own ‘scary video’.

Accellera Systems Initiative Continues to Grow

Thursday, October 17th, 2013

By Gabe Moretti

The convergence of system, software and semiconductor design activities to meet the increasing challenges of creating complex system-on-chips (SoCs) has brought to the forefront the need for a single organization to create new EDA and IP standards.

As one of the founders of Accellera, and the one responsible for its name, it gives me great pleasure to see how the consortium has grown and widened its interests.  Through the mergers and acquisitions of the Open SystemC Initiative (OSCI), Virtual Sockets Interface Alliance (VSIA), The SPIRIT Consortium, and now assets of OCP-IP, Accellera is the leading standards organization that develops language-based standards used by system, semiconductor, IP and EDA companies.

As its original name implies, Accellera is Italian meaning accelerate, its activities target EDA tools and methods with the aim of fostering efficiency and portability.

Created to develop standards for design and verification languages and methods, Accellera has grown by merging or acquiring other consortia, expanding its role to Electronic System Level standards, and IP standards.  It now has forty one member companies from industries such as EDA, IP, semiconductors, and electronics systems.  As a result of its wider activities it even its name has grown and now is “Accellera Systems Initiative”.

In addition to the corporate members Accellera has formed three Users Communities, to educate engineers and increase the use of standards.  The Communities are: OCP, SystemC, and UVM.  The first one deals with IP standards and issues, the second supports the SystemC modeling and verification language, while the third one works on Unified Verification procedures.

Accellera has 17 active Technical Committees.  Their work to date has resulted in 7 IEEE standards.  Accellera sponsors a yearly conference DVCON held generally in February but also collaborates with engineering conferences in Europe and Japan.  With the growth of electronics activities in nations like India and China, Accellera is considering a more active presence in those countries as well.

Accellera Systems Initiative has taken over OCP-IP

Tuesday, October 15th, 2013

By Gabe Moretti

Accellera has been taking over multiple standards organization in the industry for several years and this is only the latest.  The acquisition includes the current OCP 3.0 standard and supporting infrastructure for reuse of IP blocks used in semiconductor design. OCP-IP and Accellera have been working closely together for many years, but OCP-IP lost corporate and member financial support steadily over the past five years and membership virtually flatlined. Combining the organizations may be the best way to continue  to address interoperability of IP design reuse and jumpstart adoption.

“Our acquisition of OCP assets benefits the worldwide electronic design community by leveraging our technical strengths in developing and delivering standards,” said Shishpal Rawat, Accellera Chair. “With its broad and diverse member base, OCP-IP will complement Accellera’s current portfolio and uniquely position us to further develop standards for the system-level design needs of the electronics industry.”

OCP-IP was originally started by Sonics, Inc. in December 2001 as a means to proliferate it’s network-on-chip approach.  Sonics CTO  Drew Wingard has been a primary driver of the organization.  It has long been perceived as the primary marketing tool of the company and it will be interesting to see how the company (which has been on and off the IPO trail several times since its founding) fairs without being the “big dog” in the discussion.

A comprehensive list of FAQs about the asset acquisition is available.

Verification Choices: Formal, Simulation, Emulation

Thursday, July 21st, 2016

Gabe Moretti, Senior Editor

Lately there have been articles and panels about the best type of tools to use to verify a design.  Most of the discussion has been centered on the choice between simulation and emulation, but, of course, formal techniques should also be considered.  I did not include FPGA based verification in this article because I felt to be a choice equal to emulation, but at a different price point.

I invited a few representatives of EDA companies to answer questions about the topic.  The respondents are:

Steve Bailey, Director of Emerging Technologies at Mentor Graphics,

Dave Kelf, Vice President of Marketing at OneSpin Solutions

Frank Schirrmeister, Senior Product Management Director at Cadence

Seena Shankar, Technical Marketing Manager at Silvaco

Vigyan Singhal, President and CEO at Oski Technology

Lauro Rizzatti, Verification Consultant

A Search for the Best Technique

I first wanted an opinion of what each technology does better.  Of course the question is ambiguous because the choice of tool, as Lauro Rizzatti points out, depends on the characteristics of the design to be verified.  “As much as I favor emulation, when design complexity does not stand in the way, simulation and formal are superior choices for design verification. Design debugging in simulation is unmatched by emulation. Not only interactive, flexible and versatile, simulation also supports four-state and timing analysis.
However, design complexity growth is here to stay, and the curve will only get more challenging into the future. And, we not only have to deal with complexity measured in more transistors or gates in hardware, but also measured in more code in embedded software. Tasked to address this trend, both simulation and formal would hit the wall. This is where emulation comes in to rule the day.  Performance is not the only criteria to measure the viability of a verification engine.”

Vigyan Singhal wrote: “Both formal and emulation are becoming increasingly popular. Why use a chain saw (emulation) when you can use a scalpel (formal)? Every bug that is truly a block-level bug (and most bugs are) is most cost effective to discover with formal. True system-level bugs, like bandwidth or performance for representative traffic patterns, are best left for emulation.  Too often, we make the mistake of not using formal early enough in the design flow.”

Seena Shankar provided a different point of view. “Simulation gives full visibility to the RTL and testbench. Earlier in the development cycle, it is easier to fix bugs and rerun a simulation. But we are definitely gated by the number of cycles that can be run. A basic test

exercising a couple of functional operations could take up to 12 hours for a design with a 100 million gates.

Emulation takes longer to setup because all RTL components need  to be in place before a test run can begin. The upside is that millions of operations can be run in minutes. However, debug is difficult and time consuming compared to simulation.  Formal verification needs a different kind of expertise. It is only effective for smaller blocks but can really find corner case bugs through assumptions and constraints provided to the tool.”

Steve Bailey concluded that:” It may seem that simulation is being used less today. But, it is all relative. The total number of verification cycles is growing exponentially. More simulation cycles are being performed today even though hardware acceleration and formal cycles are taking relatively larger pieces of the overall verification pie. Formal is growing in appeal as a complementary engine. Because of its comprehensive verification nature, it can significantly bend the cost curve for high-valued (difficult/challenging) verification tasks and objectives. The size and complexity of designs today require the application of all verification engines to the challenges of verifying and validating (pre-silicon) the hardware design and enabling early SW development. The use of hardware acceleration continues to shift-left and be used earlier in the verification and validation flow causing emulation and FPGA prototyping to evolve into full-fledged verification engines (not just ICE validation engines).”

If I had my choice I would like to use formal tools to develop an executable specification as early as possible in the design, making sure that all functional characteristics of the intended product will be implemented and that the execution parameters will be respected.  I agree that the choice between simulation and emulation depends on the size of the block being verified, and I also think that hardware/software co-simulation will most often require the use of an emulation/acceleration device.

Limitations to Cooperation Among the Techniques

Since all three techniques have value in some circumstance, can designers easily move from one to another?

Frank Schirrmeister provided a very exhaustive response to the question, including a good figure.

“The following figure shows some of the connections that exist today. The limitations of cooperation between the engines are often of a less technical nature. Instead, they tend to result from the gaps between different disciplines in terms of cross knowledge between them.

Figure 1: Techniques Relationships (Courtesy of Cadence)

Some example integrations include:

-          Simulation acceleration combining RTL simulation and emulation. The technical challenges have mostly been overcome using transactors to connect testbenches, often at the transaction level that runs on simulation hosts to the hardware holding the design under test (DUT) and executing at higher speed. This allows users to combine the expressiveness in simulated testbenches to increase verification efficiency with the speed of synthesizable DUTs in emulation.

-          At this point, we even have enabled hot-swap between simulation and emulation. For example, we can run gate-level netlists without timing in emulation at faster speeds. This allows users to reach a point of interest at a later point of the execution that would take hours or days in simulation. Once the point of interest is reached, users can switch (hot swap) back into simulation, adding back the timing and continue the gate-level timing simulation.

-          Emulation and FPGA-based prototyping can share a common front-end, such as in the Cadence System Development Suite, to allow faster bring-up using multi-fabric compilation.

-          Formal and simulation also combine nicely for assertions, X-propagation, etc., and, when assertions are synthesizable and can be mapped into emulation, formal techniques are linked even with hardware-based execution.

Vigyan Singhal noted that: “Interchangeability of databases and poorly architected testbenches are limitations. There is still no unification of coverage database standard enabling integration of results between formal, simulation and emulation. Often, formal or simulation testbenches are not architected for reuse, even though they can almost always be. All constraints in formal testbenches should be simulatable and emulatable; if checkers and bus functional models (BFMs) are separated in simulation, checkers can sometimes be used in formal and in emulation.”

Dave Kelf concluded that: “the real question here is: How do we describe requirements and design specs in machine-readable forms, use this information to produce a verification plan, translate them into test structures for different tools, and extract coverage information that can be checked against the verification plan? It is this top-down, closed-loop environment generally accepted as ideal, but we have yet to see it realized in the industry. We are limited fundamentally by the ability to create a machine-readable specification.”

Portable Stimulus

Accellera has formed a study group to explore the possibility of developing a portable stimulus methodology.  The group is very active and progress is being made in that direction.  Since the group has yet to publish a first proposal, it was difficult to ask any specific questions, although I thought that a judgement on the desirability of such effort was important.

Frank Schirrmeister wrote: “At the highest level, the portable stimulus project allows designers to create tests to verify SoC integration, including items like low-power scenarios and cache coherency. By keeping the tests as software routines executing on processors that are available in the design anyway, the stimulus becomes portable between the different dynamic engines, specifically simulation, emulation, and FPGA prototyping. The difference in usage with the same stimulus then really lies in execution speed – regressions can run on faster engines with less debug – and on debug insight once a bug is encountered.”

Dave Kelf also has a positive opinion about the effort. “Portable Stimulus is an excellent effort to abstract the key part of the UVM test structures such that they may be applied to both simulation and emulation. This is a worthy effort in the right direction, but it is just scraping the surface. The industry needs to bring assertions into this process, and consider how this stimulus may be better derived from high-level specifications”

SystemVerilog

The language SystemVerilog is considered by some the best language to use for SoC development.  Yet, the language has limitations according to some of the respondents.

Seena Shankar answered the question “Is SystemVerilog the best we can do for system verification? as follows: “Sort of. SystemVerilog encapsulates the best features from Software and hardware paradigms for verification. It is a standard that is very easy to follow but may not be the best in performance. If the performance hit can be managed with a combination of system C/C++ or Verilog or any other verification languages the solution might be limited in terms of portability across projects or simulators.”

Dave Kelf wrote: “One of the most misnamed languages is SystemVerilog. Possibly the only thing this language was not designed to do was any kind of system specification. The name was produced in a misguided attempt to compete or compare with SystemC, and that was clearly a mistake. Now it is possible to use SystemVerilog at the system level, but it is clear that a C derived language is far more effective.
What is required is a format that allows untimed algorithmic design with enough information for it to be synthesized, virtual platforms that provide a hardware/software test capability at an acceptable level of performance, and general system structures to be analyzed and specified. C++ is the only language close to this requirement.”

And Frank Schirrmeister observed: “SystemVerilog and technologies like universal verification methodology (UVM) work well at the IP and sub-system level, but seem to run out of steam when extended to full system-on-chip (SoC) verification. That’s where the portable stimulus project comes in, extending what is available in UVM to the SoC level and allowing vertical re-use from IP to the SoC. This approach overcomes the issues for which UVM falls short at the SoC level.”

Conclusion

Both design engineers and verification engineers are still waiting for help from EDA companies.  They have to deal with differing methodologies, and imperfect languages while tackling ever more complex designs.  It is not surprising then that verification is the most expensive portion of a development project.  Designers must be careful to insure that what they write is verifiable, while verification engineers need to not only understand the requirements and architecture of the design, but also be familiar with the characteristics of the language used by developers to describe both the architecture and the functionality of the intended product.  I believe that one way to improve the situation is for both EDA companies and system companies to approach a new design not just as a piece of silicon but as a product that integrates hardware, software, mechanical, and physical characteristics.  Then both development and verification plans can choose the most appropriate tools that can co-exist and provide coherent results.

Two Tiers EDA Industry

Thursday, June 16th, 2016

Gabe Moretti, Senior Editor

Talking to Lucio Lanza you must be open to ideas that appear strange and wrong at first sight.  I had just that talk with him during DAC.  I enjoy talking to Lucio because I too have strange ideas, certainly not as powerful as him, but strange enough to keep my brain flexible.

So we were talking about the industry when suddenly Lucio said: “You know the EDA industry needs to divide itself in two: design and manufacturing are different things.”

The statement does not make much sense from an historical perspective, in fact it is contrary to how EDA does business today, but you must think about it from today and future point of view.  The industry was born and grew under the idea that a company would want to develop its own product totally in house, growing knowledge and experience not only of its own market, but also of semiconductor capabilities.  The EDA industry provides a service that replaces what companies would otherwise have to do internally when designing and developing an IC or a PCB.  The EDA industry provides all the required tools which would have otherwise been developed internally.  But with the IoT as the prime factor for growth, dealing with the vagaries of optimizing a design for a given process is something most companies are either unprepared to do, or too costly given the sale price of the finished product.  I think that a majority of IoT products will not be sensitive to a specific process’s characteristics.

The Obstacles

So why not change, as Lucio forecasts.  The problem is design methodology.  Unfortunately, given the design flow supported today, a team is supposed to take the design through synthesis before they can analyze the design for physical characteristics.  This approach is based on the assumption that the design team is actively engaged in the layout phase of the die.  But product developers should not, in general, be concerned with how the die is laid out.  A designer should have the tool to predict leakage, power consumption, noise, and thermal at the system level.  The tools need to be accurate, but not precise.  It should be possible to predict the physical behavior of the design given the characteristics of the final product and of the chosen process.  Few companies producing a product that is leading edge and will sell in large volume will need to be fully involved in the post synthesis work, but the number of these companies continues to shrink in direct proportion to the cost of using the process.

EDA startups should not look at post synthesis markets.  They should target system level design and verification.  The EDA industry must start thinking in terms of the products its customers are developing, not the silicon used to implement them.  A profound change in both the technological and business approach to our market is needed, if we want to grow.  But change is difficult and new problems require not just new tools, but new thinking.  Change is hard and almost always uncomfortable.

Software development and debug must be supported by a true hardware/software co-design and co-development system.  At present there are co-verification tools, but true co-development is still not possible, at least not within the EDA industry.

As I have said many times before “chips don’t float” thus tier one of the new EDA must also provide packaging tools, printed circuit board (PCB) design tools, and mechanical design tools to create the product.  In other words we must develop true system level design and not be so myopic to believe that our goal is Electronic System Level support.  The electronic part is a partial solution that does not yield a product, just a piece of a product.

The Pioneers

I know of a company that has already taken a business approach that is similar to what Lucio is thinking about.  The company had always exhibited at DAC, but since its new business approach it was not there this year.  Most customers of eSilicon do not go to DAC, they go to shows and conferences that deal with their end products’ markets.  The business approach of the company, as described to me by Mike Gianfagna, VP of Marketing at eSilicon, is to partner with a customer to implement a product, not a design.  eSilicon provides the EDA knowhow and the relationship with the chosen foundry, while the customer provides the knowledge of the end market.  When the product is ready both companies share in the revenue following a prior agreed to formula.  This apparently small change in the business model takes EDA out of the service business and into the full electronic industry opportunity.  It also relives companies from the burden of understanding and working the transformation of a design into silicon.

Figure 2: Idealized eSilicon Flow (Courtesy of eSilicon)

What eSilicon offers is not what Lucio has in mind, but it comes very close in most aspects, especially in its business approach to the development of a product, not just a die.

Existing Structure

Not surprisingly there are consortia that already provide structure to help the development of a two tiers EDA industry.   The newly renamed ESDA can help define and form the new industry while its marketing agreement with SEMICO can foster a closer discourse with the IP industry.  Accellera Systems Initiative, or simply Accellera, already specializes in design and verification issues, and also focuses on IP standards, thus fitting one of the two tiers perfectly.  The SI2 consortium, on the other hand, focuses mostly on post synthesis and fabrication issues, providing support for the second tier.  Accellera, therefore, provides standards and methodology for the first tier, SI2 for the second tier, while ESDA straddles both.

The Future

In the past using the latest process was a demonstration that a company was not only a leader in its market, but an electronics technology leader.  This is no longer the case.  A company can develop and sell a leading product using   a 90 or 65nm process for example and still be considered a leader in its own market.  Most IoT products will be price sensitive, so minimizing both development and production costs will be imperative.

Having a partner that will provide the know-how to transform the description of the electronic circuit into a layout ready to manufacture will diminish development costs since the company no longer has to employ designers that are solely dedicated to post synthesis analysis, layout and TCAD.

EDA companies that target these markets will see their market size shrink significantly but the customers’ knowledge of the requirements and technological characteristics of the tools will significantly improve.

The most significant impact will be that the EDA available revenue volume will increase since EDA companies will be able to get revenue from every unit sold of a specific product.

Complexity of Mixed-signal Designs

Thursday, August 28th, 2014

Gabe Moretti, Senior Editor

The major constituent of system complexity today is the integration of computing with mechanical and human interfaces.  Both of these are analog in nature, so designing mixed-signal systems is a necessity.  The entire semiconductor chain is impacted by this requirement.  EDA tools must support mixed-signal development and the semiconductor foundries must adapt to using different processes to build one IC.

Impact on Semiconductor Foundries

Jonah McLeod, Director of Corporate Marketing Communications at Kilopass Technology was well informed about foundries situation when ARM processors became widely used in mixed-signal designs.  He told me that: ” Starting in 2012 with the appearance of smart meters, chip vendors such as NXP began integrating the 32-bit ARM Cortex processor with a analog/mixed-signal metrology engine for Smart Metering with two current inputs and a voltage input.

This integration had significant impact on both foundries and analog chip suppliers. The latter had been fabricating mixed-signal chips on process nodes of 180nm and larger, many with their own dedicated fabs. With this integration, they had to incorporate digital logic with their analog designs.

Large semiconductor IDMs like NXP, IT and ST had an advantage over dedicated chip companies like Linear and Analog Devices. The former had both logic and analog design expertise they could bring to bear building these SoCs and they had the fabrication expertise to build digital mixed-signal processes in smaller process geometries.

Long exempt from the pressure to chase smaller process geometries aggressively, the dedicated analog chip companies had a stark choice. They could invest the large capital expenditure required to build smaller geometry fabs or they could go fablite and outsource smaller process geometry designs to the major fabs. This was a major boost for the foundries that needed designs to fill fabs abandoned by digital designs chasing first 40nm and now 28nm. As a result, ffoundries now have 55nm processes tailored for power management ICs (PMICs) and display driver ICs, among others. Analog expertise still counts in this new world order but the competitive advantage goes to analog/mixed-signal design teams able to leverage smaller process geometries to achieve cost savings over competitors.”

As form factors in handheld and IoT devices become increasingly smaller, integrating all the components of a system on one IC becomes a necessity.  Thus fabricating mixed-signal chips with smaller geometries processes grows significantly in importance.

Mladen Nizic, Product Marketing Director at cadence noted that requirements on foundries are directly connected to new requirements for EDA tools.  He observed that: “Advanced process nodes typically introduce more parametric variation, Layout Dependent Effects (LDE), increased impact of parasitics, aging and reliability issues, layout restrictions and other manufacturing effects affecting device performance, making it much harder for designers to predict circuit performance in silicon. To cope with these challenges, designers need automated flow to understand impact of manufacturing effects early, sensitivity analysis to identify most critical devices, rapid analog prototyping to explore layout configurations quickly, constraint driven methodology to guide layout creation and in-design extraction and analysis to enable correct-by-construction design. Moreover, digital-assisted-analog has common approach in achieving analog performance, leading to increased need for integrate mixed-signal design flow.”

Marco Casale-Rossi, Senior Staff Product Marketing Manager, Design Group, Synopsys points out that there is still much life remaining in the 180nm process.  ”I’ll give you the 180 nanometer example: when 180 nanometers was introduced as an emerging technology node back in 1999, it offered single-poly, 6 aluminum metals, digital CMOS transistors and embedded SRAM memory only. Today, 180 nanometers is used for state-of-the-art BCD (Bipolar-CMOS-DMOS) processes, e.g. for smartpower, automotive, security, MCU applications; it features, as I said, bipolar, CMOS and DMOS devices, double-poly, triple-well, four aluminum layers, integrating a broad catalogue of memories such as DRAM, EEPROM, FLASH, OTP, nad more.   Power supplies span from 1V to several tens or even hundreds of Volts; analog & mixed-signal manufacturing processes at established technology nodes are as complex as the latest and greatest digital manufacturing processes at emerging technology nodes, only the metrics of complexity are different.”

EDA Tools and Design Flow

Mixed-signal designs require a more complex flow than strictly digital designs.  They often incorporate multiple analog, RF, mixed-signal, memory and logic blocks operating at high performance and different power domains.  For these reasons engineers designing a mixed-signal IC need different tools throughout the development process.  Architecting, developing testing and place and route functions are all impacted.

Mladen observed that “Mixed-signal chip architects must explore different configurations with all concerned in a design team to avoid costly, late iterations. Designers consider many factors like block placement relative to each other, IO locations, power grid and sensitive analog routes, noise avoidance to arrive to optimal chip architecture.”

Standard organizations, particularly Accellera and the IEEE have developed versions of generally used HD languages like Verilog and VHDL that provide support for mixed-signal descriptions.  VHDL-AMS and Verilog-AMS continue to be supported by the IEEE and working groups are making sure that the needs of designers are met.

Mladen points out that “recent extensions in standardization efforts for Real Number Modeling (RNM) enable effective abstraction of analog for simulation and functional verification almost at digital speeds.    Cadence provides tools for automating analog behavioral and RNM model generation and validation. In last couple of years, adoption of RNM is on rise driven by verification challenges of complex mixed-signal designs.”

Design verification is practically always the costlier part of development.  This is partly due to the lack of effective correct by construction tools and obviously by the increasing complexity of designs that are often a product of various company design teams as well as the use of third party IP.

Steve Smith, Sr. Marketing Director, Analog/Mixed-signal Verification at Synopsyspointed out that: “The need for exhaustive verification of mixed-signal SoCs means that verification teams need perform mixed-signal simulation as part of their automated design regression test processes. To achieve this requirement, mixed-signal verification is increasingly adopting techniques that have been well proven in the purely digital design arena. These techniques include automated stimulus generation, coverage and assertion- driven verification combined with low-power verification extended to support analog and mixed-signal functionality.  As analog/mixed-signal design circuits are developed, design teams will selectively replace high-level models for the SPICE netlist and utilize the high performance and capacity of a FastSPICE simulator coupled to a high-performance digital simulator. This combination provides acceleration for mixed-signal simulation with SPICE-like accuracy to adequately verify the full design. Another benefit of using FastSPICE in this context is post-layout simulation for design signoff within the same verification testbench environment.”

He continued by saying that: “An adjacent aspect of the tighter integration between analog and digital relates to power management – as mixed-signal designs require multiple power domains, low-power verification is becoming more critical. As a result of these growing challenges, design teams are extending proven digital verification methodologies to mixed-signal design.  Accurate and efficient low-power and multiple power domain verification require both knowledge of the overall system’s power intent and careful tracking of signals crossing these power domains. Mixed-signal verification tools are available to perform a comprehensive set of static (rule-based) and dynamic (simulation run-time) circuit checks to quickly identify electric rule violations and power management design errors. With this technology, mixed-signal designers can identify violations such as missing level shifters, leakage paths or power-up checks at the SoC level and avoid significant design errors before tape-out. During detailed mixed-signal verification, these run-time checks can be orchestrated alongside other functional and parametric checks to ensure thorough verification coverage.”

Another significant challenge to designers is placing and routing the design.  Mladen described the way Cadence supports this task.  “Analog designers use digital logic to calibrate and tune their high performance circuits. This is called digitally-assisted-analog approach. There are often tens of thousands of gates integrated with analog in a mixed-signal block, at periphery making it ready for integration to SOC as well as embedded inside the hierarchy of the block. Challenges in realizing this kind of designs are:

- time and effort needed for iteration among analog and digital designers,

- black-boxing among analog and digital domains with no transparency during the iterations,

- sharing data and constraints among analog and digital designers,

- performing ECO particularly late in the design cycle,

- applying static timing analysis across timing paths spanning gates embedded in hierarchy of mixed-signal block(s).

Cadence has addressed these challenges by integrating Virtuoso custom and Encounter digital platforms on common OpenAccess database enabling data and constraint sharing without translation, full transparency in analog and digital parts of layout in both platforms.”

Casale-Rossi has described how Synopsys addresses the problem.  “A&M/S has always had placement (e.g. symmetry, rotation) and routing (e.g. shielding, length/resistance matching) special requirements. With Synopsys’ Custom Designer – IC Compiler round-trip, and with our Galaxy Custom Router, we are providing our partners and customers with an integrated environment for digital and analog & mixed-signal design implementation that helps address the challenges.”

Conclusion

The bottom line is that EDA tools providers, standards developing organization and semiconductor foundries have significant further work to do.  IC complexity will increase and with it mixed-signal designs.  Mixed-signal third party IP is by nature directly connected to a specific foundry and process since it must be developed and verified at the transistor level.  Thus the complexity of integrating IP development and fabrication will limit the number of IP providers to those companies big enough to obtain the attention of the foundry.

Newer Processes Raise ESL Issues

Wednesday, August 13th, 2014

Gabe Moretti, Senior Editor

In June I wrote about   how EDA changed its traditional flow in order to support advanced semiconductors manufacturing.  I do not think that, although the changes are significant and meaningful they are enough to sustain the increase in productivity required by financial demands.  What is necessary, in my opinion, is a better support for system level developers.

Leaving the solution to design and integration problems to a later stage of the development process creates more complexity since the network impacted is much larger.  Each node in the architecture is now a collection of components and primitive electronic elements that dilute and thus hide the intended functional architecture.

Front End Design Issues

Changes in the way front end design is done are being implemented.  Anand Iyer, Calypto’s Director of Product Marketing focused on the need to plan power at system level.  He observed that: “Addressing DFP issues need to done in the front end tools, as the RTL logic structure and architecture choices determines 80% of the power. Designers need to minimize the activity/clock frequency across their designs since this is the only metric to control dynamic power. They can achieve this in many ways: (1) Reducing activity permanently from their design, (2) Reduce activity temporarily during the active mode of the design.”  Anand went on to cover the two points: “The first point requires a sequential analysis of the entire design to identify opportunities where we can save power. These opportunities need to be evaluated against possible timing and area impact. We need automation when it comes to large and complex designs. PowerPro can help designers optimize their designs for activity.”

As for the other point he said: “The second issue requires understanding the interaction of hardware and software. Techniques like power gating and DVFS fall under this category.”

Anand also recognized that high level synthesis can be used to achieve low power designs.  Starting from C++ or SystemC, architects can produce alternative microarchitectures and see the power impact of their choices (with physically aware RTL power analysis).  This is hugely powerful to enable exploration because if this is done only at RTL it is time consuming and unrealistic to actually try multiple implementations of a complex design.  Plus, the RTL low power techniques are automatically considered and automatically implemented once you have selected the best architecture that meets your power, performance, and cost constraints.

Steve Carlson, Director of Marketing at Cadence pointed out that about a decade ago design teams had their choice of about four active process nodes when planning their designs.  He noted that: “In 2014 there are ten or more active choices for design teams to consider.  This means that solution space for product design has become a lot more rich.  It also means that design teams needs a more fine grained approach to planning and vendor/node selection.  It follows that the assumptions made during the planning process need to be tested as early and often, and with as much accuracy as possible at each stage. The power/performance and area trade-offs create end product differentiation.  One area that can certainly be improved is the connection to trade-offs between hardware architecture and software.  Getting more accurate insight into power profiles can enable trade-offs at the architectural and micro architectural levels.

Perhaps less obvious is the need for process accurate early physical planning (i.e., understands design rules for coloring, etc.).”

As shown in the following figure designers have to be aware that parts of the design are coming from different suppliers and thus Steve states that: “It is essential for the front-end physical planning/prototyping stages of design to be process-aware to prevent costly surprises down the implementation road.”

Simulation and Verification

One of the major recent changes in IC design is the growing number of mixed/signals designs.  They present new design and verification challenges particularly when new advanced processes are targeted for manufacturing.  On the standard development side Accellera has responded by releasing a new version of its Verilog-AMS.  It is a mature standard originally released in 2000. It is built on top of the Verilog subset of the IEEE 1800 -2012 SystemVerilog.  The standard defines how analog behavior interacts with event-based functionality, providing a bridge between the analog and digital worlds. To model continuous-time behavior, Verilog-AMS is defined to be applicable to both electrical and non-electrical system descriptions.  It supports conservative and signal-flow descriptions and can also be used to describe discrete (digital) systems and the resulting mixed-signal interactions.

The revised standard, Verilog-AMS 2.4, includes extensions to benefit verification, behavioral modeling and compact modeling. There are also several clarifications and over 20 errata fixes that improve the overall quality of the standard.Resources on how best to use the standard and a sample library with power and domain application examples are available from Accellera.

Scott Little, chair of the Verilog AMS WG stated: “This revision adds several features that users have been requesting for some time, such as supply sensitive connect modules, an analog event type to enable efficient electrical-to-real conversion and current checker modules.”

The standard continues to be refined and extended to meet the expanding needs of various user communities. The Verilog-AMS WG is currently exploring options to align Verilog-AMS with SystemVerilog in the form of a dot standard to IEEE 1800. In addition, work is underway to focus on new features and enhancements requested by the community to improve mixed-signal design and verification.

Clearly another aspect of verification that has grown significantly in the past few years is the availability of Verification IP modules.  Together with the new version of the UVM 1.2 (Universal Verification Methodology) standard just released by Accellera, they represent a significant increment in the verification power available to designers.

Jonah McLeod, Director of Corporate Marketing Communications at Kilopass, is also concerned about analog issues.  He said: “Accelerating Spice has to be major tool development of this generation of tools. The biggest problem designers face in complex SoC is getting corner cases to converge. This can be time consuming an imprecise with current generation tools.  Start-ups claiming montecarlo spice accelerations like Solido Design Automation and CLK Design Automation are attempting to solve the problem. Both promise to achieve Spice-level accuracy on complex circuits within a couple of percentage points in a fraction of the time.”

One area of verification that is not often covered is its relationship with manufacturing test.  Thomas L. Anderson, Vice President of Marketing at Breker Verification Systems told me that: “The enormous complexity of a deep submicron (32, 38, 20, 14 nm) SoC has a profound impact on manufacturing test. Today, many test engineers treat the SoC as a black box, applying stimulus and checking results only at the chip I/O pins. Some write a few simple C tests to download into the SoC’s embedded processors and run as part of the manufacturing test process. Such simple tests do not validate the chip well, and many companies are seeing returns with defects missed by the tester. Test time limitations typically prohibit the download and run of an operating system and user applications, but clearly a better test is needed. The answer is available today: automatically generated C test cases that run on “bare metal” (no operating system) while stressing every aspect of the SoC. These run realistic user scenarios in multi-threaded, multi-processor mode within the SoC while coordinating with the I/O pins. These test cases validate far more functionality and performance before the SoC ever leaves the factory, greatly reducing return rates while improving the customer experience.”

Blog Review – Mon. July 14 2014

Monday, July 14th, 2014

Accellera prepares UVM; Shades of OpenGL ES; Healthy heart in 3D; Webinar for SoC-IoT; Smart watch tear-down. By Caroline Hayes, Senior Editor.

An informative update of Universal Verification (UVM) 1.2 is set out by Dennis Brophy, Mentor Graphics on the announcement by Accellera of the update. Ahead of the final review process, which will end October 31, the author sets out what the standard may mean for current and future projects.

The addition of Compute Shaders to OpenGL ES for mobile API is one of the most notable, says Tim Hartley, ARM. He explains what these APIs do and where to use them for maximum effectiveness.

Dassault Systemes was part of The Loving Heart project, producing a video with the BBC, to advertise the world’s first, realistic 3D simulation model of a human heart, developed by Simulia software. The blog, by Alyssa, adds some background and context to how it can be used.

A webinar on Tuesday July 22, covering the SoC verification challenges in the IoT will be hosted by ARM and Cadence. Brian Fuller flags up why presenters in ‘SoC Verification Challenges in the IoT Age’ will help those migrating from 8- and 16bit systems, with a focus on using an ARM Cortex-M0 processor.

Inside the Galaxy Gear, the truly wearable smart watch, is an ARM Cortex-M4 powered STMicroelectronics device. Chris Ciufo cannot pretend to be taken off-guard by the ABI Research teardown.

Next Page »

Extension Media websites place cookies on your device to give you the best user experience. By using our websites, you agree to placement of these cookies and to our Privacy Policy. Please click here to accept.