Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘RTL’

Cadence Launches New Verification Solutions

Tuesday, March 14th, 2017

Gabe Moretti, Senior Editor

During this year’s DVCon U.S. Cadence introduced two new verification solutions: the Xcelium Parallel Simulator and the Protium S1 FPGA-Based Prototyping Platform, which incorporates innovative implementation algorithms to boost engineering productivity.

Xcelium Parallel Simulator

.The new simulation engine is based on innovative multi-core parallel computing technology, enabling systems-on-chip (SoCs) to get to market faster. On average, customers can achieve 2X improved single-core performance and more than 5X improved multi-core performance versus previous generation Cadence simulators. The Xcelium simulator is production proven, having been deployed to early adopters across mobile, graphics, server, consumer, internet of things (IoT) and automotive projects.

The Xcelium simulator offers the following benefits aimed at accelerating system development:

  • Multi-core simulation improves runtime while also reducing project schedules: The third generation Xcelium simulator is built on the technology acquired from Rocketick. It speeds runtime by an average of 3X for register-transfer level (RTL) design simulation, 5X for gate-level simulation and 10X for parallel design for test (DFT) simulation, potentially saving weeks to months on project schedules.
  • Broad applicability: The simulator supports modern design styles and IEEE standards, enabling engineers to realize performance gains without recoding.
  • Easy to use: The simulator’s compilation and elaboration flow assigns the design and verification testbench code to the ideal engines and automatically selects the optimal number of cores for fast execution speed.
  • Incorporates several new patent-pending technologies to improve productivity: New features that speed overall SoC verification time include SystemVerilog testbench coverage for faster verification closure and parallel multi-core build.

“Verification is often the primary cost and schedule challenge associated with getting new, high-quality products to market,” said Dr. Anirudh Devgan, senior vice president and general manager of the Digital & Signoff Group and the System & Verification Group at Cadence. “The Xcelium simulator combined with JasperGold Apps, the Palladium Z1 Enterprise Emulation Platform and the Protium S1 FPGA-Based Prototyping Platform offer customers the strongest verification suite on the market”

The new Xcelium simulator further extends the innovation within the Cadence Verification Suite and supports the company’s System Design Enablement (SDE) strategy, which enables system and semiconductor companies to create complete, differentiated end products more efficiently. The Verification Suite is comprised of best-in-class core engines, verification fabric technologies and solutions that increase design quality and throughput, fulfilling verification requirements for a wide variety of applications and vertical segments.

Protium S1

The Protium S1 platform provides front-end congruency with the Cadence Palladium Z1 Enterprise Emulation Platform. BY using Xilinx Virtex UltraScale FPGA technology, the new Cadence platform features 6X higher design capacity and an average 2X performance improvement over the previous generation platform. The Protium S1 platform has already been deployed by early adopters in the networking, consumer and storage markets.

Protium S1 is fully compatible with the Palladium Z1 emulator

To increase designer productivity, the Protium S1 platform offers the following benefits:

  • Ultra-fast prototype bring-up: The platform’s advanced memory modeling and implementation capabilities allow designers to reduce prototype bring-up from months to days, thus enabling them to start firmware development much earlier.
  • Ease of use and adoption: The platform shares a common compile flow with the Palladium Z1 platform, which enables up to 80 percent re-use of the existing verification environment and provides front-end congruency between the two platforms.
  • Innovative software debug capabilities: The platform offers firmware and software productivity-enhancing features including memory backdoor access, waveforms across partitions, force and release, and runtime clock control.

“The rising need for early software development with reduced overall project schedules has been the key driver for the delivery of more advanced emulation and FPGA-based prototyping platforms,” said Dr. Anirudh Devgan, senior vice president and general manager of the Digital & Signoff Group and the System & Verification Group at Cadence. “The Protium S1 platform offers software development teams the required hardware and software components, a fully integrated implementation flow with fast bring-up and advanced debug capabilities so they can deliver the most compelling end products, months earlier.”

The Protium S1 platform further extends the innovation within the Cadence Verification Suite and supports the company’s System Design Enablement (SDE) strategy, which enables system and semiconductor companies to create complete, differentiated end products more efficiently. The Verification Suite is comprised of best-in-class core engines, verification fabric technologies and solutions that increase design quality and throughput, fulfilling verification requirements for a wide variety of applications and vertical segments.

Blog Review – Monday Sept. 15, 2014

Monday, September 15th, 2014

Video has a CAST of one; RTL clean-up as a simple chore; Detroit spins out roadmap; it’s all about ‘I’
By Caroline Hayes, Senior Editor.

The affable Warren Savage, IP eXtreme, opens a new season of five-minute chats and interviews CAST COO Nickos Zervas, who explains about the role of Greece and a duty to customers.

Using the analogy of the latest household gadget, Graham Bell, Vice President, Marketing, Real Intent, explains how the company’s autoformal tool, cleans up the RTL code in a design.

Invigorated from the Intelligent Transportation Society in Detroit John Day, Mentor Graphics steers his way through the road ahead for the automotive industry.

It sounds like the way to annoy Kris Flautner is to ask “What does the I in IoT stand for?” Apparently he is asked this a lot, but he patiently and clearly explains both the Internet’s role and also the challenges for connectivity and security, ahead of ARM TechCon 2014.

System Integration Requires a Shared Viewpoint

Tuesday, December 17th, 2013

By John Blyler

Qualcomm uses Dassault Systemes’ dashboarding tool in its Hexagon DSP chip to incorporate multiple design metrics from key EDA tools.

The EDA tool market has longed talked about its need to expand beyond the creation of silicon-based system-on-chips (SoCs) to provide packages that integrate the larger hardware and software system. Specifically, the major tool vendors emphasized the need to move beyond EDA-centric issues like electronic system level (ESL) design, functional verification, design-for-yield or any similar so-called crisis issues. The goal has been to move beyond chip creation to system integration to deal with both hardware and software at the chip, board, and end-user product levels.

“It begins with a shift from design creation to integration in the electronic systems industry,” states the Cadence’s EDA Vision 360 report. EDA tool companies have had to expand their coverage into the larger system market, thanks to changes in the semiconductor supply chain.

Regardless of the drivers, the expansion from creation to integration tools for the larger system has not been easy move for a variety of technical and cultural reasons. Consider but one aspect of the problem: How to provide higher-level integration when your customer uses a variety of internal and competitive tools? For example, most IDMs like Intel, Samsung and Apple, as well as fabless chip companies use a variety of EDA tools for synthesis, place and route (P&R), time and power closure, etc. Further, many use a mix of internal tools that have been tailored to the needs of the customer.

To become a system integrator – at least from the chip design space viewpoint – tool providers will need a mechanism to gather, analyze and display useful data metrics from a variety EDA packages. One of the few companies that come close to such an application is not an EDA company at all, but rather comes from a higher-level, project lifecycle management (PLM) provider.

Qualcomm recently shared their challenges in integrating the metrics from a mix of chip design tools. Their problem was how to put together all of the disjointed design pieces for development of its Hexagon DSP-based multithreaded CPU architecture. With a global design team (San Diego, India and Austin), the company had to communicate all of the traditional design metrics like timing and area, with secondary metrics like power and signal integrity. Adding to this technical complexity was the diversity of professionals that needed access to these metrics, from system architects, RTL coders to logical and physical designers.

The answer was simply to use dashboards to display data and metrics in such as way as to quickly show trends and trouble spots. Good dashboards highlight the metrics data in a graphical analysis format while also providing a transition from high-level to detailed low-level views. This abstraction-level zoom-in/zoom-out capability helps designers quickly spot trouble areas and then probe down into the details.

Dwight Galbi, Principle Manager of Qualcomm

Dashboarding is nothing new. “Qualcomm has many internal dashboards,” explained Dwight Galbi, Principle Manager of Qualcomm’s physical design team at a recent Dassault Systemes’s Customer Forum. “We have dashboards that cover some of the (design metrics) … but not one that incorporated all of them.” What was needed was a dashboard to provide design metrics from a variety of EDA tools throughout the chip design process.

That’s where Dassault Systemes’s dashboarding tool called Pinpoint came to into play.  In his presentation, Galbi listed the mix of life cycle tools (albeit from one vendor, i.e.,, Synopsys) used in his recent DSP project. The list included Design Compiler for synthsis;  IC and Talis for P&R; and Prime Time for sign off.

“The beauty here is that these are four different tools but you can incorporate all of the reports into the same web-based server,” said Galbi. Equally important (though not mentioned by Galbi) was that the tool provides a graphical visualization of physical design, timing paths, etc., without needing to reload the entire design block. This saves both time and money – since the user doesn’t need to activate a license from the EDA tool vendors.

Further, using a dashboard can provide a way for geographically dispersed teams to communicate via a common view of the design. This is a key requirement for any system integration. For example, the chip’s Register-Transfer-Level (RTL) codes are often developed by teams in different geographic locations. Complicating the geographic challenges is the need to incorporate third party-IP and reused internal design blocks with the various RTL designs before the implementation process even begins. This is a problem since the physical layout and design team requires the RTL synthesized code (with all the IP), design planning and place-and-route (P&R) data to decide if the primary chip design constraints can be met.

Getting the detailed RTL design team to work with the physical layout-design teams as soon as possible encourages communication and successful design practices. It helps mitigate the problems of siloed design activates. Also, a dashboard approach incorporates the essential data metrics from several different EDA tools into one place. This single, global view increases the likelihood of a successful SoC design as well as integrating that design – and the team – with the next level of system development.

What is the current state of ESL tools?

Friday, September 27th, 2013

By Gabe Moretti

In preparing for this panel I had a conversation with Gary Smith.  As usual Gary was quite helpful and discussed the most likely growth path for the ESL market as well as the underlying factors that will catalyze the growth.  His contributions will be the subject of a follow up article later this month.  The bottom line is that ESL is dynamic and adapting to the requirements of the development of complex systems, both as IC’s and as complete systems.

This month’s panelists were Brett Cline from Forte Design Systems, Jon McDonald from Mentor Graphics, Bill Neifert from Carbon Design Systems, and Frank Schrrmeister of Cadence.

Their pictures and short biographies follow the body of the article.

SLD: Is ESL (Electronic System Level) terminology obsolete in the context of complex integrated systems?

Frank Schrrmeister: No, the term ESL is still very valid. The “S” in ESL can stand for either “Software” or “System”. The scope of what a system represents is important, as well as the engines used to perform system and software related tasks.

A complex system can be a chip as well as a component in its system environment. Most often various types of software are executing on processing engines within the chip, and communicate through complex interconnect and peripherals to other components at the board level.

In the past, ESL has often been defined as “everything happening before RTL”. In reality, ESL overlaps heavily with verification, so one may suggest an extension of ESL being “everything happening until RTL is signed off and verified.”  The reason for that becomes clear when considering a chip-development flow.

The flow generally encompasses the development hierarchy from IP blocks through sub-systems and the actual SoC together with the software hierarchy running on top of the chip. The key take away here is that after about 60% of the time of the project has elapsed, all relevant system and software dependencies have to be identified and verified, otherwise the chip is at risk to either not accurately execute the software or not work correctly within its system environment. The traditional pre-RTL and RTL based decisions are blending and overlapping more and more, so the handover to silicon implementation, i.e. RTL being fully verified, is really the time at which all ESL questions have to be answered.

Brett Cline: ESL isn’t obsolete because it was insufficiently defined in the first place. Certainly, ESL can be cleverly used to encompass whatever one’s needs are and that is exactly what has been done.

ESL can be anything from the complete definition of an airplane’s electronic systems, a telephone system, an SoC, or even an FPGA with software. Why not? They are all electronic systems.   For hardware design, Forte has used ESL to describe the abstraction level above RTL. We’ve also called the same level of abstraction behavioral level.

SLD: Is there a demand for heterogeneous system development tools?

Jon McDonald: There is a strong need for each tool to have the ability to interact with representations from other design spaces.  Engineering disciplines have been compartmentalized for a long time to address significant issues in system complexity.  Each discipline needs development tools focused on reducing the complexity of that domain.  Forcing heterogeneous system development tools results in reducing the capabilities of the tools across the board, while enabling the tools in each discipline to communicate with tools in connected areas has the potential to tremendously increase the value of the analysis in each design discipline without compromising the effectiveness of the individual tools.  There is a need for heterogeneous communication of information between system development tools, there is not a need, and I believe it would be ineffective, to attempt to create heterogeneous system development tools.

Bill Neifert: Like it or not, software can have a significant impact on the behavior of the system. Far too often, hardware decisions are made with no real data from the software team and software teams are forced to live with decisions the hardware team made without their input.

A heterogeneous tool providing value to both the hardware and software team enables much earlier design collaboration and the capability to deliver an overall system product much better suited to the needs of the market. This is reflected in the way engineering teams are built today. I used to regularly attend customer meetings where the hardware and software engineers were meeting each other for the first time. That doesn’t seem to happen much anymore as the makeup of the engineering teams themselves becomes more heterogeneous. Tools must evolve to keep up with the changing demographics of their target user base.

SLD: How could EDA companies expand their role to integrate non-electronic components in a system?

Brett Cline: Integrating non-electronic components could help more accurately model the system prior to construction with obvious benefits. There are plenty of tools and technologies available to help model the non-electronic portions of a system. At some point, the tool flow becomes extremely complex and modeling the entire system becomes prohibitively expensive or difficult. Should EDA companies choose to tackle this problem, the getting the tool flow and integration will be paramount to being successful.

Jon McDonald: By creating standard interfaces for passing relevant information into the EDA tool domains, we allow the individual tools to leverage the expertise and accuracy present in other domains.  Similar to the previous question, ‘Is ESL terminology obsolete?’ I think EDA terminology is obsolete.  EDA companies, Mentor specifically, have dramatically expanded their product portfolio to address disciplines outside the traditional EDA tool space. In my experience, Mentor is maintaining the capabilities of each of the tools in its intended domain and leveraging the communication of domain specific knowledge by passing appropriate information to tools in other design disciplines.

SLD: One often hears “we do not know this market” as a justification to stick to electronics.  If so how is EDA to grow in the ESL segment?

Bill Neifert: These two questions seem interrelated, so I’ll answer them together. If EDA is going to be successful selling to non-hardware folks, it needs to stop treating them like hardware folks.

Hardware users pay large amounts of money for software that has to be configurable enough to meet the needs of the next generation of ever-growing hardware designs. Software users don’t tend to need this configurability –– they’re not changing the hardware, just using it –– but they need something easy to use and at a much lower price point. Far too often, however, EDA vendors tend to try to sell the same tools to software designers that they sell to hardware engineers and this creates a fundamental value mismatch. EDA won’t be successful selling to software teams until it can create a need for software teams to use the EDA solutions and when the EDA vendor is capable of providing them at a cost software designers can afford.

For example, in the virtual prototype space, the software user wants something fast and free on a per-use basis. The hardware user needs something accurate and configurable to enable design exploration. Some vendors address this by selling two different products, one aimed at software designers and another at hardware engineers, eliminating a lot of the collaboration value inherent to virtual prototypes. Carbon has a single tool that can be used by the hardware team to optimize its systems and then automatically creates the system used by the software team.

The software team gets value because it has a fast model suited for its needs at a software team price point.  The hardware team gets value because the software team is now running on the same representation of the hardware that it’s using for design.

This approach works to expand the market because we’re enabling the hardware team to expand the market for us. EDA companies with their expensive direct sales forces aren’t built well to address the price points required by software teams. By enabling the hardware team to deliver a valuable solution to the software team, we’re getting them to do our sales work for us, to the benefit of all involved.

Fundamentally, this is a business model challenge as much as it is a technical one. Trying to apply the EDA business model and cost structure to address software users’ needs will not work and has not worked. The cost of selling and developing leading-edge EDA tools is expensive and reflected in the cost of the tools.

Software teams are much larger than hardware teams, so they are a potentially lucrative market that could be tapped, even at a much lower price point. They must be sold to in a much more cost efficient way, however. By selling a product with a built-in distribution mechanism to enable the hardware team to easily support the needs of the software team, EDA companies can continue selling to their primary hardware users while enabling the much larger software community.

Frank Schrrmeister: Fact is that the core customers of EDA – the semiconductor houses – have taken on over the last two decades huge amounts of additional expertise as part of their developments. Where a set of drivers and a partnership with operating system vendors may have been enough 15 years ago, today the same vendors have to provide chips with reference ports of Android, Linux, Chrome OS and Windows Mobile just to win the socket. We all have to learn and deal with the aspects of those adjacent markets as they increasingly simply become a “must Deal with” for existing EDA customers.


Brett Cline is vice president of Marketing and Sales for Forte Design Systems. Before joining Forte in 1999, he was director of Marketing at Summit Design, where he managed both the verification product line, including HDL Score, and marketing communications. Cline joined Summit through its acquisition of Simulation Technologies in 1997. He has held positions in development, applications, and technical marketing at Cadence Design Systems and General Electric. Cline holds a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Mass.


Jon McDonald is a Senior Technical Marketing Engineer at Mentor Graphics. He received a BS in Electrical and Computer Engineering from Carnegie Mellon and an MS in Computer Science from Polytechnic University. He has been active in digital design, language-based design and architectural modeling for over 15 years. Prior to joining Mentor Graphics, Mr. McDonald held senior technical engineering positions with Summit Design, Viewlogic Systems and HHB Systems.


Bill Neifert is CTO and vice president of Business Development at Carbon Design Systems. A Carbon cofounder, he has 13 years of electronics engineering experience with more than 10 years in EDA, including C-Level Design and QuickTurn Systems. Neifert has designed high-performance verification and system-integration solutions, and developed an architecture and coding style for high-performance RTL simulation in C/C++. He has a Bachelor of Science degree and a Master of Science degree in Computer Engineering from Boston University.