Part of the  

Chip Design Magazine


About  |  Contact

Posts Tagged ‘Forte Design’

Blog Review Mon. July 21, 2014

Monday, July 21st, 2014

Plane talking at the Farnborough Air Show from Dassault Systemes and Synopsys goes vertical; Forte Q&A; New views at ARM; Interface integration at Mentor Graphics. By Caroline Hayes, Senior Editor.

Inspired by the vertical takeoff of the iconic Harrier jump jet, Michael Posner, looks at the vertical HAPS daughter board launched by Synopsys. He lists the real estate and connector benefits of thinking “laterally”.

When the acquisition of Forte by Cadence was announced, there were many questions asked and Richard Goering, Synopsys asks them of former CEO, Sean Dart. The current senior group director for R&D, Cadence System-Level Design Group has some interesting insights into high level synthesis and the advantages of being acquired by Cadence.

Dassault Systemes is flying high, with its own chalet at this year’s Farnborough International Airshow in the UK. Amongst the aircraft, the company had a dedicated meeting spaces and a 3DEXPERIENCE Playground, reports Aurelien. There are some great images on this blog, amongst the news items.

New boy on the ARM block, Varun Sarwal, dives straight in the blogsphere with news of the Juno 64bit development platform. He explains the architecture well, covering hardware and software overviews.

Welcoming a guest blogger at Mentor Graphics, Phil Brumby, examines user interface and relates how the company has finished integrating its own RTOS Nucleus and development tools with Embedded Wizard UI tool from TARA Systems for MCU or MPU-based hardware systems, such as the glucose meter illustrated on the blog.

High Level Synthesis (HLS) Splits EDA Market

Friday, February 14th, 2014

Recent acquisitions and spin-offs by the major electronic design automation company’s reveals key differences in the design of complex chips.

Last week, Cadence Design Systems announced the acquisition of Forte Design. This announcement brought renewed interest to the high-level synthesis (HLS) of semiconductor chips. But the acquisition also raises questions about emerging changes in the electronic design automation (EDA) industry. Before looking at these wide-ranging changes, let’s see how this acquisition may affect the immediate HLS market.

At first glance, it seems that Cadence has acquired a redundant tool. Both Forte’s Cynthesizer and Cadence’s C-to-Silicon are SystemC-based applications that help chip designers create complex system-on-chips (SoCs) designs from higher levels of abstraction. “High-level synthesis (HLS) tools synthesize C/C++/SystemC code targeting hardware implementation, after the hardware-software trade-offs and partitioning activities have been performed upstream in the design flow,” explained Gary Dare, General Manager at Space Codesign,  a provider of front-end architectural EDA design tools.

Although both Cadence’s and Forte’s HLS tools are based on SystemC, they are not identical in function.

Forte’s strength lies in the optimization of data path design, i.e., the flow of data on a chip. This strength comes from Forte’s previous acquisition of the Arithmetic IP libraries, which focuses on mathematical expressions and related data types, e.g. floating-point calculations.

How do the data bits from arithmetic computations move through a SoC? That’s where the C-to-Silicon tool takes over. “Forte’s arithmetic and data focus will complement Cadence’s C-to-Silicon strength in control logic synthesis domain,” notes Craig Cochran, VP of Corporate Marketing at Cadence. The control plane serves to route and control the flow of information and arithmetic computations from the data plane world.

Aside from complementary data and control plane synthesis, the primary difference between the two tools is that C-to-Silicon was built on top of a register-transfer level (RTL) compiler, thus allowing chip designers to synthesize from high-level SystemC level down down to the hardware specific gate level.

The emphasis on the SystemC support for both tools is important. “Assuming that Cadence keeps the Forte Design team, it will be possible to enhance C-to-Silicon with better SystemC support based on Cynthesizer technology,” observed Nikolaos Kavvadias, CEO, Ajax Compilers. “However, for the following 2 or 3 years both tools will need to be offered.”

From a long-term perspective, Cadence’s acquisition of Forte’s tools should enhance their position in classic high-level synthesis (HLS). “Within 2013, Cadence acquired Tensilica’s and Evatronix’ IP businesses,” notes Kavvadias. “Both moves make sense if Cadence envisions selling the platform and the tools to specialize (e.g. add hardware accelerators), develop and test at a high level.”

These last two process areas – design and verification – are key strategies in Cadences recent push into the IP market. Several acquisitions beyond Tensilica and Evatronix over the last few years have strengthened the company’s portfolio of design and verification IP. Further, the acquisition of Forte’s HSL tool should give Cadence greater opportunities to drive the SystemC design and verification standards.

Enablement verses Realization

Does this acquisition of another HLS company support Cadence’s long-term EDA360 vision? When first introduced several years ago, the vision acknowledged the need for EDA tools to more than automate the chip development process. It shifted focus to development of a hardware and software system in which the hardware development was driven by the needs of the software application.

“Today, the company is looking beyond the classic definition of EDA – which emphasizes automation – to the enablement of the full system including hardware, software and IP on chips and boards to interconnections and verification of the complete system,” explains Cochran. “And this fits into that system (HLS) context.

The system design enablement approach was first introduced by Cadence during last month’s earning report. The company has not yet detailed how the “enablement” approach relates to its previous “realization” vision. But Cochran explains it this way: “Enablement goes beyond automation. Enablement includes our content contribution to our customer’s design in the form of licensable IP and software.” The software comes in many forms, from the drivers and applications that run on the Tensilica (IP) processors to other embedded software and codices.”

This change in semantics may reflect the change in the way EDA tool companies interface with the larger semiconductor supply chain. According to Cochran and others, design teams from larger chip companies are relying more on HLS tools for architectural development and verification of larger and larger chips.  In these ever growing SoC designs, RTL synthesis has become a bottleneck. This means that chip designers must synthesize much larger portions of their chips in a way that reduces human error and subsequent debug and verification activities. That’s the advantage offered by maturing high-level synthesis tools.

Cadence believes that SystemC is the right language for HLS development. But what is the alternative?

HLS Market Fragments

The other major high-level synthesis technology in the EDA market relies on ANSI-C and C++ implementation that involve proprietary libraries and data types, explained Cochran. “These proprietary libraries and data types are needed to define the synthesis approach in terms of mathematical functions, communication between IP blocks and to represent concurrency.” The ANSI-C approach appeals to designers writing software algorithm rather than designing chip hardware.

Kavvadias agrees, but adds this perspective. “Given the Synopsys’s recent acquisition of Target Compiler Technologies (TCT), it appears that the big three have different HLS market orientations: Cadence with a SystemC to ASIC/FPGA end-to-end flow, Snopsys moving on to application-specific instruction-set processor (ASIP) synthesis technology, while Mentor has offloaded its HLS business.”

“Further, Synopsys now has two totally distinct ASIP synthesis technologies, LISATek’s Processor Designer and TCT’s IP Designer,” noes Kavvadias. “They are based on different formalisms (LISA and nML) and have different code and model generation approaches. In order to appeal to ASIP synthesis tool users, Cadence will have to focus to the XPRES toolset. But I’m not sure this will happen.”

A few years ago, Mentor Graphics spun out it HLS technology to Calypto. But Mentor still owns a stake in the spin-off company. That’s why long-time EDA analyst Gary Smith believes that the Forte acquisition puts Cadence and Mentor-Calypto’s CatapultC way ahead of Synopsys’s Synfora Synphony C Compiler. “The Synopsys HLS tool pretty much only does algorthmic mapping to RTL, whereas Forte and Mentor-Calypto tools can do algorthmic mapping, control logic, data paths, registers, memory interfaces, etc. — a whole design.”

What does the Future hold?

Forte’s tool focus on data path synthesis and associated arithmetic IP should mean few integration issues with Cadence’s existing HLS tool C-to-Silicon. However, Kavvadias notes that the acquisition makes floating-point IP increasingly important. “It is relevant to algorithmists (e.g. using MATLAB or NumPy/SciPy) wishing to push-button algorithms to hardware.” The efficient implementation of floating-point functions is not a trivial task.

Kavvadias  predictions that, “if CDNS buys a matrix-processor oriented IP portfolio, then their next step is definitely a MATLAB- or Python-to-hardware HLS tool and maybe the MATLAB/Python platform beyond that.” Matrix processors are popular in digital signal processing (DSP) applications that require massive multiply-accumulate (MAC) data operation.

Today’s sensor and mobile designs require the selection of the most energy-efficient platforms available. In turn, this mandates the need for early, high-level power trade-off studies – perfect for High-Level Synthesis (HLS) tools.

What is the current state of ESL tools?

Friday, September 27th, 2013

By Gabe Moretti

In preparing for this panel I had a conversation with Gary Smith.  As usual Gary was quite helpful and discussed the most likely growth path for the ESL market as well as the underlying factors that will catalyze the growth.  His contributions will be the subject of a follow up article later this month.  The bottom line is that ESL is dynamic and adapting to the requirements of the development of complex systems, both as IC’s and as complete systems.

This month’s panelists were Brett Cline from Forte Design Systems, Jon McDonald from Mentor Graphics, Bill Neifert from Carbon Design Systems, and Frank Schrrmeister of Cadence.

Their pictures and short biographies follow the body of the article.

SLD: Is ESL (Electronic System Level) terminology obsolete in the context of complex integrated systems?

Frank Schrrmeister: No, the term ESL is still very valid. The “S” in ESL can stand for either “Software” or “System”. The scope of what a system represents is important, as well as the engines used to perform system and software related tasks.

A complex system can be a chip as well as a component in its system environment. Most often various types of software are executing on processing engines within the chip, and communicate through complex interconnect and peripherals to other components at the board level.

In the past, ESL has often been defined as “everything happening before RTL”. In reality, ESL overlaps heavily with verification, so one may suggest an extension of ESL being “everything happening until RTL is signed off and verified.”  The reason for that becomes clear when considering a chip-development flow.

The flow generally encompasses the development hierarchy from IP blocks through sub-systems and the actual SoC together with the software hierarchy running on top of the chip. The key take away here is that after about 60% of the time of the project has elapsed, all relevant system and software dependencies have to be identified and verified, otherwise the chip is at risk to either not accurately execute the software or not work correctly within its system environment. The traditional pre-RTL and RTL based decisions are blending and overlapping more and more, so the handover to silicon implementation, i.e. RTL being fully verified, is really the time at which all ESL questions have to be answered.

Brett Cline: ESL isn’t obsolete because it was insufficiently defined in the first place. Certainly, ESL can be cleverly used to encompass whatever one’s needs are and that is exactly what has been done.

ESL can be anything from the complete definition of an airplane’s electronic systems, a telephone system, an SoC, or even an FPGA with software. Why not? They are all electronic systems.   For hardware design, Forte has used ESL to describe the abstraction level above RTL. We’ve also called the same level of abstraction behavioral level.

SLD: Is there a demand for heterogeneous system development tools?

Jon McDonald: There is a strong need for each tool to have the ability to interact with representations from other design spaces.  Engineering disciplines have been compartmentalized for a long time to address significant issues in system complexity.  Each discipline needs development tools focused on reducing the complexity of that domain.  Forcing heterogeneous system development tools results in reducing the capabilities of the tools across the board, while enabling the tools in each discipline to communicate with tools in connected areas has the potential to tremendously increase the value of the analysis in each design discipline without compromising the effectiveness of the individual tools.  There is a need for heterogeneous communication of information between system development tools, there is not a need, and I believe it would be ineffective, to attempt to create heterogeneous system development tools.

Bill Neifert: Like it or not, software can have a significant impact on the behavior of the system. Far too often, hardware decisions are made with no real data from the software team and software teams are forced to live with decisions the hardware team made without their input.

A heterogeneous tool providing value to both the hardware and software team enables much earlier design collaboration and the capability to deliver an overall system product much better suited to the needs of the market. This is reflected in the way engineering teams are built today. I used to regularly attend customer meetings where the hardware and software engineers were meeting each other for the first time. That doesn’t seem to happen much anymore as the makeup of the engineering teams themselves becomes more heterogeneous. Tools must evolve to keep up with the changing demographics of their target user base.

SLD: How could EDA companies expand their role to integrate non-electronic components in a system?

Brett Cline: Integrating non-electronic components could help more accurately model the system prior to construction with obvious benefits. There are plenty of tools and technologies available to help model the non-electronic portions of a system. At some point, the tool flow becomes extremely complex and modeling the entire system becomes prohibitively expensive or difficult. Should EDA companies choose to tackle this problem, the getting the tool flow and integration will be paramount to being successful.

Jon McDonald: By creating standard interfaces for passing relevant information into the EDA tool domains, we allow the individual tools to leverage the expertise and accuracy present in other domains.  Similar to the previous question, ‘Is ESL terminology obsolete?’ I think EDA terminology is obsolete.  EDA companies, Mentor specifically, have dramatically expanded their product portfolio to address disciplines outside the traditional EDA tool space. In my experience, Mentor is maintaining the capabilities of each of the tools in its intended domain and leveraging the communication of domain specific knowledge by passing appropriate information to tools in other design disciplines.

SLD: One often hears “we do not know this market” as a justification to stick to electronics.  If so how is EDA to grow in the ESL segment?

Bill Neifert: These two questions seem interrelated, so I’ll answer them together. If EDA is going to be successful selling to non-hardware folks, it needs to stop treating them like hardware folks.

Hardware users pay large amounts of money for software that has to be configurable enough to meet the needs of the next generation of ever-growing hardware designs. Software users don’t tend to need this configurability –– they’re not changing the hardware, just using it –– but they need something easy to use and at a much lower price point. Far too often, however, EDA vendors tend to try to sell the same tools to software designers that they sell to hardware engineers and this creates a fundamental value mismatch. EDA won’t be successful selling to software teams until it can create a need for software teams to use the EDA solutions and when the EDA vendor is capable of providing them at a cost software designers can afford.

For example, in the virtual prototype space, the software user wants something fast and free on a per-use basis. The hardware user needs something accurate and configurable to enable design exploration. Some vendors address this by selling two different products, one aimed at software designers and another at hardware engineers, eliminating a lot of the collaboration value inherent to virtual prototypes. Carbon has a single tool that can be used by the hardware team to optimize its systems and then automatically creates the system used by the software team.

The software team gets value because it has a fast model suited for its needs at a software team price point.  The hardware team gets value because the software team is now running on the same representation of the hardware that it’s using for design.

This approach works to expand the market because we’re enabling the hardware team to expand the market for us. EDA companies with their expensive direct sales forces aren’t built well to address the price points required by software teams. By enabling the hardware team to deliver a valuable solution to the software team, we’re getting them to do our sales work for us, to the benefit of all involved.

Fundamentally, this is a business model challenge as much as it is a technical one. Trying to apply the EDA business model and cost structure to address software users’ needs will not work and has not worked. The cost of selling and developing leading-edge EDA tools is expensive and reflected in the cost of the tools.

Software teams are much larger than hardware teams, so they are a potentially lucrative market that could be tapped, even at a much lower price point. They must be sold to in a much more cost efficient way, however. By selling a product with a built-in distribution mechanism to enable the hardware team to easily support the needs of the software team, EDA companies can continue selling to their primary hardware users while enabling the much larger software community.

Frank Schrrmeister: Fact is that the core customers of EDA – the semiconductor houses – have taken on over the last two decades huge amounts of additional expertise as part of their developments. Where a set of drivers and a partnership with operating system vendors may have been enough 15 years ago, today the same vendors have to provide chips with reference ports of Android, Linux, Chrome OS and Windows Mobile just to win the socket. We all have to learn and deal with the aspects of those adjacent markets as they increasingly simply become a “must Deal with” for existing EDA customers.

Brett Cline is vice president of Marketing and Sales for Forte Design Systems. Before joining Forte in 1999, he was director of Marketing at Summit Design, where he managed both the verification product line, including HDL Score, and marketing communications. Cline joined Summit through its acquisition of Simulation Technologies in 1997. He has held positions in development, applications, and technical marketing at Cadence Design Systems and General Electric. Cline holds a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Mass.

Jon McDonald is a Senior Technical Marketing Engineer at Mentor Graphics. He received a BS in Electrical and Computer Engineering from Carnegie Mellon and an MS in Computer Science from Polytechnic University. He has been active in digital design, language-based design and architectural modeling for over 15 years. Prior to joining Mentor Graphics, Mr. McDonald held senior technical engineering positions with Summit Design, Viewlogic Systems and HHB Systems.

Bill Neifert is CTO and vice president of Business Development at Carbon Design Systems. A Carbon cofounder, he has 13 years of electronics engineering experience with more than 10 years in EDA, including C-Level Design and QuickTurn Systems. Neifert has designed high-performance verification and system-integration solutions, and developed an architecture and coding style for high-performance RTL simulation in C/C++. He has a Bachelor of Science degree and a Master of Science degree in Computer Engineering from Boston University.