Part of the  

Chip Design Magazine

  Network

About  |  Contact

Archive for April, 2015

Cadence Introduces Indago Debug Platform

Tuesday, April 28th, 2015

Gabe Moretti, Senior Editor

Cadence Design Systems, Inc. announced the Cadence Indago Debug Platform, a new debugging solution which the company claims it reduces the time to identify bugs in a design by up to 50 percent compared to traditional signal- or transaction-level debug methods.

Cadence chose to put the accent on the second syllable.  Personally I think that the accent on the first syllable would give more verve to the name, while the accent on the second makes it more laid back.  But I do need to find some drawbacks in the product!

The patented root-cause analysis technology in the Indago Debug Platform filters unneeded data that would usually be needed to go to the source of a single bug to resolve the cause of all related bugs.  Current debug methodologies used by designers require multiple simulation iterations to incrementally extract data points that ultimately point to the source of the bug. The technology in the Indago Debug Platform can reduce the time necessary to resolve schedule-impacting bugs significantly using a common set of resources that enable a suite of commercial and user-created Apps that automate the analysis of data from multiple verification engines and multiple vendors.

“Leading-edge verification projects create terabytes of data every day, making debug a big data problem for semiconductor and system companies,” said Andy Eliopoulos, vice president, research and development, Advanced Verification Solutions at Cadence.  “With the Indago Debug Platform, Cadence helps solve this problem by automating the process of finding the root cause for a bug. For the first time, engineers can have a collaborative environment across multiple verification engines that both reduces the time to solve the discovered bug and the root cause for other bugs that may be buried in the data.”

In addition to the Indago Debug Platform, Cadence also announced three debugging apps that plug into the platform and can be used with other verification tools to provide an integrated and synchronized debug solution for testbench, verification IP (VIP), and hardware/software debug for system-on-chip (SoC) designs.

The Indago Debug Platform and debugging apps are part of the comprehensive Cadence System Development Suite and are currently available for early adopters. General availability is expected by June 2015.

With a unified debug platform and the debug apps, the Indago Debug Platform enables multiple engineering specialists from design, testbench, embedded software and protocol verification to operate as a team to resolve SoC bugs. The three debug apps are:

- Indago DebugAnalyzer:  Extends root-cause analysis from etestbench (IEEE 1647) to SystemVerilog (IEEE 1800)and increases performance by up to 10X

- Indago Embedded Software Debug: Resolves bugs associated with embedded software applications by synchronizing software and hardware source code debug

- Indago Protocol Debug:  Visualizes advanced protocols such as DDR4, ARM AMBA AXI and ACE using Cadence VIP.

Synopsys to Acquire Codenomicon

Wednesday, April 22nd, 2015

Gabe Moretti, Senior Editor

After what many thought was a diversion of focus when Synopsys acquired Coverity, the company is making another bold move with the announced acquisition of Codenomicon

Based in Finland, Codenomicon is well-known and highly respected in the global software security world with a focus on software embedded in chips and devices.

The official Synopsys release states: “The additional talent, technology and products will expand Synopsys’ presence in the software security market segment and extend the Coverity quality and security platform to help software developers throughout various organizations quickly find and fix security vulnerabilities and protect applications from security attacks.”

Fine thought and certainly true.  But looking at the security problems, those already found and those yet to be written about, in the IoT architecture, I think that Synopsys should not minimize the impact that the technologists at Codenomicon will have on the EDA market.

“Businesses are increasingly concerned about the security of their applications and protecting customer data. Adding the Internet of Things to the mix increases the complexity of security even further. During the past 15 months, the world was hit by major security breaches such as Heartbleed, Shellshock, etc.,” said Chi-Foon Chan, president and co-CEO of Synopsys. “By combining the Coverity platform with the Codenomicon product suite, Synopsys will expand its reach to provide a more robust software security solution with a full set of tools to help ensure the integrity, privacy and safety of an organization’s most critical software applications.”

Codenomicon’s customer base includes some of the world’s leading organizations in telecommunications, finance, manufacturing, software development, healthcare, automotive and government agencies.  But as part of Synopsys Codenomicon’s solutions deliver a more comprehensive security offering for the software development lifecycle by adding its Defensics tool for file and protocol fuzz testing, and its AppCheck tool for software composition analysis and vulnerability assessment to the embedded software used in electronics systems.

The Codenomicon Defensics tool used to discover the Heartbleed bug automatically tests the target system for unknown vulnerabilities, helping developers find and fix them before a product goes to market. It is a systematic solution to make systems more robust, harden them against cyber-attacks and mitigate the risk of 0-day vulnerabilities. The Defensics tool also helps expose failed cryptographic checks, privacy leaks or authentication bypass weaknesses. The Defensics tool is heavily used by buyers of Internet-enabled products to validate and verify that procured products meet their stringent security and robustness requirements.

The Codenomicon AppCheck tool adds software composition analysis (SCA) capabilities to the Coverity platform, helping customers reduce risks in third-party and open source components. When using the AppCheck tool, customers are able to obtain a software bill of materials (BOM) for their application portfolios, and identify components with known vulnerabilities.

Intel Acquisition of Altera: the Worth of Rumors

Monday, April 13th, 2015

Gabe Moretti, Senior Editor

In the last two weeks John Cooley, the owner of the Deep Chip column. has spent a significant amount of words commenting on the Wall Street Journal story regarding the offering by Intel to acquire Altera.  At the end of the two weeks I have learned a full list of short term negative events possible that can result from an acquisition, thanks to John.  I also learned from the usual anonymous contributor that at least two projects using FPGA devices to increase search speed in a data center did not meet expected goals.  This second contribution was in response to the reason the Wall Street article gave for Intel’s interest in Altera.  The writer of the Deep Chip piece, who signs himself or herself “Been There, Done That” states that the main reason for Intel acquisition is the market expansion in FPGA due to the need to use FPGA devices in data centers to increase search speed.

By the way, I really wish John would rely less on these anonymous contributions because more frequently than is comfortable, such pieces end up having ulterior motives.  “Caveat lector” reader beware, is the approach to be taken with these contributions.   The anonymous contribution published this week regarding the supposed technological reason for the acquisition of Altera by Intel is particularly off the mark.

“Been There, Done That” starts from a premise that is hardly defendable.  In my job I talk to both financial analysts and editors.  We all know the names of the few analysts that truly follow our industry: none of them is connected with this particular story.  Generally financial editors have only a superficial knowledge of the industry in general. The Deep Chip contributor took the Wall Street explanation for the acquisition as a real reason for such move.  Wrong, really wrong.  Contrary to the Wall Street editor, Intel has people that do not just rely on a Google search to find out the potential of the acquisition, both from a technological and a financial point of view.  FPGA in the data center is likely to have been a topic found with a Google search, knowing how good Intel is at pointing the financial community in the wrong direction when they want.  I can assure you that the real reason for the acquisition, if the acquisition will indeed take place, may never be known.  Reasons will be given, but the real, fundamental one, may never be known outside a small group at Intel.  Please do not be so naïve to assume that every acquisition offer has the goal to actually end up in an acquisition.  Do you remember how much money Carl Icahn made with is “failed” acquisition of Mentor by Cadence?

The anonymous Deep Chip contributor uses a published paper from Microsoft personnel that stated that their project of using FPGA devices to increase search speed only doubled the speed. “Been There, Done That” states that “my engineers and I” have identified 14 failures in such an approach.  The best I can deduct is that that particular engineering team has also used FPGAs in an attempt to increase the search speed of a system and found that an ASIC solution worked better in their particular case.  I ask myself why an entire engineering team has the time to analyze a Wall Street article instead of doing productive work?  Is it to derail the acquisition or to ridicule Intel or may be the Wall Street Journal?

The stated reasons for the projected failure of the acquisition rests on one Microsoft project and the experience of another engineering team.  I am sure that the writer is extremely familiar with the project he or she led, but probably not intimately familiar with the purpose of the Microsoft project.  Sometimes collateral results are as or more important than the stated purpose of the project.  Let me point out that the Microsoft project was not meant to deliver a sellable product and that it may have been only one of a number of parallel projects.

So “Been There, Done That” fails to prove that the assumed technical reason for the acquisition is misguided, since it is likely that the reason does not exist.  Unless, of course, he or she works for Intel.  In which case good luck hiding under a pseudonym.  I do not presume to know the reason or reasons Intel has to acquire Altera but I can point out that the FPGA market is healthy and likely to grow, that there might be technological synergy between FPGA and IC engineering (in either or both directions), and that it may be more efficient to acquire experienced human resources through an acquisition than in the open market.

Design Automation Is More Than EDA

Wednesday, April 8th, 2015

Gabe Moretti, Senior Editor

In a couple of months the EDA industry will hold its yearly flagship conference: the Design Automation Conference (DAC).  And just a few days ago Vic Kulkarni, SVP & GM, RTL Power Business at ANSYS-Apache Business Unit, told me that the industry should be called Design Automation Industry, and not EDA.  Actually both the focus of the EDA industry and the contents of DAC are mostly IC design.  This activity is now only a portion of system design and thus the conference does not live up to its name and the industry does not support system design in its entirety.

For almost all its entire life the EDA industry has been focused only on hardware and it did it well.  That was certainly enough when products were designed with the approach that electronic systems ‘s purpose was to execute application software with the help of an operating system and associated device drivers.

A few years ago the EDA industry realized that the role of hardware and software had changed.  Software was used to substitute hardware to implement system tasks and to “personalize” systems that were not end-user programmable.  It than became necessary to verify these systems and new markets, those of virtual prototyping and software/hardware verification, grew and continue to grow.  Yet the name remains Electronic Design Automation and most of its popular descriptions, like in Wikipedia,  deal only with hardware design and development.

The Internet of Things (IoT), where intelligent electronic products are interconnected to form a distributed and powerful data acquisition, analysis, and control system, is considered to be a major growth area for the electronics industry and thus for EDA.  A small number of companies, like Ansys and Mathworks for example, have realized some time ago that a system is much more than “just” electronics and software and these companies now have an advantage in the IoT market segment.

By just  developing and verifying the electronic portion of a product traditional EDA companies run the risk to fool designers into thinking that the product is as efficient and secure as it can be.  In fact, even virtual prototyping tools cannot make an absolute statement regarding the robustness of the software portion of the design.  All that can be said using this tools is that the interface between hardware and software has been sufficiently verified and that the software appears to work correctly when used as intended.  But a number of organizations and individuals have pointed to critical security issues in existing real time systems that are the precursors of IoT.  The latest attention to security in automotive applications is an example.

The use of MEMS in IoT applications should push EDA leaders to ask why MCAD is still considered a discipline separate from EDA.  The interface and concurrent execution of the electro-mechanical system is as vital as the EDA portion.  Finally the EDA industry has another weakness when it comes to providing total support for IoT products.  Nature is analog, yet the analog portion of the EDA industry lags significantly from the development achieved in the digital portion.  Digital tools only offer approximations that have been, and still are, good enough for simpler systems.  Leading edge processes are clearly showing that such approximation yields results that are no longer acceptable, and product miniaturization has resulted in a new set of issues that are almost entirely in the analog domain.

The good news is that the industry has always managed to catch up in response to customers’ demand.  The hope offered by IoT growth is not simply revenue growth due to new licenses, but a maturing of the industry to finally provide support for true system design and verification.