Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘verification’

Portable Stimulus

Thursday, March 23rd, 2017

Gabe Moretti, Senior Editor

Portable Stimulus (PS) is not a new sex toy, and is not an Executable Specification either.  So what is it?  It is a method, or rather it will be once the work is finished to define inputs independently from the verification tool used.

As the complexity of a system increases, the cost of its functional verification increases at a more rapid pace.   Verification engineers must consider not only wanted scenarios but also erroneous one.  Increased complexity increases the number of unwanted scenarios.  To perform all the required tests, engineers use different tools, including logic simulation, accelerators and emulators, and FPGA prototyping tools as well.  To transport a test from one tool to another is a very time consuming job, which is also prone to errors.  The reason is simple.  Not only each different class of tools uses different syntax, in some cases it also uses different semantics.

The Accellera System Initiative, known commonly as simply Accellera is working on a solution.  It formed a Working Group to develop a way to define tests in a way that is independent of the tool used to perform the verification.  The group, made up of engineers and not of markting professionals, chose as its name what they are supposed to deliver, a Portable Stimulus since the verification tests are made up of stimuli to the device under test (DUT) and the stimuli will be portable among verification tools.

Adnan Hamid, CEO of Breker, gave me a demo at DVCon US this year.  Their product is trying to solve the same problem, but the standard being developed will only be similar, that is based on the same concept.  Both will be a descriptive language, Breker based on SystemC and PS based on SystemVerilog, but the approach the same.  The verification team develops a directed network where each node represents a test.  The Accellera work must, of course, be vendor independent, so their work is more complex.  The figure below may give you an idea of the complexity.

Once the working group is finished, and they expect to be finished no later than the end of 2017, each EDA vendor could then develop a generator that will translate the test described in PS language into the appropriate string of commands and stimuli required to actually perform the test with the tool in question.

The approach, of course, is such that the product of the Accellera work can then be easily submitted to the IEEE for standardization, since it will obey the IEEE requirements for standardization.

My question is: What about Formal Verification?  I believe that it would be possible to derive assertions from the PS language.  If this can be done it would be a wonderful result for the industry.  An IP vendor, for example, will then be able to provide only one definition of the test used to verify the IP, and the customer will be able to readily use it no matter which tool is appropriate at the time of acceptance and integration of the IP.

DVCon Is a Must Attend Conference for Verification and Design Engineers

Monday, February 13th, 2017

Gabe Moretti, Senior Editor

Dennis Brophy, Chair of this year’s DVCon said that “The 2017 Design and Verification Conference and Exhibition U.S. The conference will offer attendees a comprehensive selection of 39 papers, 9 tutorials, 19 posters, 2 panels, a special session on Functional Verification Industry Trends, and a keynote address by Anirudh Devgan, Senior Vice President and General Manager of the Digital & Signoff Group and System & Verification Group at Cadence.”

DVCon is   sponsored by Accellera Systems Initiative, DVCon U.S. will be held February 27-March 2, 2017 at the DoubleTree Hotel in San Jose, California.

DVCon was initially sponsored by VHDL International and Open Verilog International; but its growth really started after the two organizations merged  to form Accelera which was renames Accellera Systems Initiative during its merger with Open SystemC Initiative.  It seems that every EDA organization now-a-days needs the word “systems” in its name.  As they are afraid to be seen to be of lesser importance without making very sure that everyone knows they are aware of the exitance of systems.

DVCon is now a truly international conference holding events not only in the US but also in Europe, India and, for the first time this year, in China.

The aim of Accellera is to build and serve a community of professionals who focus on development and verification of hardware systems, with the awareness that software role in system design is growing rapidly.  Dennis Brophy observed that “Coming together as a community is fostered by the DVCon Expo. The bigger and better exposition will run from Monday evening to Wednesday evening. See the program for specific opening and closing times. The Expo is a great place to catch up with commercial vendors and learn the latest in product developments. It is also great to connect with colleagues and exchange and share information and ideas. Join us for the DVCon U.S. 2017 “Booth Crawl” where after visiting select exhibitors you will be automatically entered for a lucky draw.”

Although the titles of the papers presented in the technical conference focus on EDA technologies, the impact of the papers deal with application areas diverse from automotive to communications, from the use of MEMS and FPGA in system design, from could computing to rf.

“DVCon has long been the technical and social highlight of the year for design and verification engineers,” stated Tom Fitzpatrick, DVCon U.S. 2017 Technical Program Chair.  “Through the hard work of a large team of dedicated reviewers, we have chosen the best of over one hundred submitted abstracts from deeply knowledgeable colleagues within the industry to help attendees learn how to improve their verification efforts. We also have two days of extended tutorials where attendees can get an in-depth look at the cutting edge of verification, not to mention the Exhibit Floor where over 30 companies will be demonstrating their latest tools and technologies.  In between, there are plenty of opportunities to network, relax and take advantage of a fun and welcoming atmosphere where attendees can reconnect with old friends and former colleagues or make new friends and contacts. The value of DVCon goes well beyond the wealth of information found in the Proceedings. Being there makes all the difference.”

Dennis Brophy added that on Monday February 27th there will be a presentation covering the Portable Stimulus work being done under Accellera sponsorship.  The working group has made significant progress toward defining what it is and how it works.  The goal is to have the Board of Accellera to authorize a ballot to make the result an industry standard and to further take it to the IEEE to complete the standardization owkr.

As has happened last year the exhibit space was quickly filled by vendor who understood the advantage of talking with technologists who specialize in verification and design of complex systems.

Devgan’s keynote, “Tomorrow’s Verification Today” will review the latest trends which are redefining verification from IP to System-level with an increasingly application-specific set of demands for hardware and software development. Over the past decade, verification complexity and demands on engineering teams have continued to raise rapidly. However, the supporting automation tools and flows have been only improving incrementally, resulting in a verification gap. It is time to redefine how verification should be approached to accelerate innovation in the next decade.  In his presentation, Dr. Devgan will review the latest trends which are redefining verification from IP to System-level, with an increasingly application-specific set of demands changing the landscape for hardware and software development. The keynote will be delivered on Tuesday, February 28

On the same day from 1:30-2:30pm in the Oak/Fir Ballroom the conference will offer a special session with Harry Foster, Chief Scientist for Mentor Graphics’ Design Verification Technology Division.  Foster has been asked to present “Trends in Functional Verification: A 2016 Industry Study” based on the Wilson Research Group’s 2016 study. The findings from the 2016 study provide invaluable insight into the state of today’s electronics industry. It will be held on Tuesday, February 28 from 10:30-11:00am in the Fir Ballroom.

Two full days of in-depth tutorials: Accellera Day with three tutorials on Monday and sponsored tutorials on Thursday.  There are also many technical papers and posters and two intriguing panels.

There will be plenty of networking opportunities, especially during the exhibition.  There will be a booth crawl on Monday, February 27 from 5:00-7:00pm and receptions both Tuesday and Wednesday in the exhibit hall.  Exhibits will be open Tuesday from 5:00-7:00pm and Wednesday and Thursday from 2:30-6:00p

The awards for Best Paper and Best Poster will be presented at the beginning of the reception on Wednesday.  For the complete DVCon U.S. 2017 schedule, including a list of sessions, tutorials, sponsored luncheons and events, visit www.dvcon.org.

Cadence Introduces Palladium Z1 Enterprise Emulation Platform

Thursday, November 19th, 2015

Gabe Moretti, Senior Editor

One would think that the number of customers for very large and expensive emulation systems is shrinking and thus the motivation to launch new such systems would be small.  Clearly my opinion is not correct.

Earlier this week Cadence Design Systems, Inc. launched the Palladium Z1 emulation platform, which the company calls the industry’s first datacenter-class emulation system.  According to Cadence the new platform delivers up to 5X greater emulation throughput than the previous generation, with an average 2.5X greater workload efficiency. the Palladium Z1 platform executes up to 2304 parallel jobs and scales up to 9.2 billion gates, addressing the growing market requirement for emulation technology that can be efficiently utilized across global design teams to verify increasingly complex systems-on-chip (SoCs) designs.

The Palladium Z1 enterprise emulation platform features a rack-based blade architecture, a 92 percent smaller footprint and 8X better gate density than the Palladium XP II platform according to Frank Schirrmeister, Group Director for Product Marketing, System Development Suite at cadence.  Optimizing the utilization of the emulation resource, Palladium Z1 platform offers a virtual target relocation capability, and payload allocation into available resources at run time, avoiding re-compiles. With its massively parallel processor-based architecture, Palladium Z1 platform offers 4X better user granularity than its nearest competitor according to Frank.

Additional key features and benefits of the Palladium Z1 platform include:

  • Less than one-third the power consumption per emulation cycle of the Palladium XP II platform. This is enabled by an up to 44 percent reduction in power density, an average of 2.5X better system utilization and number of parallel users, 5X better job queue turnaround time, up to 140 million gate per hour compile times on a single workstation, and superior debug depth and upload speeds
  • Full virtualization of the external interfaces using a unique virtual target relocation capability. This enables remote access of fully accurate real world devices as well as virtualized peripherals like Virtual JTAG. Pre-integrated Emulation Development Kits are available for USB and PCI-Express interfaces, providing modeling accuracy, high performance and remote access. Combined with virtualization of the databases using Virtual Verification Machine capabilities, it allows for efficient offline access of emulation runs by multiple users
  • The industry’s highest versatility with over a dozen use models, including In-Circuit Emulation running software loads, Simulation Acceleration that allows hot swapping between simulation and emulation, Dynamic Power Analysis using Cadence Joules RTL power estimation, IEEE 1801 and Si2 CPF based Power Verification, Gate-level acceleration and emulation, and OS bring-up for ARM-based SoCs running at 50X the performance of pure standard emulation
  • Seamless integration within the Cadence System Development Suite. This includes Incisive® verification platform for simulation acceleration, Incisive vManager for verification planning and unified metrics tracking, Indago Debug Analyzer and Embedded Software Apps for advanced hardware/software debug, Accelerated and Assertion-Based Verification IP, Protium FPGA-based prototyping platform with common compiler, and Perspec System Verifier for multi-engine system use-case testing.

“We continue to see customer demand for doubling of available emulation capacity every two years, driven by short project schedules amid growing verification complexity and requirements on quality, hardware-software integration, and power consumption” said Daryn Lau, vice president and general manager, Hardware and System Verification, Cadence. “With Palladium Z1 platform as one of the pillars of the System Development Suite, design teams can finally utilize emulation as a compute resource in the datacenter akin to blade server-based compute farms for simulation, and improve schedules while enabling more verification automation to address the growing impact of verification on end product delivery.”

DVCon is the Primary Design and Verification Conference

Friday, February 20th, 2015

Gabe Moretti, Senior editor

DVCon United States opens on March 2nd and ends on March 5th.  If you have not yet made plans to attend and have something to do with developing ICs you should plan to attend.  The growth of this conference has been remarkable.

In February 2000 VHDL International (VI) and Open Verilog International (OVI) agreed to merge and form Accellera.  That year DVCon, which until 2003 was called HDLCon, took place with the format it had for the previous 12 years.  Started in 1988 as the co-location of the Verilog Users Group and the VHDL International Users Forum (VIUF), DVCon was successful since its inception.

The name DVCon derives from Design and Verification Conference, and its focus was, and in part still is, the development, use, and improvement of Hardware Description Languages.  This year’s conference is the 27th and offers an expanded technical program.  In spite of the consolidation occurring in the industry the exhibit space has remained practically the same as last year.  Although this year there will be one less tutorial than the previous year, the breath of topics is larger.

The merger of OVI and VI produced significant changes, both for DVCon and for Accellera.  In 2001 DVCon boosted a more efficient organization, both for its technical program and for exhibits.  The source of papers offered for acceptance increased in scope as professionals outside of the Verilog and VHDL users communities became interested in presenting papers at the conference.

As Accellera grew and widened the scope of technical subject it handled, so DVCon increased the technical segments it covered.  First SystemC and shortly thereafter SystemVerilog provided interesting papers and Tutorials.  The verification aspect of the conference was enlivened with focus on UVM (Universal Verification Methodology), TLM (Transaction Level Modeling), testbench construction, and various approaches to testing, including formal techniques.

As the percentage of analog circuits increased in SoC, mixed languages and mixed signal design and verification also became a topic, both in papers and in Tutorials.  In fact one can quickly make a list of the most relevant issues current among electronics engineers by quickly reading the current conference program.

Yatin Trivedi, DVCon General Chair succinctly described the aim of DVCon..  ” DVCon continues to focus on serving the Design and Verification community. DVCon is a conference sponsored by Accellera, in order to promote the adoption of its standards and standards-based methodologies. From the days of exclusive focus on Verilog and VHDL, we have come a long way in including SystemVerilog, SystemC, UPF and UVM. As semiconductor IP became significant to our community of designers and verification engineers, the program has expanded its range of topics. However, our focus remains on design and verification.”

EDA in 2015: Something New and Something Old

Monday, December 1st, 2014

Gabe Moretti, Senior Editor

Every year I like to contact EDA companies to ask what they expect in 2015.  When I started working on this project I visualized one article that would offer the opinions of EDA industry leaders from many companies covering their expectations for the coming year.  As work progressed I found, as I should have expected, that the responses to my questions were much broader and in depth than could possibly be covered in one article.  Doing so would have resulted in such a long article that would have surpassed the time limits most engineers have to read a magazine, even a digital one.

So I decided to publish three articles in addition to this introductory blog.  The decision is based mostly on the amount of feedback I received, and in part by how stand-alone the input was.  It turns out that both Cadence and Mentor provided me with material that can be judged to be a contributed article, while the rest of the companies submitted contributions that could be grouped into an article, albeit one significantly longer than normal.  The articles will be published during this week, one article a day.

Designers, architects, and verification engineers will find worthwhile material in more than one article.  One subject that is receiving attention lately and that is not covered directly in the articles is Agile IC Methodology.  In truth Chi-Ping Hsu of Cadence talks about the issue in his article, but not in the terms of the conversation going on under the auspices of Sonics, Inc.  I am sure that I will write about Agile IC Methodology in 2015 so this subject will receive its due attention.

Verification and mixed/signal design are the subject that have received the greatest attention.  But it is important to acknowledge the underlying drivers for such attention: hardware/software co-design, and the use of third party IP.  These are the true technology drivers.  From a market point of view, automotive looms important.  This market has been developing for a few years and has now reached the point in which it can approach its full potential.  Distributed intelligence and “Thing to Thing” communication and co-operation is within the grasps of product developers.  The automobile is the first working implementation of the Internet of Things (IoT).  IoT is in everyone’s mind in our industry, and the intelligent automobile, even if such product does not really use the internet architecture in most instances, is often used as an example.  The IoT will certainly be a significant driver of our industry, but its growth ramp in 2015 will still be linear as we continue to understand what the hierarchical architecture should really look like.  At this point anything that could possibly generate data is seen as a good prospect, but soon the market will discover that much of the data may be interst9ing but it is not necessary, and in fact would just clutter one’s life.  As usual, customers’ demand will inject sense in the market.

In a time when all festivities seem to start two months before they actually occur let me be one of the firsts to wish all of you a productive and peaceful 2015.

From Data to Information to Knowledge

Monday, November 17th, 2014

Gabe Moretti, Senior Editor

During my conversation with Luco Lanza last month we exchanged observations on how computing power, both at the hardware and the system levels, had progressed since 1968, the year when we both started working full time in the electronics field.  And how much was left to do in order to achieve knowledge based computing.

Observing what computers do, we recognized that computers are now very good at collecting and processing data.  In fact the concept of the IoT is based on the capability to collect and process a variety of data types in a distributed manner.  The plan is, of course, to turn the data into information.

Definitely there are examples of computer systems that process data and generate information.  Financial systems provide profit and loss reports using income and expense data for example, and logic simulators provide the behavior of a system by using inputs and output values.  But information is not knowledge.

Knowledge requires understanding, and computers do not understand information.  The achievement of knowledge requires the correlation and synthesis of information, sometimes from disparate sources, in order to generate understanding of the information and thus abstract knowledge.  Knowledge is also a derivate of previous knowledge and cannot always be generated simply by digesting the information presented.  The human brain associates the information to be processed with knowledge already available in a learning process that assigns the proper value and quality to the information presented in order to construct a value judgment and associative analysis that generates the new knowledge.  What follows are some of my thoughts since that dialogue.

As an aside I note that unfortunately many education systems give students information but not the tools to generate knowledge.  Students are thought to give the correct answer to a question, but not to derive the correct answer from a set of information and their own existing knowledge.

We have not been able to recreate the knowledge generating processes successfully with a computer, but nothing says that it will not be possible in the future.  As computing power increases and new computer architectures are created, I know we will be able to automate the generation of knowledge.  As far as EDA is concerned I will offer just one example.

A knowledge based EDA tool would develop a SoC from a set of specifications.  Clearly if the specifications are erroneous the resulting SoC will behave differently than expected, but even this eventuality would help in improving the next design because it would provide information to those that develop a knowledge based verification system.

When we achieve this goal humanity will finally be free to dedicate its intellectual and emotional resources to address those problems that derive directly from what humans are, and prevent most of them, instead of having to solve them.

At this moment I still see a lot of indeterminism with respect of the most popular topic in our field: the IoT.  Are we on the right track in the development of the IoT?  Are we generating the correct environment to learn to handle information in a knowledge generating manner?  To even attempt such a task we need to solve not just technical problems, but financial, behavioral, and political issues as well.  The communication links to arrive to a solution are either non-existent or weak.  Just think of the existing debate regarding “free internet”.  Financial requirements demand a redefinition of the internet.  From a communication mechanism akin to a utility, to a special purpose device used in selective manners defined by price.  How would a hierarchical internet defined by financial parameters modify the IoT architecture?  Politicians and some business interests do not seem to care and engineers will have to patch things up later.  In this case we do not have knowledge.