Part of the  

Chip Design Magazine

  Network

About  |  Contact

Archive for February, 2016

DVCon U.S. 2016 Is Around the Corner

Thursday, February 18th, 2016

Gabe Moretti, Senior Editor

Within the EDA industry, the Design & Verification Conference and Exhibition (DVCon) has created one of the most successful communities of the 21st century.  Started as a conference dealing with two design languages, Verilog and VHDL, DVCon has grown to cover all aspects of design and verification.  Beginning as a conference based in Silicon Valley, the conference is now held on three continents: America, Asia and Europe.  Both DVCon Europe and DVCon India have shown significant growth, and plans are well on their way to offer a DVCon in China as well.  As Yatin Trivedi, General Chair of this year’s DVCon U.S., says, “DVCon continues to be the premier conference for design and verification engineers of all experience levels. Compared to larger and more general conferences, DVCon affords attendees a concentrated menu of technical sessions – tutorials, papers, poster sessions and panels – focused on design and verification hot topics. In addition to participation in high quality technical sessions, DVCon attendees have the opportunity to take part in the many informal, but often intense, technical discussions that pop up around the conference venue among more than 800 design and verification engineers and engineering managers. This networking opportunity among peers is possibly the greatest benefit to DVCon attendees.”

Professionals attend DVCon to learn and to share, not just to show off their research achievements as a community.  The conference is focused on providing its attendees with the opportunity to learn by offering two days of tutorials as well as frequent networking opportunities.  The technical program offers engineers examples of how today’s problems have been solved under demanding development schedules and budgets.  Ambar Sarkar, Program Chair, offers this advice on the DVCon U.S. 2016 web site: “Find what your peers are working on and interact with the thought leaders in our industry. Learn where the trends are and become a thought leader yourself.”

Grown from the need to verify digital designs, verification technology now faces the need to verify heterogeneous systems that include analog, software, MEMS, and communication hardware and protocols.  Adapting to these new requirements is a task that the industry has not yet solved.

At the same time, methods and tools for mixed-signal or system-level design still need maturing.  The concept of system-level design is being revolutionized as architectures like those required for IoT applications demand heterogeneous systems.

Attendees to DVCon U.S. will find ample opportunity to consider, debate, and compare both requirements and solutions that impact near term projects.

Tutorials and Papers

As part of its mission to provide a learning venue for designers and verification engineers, DVCon U.S. offers two full days of tutorials.  The presentations of the 12 tutorial sessions are divided between Monday and Thursday, separate from the rest of the technical program so they do not conflict and force attendees to make difficult attendance choices.

Accellera has a unique approach to putting together its technical program.  I am slightly paraphrasing this year’s Program Chair, Ambar Sarkar, by stating that DVCon U.S. lets the industry set the agenda, not the conference asking for papers on selected topics.  He told me that the basic question is: “Can a practicing engineer get new ideas and try to use them in his or her upcoming project?” For this reason, the call for papers asks only for abstracts and those that do not meet the request are eliminated.  After a further selection, the authors of the chosen abstracts are asked to submit a full paper.  Those papers are then grouped according to their common subject areas into sessions.  The sessions that emerge automatically reflect the latest trends in the industry.

The paper presentations during Tuesday and Wednesday take the majority of the conference’s time and form the technical backbone of the event.

Of the 127 papers submitted, 36 were chosen to be presented in full.  There will be 13 sessions covering the following areas: UVM, Design and Modeling, Low Power, SystemVerilog, Fault Analysis, Emulation, Mixed-Signal, Resource Management, and Formal Techniques.  Each session offers from 3 to 4 individual papers.

Posters

Poster presentations are selected in the same manner as papers.  A poster presentation is less formal but has the advantage of giving the author the opportunity to interact with a small audience and thus the learning process can be bilateral.  There have been occasions in the past when an abstract submitted as a poster is switched to an oral presentation with the consent of the author.  Such operation is possible because the submitting and selecting process is similar and thus the poster has already been judged as presenting an approach that will be useful to the attendees.

Keynote

This year’s keynote will be delivered by Wally Rhines, the 2015 recipient of the Phil Kaufman Award.  Wally is well known in the EDA industry for both his insight and his track record as the Chairman and CEO of Mentor Graphics.  The title of his address is Design Verification Challenges: Past, Present, and Future.  Dr. Rhines will review the history of each major phase of verification evolution and then concentrate on the challenges of newly emerging problems. While functional verification still dominates the effort, new requirements for security and safety are becoming more important and will ultimately involve challenges that could be more difficult than those we have faced in the past.

Panels: One Good and One Suspect

There are two panels on the conference schedule.  One panel: “Emulation + Static Verification Will Replace Simulation”, scheduled for Wednesday March 2nd at 1:30 in the afternoon looks at the near future verification methods.  Both emulation and static verification use has been increasing significantly.  May be the verification paradigm of the future is to invest in high-end targeted static verification tools to get the design to a very high quality level, followed by very high-speed emulation or FPGA-prototyping for system-level functional verification. Where does that leave RTL simulation? Between a rock and a hard place! Gate-level simulation is already marginalized to doing basic sanity checks. May be RTL simulation will follow. Or will it?

The other panel scheduled for 8:30 in the morning of the same day concerns me a lot.  The title is “Redefining ESL” and the description of the panel is taken from a blog that Brian Bailey, the panel moderator, published on September 24 of 2015.  You can read the blog here: http://semiengineering.com/what-esl-is-really-about/.

In the blog Brian holds the point of view that ESL is not a design flow, it is a verification flow, and it will not take off until the industry recognizes that. Only now are we beginning to define what ESL verification means, but is it too little, too late?  There are a few problems with the panels committee accepting this panel.  To begin with ESL is an outdated concept.  Today’s systems include much more than digital design.  Modern SoCs, even small ones like those fund in IoT applications, include analog, firmware, and MEMS blocks.  All of these are outside of the ESL definition and fall within the System Level Design (SLD) market.

The statement made by Brian that ESL would not be made viable by the introduction of viable High Level Synthesis (HLS) tools is simply false.  ESL verification became a valuable tool only when designers began to use HLS products to automatically derive RTL models from ESL descriptions in SystemVerilog or C/C++ even if HLS covered mostly algorithms expressed in something else besides Verilog, VHDL, or SystemC.

The Search for the Executable Specification

Monday, February 8th, 2016

Gabe Moretti, Senior Editor

One of the Holy Grail of EDA is to find a way to specify a design in such a way that the specification itself could be synthesized into an architecture so that one would be assured that the design met the requirements by construction.  Work in the area has been going on for a long time, VHDL being the most significant language developed with the support of the US Department of Defense.  Unfortunately, the language was understood and immediately used as a design implementation tool and thus overtaken in popularity by Verilog which is a true design implementation language.

The EDA community has continued to look for a way to write a specification in a language that is unambiguous, that can be modeled, and can be synthesized.  Judging by the results, failure is consistent.   Even great language architects, have not found a solution.  The latest attempt, Rosetta, has been almost totally ignored.  The problem, of course, is that no one wants to learn another language in order to write a specification, or build a tool for a language that may not be used or used so infrequently to be an economic failure.

But I think the answer is under our noses.  It is being talked about, but the concept is so foreign to the design and marketing communities that it has not been recognized.  It all started with the discussion about using Agile methods for hardware development.  There are many reasons for not applying the Agile methods used in software development literally to hardware development, but surprisingly there is one that not only fits beautifully, but also solves the problem of developing a way to represent a design specification in an executable manner.  And that is the test suite.

Of course I am not suggesting that Marketing learn to write tests, and thus ambiguity could still creep in at the beginning, but it would become obvious much sooner in the development project and thus more easily resolved as soon as a first level architectural implementation was tested.

A test suite properly constructed does in fact represent a specification for functional behavior.  It defines a series of transformations that must be executed by the design in order to produce desired measurable states.  By describing a set of input values and their sequence in time, as well as the expected outputs, one in fact describes a functional specification.

In addition, such test sequence does follow the Agile intent of continuous refinement, as more is known about the behavior of the desired product the test suite can be expanded accordingly and the resulting design incrementally modified.  The result is that at all stages of development the design is testable in a consistent manner.  Even better, if the results of a test are different from what was intended, the specification itself can be changed and the test file(s) updated.

A Universal Test Methodology (UTM)

A UTM would offer a methodology to group test suites addressing both functional and physical characteristics of the design into a manageable data base of stimuli and responses that, taken together, constitute the entire design specification, including timing, power, manufacturability and so on.

In an age of big data, tools are available to manage all the relevant information about a given design, and the data can be presented in a number of ways appropriate to the expected reader.

The design community is developing expertise in porting tests among specific tools required to analyze and verify different aspects of a design so the UTM does not represent such obstacle to timely implementation.  And here again the Agile methodology of sequential refinements applies.  Let’s start using what we know and expand and refine it to meet the requirements.