Part of the  

Chip Design Magazine

  Network

About  |  Contact

Archive for March, 2017

Portable Stimulus

Thursday, March 23rd, 2017

Gabe Moretti, Senior Editor

Portable Stimulus (PS) is not a new sex toy, and is not an Executable Specification either.  So what is it?  It is a method, or rather it will be once the work is finished to define inputs independently from the verification tool used.

As the complexity of a system increases, the cost of its functional verification increases at a more rapid pace.   Verification engineers must consider not only wanted scenarios but also erroneous one.  Increased complexity increases the number of unwanted scenarios.  To perform all the required tests, engineers use different tools, including logic simulation, accelerators and emulators, and FPGA prototyping tools as well.  To transport a test from one tool to another is a very time consuming job, which is also prone to errors.  The reason is simple.  Not only each different class of tools uses different syntax, in some cases it also uses different semantics.

The Accellera System Initiative, known commonly as simply Accellera is working on a solution.  It formed a Working Group to develop a way to define tests in a way that is independent of the tool used to perform the verification.  The group, made up of engineers and not of markting professionals, chose as its name what they are supposed to deliver, a Portable Stimulus since the verification tests are made up of stimuli to the device under test (DUT) and the stimuli will be portable among verification tools.

Adnan Hamid, CEO of Breker, gave me a demo at DVCon US this year.  Their product is trying to solve the same problem, but the standard being developed will only be similar, that is based on the same concept.  Both will be a descriptive language, Breker based on SystemC and PS based on SystemVerilog, but the approach the same.  The verification team develops a directed network where each node represents a test.  The Accellera work must, of course, be vendor independent, so their work is more complex.  The figure below may give you an idea of the complexity.

Once the working group is finished, and they expect to be finished no later than the end of 2017, each EDA vendor could then develop a generator that will translate the test described in PS language into the appropriate string of commands and stimuli required to actually perform the test with the tool in question.

The approach, of course, is such that the product of the Accellera work can then be easily submitted to the IEEE for standardization, since it will obey the IEEE requirements for standardization.

My question is: What about Formal Verification?  I believe that it would be possible to derive assertions from the PS language.  If this can be done it would be a wonderful result for the industry.  An IP vendor, for example, will then be able to provide only one definition of the test used to verify the IP, and the customer will be able to readily use it no matter which tool is appropriate at the time of acceptance and integration of the IP.

A Brief History of Verification

Thursday, March 2nd, 2017

Gabe Moretti, Senior Editor

This year’s DVCon is turning out to be very interesting.  It may be something that Jim Hogan said.  Jim insisted, in more than one panel, that we are at the cusp of a major change, an opportunity for startups that has a greater probability of revolutionary success than in the last few years in the area of verification and, I add, circuit design.

His observations got me thinking, reviewing my experiences in various aspects of verification I have been involved in.  What follow is my existential history of verification.

Logic is What Counts

When in 1968 I wrote my first logic simulator I only had to deal with a two states Boolean logic.  A node was either on or off.  Everything else about the circuit was not important.  It was not very long after that that things got more complex.  Now Boolean logic had three states, I and other verification engineers met the floating gate.  And then shortly thereafter memories became a real physical quantity in designs and four-state logic became the norm.  We had to know whether a bit was initialized in a deterministic manner or not.  From then on verification engineers’ major problem was the size of the circuit to be simulated and how efficient we could make the simulator so that run times were as short as possible.  And by the way this is still going on today, although for more complex reasons.

Enters Physics

As the size of transistors got smaller and smaller verification engineers found that physics grew in importance.  We no longer could assume that a circuit was well behaved in the physical domain, but had to worry about power, electromagnetic effects, and thermal characteristics.  Verification engineers had to understand physics, or, better yet, physics professionals had to join the ranks of verification engineers.

Today, thanks in part to logic synthesis, Boolean logic issues are well understood, and development work is focused on the physical characteristics of the circuit and, just to make things interesting, its immediate environment.  Todays’ complex problems deal with clock networks management and their interaction with power distribution and consumption that determines the functional state of the circuit.  How power transmission effect the nearby circuitry, how the current of electrons warms the device and how temperature impacts the probability that a transistor can maintain the state it is supposed to have are today’s issues.

Behavioral Scientists Find a New Job

There has been much talk at this year’s DVCon about safety and security, issues that are really behavioral and not about logic or physics.  During a lunch panel on Tuesday the importance of safety and security in complex systems, often employing diverse and distributed circuitry, was indicated as being the next important aspect of a system to verify.  Defense, automotive, and mobile applications were used as examples.  More than one panelist with Jim Hogan again assuming a leading role, spoke about the desirability of self-correcting systems.  If a safety failure occurred, or a security breach, the most desirable thing would be to have the circuitry prevent the occurrence or fix itself thus limiting the damage and the cost of repairs.   Verification engineers will have to deal with behavioral science in order to understand the danger, build a knowledge base of past incidents, and develop circuitry that will identify and prevent the faulty behavior in the future.  This, or course, requires that we can define what a good behavior is and what would signify a deviation from it.

Can We Do Better?

Now that I no longer develop verification tools, I think that both hardware and software developers have gotten lazy.  Transistors are now assumed to exist to the point that no one ever asks himself or herself “will this fit?” since the answer is “of course, and with transistors to spare”.  We design and develop complex circuits not just because they have to be but because they can be.

The amount of data produced by the various verification tools is so large that it is becoming impossible for a human to correlate all of it into a meaningful indication of the wellness and effectiveness of the design.  Tools to manage big data are required, especially when decisions must be made in the field during actual deployment.

And as transistors grow smaller and smaller in order to provide more of them every couple of years, we are unconsciously marching toward the boundary of Einstein’s space time environment to get closer and closer to quantum physics.  What will happen when one day we will be able to determine the state of a transistor or the trajectory of the electrons but not both at the same time?

I believe that it is time to go back in time and asks ourselves “ Is there another implementation that is simpler and uses fewer transistors?” the same way I used to ask “How can I make my program fit in 8K bytes of memory?”  Verification engineers must develop an efficiency measurement tool that warns design engineers when they are not efficient.  Not for the sake of elegance, but for the sake of simplicity.  Simplicity in circuit design means fewer physical problems and fewer behavioral problems.

The best business opportunity of this approach, of course, is in the area of IP.  Design, develop, and market the most efficient implementation of a function and, assuming you have some understanding of business management, you will be a successful entrepreneur.