Gabe Moretti, Senior Editor
This year’s DVCon is turning out to be very interesting. It may be something that Jim Hogan said. Jim insisted, in more than one panel, that we are at the cusp of a major change, an opportunity for startups that has a greater probability of revolutionary success than in the last few years in the area of verification and, I add, circuit design.
His observations got me thinking, reviewing my experiences in various aspects of verification I have been involved in. What follow is my existential history of verification.
Logic is What Counts
When in 1968 I wrote my first logic simulator I only had to deal with a two states Boolean logic. A node was either on or off. Everything else about the circuit was not important. It was not very long after that that things got more complex. Now Boolean logic had three states, I and other verification engineers met the floating gate. And then shortly thereafter memories became a real physical quantity in designs and four-state logic became the norm. We had to know whether a bit was initialized in a deterministic manner or not. From then on verification engineers’ major problem was the size of the circuit to be simulated and how efficient we could make the simulator so that run times were as short as possible. And by the way this is still going on today, although for more complex reasons.
As the size of transistors got smaller and smaller verification engineers found that physics grew in importance. We no longer could assume that a circuit was well behaved in the physical domain, but had to worry about power, electromagnetic effects, and thermal characteristics. Verification engineers had to understand physics, or, better yet, physics professionals had to join the ranks of verification engineers.
Today, thanks in part to logic synthesis, Boolean logic issues are well understood, and development work is focused on the physical characteristics of the circuit and, just to make things interesting, its immediate environment. Todays’ complex problems deal with clock networks management and their interaction with power distribution and consumption that determines the functional state of the circuit. How power transmission effect the nearby circuitry, how the current of electrons warms the device and how temperature impacts the probability that a transistor can maintain the state it is supposed to have are today’s issues.
Behavioral Scientists Find a New Job
There has been much talk at this year’s DVCon about safety and security, issues that are really behavioral and not about logic or physics. During a lunch panel on Tuesday the importance of safety and security in complex systems, often employing diverse and distributed circuitry, was indicated as being the next important aspect of a system to verify. Defense, automotive, and mobile applications were used as examples. More than one panelist with Jim Hogan again assuming a leading role, spoke about the desirability of self-correcting systems. If a safety failure occurred, or a security breach, the most desirable thing would be to have the circuitry prevent the occurrence or fix itself thus limiting the damage and the cost of repairs. Verification engineers will have to deal with behavioral science in order to understand the danger, build a knowledge base of past incidents, and develop circuitry that will identify and prevent the faulty behavior in the future. This, or course, requires that we can define what a good behavior is and what would signify a deviation from it.
Can We Do Better?
Now that I no longer develop verification tools, I think that both hardware and software developers have gotten lazy. Transistors are now assumed to exist to the point that no one ever asks himself or herself “will this fit?” since the answer is “of course, and with transistors to spare”. We design and develop complex circuits not just because they have to be but because they can be.
The amount of data produced by the various verification tools is so large that it is becoming impossible for a human to correlate all of it into a meaningful indication of the wellness and effectiveness of the design. Tools to manage big data are required, especially when decisions must be made in the field during actual deployment.
And as transistors grow smaller and smaller in order to provide more of them every couple of years, we are unconsciously marching toward the boundary of Einstein’s space time environment to get closer and closer to quantum physics. What will happen when one day we will be able to determine the state of a transistor or the trajectory of the electrons but not both at the same time?
I believe that it is time to go back in time and asks ourselves “ Is there another implementation that is simpler and uses fewer transistors?” the same way I used to ask “How can I make my program fit in 8K bytes of memory?” Verification engineers must develop an efficiency measurement tool that warns design engineers when they are not efficient. Not for the sake of elegance, but for the sake of simplicity. Simplicity in circuit design means fewer physical problems and fewer behavioral problems.
The best business opportunity of this approach, of course, is in the area of IP. Design, develop, and market the most efficient implementation of a function and, assuming you have some understanding of business management, you will be a successful entrepreneur.