Part of the  

Chip Design Magazine

  Network

About  |  Contact

A Brief History of Verification

March 2nd, 2017

Gabe Moretti, Senior Editor

This year’s DVCon is turning out to be very interesting.  It may be something that Jim Hogan said.  Jim insisted, in more than one panel, that we are at the cusp of a major change, an opportunity for startups that has a greater probability of revolutionary success than in the last few years in the area of verification and, I add, circuit design.

His observations got me thinking, reviewing my experiences in various aspects of verification I have been involved in.  What follow is my existential history of verification.

Logic is What Counts

When in 1968 I wrote my first logic simulator I only had to deal with a two states Boolean logic.  A node was either on or off.  Everything else about the circuit was not important.  It was not very long after that that things got more complex.  Now Boolean logic had three states, I and other verification engineers met the floating gate.  And then shortly thereafter memories became a real physical quantity in designs and four-state logic became the norm.  We had to know whether a bit was initialized in a deterministic manner or not.  From then on verification engineers’ major problem was the size of the circuit to be simulated and how efficient we could make the simulator so that run times were as short as possible.  And by the way this is still going on today, although for more complex reasons.

Enters Physics

As the size of transistors got smaller and smaller verification engineers found that physics grew in importance.  We no longer could assume that a circuit was well behaved in the physical domain, but had to worry about power, electromagnetic effects, and thermal characteristics.  Verification engineers had to understand physics, or, better yet, physics professionals had to join the ranks of verification engineers.

Today, thanks in part to logic synthesis, Boolean logic issues are well understood, and development work is focused on the physical characteristics of the circuit and, just to make things interesting, its immediate environment.  Todays’ complex problems deal with clock networks management and their interaction with power distribution and consumption that determines the functional state of the circuit.  How power transmission effect the nearby circuitry, how the current of electrons warms the device and how temperature impacts the probability that a transistor can maintain the state it is supposed to have are today’s issues.

Behavioral Scientists Find a New Job

There has been much talk at this year’s DVCon about safety and security, issues that are really behavioral and not about logic or physics.  During a lunch panel on Tuesday the importance of safety and security in complex systems, often employing diverse and distributed circuitry, was indicated as being the next important aspect of a system to verify.  Defense, automotive, and mobile applications were used as examples.  More than one panelist with Jim Hogan again assuming a leading role, spoke about the desirability of self-correcting systems.  If a safety failure occurred, or a security breach, the most desirable thing would be to have the circuitry prevent the occurrence or fix itself thus limiting the damage and the cost of repairs.   Verification engineers will have to deal with behavioral science in order to understand the danger, build a knowledge base of past incidents, and develop circuitry that will identify and prevent the faulty behavior in the future.  This, or course, requires that we can define what a good behavior is and what would signify a deviation from it.

Can We Do Better?

Now that I no longer develop verification tools, I think that both hardware and software developers have gotten lazy.  Transistors are now assumed to exist to the point that no one ever asks himself or herself “will this fit?” since the answer is “of course, and with transistors to spare”.  We design and develop complex circuits not just because they have to be but because they can be.

The amount of data produced by the various verification tools is so large that it is becoming impossible for a human to correlate all of it into a meaningful indication of the wellness and effectiveness of the design.  Tools to manage big data are required, especially when decisions must be made in the field during actual deployment.

And as transistors grow smaller and smaller in order to provide more of them every couple of years, we are unconsciously marching toward the boundary of Einstein’s space time environment to get closer and closer to quantum physics.  What will happen when one day we will be able to determine the state of a transistor or the trajectory of the electrons but not both at the same time?

I believe that it is time to go back in time and asks ourselves “ Is there another implementation that is simpler and uses fewer transistors?” the same way I used to ask “How can I make my program fit in 8K bytes of memory?”  Verification engineers must develop an efficiency measurement tool that warns design engineers when they are not efficient.  Not for the sake of elegance, but for the sake of simplicity.  Simplicity in circuit design means fewer physical problems and fewer behavioral problems.

The best business opportunity of this approach, of course, is in the area of IP.  Design, develop, and market the most efficient implementation of a function and, assuming you have some understanding of business management, you will be a successful entrepreneur.

Grant Pierce Named BoD Chair of the ESD Alliance

February 21st, 2017

Gabe Moretti, Senior Editor

The ESD Alliance (ESDA) has elected Grant Pierce (CEO of Sonics) as its Chairman of the Board a few weeks ago.  Grant is only the second Chair that is not a high level executive of one of the three big three EDA companies to hold the title, and the first since the organization, formerly EDAC renamed itself.  During the EDAC days it was customary for the CEOs of Cadence, Mentor and Synopsys to pass the title among themselves in an orderly manner.  The organization then reflected the mission of the EDA industry to support the development of hardware intensive silicon chips following Moore’s Law.

Things have changed since then, and the consortium responded by first appointing a new executive director, Bob Smith, then changing its name and its mission.  I talked with Grant to understand his view from the top.

Grant Pierce, Sonics CEO

Grant pointed out that: “We are trying to better reflect what has happened in the market place, both in terms of how our customers have developed further in the world of system on chip and what we have seen in the development of the EDA world where today the IP offerings in the market, both those from independent companies but also those from EDA companies are critical and integral to all the whole ecosystem for building today’s modern chips.”

Grant pointed out that ESDA has expanded its focus and has embraced not only hardware design and development but also software.  That does not mean, Grant pointed out, that the EDA companies are loosing importance but instead they are gaining a seat at the table with the software and the system design community in order to expand the scope of their businesses.

From my point of view, I interjected, I see the desired change implemented very slowly, still reacting to and not anticipating new demands.  So what do you think can happen in the next twelve months?

“From an ESDA point of view you are going to see us broadening the membership.” answered Grant.  ”We are looking to see how we can expand the focus of the organization through its working groups to zero-in on new topics that are broader than the ones that are currently there.  Like expanding beyond what is a common operating system to support for example.  I think you will see at a minimum two fronts, one opening on the software side while at the same time continuing work on the PPA (Power, Performance, Area) issues of chip design.  This involves a level of participation from parties that have not interacted this organization before.”

Grant believes that there should be more emphasis on the needs of small companies, those where innovation is taking place.  ESDA needs to seek the best opportunity to invigorate those companies.  “At the same time we must try to get system companies involved in an appropriate fashion, at least to the degree that they represent the software that is embedded in a system” concluded Grant.

We briefly speculated on what the RISC 5 movement might mean to ESDA.  Grant does not see much value for ESDA to focus on a specific instruction set, although he conceded that there might be value if RISC 5 joined ESDA.  I agree with the first part of his judgement, but I do not see any benefit to either party, or the industry for that matter, associated with RISC 5 joining ESDA.

From my point of view ESDA has a big hurdle to overcome.  For a few years, before Bob Smith was named executive director, EDAC was somewhat stagnant, and now it must catch up with market reality and fully address the complete system issue.  Not just hardware/software, but analog/digital, and the increased use of FPGA and MEMS.

For sure, representing an IP company gives Grant an opportunity to stress a different point of view within ESDA than the traditional EDA view.  The IP industry would not even exist without a system approach to design and it has changed the way architects think when first sketching a product on the back of an envelope.

DVCon Is a Must Attend Conference for Verification and Design Engineers

February 13th, 2017

Gabe Moretti, Senior Editor

Dennis Brophy, Chair of this year’s DVCon said that “The 2017 Design and Verification Conference and Exhibition U.S. The conference will offer attendees a comprehensive selection of 39 papers, 9 tutorials, 19 posters, 2 panels, a special session on Functional Verification Industry Trends, and a keynote address by Anirudh Devgan, Senior Vice President and General Manager of the Digital & Signoff Group and System & Verification Group at Cadence.”

DVCon is   sponsored by Accellera Systems Initiative, DVCon U.S. will be held February 27-March 2, 2017 at the DoubleTree Hotel in San Jose, California.

DVCon was initially sponsored by VHDL International and Open Verilog International; but its growth really started after the two organizations merged  to form Accelera which was renames Accellera Systems Initiative during its merger with Open SystemC Initiative.  It seems that every EDA organization now-a-days needs the word “systems” in its name.  As they are afraid to be seen to be of lesser importance without making very sure that everyone knows they are aware of the exitance of systems.

DVCon is now a truly international conference holding events not only in the US but also in Europe, India and, for the first time this year, in China.

The aim of Accellera is to build and serve a community of professionals who focus on development and verification of hardware systems, with the awareness that software role in system design is growing rapidly.  Dennis Brophy observed that “Coming together as a community is fostered by the DVCon Expo. The bigger and better exposition will run from Monday evening to Wednesday evening. See the program for specific opening and closing times. The Expo is a great place to catch up with commercial vendors and learn the latest in product developments. It is also great to connect with colleagues and exchange and share information and ideas. Join us for the DVCon U.S. 2017 “Booth Crawl” where after visiting select exhibitors you will be automatically entered for a lucky draw.”

Although the titles of the papers presented in the technical conference focus on EDA technologies, the impact of the papers deal with application areas diverse from automotive to communications, from the use of MEMS and FPGA in system design, from could computing to rf.

“DVCon has long been the technical and social highlight of the year for design and verification engineers,” stated Tom Fitzpatrick, DVCon U.S. 2017 Technical Program Chair.  “Through the hard work of a large team of dedicated reviewers, we have chosen the best of over one hundred submitted abstracts from deeply knowledgeable colleagues within the industry to help attendees learn how to improve their verification efforts. We also have two days of extended tutorials where attendees can get an in-depth look at the cutting edge of verification, not to mention the Exhibit Floor where over 30 companies will be demonstrating their latest tools and technologies.  In between, there are plenty of opportunities to network, relax and take advantage of a fun and welcoming atmosphere where attendees can reconnect with old friends and former colleagues or make new friends and contacts. The value of DVCon goes well beyond the wealth of information found in the Proceedings. Being there makes all the difference.”

Dennis Brophy added that on Monday February 27th there will be a presentation covering the Portable Stimulus work being done under Accellera sponsorship.  The working group has made significant progress toward defining what it is and how it works.  The goal is to have the Board of Accellera to authorize a ballot to make the result an industry standard and to further take it to the IEEE to complete the standardization owkr.

As has happened last year the exhibit space was quickly filled by vendor who understood the advantage of talking with technologists who specialize in verification and design of complex systems.

Devgan’s keynote, “Tomorrow’s Verification Today” will review the latest trends which are redefining verification from IP to System-level with an increasingly application-specific set of demands for hardware and software development. Over the past decade, verification complexity and demands on engineering teams have continued to raise rapidly. However, the supporting automation tools and flows have been only improving incrementally, resulting in a verification gap. It is time to redefine how verification should be approached to accelerate innovation in the next decade.  In his presentation, Dr. Devgan will review the latest trends which are redefining verification from IP to System-level, with an increasingly application-specific set of demands changing the landscape for hardware and software development. The keynote will be delivered on Tuesday, February 28

On the same day from 1:30-2:30pm in the Oak/Fir Ballroom the conference will offer a special session with Harry Foster, Chief Scientist for Mentor Graphics’ Design Verification Technology Division.  Foster has been asked to present “Trends in Functional Verification: A 2016 Industry Study” based on the Wilson Research Group’s 2016 study. The findings from the 2016 study provide invaluable insight into the state of today’s electronics industry. It will be held on Tuesday, February 28 from 10:30-11:00am in the Fir Ballroom.

Two full days of in-depth tutorials: Accellera Day with three tutorials on Monday and sponsored tutorials on Thursday.  There are also many technical papers and posters and two intriguing panels.

There will be plenty of networking opportunities, especially during the exhibition.  There will be a booth crawl on Monday, February 27 from 5:00-7:00pm and receptions both Tuesday and Wednesday in the exhibit hall.  Exhibits will be open Tuesday from 5:00-7:00pm and Wednesday and Thursday from 2:30-6:00p

The awards for Best Paper and Best Poster will be presented at the beginning of the reception on Wednesday.  For the complete DVCon U.S. 2017 schedule, including a list of sessions, tutorials, sponsored luncheons and events, visit www.dvcon.org.

EDA in the year 2017 – Part 2

January 26th, 2017

Gabe Moretti, Senior Editor

The first part of the article, published last week, covered design methods and standards in EDA together with industry predictions that impacted all of our industry.  This part will cover automotive, design verification and FPGA.  I found it interesting that David Kelf, VP of Marketing at OneSpin Solutions, thought that Machine learning will begin to penetrate the EDA industry as well.  He stated: “Machine Learning hit a renaissance and is finding its way into a number of market segments. Why should design automation be any different?  2017 will be the start of machine learning to create a new breed of design automation tool, equipped with this technology and able to configure itself for specific designs and operations to perform them more efficiently. By adapting algorithms to suit the input code, many interesting things will be possible.”

Rob Knoth, Product Management Director, Digital and Signoff Group at Cadence touched on an issue that is being talked about more recently: security.  He noted that: “In 2016, IoT bot-net attacks brought down large swaths of the Internet – the first time the security impact of IoT was felt by many. Private and nation-state attacks compromised personal/corporate/government email throughout the year. “

In 2017, we have the potential for security concerns to start a retreat from always-on social media and a growing value on private time and information. I don’t see a silver bullet for security on our horizon. Instead, I anticipate an increasing focus for products to include security managers (like their safety counterparts) on the design team and to consider safety from the initial concept through the design/production cycle.

Automotive

The automotive industry has increased the use of electronics year over year for a long time.  At this point an automobile is a true intelligent system, at least as far as what the driver and passenger can see and hear the “infotainment system”.  Late model cars also offer collision avoidance and stay-in-lane functions, but more is coming.

Here is what Wally Rhines thinks: “Automotive and aerospace designers have traditionally been driven by mechanical design.  Now the differentiation and capability of cars and planes is increasingly being driven by electronics.  Ask your children what features they want to see in a new car.  The answer will be in-vehicle infotainment.  If you are concerned about safety, the designers of automobiles are even more concerned.  They have to deal with new regulations like ISO 26262, as well as other capabilities, in addition to environmental requirements and the basic task of “sensor fusion” as we attach more and more visual, radar, laser and other sensors to the car.  There is no way to reliably design vehicles and aircraft without virtual simulation of electrical behavior.

In addition, total system simulation has become a requirement.  How do you know that the wire bundle will fit through the hole in the door frame?  EDA tools can tell you the answer, but only after seeking out the data from the mechanical design.  Wiring in a car or plane is a three dimensional problem.  EDA tools traditionally worry about two dimension routing problems.  The world is changing.  We are going to see the basic EDA technology for designing integrated circuits be applied to the design of systems. Companies that can remain at the leading edge of IC design will be able to apply that technology to systems.”

David Kelf, VP of Marketing at OneSpin Solutions, observed: “OneSpin called it last year and I’ll do it again –– Automotive will be the “killer app” of 2017. With new players entering the marketing all the time, we will see impressive designs featured in advanced cars, which themselves will move toward a driverless future.  All automotive designs currently being designed for safety will need to be built to be as secure as possible. The ISO 26262 committee is working on security as well safety and I predict security will feature in the standard in 2017. Tools to help predict vulnerabilities will become more important. Formal, of course, is the perfect platform for this capability. Watch for advanced security features in formal.”

Rob Knoth, Product Management Director, Digital and Signoff Group at Cadence noted: “In 2016, autonomous vehicle technology reached an inflection point. We started seeing more examples of private companies operating SAE 3 in America and abroad (Singapore, Pittsburgh, San Francisco).  We also saw active participation by the US and world governments to help guide tech companies in the proliferation and safety of the technology (ex. US DOT V2V/V2I standard guidelines, and creating federal ADAS guidelines to prevent state-level differences). Probably the most unique example was also the first drone delivery by a major retailer, something which was hinted at 3 years prior and seemingly just a flight of fancy then.

Looking ahead to 2017, both the breadth and depth are expected to expand, including the first operation of SAE level 4/5 in limited use on public streets outside the US, and on private roads inside US. Outside of ride sharing and city driving, I expect to see the increasing spread of ADAS technology to long distance trucking and non-urban transportation. To enable this, additional investments from traditional vehicle OEM’s partnering with both software and silicon companies will be needed to enable high-levels of autonomous functions. To help bring these to reality, I also expect the release of new standards to guide both the functional safety and reliability of automotive semiconductors. Even though the pace of government standards can lag, for ADAS technology to reach its true potential, it will require both standards and innovation.”

FPGA

The IoT market is expected to provide a significant opportunity to the electronics industry to grow revenue and open new markets.  I think the use of FPGA in IoT dvices will increase the use of these devices in system design.

I asked Geoff Tate, CEO of FlexLogix, his opinions on the subject.  He offered four points that he expects to become reality in 2017:

1. the first customer chip will be fabricated using embedded FPGA from an IP supplier

2. the first customer announcements will be made of customers adopting embedded FPGA from an IP supplier

3. embedded FPGAs will be proven in silicon running at 1GHz+

4. the number of customers doing chip design using embedded FPGA will go from a handful to dozen.

Zibi Zalewski, Hardware Division General Manager at Aldec also addressed the FPGA subject.

“I believe FPGA devices are an important technology player to mention when talking what to expect in 2017. With the growth of embedded electronics driven by Automotive, Embedded Vision and/or IoT markets, FPGA technology becomes a core element particularly for in products that require low power and re-programmability.

Features of FPGA such as pipelining and the ability to execute and easily scale parallel instances of the implemented function allow for the use of FPGA for more than just the traditionally understood embedded markets. FPGA computing power usage is exploding in the High Performance Computing (HPC) where FPGA devices are used to accelerate different scientific algorithms, big data processing and complement CPU based data centers and clouds. We can’t talk about FPGA these days without mentioning SoC FPGAs which merge the microprocessor (quite often ARM) with reprogrammable space. Thanks to such configurations, it is possible to combine software and hardware worlds into one device with the benefits of both.

All those activities have led to solid growth in FPGA engineering, which is pushing on further growth of FPGA development and verification tools. This includes not only typical solutions in simulation and implementation. We should also observe solid growth in tools and services simplifying the usage of FPGA for those who don’t even know this technology such as high level synthesis or engineering services to port C/C++ sources into FPGA implementable code. The demand for development environments like compilers supporting both software and hardware platforms will only be growing, with the main goal focused on ease of use by wide group of engineers who were not even considering the FPGA platform for their target application.

At the other end of the FPGA rainbow are the fast-growing, largest FPGA offered both from Xilinx and Intel/Altera. ASIC design emulation and prototyping will push harder and harder on the so called big-box emulators offering higher performance and significantly lower price per gate and so becoming more affordable for even smaller SoC projects. This is especially true when partnered with high quality design mapping software that handles multi-FPGA partitioning, interconnections, clocks and memories.”

Design Verification

There are many methods to verify a design and companies will, quite often, use more than one on the same design.  Each method: simulation, formal analysis, and emulation, has its strong points.

For many years, logic simulation was the only tool available, although hardware acceleration of logic simulation was also available.

Frank Schirrmeister, Senior Product Management Group Director, System and Verification Group at Cadence submitted a through analysis of verification issues.  He wrote: “From a verification perspective, we will see further market specialization in 2017 – mobile, server, automotive (especially ADAS) and aero/defense markets will further create specific requirements for tools and flows, including ISO 26262 TCL1 documentation and support for other standards. The Internet of Things (IoT) with its specific security and low power requirements really runs across application domains.  Continuing the trend in 2016, verification flows will continue to become more application-specific in 2017, often centered on specific processor architectures. For instance, verification solutions optimized for mobile applications have different requirements than for servers and automotive applications or even aerospace and defense designs. As application-specific requirements grow stronger and stronger, this trend is likely to continue going forward, but cross-impact will also happen (like mobile and multimedia on infotainment in automotive).

Traditionally ecosystems have been centered on processor architectures. Mobile and Server are key examples, with their respective leading architectures holding the lion share of their respective markets. The IoT is mixing this up a little as more processor architectures can play and offer unique advantages, with configurable and extensible architectures. No clear winner is in sight yet, but 2017 will be a key year in the race between IoT processor architectures. Even OpenSource hardware architectures are look like they will be very relevant judging from the recent momentum which eerily reminds me of the early Linux days. It’s one of the most entertaining spaces to watch in 2017 and for years to come.

Verification will become a whole lot smarter. The core engines themselves continue to compete on performance and capacity. Differentiation further moves in how smart applications run on top of the core engines and how smart they are used in conjunction.

For the dynamic engines in software-based simulation, the race towards increased speed and parallel execution will accelerate together with flows and methodologies for automotive safety and digital mixed-signal applications.

In the hardware emulation world, differentiation for the two basic ways of emulating – processor-based and FPGA-based – will be more and more determined by how the engines are used. Specifically, the various use models for core emulation like verification acceleration low power verification, dynamic power analysis, post-silicon validation—often driven by the ever growing software content—will extend further, with more virtualization joining real world connections. Yes, there will also be competition on performance, which clearly varies between processor-based and FPGA-based architectures—depending on design size and how much debug is enabled—as well as the versatility of use models, which determines the ROI of emulation.

FPGA-based prototypes address the designer’s performance needs for software development, using the same core FPGA fabrics. Therefore, differentiation moves into the software stacks on top, and the congruency between emulation and FPGA-based prototyping using multi-fabric compilation allows mapping both into emulation and FPGA-based prototyping.

All this is complemented by smart connections into formal techniques and cross-engine verification planning, debug and software-driven verification (i.e. software becoming the test bench at the SoC level). Based on standardization driven by the Portable Stimulus working group in Accellera, verification reuse between engines and cross-engine optimization will gain further importance.

Besides horizontal integration between engines—virtual prototyping, simulation, formal, emulation and FPGA-based prototyping—the vertical integration between abstraction levels will become more critical in 2017 as well. For low power specifically, activity data created from RTL execution in emulation can be connected to power information extracted from .lib technology files using gate-level representations or power estimation from RTL. This allows designers to estimate hardware-based power consumption in the context of software using deep cycles over longer timeframes that are emulated. ‘

Anyone who knows Frank will not be surprised by the length of the answer.

Wally Rhines, Chairman and CEO of Mentor Graphics was less verbose.  He said:” Total system simulation has become a requirement.  How do you know that the wire bundle will fit through the hole in the door frame?  EDA tools can tell you the answer, but only after seeking out the data from the mechanical design.  Wiring in a car or plane is a three dimensional problem.  EDA tools traditionally worry about two dimension routing problems.  The world is changing.  We are going to see the basic EDA technology for designing integrated circuits be applied to the design of systems. Companies that can remain at the leading edge of IC design will be able to apply that technology to systems.

This will create a new market for EDA.  It will be larger than the traditional IC design market for EDA.  But it will be based upon the basic simulation, verification and analysis tools of IC design EDA.  Sometime in the near future, designers of complex systems will be able to make tradeoffs early in the design cycle by using virtual simulation.  That know-how will come from integrated circuit design.  It’s no longer feasible to build prototypes of systems and test them for design problems.  That approach is going away.  In its place will be virtual prototyping.  This will be made possible by basic EDA technology.  Next year will be a year of rapid progress in that direction.  I’m excited by the possibilities as we move into the next generation of electronic design automation.”

The increasing size of chips has made emulation a more popular tool than in the past.  Lauro Rizzatti, Principal at Lauro Rizzatti LLC, is a pioneer in emulation and continues to be thought of as a leading expert in the method.  He noted: “Expect new use models for hardware emulation in 2017 that will support traditional market segments such as processor, graphics, networking and storage, and emerging markets currently underserved by emulation –– safety and security, along with automotive and IoT.

Chips will continue to be bigger and more complex, and include an ever-increasing amount of embedded software. Project groups will increasingly turn to hardware emulation because it’s the only verification tool to debug the interaction between the embedded software and the underlying hardware. It is also the only tool capable to estimate power consumption in a realistic environment, when the chip design is booting an OS and processing software apps. More to the point, hardware emulation can thoroughly test the integrity of a design after the insertion of DFT logic, since it can verify gate-level netlists of any size, a virtually impossible task with logic simulators.

Finally, its move to the data center solidifies its position as a foundational verification tool that offers a reasonable cost of ownership.”

Formal verification tools, sometimes referred to as “static analysis tools” have seen their use increase year over year once vendors found human interface methods that did not require a highly-trained user.  Roger Sabbagh, VP of Application Engineering at Oski Technology pointed out: “The world is changing at an ever-increasing pace and formal verification is one area of EDA that is leading the way. As we stand on the brink of 2017, I can only imagine what great new technologies we will experience in the coming year. Perhaps it’s having a package delivered to our house by a flying drone or riding in a driverless car or eating food created by a 3-D printer. But one thing I do know is that in the coming year, more people will have the critical features of their architectural design proven by formal verification. That’s right. System-level requirements, such as coherency, absence of deadlock, security and safety will increasingly be formally verified at the architectural design level. Traditionally, we relied on RTL verification to test these requirements, but the coverage and confidence gained at that level is insufficient. Moreover, bugs may be found very late in the design cycle where they risk generating a lot of churn. The complexity of today’s systems of systems on a chip dictates that a new approach be taken. Oski is now deploying architectural formal verification with design architects very early in the design process, before any RTL code is developed, and it’s exciting to see the benefits it brings. I’m sure we will be hearing a lot more about this in the coming year and beyond!”

Finally David Kelf, VP Marketing at OneSpin Solutions observed: “We will see tight integrations between simulation and formal that will drive adoption among simulation engineers in greater numbers than before. The integration will include the tightening of coverage models, joint debug and functionality where the formal method can pick up from simulation and even emulation with key scenarios for bug hunting.”


Conclusion

The two combined articles are indeed quite long.  But the EDA industry is serving a multi-faceted set of customers with varying and complex requirements.  To do it justice, length is unavoidable.

EDA in the year 2017 – Part 1

January 12th, 2017

Gabe Moretti, Senior Editor

The EDA industry performance is dependent on two other major economies: one technological and one financial.  EDA provides the tools and methods that leverage the growth of the semiconductor industry and begins to receive its financial rewards generally a couple of year after the introduction of the new product on the market.  It takes that long for the product to prove itself on the market and achieve general distribution.

David Fried from Coventor addressed the most important topics that may impact the foundry business in 2017.  He made two points.

“Someone is going to commit to Extreme Ultra-Violet (EUV) for specific layers at 7nm, and prove it.  I expect EUV will be used to combine 3-4 masks currently using 193i in a multi-patterning scheme (“cut” levels or Via levels) for simplicity (reduced processing), but won’t actually leverage a pattern-fidelity advantage for improved chip area density.

The real density benefit won’t come until 5nm, when the entire set of 2D design rules can be adjusted for pervasive deployment of EUV.  This initial deployment of EUV will be a “surgical substitution” for cost improvement at very specific levels, but will be crucial for the future of EUV to prove out additional high-volume manufacturing challenges before broader deployment.  I am expecting this year to be the year that the wishy-washy predictions of who will use EUV at which technology for which levels will finally crystallize with proof.

7nm foundry technology is probably going to look mostly evolutionary relative to 10nm and 14nm. But 5nm is where the novel concepts are going to emerge (nanowires, alternate channel materials, tunnel FETs, stacked devices, etc.) and in order for that to happen, someone is going to prove out a product-like scaling of these devices in a real silicon demonstration (not just single-device research).  The year 2017 is when we’ll need to see something like an SRAM array, with real electrical results, to believe that one of these novel device

concepts can be developed in time for a 5nm production schedule.”

Rob Knoth, Product Marketing Director, Digital and Signoff Group at Cadence offered the following observation.  “This past year, major IDM and pure-play foundries began to slow the rate at which new process nodes are planned to be released. This was one of the main drivers for the restless semiconductor-based advances we’ve seen the past 50 years.

Going forward, fabs and equipment makers will continue to push the boundaries of process technology, and the major semiconductor companies will continue to fill those fabs. While it may be slowing, Moore’s Law is not “dead.” However, there will be increased selection about who jumps to the “next node,” and greater emphasis will be placed on the ability of the design engineer and their tools/flows/methods to innovate and deliver value to the product. The importance for an integrated design flow to make a difference in product power/performance/area (PPA) and schedule/cost will increase.

The role that engineering innovation and semiconductors play in making the world a better place doesn’t get a holiday or have an expiration date.

The semiconductor market, in turn, depends on the general state of the world-wide economy.  This is determined mostly by consumer sentiment: when consumers buy, all industries benefit, from industrial to financial.  It does not take much negative inflection in consumers’ demand to diminish the requirement for electronic based products and thus semiconductors parts.  That in turn will have a negative effect on the EDA industry.

While companies that sell multi-years licenses can smooth the impact, new licenses, both multi-year and yearly are more difficult to sell and result in lower revenue.

The electronic industry will evolve to deal with increased complexity of designs.  Complex chips are the only vehicle that can make advance fabrication nodes profitable.  It makes no sense decreasing features’ dimensions and power requirements at the cost of increased noise and leakage just for technology sake.  As unit costs increase, only additional functionality can justify new projects.  Such designs will require new methodology, new versions of existing tools, and new industry organization to improve the use of the development/fabrication chain.

Michael Wishart, CEO of Efabless believes that in 2017 we will begin to see full-fledged community design, driven by the need for customized silicon to serve emerging smart hardware products. ICs will be created by a community of unaffiliated designers on affordable, re-purposed 180nm nodes and incorporate low cost, including open source, processors and on-demand analog IP. An online marketplace to connect demand with the community will be a must.

Design Methods

I asked Lucio Lanza of Lanza techVentures what factors would become important in 2017 regarding EDA.  As usual his answer was short and to the point.  “Cloud, machine learning, security and IoT will become the prevailing opportunities for design automation in 2017. Design technology must progress quickly to meet the needs of these emerging markets, requiring as much as possible from the design automation industry. Design automation needs to willingly and quickly take up the challenge at maximum speed for success. It’s our responsibility, as it’s always been.”

Bob Smith, Executive Director of the ESD alliance thinks that in 2017, the semiconductor design ecosystem will continue evolving from a chip-centric (integration of transistors) focus to a system-centric (integration of functional blocks) worldview. While SoCs and other complex semiconductor devices remain critical building blocks and Moore’s Law a key driver, the emphasis is shifting to system design via the extensive use of IP. New opportunities for automation will open up with the need to rapidly configure and validate system-level design based on extensive use of IP.  Industry organizations like the Electronic System Design Alliance have a mission to work across the entire design ecosystem as the electronic design market makes the transition to system-level design.

Wally Rhines, Chairman and CEO of Mentor Graphics addressed the required changes in design as follows: “EDA is a changing.  Most of its effort in the last two decades in the EDA industry has focused on the automation of integrated circuit design. Virtually all aspects of IC design are now automated with the use of computers.  But system design is in the infancy of an evolution to virtual design automation. While EDA has now given us the ability to do first pass functional integrated circuit designs, we are far from providing the same capability to system designers.

What’s needed is the design of “systems of systems”.  That capability is coming.  And it is sooner than you might think.  Designers of planes, trains and automobiles hunger for virtual simulation of their designs long before they build the physical prototypes for each sub-system.  In the past, this has been impossible.  Models were inadequate.  Simulation was limited to mechanical or thermal analysis.  The world has changed.  During 2017, we will see the adoption of EDA by companies that have never before considered EDA as part of their methodology.”

Frank Schirrmeister, Senior Product Management Group Director, System and Verification Group at Cadence offered the following observation.  “IoT that spans across application domains will further grow, especially in the industrial domain. Dubbed in Gernany as “Industrie 4.0”, industrial applications are probably the strongest IoT driver. Value shifts will accelerate from pure semiconductor value to systemic value in IoT applications. The edge node sensor itself may not contribute to profits greatly, but the systemic value of combining the edge node with a hub accumulating data and sending it through networks to cloud servers in which machine learning and big data analysis happens allows for cross monetization. The value definitely is in the system. Interesting shifts lie ahead in this area from a connectivity perspective. 5G is supposed to broadly hit is in 2020, with early deployments in 2017. There are already discussions going on regarding how the connectivity within the “trifecta” of IoT/Hub/Server is going to change, with more IoT devices bypassing the aggregation at the hub and directly accessing the network. Look for further growth in the area that Cadence calls System Design Enablement, together with some customer names you would have previously not expected to create chips themselves.

Traditionally ecosystems have been centered on processor architectures. Mobile and Server are key examples, with their respective leading architectures holding the lion share of their respective markets. The IoT is mixing this up a little as more processor architectures can play and offer unique advantages, with configurable and extensible architectures. No clear winner is in sight yet, but 2017 will be a key year in the race between IoT processor architectures. Even OpenSource hardware architectures are look like they will be very relevant judging from the recent momentum which eerily reminds me of the early Linux days. It’s definitely one of the most entertaining spaces to watch in 2017 and for years to come. “

Standards

Standards have played a key role in EDA.  Without them designers would be locked to one vendor for all of the required tools, and given the number of necessary tools very few EDA companies would be able to offer all that is required to complete, verify, and transfer to manufacturing a design.  Michiel Ligthart, President and COO at Verific, sees two standards, in particular, playing a key role in 2017.  “Watch for quite a bit of activity on the EDA standards front in 2017. First in line is the UVM standard (IEEE 1800.2), approved by the Working Group in December 2016. The IEEE may ratify it as early as February. Another one to watch is the next installment of SystemVerilog, mainly a “clarifications and corrections” release, that will be voted on in early 2017 with an IEEE release just before the end of the year. In the meantime, we are all looking at Accellera’s Portable Stimulus group to see what it will come up with in 2017.”

In regards to the Portable Stimulus activity Adnan Hamid, CEO of Breker Verification Systems goes into more details.  “While it’s been a long time coming, Portable Stimulus is now an important component of many design verification flows and that will increase significantly in 2017. The ability to specify verification intent and behaviors reusable across target platforms, coupled with the flexibility in choosing vendor solutions, is an appealing prospect to a wide range of engineering groups and the appeal is growing. While much of the momentum is rooted in Accellera’s Portable Stimulus Working Group, verification engineers deserve credit for recognizing its value to their productivity and effectiveness. Count on 2017 to be a big year for both its technological evolution and its standardization as it joins the ranks of SystemVerilog, UVM and others.

Conclusion

Given the amount of contributions received, it would be overwhelming to present all of them in one article.  Therefore the remaining topics will be covered in a follow-on article the following week.

EDA has not been successful at keeping its leaders

January 4th, 2017

Gabe Moretti, Senior Editor

I have often wondered why when a larger EDA company acquires a smaller one, the acquired CEO ends up, in a relatively short time, leaving and either joining a new start-up or a venture capital firm.  It seemed to me that that CEO thought enough of the buyer to predict his (or hers) employees and product(s) would prosper in the new environment when accepting to be acquired.  So, why leave?  It could just not be a matter of strong contrasting personalities.  I think I found the answer over the Christmas break.

I read the book “Skunk Works” by Ben R. Rich.  The book is a factual history of development projects that were carried out while Ben was first there as an employee and eventually its leader.  During his years at the Skunk Works Mr. Rich was part of the exceptional successes of the U-2 and SR-71 spy planes, and of the F117A stealth bomber.  All those projects were run independently of corporate overseers, used a comparatively small dedicated team, and modified the project when necessary to achieve the established goal.

Two major points made in the book apply both to the EDA industry and to industry in general.  First “Leaders are natural born: managers must be trained” and second “There is no substitute for astute managerial skill on any project”.

Many start-up CEOs are born leaders and do not fit well within an organization where projects are managed in a bureaucratic manner using a rigid reporting structure.  An ex-CEO will soon find such work environment counter-productive.  Successful projects need to react quickly to changing realities and parameters.  Often in the life of a project the team discovers new opportunities or new obstacles that come to light because of the work being done.  The time spent explaining and justifying the new alternative will impact the success of the project, especially if the value of the presented alternative is not fully understood by top executives or the new managers do not understand the new corporate politics.

I think that the best use of an acquired CEO is to allow him or her to continue to be an entrepreneur within the acquiring company.  This does not mean to use his talent to continue to lead the just acquired team. He can look for new opportunities within his area of expertise and possibly build a new team that will produce a new product.  In this way the acquiring company increases its ROI form the acquisition, even at the cost of increased compensation to both the CEO and his new team at the successful completion of their work.

In general Synopsys has managed to retain acquired CEOs, while Cadence has not.

The behavior in the EDA industry, with very few well known exceptions, has been to seek a quick reward through an acquisition that will satisfy financially both the venture capitalists and the original start-up team.  Once the acquisition price is monetized, many people leave the industry seeking to capitalize on their financial gains in other ways.  Thus the EDA industry must grow through the entrance of new people with new ideas but little if any experience in the industry.  The result is many academic brilliant ideas that result in failed start-ups.  Individuals with brilliant ideas are not usually good leaders or managers, and good managers do not generally possess the creativity to conceive a breakthrough product.

In its history the EDA industry has paid the price of creating both leaders and excellent managers, but has yet to find a way to retain them.  Of course there are a few exceptions, nothing is ever black and white, but the exceptions are few.  It will be interesting to see, after a couple of years, how Siemens will have handled the Mentor Graphics acquisition.  Will Mentor’s creativity improve?  Will the successful team remain?  Will they use the additional resource in an entrepreneurial manner, or either leave or adjust to a more relaxed big company life?

DVCon is a Worldwide Conference

December 21st, 2016

Gabe Moretti, Senior Editor

The DVCon conference has now solid traditions not only in the USA but also in Europe, India, and next year will start flowering in China.

DVCon U.S.

The conference will be held February 27 – March 2, 2017 at the DoubleTree in San Jose, California. Early registration was open and the program is available on line at  https://dvcon.org. “DVCon U.S. 2017 planning is taking shape,” commented Dennis Brophy, DVCon U.S. General Chair. “We look forward to a compelling and in-depth technical program full of engaging content that practicing design and verification engineers, managers and EDA tool suppliers have come to depend on from DVCon.” The four-day program offers attendees an Expo, two exciting standards-focused panels and numerous informative papers, tutorials and posters to choose from. Accellera Day starts the conference on Monday and will devote the entire morning to a tutorial on Accellera’s emerging Portable Stimulus standard titled “Creating Portable Stimulus Models with the Upcoming Accellera Standard,” with two afternoon tutorials: “SystemC Design and Verification – Solidifying the Abstraction above RTL” and “Introducing IEEE P1800.2 – The Next Step for UVM.”

DVCon India

This important technical event in India was held in Bangalore in September with almost 440 attendees over the two-day event. There were local start-ups participating and exhibiting for the first time, further demonstrating the local focus and interest in each conference. “DVCon India rightly promotes the four C’s: connect, contribute, collaborate, and celebrate,” stated Gaurav Jalan, DVCon India General Chair. The two-day event was inaugurated with a traditional lamp-lighting ceremony and welcome remarks by Jalan. Dr. Walden Rhines, Chairman and CEO of Mentor Graphics, and Professor Kamakoti Veezhinathan, Indian Institute of Technology Madras, delivered the keynotes.

DVCon Europe

As in previous years the conference was held in Munich, Germany. Held in October it enjoyed an increase in attendance of 20% over the previous year. Attendees of the two-day conference included representatives from 93 companies and organizations from 25 countries. Insightful keynotes were delivered by Hobson Bullman, General Manager of ARM’s Technology Services Group, and Jugen Weyer, Vice President of Automotive Sales for EMEA at NXP Semiconductors. Bob Smith, Executive Director of the ESD Alliance, gave the keynote at the gala dinner. “It’s fantastic to see this event continuing to do so well, meeting a clear need for a European forum that provides practical, detailed information on state-of-the-art development methodologies,” noted Oliver Bell, DVCon Europe General Chair. “This year’s conference was particularly exciting with three dynamic keynote speeches, overwhelming tutorial and paper submissions, and a vibrant exhibition. Now that DVCon Europe is established as the must-attend event in Europefor engineers to upgrade their skills, we are looking forward to an even larger event in 2017.”

DVCon China

During 2017 DVCon will premier as a one-day event in Shanghai on April 17, 2017. The steering committee is in the process of analyzing a number of excellent paper abstract submissions for its inaugural program.. “Ideas, networking, technical discussions, learning opportunities and exciting exhibits of new products and services. This is what DVCon China will offer to attendees,” stated Andy Liu Hongliang, DVCon China General Chair. “Many hot areas of ASIC design and verification such as UVM, Low Power, IP Reuse, Formal, Mixed-Signal, System Design and Debug Strategies will be distributed throughout the whole conference with lectures, discussions, presentations and demos.”

Siemens Acquisition of Mentor Graphics is Good for EDA

November 15th, 2016

Gabe Moretti, Senior Editor

Although Mentor has been the subject of take-over attempts in the past, the business specifics of the transactions have never been favorable to Mentor.  The acquisition by Siemens, instead, is a favorable occurrence for the third largest EDA company.  This time both companies obtain positive results from the affair.

Siemens acquires Mentor following the direction set forth in 2014 when its Vision 2020 was first discussed in public.  The 6 year plan describes steps the company should take to better position itself for the kind of world it envisions in 2020.

The Vision 2020 document calls for operational consolidation and optimization during the years 2016 and 2017.  It also selects three of its business division as critical to corporate growth.  It calls it the E-A-D system that include: Digitalization, Automation, and Electrification.

Although it is possible that Mentor technology and products may be strategic in Electrification, they are of significant importance in the other two areas: Digitalization and Automation.  Digitalization, for example, includes vehicle automation, including smart cars and vehicle to vehicle communication.  Mentor already has an important presence in the automotive industry and can help Siemens in the transition to state of the art car management by electronic systems to the innovation of new systems required by the self-driving automobile and the complete integration of the components into an intelligent systems including vehicle-to-vehicle communication.

Mentor also has experience in industrial robots and what is, in my mind, more remarkable, is that the PCB and cabling portions of Mentor, often minimized in an EDA industry dominated by the obsession of building ICs, are the parts that implement and integrate the systems in the products designed and built by third parties.

With its presence in the PCB and cabling markets, Mentor can bring additional customers to Siemens as well as insight in future marketing requirements and limitations that will serve extremely well in designing the next generation industrial robots.

Of course, Mentor will also find an increased internal market as other divisions of Siemens not part of the E-A-D triumvirate will utilize its products and services.

Siemens describes itself as an employee oriented company, so present Mentor employee should not have to fear aggressive cost cutting and consolidation.  Will Mentor change? Of course, it will adapt gradually to the new requirements and opportunities the Siemens environment will create and demand, but the key word is “gradually”.  Contrary to the acquisition of ARM by SoftBank, where the acquiring company had no previous activity in ARM’s business, Siemens has been active in EDA internally, both in its Research Lab and strong connections with university programs that originated a number of European EDA startups.  Siemens executives have an understanding of what EDA is all about and what it takes to be successful in EDA.  The result, I expect, is little interference and second guessing which translates in continuous success for Mentor for as long as they are capable of it.

Wil other EDA companies benefit from this acquisition? I tink they will.  First of all it attracts more attention to our industry by the financial community, but it also is likely to increase competition among the “big 3” forcing Cadence and Synopsys to focus more on key markets and while diversifying into related markets like optical, security, software development for example.  In addition I do not see the reason for an EDA company to enter into a business partnership with some of its customers to explore new revenue generating business models.

ARM TechCon is the Model for Future Successful Conferences

October 26th, 2016

Gabe Moretti, Senior Editor

It has become abundantly clear that corporate and consortia sponsored conferences are gaining in both popularity and usefulness over generic conferences like DAC and DesignCon.  The reason, in my opinion, is how development has changed.  The industry has moved from the ASIC era, to the integrated system era.

Instead of designing an entire system, engineers now integrate subsystems.  This has been made possible with the introduction of IP licensing and the growth of the IP industry.  From a fledging and challenging design opportunity in the early 1990’s the use of IP is now a routine function that embraces both hardware and software modules.

Now both IP vendors and EDA tools providers can offer an ecosystem that is complete to their customers, both in capability and in range of functions.  The result is that conferences like ARM TechCon provide greater utility to working engineers, than the exhibit areas of DAC or DesignCon.

Only specialized conferences like DVCon held by Accellera on three continents continue to grow, because attendees benefit from the focused topics offered.  An engineer is concerned with issues covering the integration of design and verification functions finds interesting content in DVCon, while the same engineer would have to work from advanced conference documentation to create his or her own program at times dealing with conflicting schedules.

ARM holds its own specific program within DAC.  So conference attendees can take advantage of focused curricula.  But the problem is that other companies that enhance the specific environment by collaborating with ARM, for example, cannot provide focused support, since their attention must be directed toward all possibilities available within the conference.

A design engineer attending DAC finds a plethora of activities that are of no interest, or of marginal interest, and has a harder time moving within the conference just to follow what he or she wishes to see and hear.

Professionals dealing with layout and fabrication issues, for example, would find a conference organized by a fab company dealing with its own fabrication environment, challenges, and guidelines, more interesting that a series of academic papers presented at DAC.  I believe that DAC sponsor organizations need to take into consideration the changed reality of IC and system design, not just in the material presented, but in the format it is presented in.

The significant increase in size and popularity of ARC Day from Synopsys, for example, is another indication that such workshops are more valuable than generic conferences.  The same can be said for Accellera’s DVCon conferences now held in the US, Europe, India and China.  Although design and verification issue are global, they have different flavors in certain important parts of the planet.

ARM IP users find at ARM TechCon everything they need to successfully complete a design.  Both design, verification, software integration issues are covered with a depth and spread that is not available any place else.

Kilopass Unveiled Vertical Layered Thyristor (VLT) Technology for DRAMs

October 19th, 2016

Gabe Moretti, Senior Editor

Kilopass Technology, Inc., is a leader in embedded non-volatile memory (NVM) intellectual property (IP).  Its patented technologies of one-time programmable (OTP) NVM solutions scale to advanced CMOS process geometries. They are portable to every major foundry and integrated device manufacturer (IDM), and meet market demands for increased integration, higher densities, lower cost, low-power management, better reliability and improved security.  The company has just announced a new device that potentially allows it to diversify into new markets.

According to Charlie Cheng, Kilopass’ CEO, VLT eliminates the need for DRAM refresh, is compatible with existing process technologies and offers significant other benefits including lower power, better area efficiency and compatibility.  When asked the reason for this additional corporate direction Charlie replied: “Kilopass built its reputation as the leader in one-time programmable memories,” says Charlie Cheng, Kilopass’ chief executive officer. “As the next step on our roadmap, we examined many possible devices that would not need new materials or complex process flows and found this vertical thyristor to be very compelling.  We look forward to commercializing VLT DRAM in early 2018.”

VLT Overview

Kilopass’ VLT is based on thyristor technology, a structure that is electrically equivalent to a cross-coupled pair of bipolar transistors that form a latch. The latch lends itself to memory applications since it stores values and, as opposed to current capacitor-based DRAM technology, does not require refresh. The thyristor was first invented in the 1950s and several attempts have been made to use it for the SRAM market without success.  Kilopass’ VLT is the realization of DRAM requirements based on implementing the thyristor structure vertically.

Since VLT does not require complex performance- and power-consuming refresh cycles, a VLT-based DDR4 DRAM lowers standby power by 10X when compared to conventional DRAM at the same process node. Furthermore, VLT requires fewer processing steps and is designed to be built using existing processing equipment, materials and flows.

The VLT bitcell operations and silicon measurement were completed in 2015 and shown to have excellent correlation to Kilopass’ proprietary ultra-fast TCAD simulator that is one hundred thousand times faster than a traditional TCAD simulator. The TCAD simulator enables Kilopass to predict the manufacturing windows for key process parameters, and optimize the design for any given manufacturing process.  A full macro level test chip was taped-out in May and initial silicon testing is underway.
Industry Perspective

The $50B DRAM market is being driven by strong demand in the server/cloud computing market as mobile phone and tablet market growth are slowing down and computing is moving increasingly to the cloud. The outlook for DRAM growth remains strong. In a report published in 2015, IC Insights forecasts DRAM CAGR of 9% over the period from 2014 – 2019. This growth rate shows DRAM growing faster than the total IC market.

Servers and server farms consume a tremendous amount of energy with memory being a major contributor. In an ideal world, the current generation of 20 nanometer (nm) DRAM would migrate to sub-20nm processes to deliver even lower power.

Current DRAM technology is based on the 1 transistor, 1 capacitor. The (1T1C) bitcell is difficult to scale since the smaller transistors exhibit more leakage and the smaller capacitor structure has less capacitance, resulting in the need to reduce the time between refresh intervals. Up to 20% of a 16Gb DDR DRAM’s raw bandwidth will be lost due to the increased frequency of refresh cycles, a negative for multi-core/multi-thread server CPUs that must squeeze out every bit of performance to remain competitive. The DRAM industry is in a quandary trying to increase memory performance while reducing power consumption, a tough challenge given the physics at play with the current 1T1C technology. In order to address the need for lower power consumption a new DRAM technology and architecture is needed.

Kilopass stated that its initial target markets include “PCs” and servers. I am of the old school and associate the term “PC” to personal computers.  But Kilopass uses the term to mean Portable Computing Devices so it is talking about a different market.  Kilopass expects to have test silicon by early 2017 that will confirm performance of the new VLT DRAM technology and manufacturability.   Kilopass has two primary reasons to announce the new technology over one year in advance of product delivery.  First the company is in the IP business, so it is giving itself time to look for licensees.  Secondly it thinks that the DRAM market has been stuck at 20nm. Adoption of new technology takes time, though VLT has been shown to be manufacturable. This is the right time to alert the market that there are alternative solutions, allowing time for investigation of this new technology.   Market penetration of new technology is not always assured.  Wide acceptance almost always requires a second source, especially with something so new as the VLT device.  Memories play a critical role cloud computing but a far smaller one in PC since power consumption in PC is not a widespread issue.

Next Page »