Part of the  

Chip Design Magazine

  Network

About  |  Contact

Gallium Nitride for Power Applications

May 15th, 2017

Gabe Moretti, Senior Editor

Gallium nitride (GaN) has long been used in LED devices and other optoelectronic devices with high power and high frequency requirements.  It can work at much higher temperatures and voltages than gallium arsenide (GaAs) for example and thus it is also used in military and space applications.

A few days ago X-FAB and Exagan announced that they have produced GaN-on-Silicon devices on 200-mm wafers.

X-FAB, whose headquarter is in Germany, is a leading analog/mixed-signal and MEMS foundry group manufacturing silicon wafers for automotive, industrial, consumer, medical and other applications.  The company operates one fab in the USA, specifically in Lubbock Texas where it fabricates mixed-signal devices.  It also operates other fabs both in Europe and in Asia.

Founded in 2014 with support from CEA-Leti and Soitec, Exagan is based in Grenoble, France.  its mission is to accelerate the power electronics industry’s transition from silicon-based technology to GaN-on-silicon technology, enabling smaller and more efficient electrical converters.

The two companies have demonstrated mass-production capability to manufacture highly efficient high-voltage power devices on 200-mm GaN-on-silicon wafers using X-FAB’s standard CMOS production facility in Dresden, Germany.

Exagan and X-FAB have successfully resolved many of the challenges related to material stress, defectivity and process integration while using standard fabrication equipment and process recipes. Combined with the use of 200-mm wafers, this will significantly lower the cost of mass producing GaN-on-silicon devices. By enabling greater power integration than silicon ICs, GaN devices can improve the efficiency and reduce the cost of electrical converters, which will accelerate their adoption in applications including electrical vehicle charging stations, servers, automobiles and industrial systems.

The industry’s previous work with GaN had been limited to 100-mm and 150-mm wafers due to the challenges of layering GaN films on silicon substrates. Exagan’s G-Stack technology enables GaN-on-silicon devices to be manufactured more cost effectively on 200-mm substrates by depositing a unique stack of GaN and strain-management layers that relieves the stress between GaN and silicon layers. The resulting devices have been shown to exhibit high breakdown voltage, low vertical leakage and high-temperature operation.

The new GaN-on-silicon devices have been built using substrates fabricated at Exagan’s 200- mm epi-manufacturing facility in Grenoble, France. These epi wafers meet the physical and electrical specifications to produce Exagan’s 650-volt G-FET devices as well as the tight requirements for compatibility with CMOS manufacturing lines.

IP Fingerprinting Initiative from the ESD Alliance

May 11th, 2017

Gabe Moretti, Senior Editor

I had occasion to discuss with Bob Smith, Executive Director of the ESD Alliance, and Warren Savage, General Manager IP at Silvaco, my March article “Determining a Fair Royalty Value for IP” (http://chipdesignmag.com/sld/blog/2017/03/13/determining-a-fair-royalty-value-for-ip/) as we addressed the IP Fingerprinting Initiative of the ESD Alliance.

I stated that there really was not a standard way to determine the amount of royalty that could be charged for an IP.

Bob Smith responded: “Royalties should be based on value provided. Value comes in many forms, such as how much of the functionality of the end product is provided by the IP, the risk and time-to-market reduction, and design and verification cost savings. There is no simple formula for IP royalties. In fact, they can be quite complicated.”

Warren added: “Business models used for licensing royalties are always a negotiation between the buyer and seller with each party striving to optimize the best outcome for their business. In some cases, the customer may be willing to pay more for royalties in exchange for lowering the upfront licensing costs. A different customer may be willing to invest more upfront to drive down the cost of royalties. Calculation of the actual royalty amounts may be based on a percentage of the unit cost or a fixed price, and each may have sliding scales based on cumulative volumes. Both parties need to derive the value that fits their own business model. The IP user needs to arrive at a price for the IP that supports the ROI model for the end product. The IP supplier needs to ensure that it receives sufficient value to offset its investment in IP development, verification and support. It is able then to participate in the success of the buyer’s product based (at least in part) on the value of the IP provided.”

Since it seems impossible to have a standard function to determine royalties, is there an intrinsic value for an IP?

Warren remarked: “An IP has zero intrinsic value in of itself. The value is completely dependent on the application in which it is used, the ability of the IP to offset development costs and risks and the contributions it makes to the operation and success of the target product. For example, an IP that is developed and ends up sitting on the shelf has no value at all. In fact, its value is negative given the resources and costs spent on developing it. Size doesn’t matter. An IP that has hundreds of thousands of gates may command a higher price because the IP supplier needs to sell it for that price to recoup its investment in creating it.  A small IP block may also command a high price because it may contain technology that is extremely valuable to the customer’s product and differentiates it significantly from the competition. The best way to think about intrinsic value is to think of it in the context of value delivered to the customer. If there is no apparent difference in this regard between an IP product from two or more suppliers, then the marketplace sets the price and the lowest cost supplier wins.”

In terms of the IP Fingerprinting Initiative of the ESD Alliance, I was curious to understand how the owner of the IP could protect against illegal uses.

Warren said: “This is the great problem we have in the IP industry today. Approximately 99% percent of IP is delivered to customers in source code form and IP companies rely on the good faith of their customers to use it within the scope of the license agreement. However, there is a fundamental problem. Engineers rarely know what the usage terms and restrictions of the agreement their company has with the IP supplier, so it is quite easy for a semiconductor company to be in violation, and not even know it. New technologies are coming into play, such as the IP fingerprinting scheme that the ESD Alliance is promoting. Fingerprinting is a non-invasive approach that protects both IP suppliers and their customers from “accidental reuse.”

Bob Smith added: “IP suppliers can utilize The Core Store (www.the-core-store.com) at no charge to showcase their products and register “fingerprints” of their technology. Semiconductor companies can use this registry to detect IP usage within their chips by means of “DNA analysis” software available through Silvaco.”

Memory Subsystem Solutions Announced

May 9th, 2017

Gabe Moretti, Senior Editor

Integrating a Network on Chip (NoC) with a memory controller provides increased system throughput by decreasing the latency of data transfer to and from local storage.  Sonics, Inc. and Northwest Logic have announced their partnership to deliver high throughput memory subsystem solutions for complex System-On-Chip (SOC) designs.  The subsystem is focused on SOC Designs for Machine Learning, Computer Vision, UHD Video Processing, and Enterprise SSD Applications.   The companies’ partnership, which is being driven by a mutual customer SOC design win, integrates Sonics’ flagship interconnect fabric, SonicsGN NoC, and Sonics’ MemMax memory scheduler with Northwest Logic’s family of HBM2, DDRx, LPDDRx memory controllers.

“As DRAM data rates increase, the number of pipelined outstanding transactions required to achieve full throughput grows dramatically,” said Drew Wingard, CTO of Sonics. “This requires more intelligent transaction scheduling that considers both the bandwidth and latency requirements of pending requests and the page and bank states of DRAM. Without careful coordination between the NoC, memory scheduler, and memory controller, the subsystem will miss data transfer deadlines and suffer performance degradation at several points along the memory subsystem transaction path. Our partnership with Northwest Logic ensures that MemMax’s scheduling decisions produce a transaction stream enabling Northwest Logic’s controllers to efficiently map into memory commands that fully leverage the customer’s chosen DRAM technology.”

“Northwest Logic and Sonics share an uncompromising commitment to customer success,” said Brian Daellenbach, President of Northwest Logic. “We are seeing a significant uptick in demand for high throughput memory subsystems that address data-intensive applications and markets such as Machine Learning, Computer Vision, UHD Video Processing, and Enterprise SSD. Our memory controllers have a strong industry reputation for delivering high-performance, high quality, and ease-of-use. Our partnership with Sonics enables us to provide our mutual customers with a complete memory subsystem solution that also takes into account the need for high performance NoCs that support multi-channel memory subsystem architectures and integrate all of the cores in the system.”

The Sonics-Northwest Logic high throughput memory subsystem solutions where developed for a mutual customer and are now being integrated in designs by other customers as well.

ESDA CEO Outlook Panel, not a Cause for Celebration

April 25th, 2017

Gabe Moretti, Senior Editor

About two weeks ago the ESD Alliance held its 2017 CEO Outlook Panel.  The panel used to be a yearly event for EDAC, the precursor of the ESD Alliance, but had not been held for a few years since.

I did not have the opportunity to attend the panel in person and was looking forward to read the entire transcript on the ESDA site.  But there is no transcript.  There is a picture of the panelists with Ed Sperling as the moderator, and a brief video showing introductory remarks from each of the panelists.

Unfortunately, the introductory remarks could have been delivered by a set of industry cheerleaders, since there was really no depth in them.  You can guess the contents: greater need for silicon, IoT and IIoT (Industrial IoT) are the future.  Lip-Bu Tan did say something interesting by addressing data architecture within the IIoT.  Too many articles have been written about the IoT hardware architecture and I have found almost nothing that talks about the data architecture.  Lip-Bu emphasized the need for local computational and decision making capability, thus limiting the need to transfer large amount of data up in the hierarchy of the system.  Given the complex connectivity of a IoT system, where everything is potentially connected to everything else, security would be a major concern should the need to transfer a large amount of information arise.  The smaller the amount of data transfer, the higher the security.

What Lip-Bu did not say, but is implied, is the need for distributed intelligence in the system with application specific hardware playing a greater role.  It is back to the future.  ASIC once again will step to the forefront replacing software based system running on general purpose hardware.

Wally Rhines noted that our industry economics are back at the levels of six years ago in spite of the significant consolidation of our customer base.  It is called recovery, and our industry has had less of a recovery than most other industries.

The consolidation issue points out the real problem of the EDA industry.  We sell tools used in the design and development of a product, not its manufacturing.  Manufacturing volume means nothing to the EDA bottom line.  Fewer “makers” means fewer “tools needed”.

Mentor is now part of one of its customers, so does such consolidation matter to Wally?  I hear of course that Mentor will continue to do business in an independent manner, and that will be true unless and until a conflict of interest ensues.  Will Mentor sell its best tools to companies directly competing with Siemens in a specific, competitive, market?

Simon Segers of ARM was also part of the panel.  If he said something important as a result of now working for SoftBank, you could not have guessed it from his introductory remarks either.  He just repeated Aart De Geus observation about the world needing more silicon.  It has been clear since the acquisition that SoftBank other captive industries need more silicon and more IP cores, the reason for the acquisition!  ARM will expertly create whatever SoftBank needs and will market what it creates to the outside world.  The other way around will not happen, at least not in any important way.

Speaking as the scientist that he is Aart pointed out that as algorithms’ complexity increases the need for computational capability increases as well, thus the need for more silicon.  It is an assumption that increased silicon production means increased EDA revenue.  This is a fallacy, since EDA revenues are realized at the front end of the project and do not grow with product volume!  The amount of EDA products that are used in wafer production and testing is small in comparison to the tools used to design and test before production.

There is also a significant difference in the revenue generated by different types of silicon.  A 90 nm device requires less up-front investment in tools than a 10nm device.  And there will be much more of the former than the latter type.

I think that reviving the CEO Panel is a good thing.  It shows, at least, that accepted leaders of the EDA industry are willing to appear in public and deliver statements without fear of sounding irrelevant.  And the lack of a full transcript, given the high level of professionalism now in the ESDA, must mean that nothing worthy of greater analysis was said during the panel.

Cadence Builds a Winged Horse for Verification

April 17th, 2017

Gabe Moretti, Senior Editor

Cadence reached back into Greek mythology to name its new Verification engine for digital designs: Pegasus.  The hope is, of course, that the new product will prove itself more than a popular myth, but that the product will soar to new heights in efficiency.

The company describes Pegasus as: “a massively parallel, cloud-ready physical verification signoff solution that enables engineers to deliver advanced-node ICs to market faster. The new solution is part of the full-flow Cadence® digital design and signoff suite and provides up to 10X faster design rule check (DRC) performance on hundreds of CPUs while also reducing turnaround time from days to hours versus the previous-generation Cadence solution.”

The major benefits offered by Pegasus, according to Dr. Anirudh Devgan, executive vice president and general manager of the Digital & Signoff Group and the System & Verification Group at Cadence are:

  • Massively parallel architecture: The solution incorporates a massively parallel architecture that provides unprecedented speed and capacity, enabling designers to easily run on hundreds of CPUs to speed up tapeout times.
  • Reduced full-chip physical verification runtimes: The solution’s gigascale processing offers near-linear scalability that has been demonstrated on up to 960 CPUs, allowing customers to dramatically reduce DRC signoff runtimes.
  • Low transition cost: Using existing foundry-certified rule decks, customers achieve 100 percent accurate results with a minimal learning curve.
  • Flexible cloud-ready platform: The solution offers native cloud support that provides an elastic and flexible compute environment for customers facing aggressive time-to-market deadlines.
  • Efficient use of CPU resources: The solution’s data flow architecture enables customers to optimize CPU usage, regardless of machine configurations and physical location, providing maximum flexibility to run on a wide range of hardware, achieving the fastest DRC signoff.
  • Native compatibility with Cadence digital and custom design flows: The Pegasus Verification System integrates seamlessly with the Virtuoso custom design platform, delivering instantaneous DRC signoff checks to guide designers to a correct-by-construction flow that improves layout productivity. An integration with the Innovus Implementation System enables customers to run the Pegasus Verification System during multiple stages of the flow for a wide range of checks—signoff DRC and multi-patterning decomposition, color-balancing to improve yield, timing-aware metal fill to reduce timing closure iterations, incremental DRC and metal fill during engineering change orders (ECOs) that improve turnaround time, and full-chip DRC.

Anirudh concluded that: “The Pegasus Verification System’s innovative architecture and native cloud-ready processing provides an elastic and flexible computing environment, which can enable our customers to complete full-chip signoff DRC on advanced-node designs in a matter of hours, speeding time to market.”

Industrial IoT, a Silicon Valley Opportunity

April 11th, 2017

Gabe Moretti, Senior Editor

I read a white paper written by Brian Derrick, VP of Corporate Marketing at Mentor titled Industrial IoT (IIOT) – Where Is Silicon Valley?  It is an interesting discussion about the IIoT market pointing out that most of the leading companies in the market are not located in Silicon Valley.  In fact Brian only lists Applied Material as having a measurable market share in IIoT (1.4% in 2015), HP and Avago as sensors providers.  Amazon and Google are listed as Cloud Service Providers, Cisco, ProSoft, and Cal Amp as Intelligent Gateway Providers and Sierra Wireless as Machine to Machine Communication Hardware supplier.

It does not make sense to list EDA companies in the valley that supply the tools used by many of the IIoT vendors to design their products.  Unfortunately, it is the service nature of EDA that allows analysts to overlook the significant contribution of our industry to the electronics market place.

There is actually a company in Silicon Valley that in my opinion offers a good example of what IIoT is: eSilicon.  The company started as a traditional IP provider but in the last three years it developed itself into a turn-key supplier supporting a customer from design to manufacturing of IC with integrated analysis tools, and order, billing and WIP reports, all integrated in a system it calls STAR.

A customer can submit a design that uses a eSilicon IP, analyze physical characteristics of the design, choose a foundry, receive a quote, place an order, evaluate first silicon, and go into production all in the STAR system.  This combines design, analysis, ordering, billing, and manufacturing operations, significantly increasing reliability through integration.  The development chain that usually requires dealing with many corporate contributors and often more than one accounting system, has been simplified through integration not just of engineering software tools, but accounting tools as well.

I think that we will regret the use of the term “Internet” when describing communication capabilities between and among “Things”.  Internet is not just hardware, it is a protocol.  A significant amount of communication in the IoT architecture takes place using Bluetooth and WiFi hardware and software, not internet.  In fact, I venture that soon we might find that the internet protocol s the wrong protocol to use.  We need networks that can be switched from public to private, and in fact an entire hierarchy of connectivity that offer better security, faster communication, and flexibility of protocol utilization.

I find that the distinction between real time and batch processing is disappearing because people are too used to real time.  But real time connectivity is open to more security breaches than batch processing.  On the manor, for example, a machine can perform thousands of operations without being connected to the internet all the time.  Status reports, production statistics information, for example, can be collected at specific times and only at those times does the machine need to be connected to the internet.  For the machine to continuously say that all is normal to a central control unit is redundant.  All we should care is if something is not normal.

The bottom line is that there are many opportunities for Silicon Valley corporations to become a participant to IIoT, and, of course, start-ups, a specialty of the Valley, can find a niche in the market.

Portable Stimulus

March 23rd, 2017

Gabe Moretti, Senior Editor

Portable Stimulus (PS) is not a new sex toy, and is not an Executable Specification either.  So what is it?  It is a method, or rather it will be once the work is finished to define inputs independently from the verification tool used.

As the complexity of a system increases, the cost of its functional verification increases at a more rapid pace.   Verification engineers must consider not only wanted scenarios but also erroneous one.  Increased complexity increases the number of unwanted scenarios.  To perform all the required tests, engineers use different tools, including logic simulation, accelerators and emulators, and FPGA prototyping tools as well.  To transport a test from one tool to another is a very time consuming job, which is also prone to errors.  The reason is simple.  Not only each different class of tools uses different syntax, in some cases it also uses different semantics.

The Accellera System Initiative, known commonly as simply Accellera is working on a solution.  It formed a Working Group to develop a way to define tests in a way that is independent of the tool used to perform the verification.  The group, made up of engineers and not of markting professionals, chose as its name what they are supposed to deliver, a Portable Stimulus since the verification tests are made up of stimuli to the device under test (DUT) and the stimuli will be portable among verification tools.

Adnan Hamid, CEO of Breker, gave me a demo at DVCon US this year.  Their product is trying to solve the same problem, but the standard being developed will only be similar, that is based on the same concept.  Both will be a descriptive language, Breker based on SystemC and PS based on SystemVerilog, but the approach the same.  The verification team develops a directed network where each node represents a test.  The Accellera work must, of course, be vendor independent, so their work is more complex.  The figure below may give you an idea of the complexity.

Once the working group is finished, and they expect to be finished no later than the end of 2017, each EDA vendor could then develop a generator that will translate the test described in PS language into the appropriate string of commands and stimuli required to actually perform the test with the tool in question.

The approach, of course, is such that the product of the Accellera work can then be easily submitted to the IEEE for standardization, since it will obey the IEEE requirements for standardization.

My question is: What about Formal Verification?  I believe that it would be possible to derive assertions from the PS language.  If this can be done it would be a wonderful result for the industry.  An IP vendor, for example, will then be able to provide only one definition of the test used to verify the IP, and the customer will be able to readily use it no matter which tool is appropriate at the time of acceptance and integration of the IP.

A Brief History of Verification

March 2nd, 2017

Gabe Moretti, Senior Editor

This year’s DVCon is turning out to be very interesting.  It may be something that Jim Hogan said.  Jim insisted, in more than one panel, that we are at the cusp of a major change, an opportunity for startups that has a greater probability of revolutionary success than in the last few years in the area of verification and, I add, circuit design.

His observations got me thinking, reviewing my experiences in various aspects of verification I have been involved in.  What follow is my existential history of verification.

Logic is What Counts

When in 1968 I wrote my first logic simulator I only had to deal with a two states Boolean logic.  A node was either on or off.  Everything else about the circuit was not important.  It was not very long after that that things got more complex.  Now Boolean logic had three states, I and other verification engineers met the floating gate.  And then shortly thereafter memories became a real physical quantity in designs and four-state logic became the norm.  We had to know whether a bit was initialized in a deterministic manner or not.  From then on verification engineers’ major problem was the size of the circuit to be simulated and how efficient we could make the simulator so that run times were as short as possible.  And by the way this is still going on today, although for more complex reasons.

Enters Physics

As the size of transistors got smaller and smaller verification engineers found that physics grew in importance.  We no longer could assume that a circuit was well behaved in the physical domain, but had to worry about power, electromagnetic effects, and thermal characteristics.  Verification engineers had to understand physics, or, better yet, physics professionals had to join the ranks of verification engineers.

Today, thanks in part to logic synthesis, Boolean logic issues are well understood, and development work is focused on the physical characteristics of the circuit and, just to make things interesting, its immediate environment.  Todays’ complex problems deal with clock networks management and their interaction with power distribution and consumption that determines the functional state of the circuit.  How power transmission effect the nearby circuitry, how the current of electrons warms the device and how temperature impacts the probability that a transistor can maintain the state it is supposed to have are today’s issues.

Behavioral Scientists Find a New Job

There has been much talk at this year’s DVCon about safety and security, issues that are really behavioral and not about logic or physics.  During a lunch panel on Tuesday the importance of safety and security in complex systems, often employing diverse and distributed circuitry, was indicated as being the next important aspect of a system to verify.  Defense, automotive, and mobile applications were used as examples.  More than one panelist with Jim Hogan again assuming a leading role, spoke about the desirability of self-correcting systems.  If a safety failure occurred, or a security breach, the most desirable thing would be to have the circuitry prevent the occurrence or fix itself thus limiting the damage and the cost of repairs.   Verification engineers will have to deal with behavioral science in order to understand the danger, build a knowledge base of past incidents, and develop circuitry that will identify and prevent the faulty behavior in the future.  This, or course, requires that we can define what a good behavior is and what would signify a deviation from it.

Can We Do Better?

Now that I no longer develop verification tools, I think that both hardware and software developers have gotten lazy.  Transistors are now assumed to exist to the point that no one ever asks himself or herself “will this fit?” since the answer is “of course, and with transistors to spare”.  We design and develop complex circuits not just because they have to be but because they can be.

The amount of data produced by the various verification tools is so large that it is becoming impossible for a human to correlate all of it into a meaningful indication of the wellness and effectiveness of the design.  Tools to manage big data are required, especially when decisions must be made in the field during actual deployment.

And as transistors grow smaller and smaller in order to provide more of them every couple of years, we are unconsciously marching toward the boundary of Einstein’s space time environment to get closer and closer to quantum physics.  What will happen when one day we will be able to determine the state of a transistor or the trajectory of the electrons but not both at the same time?

I believe that it is time to go back in time and asks ourselves “ Is there another implementation that is simpler and uses fewer transistors?” the same way I used to ask “How can I make my program fit in 8K bytes of memory?”  Verification engineers must develop an efficiency measurement tool that warns design engineers when they are not efficient.  Not for the sake of elegance, but for the sake of simplicity.  Simplicity in circuit design means fewer physical problems and fewer behavioral problems.

The best business opportunity of this approach, of course, is in the area of IP.  Design, develop, and market the most efficient implementation of a function and, assuming you have some understanding of business management, you will be a successful entrepreneur.

Grant Pierce Named BoD Chair of the ESD Alliance

February 21st, 2017

Gabe Moretti, Senior Editor

The ESD Alliance (ESDA) has elected Grant Pierce (CEO of Sonics) as its Chairman of the Board a few weeks ago.  Grant is only the second Chair that is not a high level executive of one of the three big three EDA companies to hold the title, and the first since the organization, formerly EDAC renamed itself.  During the EDAC days it was customary for the CEOs of Cadence, Mentor and Synopsys to pass the title among themselves in an orderly manner.  The organization then reflected the mission of the EDA industry to support the development of hardware intensive silicon chips following Moore’s Law.

Things have changed since then, and the consortium responded by first appointing a new executive director, Bob Smith, then changing its name and its mission.  I talked with Grant to understand his view from the top.

Grant Pierce, Sonics CEO

Grant pointed out that: “We are trying to better reflect what has happened in the market place, both in terms of how our customers have developed further in the world of system on chip and what we have seen in the development of the EDA world where today the IP offerings in the market, both those from independent companies but also those from EDA companies are critical and integral to all the whole ecosystem for building today’s modern chips.”

Grant pointed out that ESDA has expanded its focus and has embraced not only hardware design and development but also software.  That does not mean, Grant pointed out, that the EDA companies are loosing importance but instead they are gaining a seat at the table with the software and the system design community in order to expand the scope of their businesses.

From my point of view, I interjected, I see the desired change implemented very slowly, still reacting to and not anticipating new demands.  So what do you think can happen in the next twelve months?

“From an ESDA point of view you are going to see us broadening the membership.” answered Grant.  ”We are looking to see how we can expand the focus of the organization through its working groups to zero-in on new topics that are broader than the ones that are currently there.  Like expanding beyond what is a common operating system to support for example.  I think you will see at a minimum two fronts, one opening on the software side while at the same time continuing work on the PPA (Power, Performance, Area) issues of chip design.  This involves a level of participation from parties that have not interacted this organization before.”

Grant believes that there should be more emphasis on the needs of small companies, those where innovation is taking place.  ESDA needs to seek the best opportunity to invigorate those companies.  “At the same time we must try to get system companies involved in an appropriate fashion, at least to the degree that they represent the software that is embedded in a system” concluded Grant.

We briefly speculated on what the RISC 5 movement might mean to ESDA.  Grant does not see much value for ESDA to focus on a specific instruction set, although he conceded that there might be value if RISC 5 joined ESDA.  I agree with the first part of his judgement, but I do not see any benefit to either party, or the industry for that matter, associated with RISC 5 joining ESDA.

From my point of view ESDA has a big hurdle to overcome.  For a few years, before Bob Smith was named executive director, EDAC was somewhat stagnant, and now it must catch up with market reality and fully address the complete system issue.  Not just hardware/software, but analog/digital, and the increased use of FPGA and MEMS.

For sure, representing an IP company gives Grant an opportunity to stress a different point of view within ESDA than the traditional EDA view.  The IP industry would not even exist without a system approach to design and it has changed the way architects think when first sketching a product on the back of an envelope.

DVCon Is a Must Attend Conference for Verification and Design Engineers

February 13th, 2017

Gabe Moretti, Senior Editor

Dennis Brophy, Chair of this year’s DVCon said that “The 2017 Design and Verification Conference and Exhibition U.S. The conference will offer attendees a comprehensive selection of 39 papers, 9 tutorials, 19 posters, 2 panels, a special session on Functional Verification Industry Trends, and a keynote address by Anirudh Devgan, Senior Vice President and General Manager of the Digital & Signoff Group and System & Verification Group at Cadence.”

DVCon is   sponsored by Accellera Systems Initiative, DVCon U.S. will be held February 27-March 2, 2017 at the DoubleTree Hotel in San Jose, California.

DVCon was initially sponsored by VHDL International and Open Verilog International; but its growth really started after the two organizations merged  to form Accelera which was renames Accellera Systems Initiative during its merger with Open SystemC Initiative.  It seems that every EDA organization now-a-days needs the word “systems” in its name.  As they are afraid to be seen to be of lesser importance without making very sure that everyone knows they are aware of the exitance of systems.

DVCon is now a truly international conference holding events not only in the US but also in Europe, India and, for the first time this year, in China.

The aim of Accellera is to build and serve a community of professionals who focus on development and verification of hardware systems, with the awareness that software role in system design is growing rapidly.  Dennis Brophy observed that “Coming together as a community is fostered by the DVCon Expo. The bigger and better exposition will run from Monday evening to Wednesday evening. See the program for specific opening and closing times. The Expo is a great place to catch up with commercial vendors and learn the latest in product developments. It is also great to connect with colleagues and exchange and share information and ideas. Join us for the DVCon U.S. 2017 “Booth Crawl” where after visiting select exhibitors you will be automatically entered for a lucky draw.”

Although the titles of the papers presented in the technical conference focus on EDA technologies, the impact of the papers deal with application areas diverse from automotive to communications, from the use of MEMS and FPGA in system design, from could computing to rf.

“DVCon has long been the technical and social highlight of the year for design and verification engineers,” stated Tom Fitzpatrick, DVCon U.S. 2017 Technical Program Chair.  “Through the hard work of a large team of dedicated reviewers, we have chosen the best of over one hundred submitted abstracts from deeply knowledgeable colleagues within the industry to help attendees learn how to improve their verification efforts. We also have two days of extended tutorials where attendees can get an in-depth look at the cutting edge of verification, not to mention the Exhibit Floor where over 30 companies will be demonstrating their latest tools and technologies.  In between, there are plenty of opportunities to network, relax and take advantage of a fun and welcoming atmosphere where attendees can reconnect with old friends and former colleagues or make new friends and contacts. The value of DVCon goes well beyond the wealth of information found in the Proceedings. Being there makes all the difference.”

Dennis Brophy added that on Monday February 27th there will be a presentation covering the Portable Stimulus work being done under Accellera sponsorship.  The working group has made significant progress toward defining what it is and how it works.  The goal is to have the Board of Accellera to authorize a ballot to make the result an industry standard and to further take it to the IEEE to complete the standardization owkr.

As has happened last year the exhibit space was quickly filled by vendor who understood the advantage of talking with technologists who specialize in verification and design of complex systems.

Devgan’s keynote, “Tomorrow’s Verification Today” will review the latest trends which are redefining verification from IP to System-level with an increasingly application-specific set of demands for hardware and software development. Over the past decade, verification complexity and demands on engineering teams have continued to raise rapidly. However, the supporting automation tools and flows have been only improving incrementally, resulting in a verification gap. It is time to redefine how verification should be approached to accelerate innovation in the next decade.  In his presentation, Dr. Devgan will review the latest trends which are redefining verification from IP to System-level, with an increasingly application-specific set of demands changing the landscape for hardware and software development. The keynote will be delivered on Tuesday, February 28

On the same day from 1:30-2:30pm in the Oak/Fir Ballroom the conference will offer a special session with Harry Foster, Chief Scientist for Mentor Graphics’ Design Verification Technology Division.  Foster has been asked to present “Trends in Functional Verification: A 2016 Industry Study” based on the Wilson Research Group’s 2016 study. The findings from the 2016 study provide invaluable insight into the state of today’s electronics industry. It will be held on Tuesday, February 28 from 10:30-11:00am in the Fir Ballroom.

Two full days of in-depth tutorials: Accellera Day with three tutorials on Monday and sponsored tutorials on Thursday.  There are also many technical papers and posters and two intriguing panels.

There will be plenty of networking opportunities, especially during the exhibition.  There will be a booth crawl on Monday, February 27 from 5:00-7:00pm and receptions both Tuesday and Wednesday in the exhibit hall.  Exhibits will be open Tuesday from 5:00-7:00pm and Wednesday and Thursday from 2:30-6:00p

The awards for Best Paper and Best Poster will be presented at the beginning of the reception on Wednesday.  For the complete DVCon U.S. 2017 schedule, including a list of sessions, tutorials, sponsored luncheons and events, visit www.dvcon.org.

Next Page »