Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘IP’

Next Page »

IP Fingerprinting Initiative from the ESD Alliance

Thursday, May 11th, 2017

Gabe Moretti, Senior Editor

I had occasion to discuss with Bob Smith, Executive Director of the ESD Alliance, and Warren Savage, General Manager IP at Silvaco, my March article “Determining a Fair Royalty Value for IP” (http://chipdesignmag.com/sld/blog/2017/03/13/determining-a-fair-royalty-value-for-ip/) as we addressed the IP Fingerprinting Initiative of the ESD Alliance.

I stated that there really was not a standard way to determine the amount of royalty that could be charged for an IP.

Bob Smith responded: “Royalties should be based on value provided. Value comes in many forms, such as how much of the functionality of the end product is provided by the IP, the risk and time-to-market reduction, and design and verification cost savings. There is no simple formula for IP royalties. In fact, they can be quite complicated.”

Warren added: “Business models used for licensing royalties are always a negotiation between the buyer and seller with each party striving to optimize the best outcome for their business. In some cases, the customer may be willing to pay more for royalties in exchange for lowering the upfront licensing costs. A different customer may be willing to invest more upfront to drive down the cost of royalties. Calculation of the actual royalty amounts may be based on a percentage of the unit cost or a fixed price, and each may have sliding scales based on cumulative volumes. Both parties need to derive the value that fits their own business model. The IP user needs to arrive at a price for the IP that supports the ROI model for the end product. The IP supplier needs to ensure that it receives sufficient value to offset its investment in IP development, verification and support. It is able then to participate in the success of the buyer’s product based (at least in part) on the value of the IP provided.”

Since it seems impossible to have a standard function to determine royalties, is there an intrinsic value for an IP?

Warren remarked: “An IP has zero intrinsic value in of itself. The value is completely dependent on the application in which it is used, the ability of the IP to offset development costs and risks and the contributions it makes to the operation and success of the target product. For example, an IP that is developed and ends up sitting on the shelf has no value at all. In fact, its value is negative given the resources and costs spent on developing it. Size doesn’t matter. An IP that has hundreds of thousands of gates may command a higher price because the IP supplier needs to sell it for that price to recoup its investment in creating it.  A small IP block may also command a high price because it may contain technology that is extremely valuable to the customer’s product and differentiates it significantly from the competition. The best way to think about intrinsic value is to think of it in the context of value delivered to the customer. If there is no apparent difference in this regard between an IP product from two or more suppliers, then the marketplace sets the price and the lowest cost supplier wins.”

In terms of the IP Fingerprinting Initiative of the ESD Alliance, I was curious to understand how the owner of the IP could protect against illegal uses.

Warren said: “This is the great problem we have in the IP industry today. Approximately 99% percent of IP is delivered to customers in source code form and IP companies rely on the good faith of their customers to use it within the scope of the license agreement. However, there is a fundamental problem. Engineers rarely know what the usage terms and restrictions of the agreement their company has with the IP supplier, so it is quite easy for a semiconductor company to be in violation, and not even know it. New technologies are coming into play, such as the IP fingerprinting scheme that the ESD Alliance is promoting. Fingerprinting is a non-invasive approach that protects both IP suppliers and their customers from “accidental reuse.”

Bob Smith added: “IP suppliers can utilize The Core Store (www.the-core-store.com) at no charge to showcase their products and register “fingerprints” of their technology. Semiconductor companies can use this registry to detect IP usage within their chips by means of “DNA analysis” software available through Silvaco.”

Industrial IoT, a Silicon Valley Opportunity

Tuesday, April 11th, 2017

Gabe Moretti, Senior Editor

I read a white paper written by Brian Derrick, VP of Corporate Marketing at Mentor titled Industrial IoT (IIOT) – Where Is Silicon Valley?  It is an interesting discussion about the IIoT market pointing out that most of the leading companies in the market are not located in Silicon Valley.  In fact Brian only lists Applied Material as having a measurable market share in IIoT (1.4% in 2015), HP and Avago as sensors providers.  Amazon and Google are listed as Cloud Service Providers, Cisco, ProSoft, and Cal Amp as Intelligent Gateway Providers and Sierra Wireless as Machine to Machine Communication Hardware supplier.

It does not make sense to list EDA companies in the valley that supply the tools used by many of the IIoT vendors to design their products.  Unfortunately, it is the service nature of EDA that allows analysts to overlook the significant contribution of our industry to the electronics market place.

There is actually a company in Silicon Valley that in my opinion offers a good example of what IIoT is: eSilicon.  The company started as a traditional IP provider but in the last three years it developed itself into a turn-key supplier supporting a customer from design to manufacturing of IC with integrated analysis tools, and order, billing and WIP reports, all integrated in a system it calls STAR.

A customer can submit a design that uses a eSilicon IP, analyze physical characteristics of the design, choose a foundry, receive a quote, place an order, evaluate first silicon, and go into production all in the STAR system.  This combines design, analysis, ordering, billing, and manufacturing operations, significantly increasing reliability through integration.  The development chain that usually requires dealing with many corporate contributors and often more than one accounting system, has been simplified through integration not just of engineering software tools, but accounting tools as well.

I think that we will regret the use of the term “Internet” when describing communication capabilities between and among “Things”.  Internet is not just hardware, it is a protocol.  A significant amount of communication in the IoT architecture takes place using Bluetooth and WiFi hardware and software, not internet.  In fact, I venture that soon we might find that the internet protocol s the wrong protocol to use.  We need networks that can be switched from public to private, and in fact an entire hierarchy of connectivity that offer better security, faster communication, and flexibility of protocol utilization.

I find that the distinction between real time and batch processing is disappearing because people are too used to real time.  But real time connectivity is open to more security breaches than batch processing.  On the manor, for example, a machine can perform thousands of operations without being connected to the internet all the time.  Status reports, production statistics information, for example, can be collected at specific times and only at those times does the machine need to be connected to the internet.  For the machine to continuously say that all is normal to a central control unit is redundant.  All we should care is if something is not normal.

The bottom line is that there are many opportunities for Silicon Valley corporations to become a participant to IIoT, and, of course, start-ups, a specialty of the Valley, can find a niche in the market.

A Brief History of Verification

Thursday, March 2nd, 2017

Gabe Moretti, Senior Editor

This year’s DVCon is turning out to be very interesting.  It may be something that Jim Hogan said.  Jim insisted, in more than one panel, that we are at the cusp of a major change, an opportunity for startups that has a greater probability of revolutionary success than in the last few years in the area of verification and, I add, circuit design.

His observations got me thinking, reviewing my experiences in various aspects of verification I have been involved in.  What follow is my existential history of verification.

Logic is What Counts

When in 1968 I wrote my first logic simulator I only had to deal with a two states Boolean logic.  A node was either on or off.  Everything else about the circuit was not important.  It was not very long after that that things got more complex.  Now Boolean logic had three states, I and other verification engineers met the floating gate.  And then shortly thereafter memories became a real physical quantity in designs and four-state logic became the norm.  We had to know whether a bit was initialized in a deterministic manner or not.  From then on verification engineers’ major problem was the size of the circuit to be simulated and how efficient we could make the simulator so that run times were as short as possible.  And by the way this is still going on today, although for more complex reasons.

Enters Physics

As the size of transistors got smaller and smaller verification engineers found that physics grew in importance.  We no longer could assume that a circuit was well behaved in the physical domain, but had to worry about power, electromagnetic effects, and thermal characteristics.  Verification engineers had to understand physics, or, better yet, physics professionals had to join the ranks of verification engineers.

Today, thanks in part to logic synthesis, Boolean logic issues are well understood, and development work is focused on the physical characteristics of the circuit and, just to make things interesting, its immediate environment.  Todays’ complex problems deal with clock networks management and their interaction with power distribution and consumption that determines the functional state of the circuit.  How power transmission effect the nearby circuitry, how the current of electrons warms the device and how temperature impacts the probability that a transistor can maintain the state it is supposed to have are today’s issues.

Behavioral Scientists Find a New Job

There has been much talk at this year’s DVCon about safety and security, issues that are really behavioral and not about logic or physics.  During a lunch panel on Tuesday the importance of safety and security in complex systems, often employing diverse and distributed circuitry, was indicated as being the next important aspect of a system to verify.  Defense, automotive, and mobile applications were used as examples.  More than one panelist with Jim Hogan again assuming a leading role, spoke about the desirability of self-correcting systems.  If a safety failure occurred, or a security breach, the most desirable thing would be to have the circuitry prevent the occurrence or fix itself thus limiting the damage and the cost of repairs.   Verification engineers will have to deal with behavioral science in order to understand the danger, build a knowledge base of past incidents, and develop circuitry that will identify and prevent the faulty behavior in the future.  This, or course, requires that we can define what a good behavior is and what would signify a deviation from it.

Can We Do Better?

Now that I no longer develop verification tools, I think that both hardware and software developers have gotten lazy.  Transistors are now assumed to exist to the point that no one ever asks himself or herself “will this fit?” since the answer is “of course, and with transistors to spare”.  We design and develop complex circuits not just because they have to be but because they can be.

The amount of data produced by the various verification tools is so large that it is becoming impossible for a human to correlate all of it into a meaningful indication of the wellness and effectiveness of the design.  Tools to manage big data are required, especially when decisions must be made in the field during actual deployment.

And as transistors grow smaller and smaller in order to provide more of them every couple of years, we are unconsciously marching toward the boundary of Einstein’s space time environment to get closer and closer to quantum physics.  What will happen when one day we will be able to determine the state of a transistor or the trajectory of the electrons but not both at the same time?

I believe that it is time to go back in time and asks ourselves “ Is there another implementation that is simpler and uses fewer transistors?” the same way I used to ask “How can I make my program fit in 8K bytes of memory?”  Verification engineers must develop an efficiency measurement tool that warns design engineers when they are not efficient.  Not for the sake of elegance, but for the sake of simplicity.  Simplicity in circuit design means fewer physical problems and fewer behavioral problems.

The best business opportunity of this approach, of course, is in the area of IP.  Design, develop, and market the most efficient implementation of a function and, assuming you have some understanding of business management, you will be a successful entrepreneur.

Lucio Lanza Joins ESDA Board

Friday, April 22nd, 2016

Gabe Moretti, Senior Editor

In a major change of the board structure, the Electronic System Design Association (ESDA) has elected Dr. Lucio Lanza to its Board of Directors.  Previously the Board members were only officers, and in most cases CEOs of member companies, but in an effort to broaden the outlook of the board to encompass ESDA’s new mission a new prospective on the electronic industry is required.

Dr. Lanza, the 2014 Phil Kaufman award winner, has spent all of his professional career in the electronic industry and is now a leading Silicon Valley venture capitalist and industry observer.  I asked Lucio the reason for his decision to join the ESDA board.  “It is a very important time for the next generation of electronic products. In the next few years we are going to have more products created than in the history of mankind.  The responsibility of organizations like ESDA to help people create those products is pretty significant.  So we need to make sure that we get well organized and we all cooperate.”

As it is typical of Lucio, he looks at the entire scope of the problem and defines organizations like ESDA as principal facilitator of the upcoming major change in the industry.  When Lucio looks at the purpose of what was EDAC he points out that the EDA industry had traditionally concentrated in enabling the electronic industry to step from one process node to the next while maintaining development costs practically flat.   This has been the EDA’s Moore’s Law.  It has been a success, but the challenges are different now, at least for the vast majority of companies and ventures that are looking at developing IoT products.  And one of the signals that the EDA industry has recognized this fact is the change of the EDAC name to ESDA.

Much has already been written about the new name: ESDA, and how there might have been better choices.  Personally I am glad that at the time we did not name Accellera as the Electronic Standards Development Association, or EDAC would really have had a much harder task renaming itself.  But the aim was to make a statement that electronic products are now much more than an orderly collection of silicon transistors, and that engineers require more than traditional EDA tools to efficiently develop them.  So the word “system” provides the best description of the problem to be solved.  The method to develop hardware has changed with the use of IP blocks, and the use of software has increased significantly.  The way Lucio explains it, makes it so clear that now I appreciate what motivated the name change.

Lucio points out that: “The traditional EDA tools are no longer sufficient to fulfill its own Moore’s Law.  In the last few years what engineers needed to design SoCs was availability of IP.  The issue became “Is there the right IP?”  Because up to 80% of the chip is not new development but new or modified IP.  Somehow we ended up designing new chips by assembling IP and designing only a minor portion from scratch.”

In looking at the state of development today Lucio found that many developers are spending ten times more in software than they are in hardware development and debug. He continued by pointing out that this is the problem to be addressed.  His message is that EDA intended as the provider of all development tools and methods, must find a way to bring the IP vendors and the software modules and tools providers to realize that everyone will benefit from a well-planned coordination on the supply side of the equation.

My next question dealt with how could ESDA contact a software tool company and convince it that it is in the common interest to “design together”.  “There are two steps here” said Lucio “the first step, which I cannot say I am an expert in, so I am very very humble, is to find out what is the environment today.  Are there companies that are already trying to do that?  If so, is there something we can do to help these people to acquire visibility?”  The second question is “Is there a way that potential users can encourage these companies and others to strengthen and expand such approach is the follow on question.  Organizations like ESDA must become the leaders in organizing and supporting this work.”

The just announced agreement between ESDA and Semico says Lucio is a way to understand that we are all after the same goals.   “if we cooperate the efficiency of the industry will increase and we all will benefit, not just benefit as businesses, but benefit as society.”

Custom Compiler Shortens Layout of FinFET Circuits

Wednesday, March 30th, 2016

Gabe Moretti, Senior Editor

Synopsys has made a break from constraints driven layout and introduced a new layout system that, according to the company, allows engineers to work in a visual manner. “Legacy custom design tools have not kept pace with the exponential growth in design complexity,” said Antun Domic, executive vice president and general manager of the Design Group at Synopsys. “In particular, the growing number and complexity of FinFET design rules pose significant challenges for layout designers. Custom Compiler’s innovative assistants enable designers to address the most difficult layout challenges while significantly improving FinFET design productivity.”

The new tool is especially efficient when doing FinFET layout since it allows engineers to stack transistors in a visual manner while at the same time conserving the connectivity, thus saving hours of work.  Developing visually-assisted automation technologies that speed up common design tasks reduces iterations and enables reuse.

Custom Compiler Assistants, pictured in the figure above, are productivity aids that leverage the graphical use model familiar to layout designers while eliminating the need to write complicated code and constraints. With Custom Compiler, routine and repetitive tasks are dealt with automatically without extra setup. Custom Compiler provides four types of assistants: Layout, In-Design, Template and Co-Design.

  • Layout Assistants speed layout with visually-guided automation of placement and routing. The router is ideal for connecting FinFET arrays or large-M factor transistors. It automatically clones connections and creates pin taps. The user simply guides the router with the mouse and it fills in the details automatically. The placer uses a new innovative approach to device placement. It allows the user to make successive refinements, offering placement choices but leaving the layout designer in full control of the results—without requiring any up-front textual constraint entry.
  • In-Design Assistants reduce costly design iterations by catching physical and electrical errors before signoff verification. Custom Compiler includes a built-in design rule checking (DRC) engine, which is extremely fast and can be active all the time. In addition to the DRC engine, electromigration checking, and resistance and capacitance extraction are all natively implemented in Custom Compiler. Custom Compiler’s extraction is based on Synopsys’ StarRC engine.
  • Template Assistants help designers reuse existing know-how by making it easy to apply previous layout decisions to new designs. Template Assistants actually learn from the work done with the Layout Assistant’s placer and router. They intelligently recognize circuits that are similar to ones that were already completed and enable users to apply the same placement and routing pattern as a template to the new circuits. Custom Compiler comes pre-loaded with a set of built-in templates for commonly used circuits, such as current mirrors, level shifters and differential pairs.  Users can add more templates that are developed for their design styles as they are created and verified.
  • Co-Design Assistants combine the IC Compiler place and route system and Custom Compiler into a unified solution for custom and digital implementation. Users can freely move back and forth between Custom Compiler and IC Compiler, using the commands of each to successively refine their designs. With the Co-Design Assistants, IC Compiler users can perform full custom edits to their digital designs at any stage of implementation. Likewise, Custom Compiler users can use IC Compiler to implement digital blocks in their custom designs. The lossless, multi-roundtrip capability of the Co-Design Assistants ensures that all changes are synchronized across both the digital and custom databases.

Although Custom Compiler takes advantage of knowledge developed from existing Synopsys tools, it is not only a new product but a new approach to layout that fits well with FinFET use.  “As the leader in analog/mixed-signal semiconductor IP, our team has been exposed to FinFET related design challenges very early in the foundry process development cycle,” said Joachim Kunkel, executive vice president and general manager of the Solutions Group at Synopsys. “We asked the Custom Compiler development team to focus on improving FinFET layout productivity because we saw large increases in the layout effort across a wide range of IP development projects, from standard cells to high-performance SerDes. Custom Compiler’s Layout Assistants allowed us to implement a novel layout methodology that reduces the time of many layout tasks from hours to minutes.”

Synopsys’ Relaunched ARC Is Not The Answer

Wednesday, October 14th, 2015

Gabe Moretti, Senior Editor

During the month of September Synopsys spent considerable marketing resources relaunching its ARC processor family of products by leveraging the IoT.  First on September 10 it published a release announcing two additional versions of the ARC EM family of deeply embedded DSP cores.  Then on September 15 the company held a free one-day ARC Processor Summit in Santa Clara and on September 22 issued another press release about its involvement in IoT again mentioning the embARC Open Software Platform and ARC Access Program.  It is not clear that ARC will fare any better in the market after this effort than it did in the past.

Background

Almost ten years ago a company called ARC International LTD designed and developed a RISC processor called Argonaut RISC Core.  Its architecture has roots in the Super FX chip for the Super Nintendo Entertainment System.  In 2009 Virage Logic purchased ARC International.  Virage specialized in embedded test systems and was acquired by Synopsys in 2010.  This is how Synopsys became the owner of the ARC architecture, although it was just interested in the embedded test technology.

Since that acquisition ARC has seen various developments that produced five product families all within the DesignWare group.  Financial success of the ARC family has been modest, especially when compared within the much more popular product families in the company.  The EM family is one of the five product families where the two new products reside.  During this year’s DVCon, at the beginning of March I had an interview with Joachim Kunkel, Sr. Vice President and General Manager of the Solutions Group at Synopsys who is responsible among other things of the IP products.  We talked about the ARC family and how Synopsys had not yet found a way to efficiently use this core.  We agreed that IoT applications could benefit from such an IP especially if well integrated with other DesignWare pieces and security software.

The Implementation

I think that the ARC family will never play a significant part in Synopsys revenue generation, even after this last marketing effort.

It seems clear to me that the IoT strategy is built on more viable corporate resources than just the ARC processor.  The two new cores are the EM9D and EM11D which implement an enhanced version of the ARCv2DSP instruction set architecture, combining RISC and DSP processing with support for an XY memory system to boost digital signal processing performance while minimizing power consumption.  Synopsys claims that the cores are from 3 to 5 times more efficient than the two previous similar cores, but the press release specifically avoids comparison with similar devices from other vendors.

When I read the data sheets of devices from possible competitors I appreciate the wisdom to avoid direct comparison.  Although the engineering work to produce the two new cores seems quite good, there is only so much that can be done with a ten years old architecture.  ARC becomes valuable only if sold as part of a sub-system that integrates other Synopsys IP and security products owned by the company.

It is also clear that those other resources will generate more revenue for Synopsys when integrated with other DSP processors from ARM, Intel, and may be Apple or even Cadence.  ARC has been neglected for too long to be competitive by itself, especially when considering the IoT market.  ARC is best used at the terminals or data acquisition nodes.  Such nodes are highly specialized, small, and above all very price sensitive.  A variation of few cents makes the difference between adoption or not.  This is not a market Synopsys is comfortable with.  Synopsys prefers to control by offering the best solution at a price it finds acceptable.

Conclusion

The ARC world will remain small.  Synopsys mark on the IoT will possibly be substantial but certainly not because of ARC.

Gary Smith’s DAC presentation: The Changing Landscape

Friday, June 26th, 2015

Gabe Moretti, Senior Editor

Since a couple of weeks have passed I have had time to think about the contents of the Sunday evening presentation by Gary Smith.  Gary touched many subjects but two in particular gave me reason to ponder.  The first one was his comment on the IP industry and the second his view on Synopsys expansion.

The IP Industry

Gary stated that revenue for IP products will first level off and then diminish in the period up to 2019.  This statement generated incredulous response from the audience.  In the time set aside for questions Gary stated that the reason for the decline was the market saturation for products like Synopsys DesignWare.

He said that small IP modules were now commodities that are free, or practically free, and that the only source of IP revenue will be IP subsystems.  By itself the statement is true, but I think that Gary missed the larger picture.  IP subsystems are growing in sophistication and now contain both hardware and firmware modules.  Their complexity will increase, not decrease, and with it the price they can command.  The projected expansion of IoT products also means a growing use of IP subsystems, thus a very large growth in the number of licenses sold.

In addition the sophistication of the subsystems require more powerful development tools to integrate the IP and debug the final system.  ARM, for example has just introduced a very sophisticated development environment.  The new IP tooling suite comprises Socrates DE, CoreSight Creator and CoreLink Creator. Additionally, CoreLink Creator will easily configure and help implement the new CoreLink NIC-450 Network Interconnect, the follow-on to the widely-adopted CoreLink NIC-400.  See my article at: http://chipdesignmag.com/sld/blog/2015/06/04/new-arm-ip-tooling-suite-reduces-significantly-soc-integration-time/.

How is the revenue generated by this new tool, and those likely to appear on the market by competitors, to be counted?  Clearly the revenue would not exist if the IP was not purchased and used.  So, I think, tools specifically used for IP, should be counted in the IP market, not in the general EDA tools markets.  In addition the firmware sold with or for IP should not be counted in the embedded software column, but in the IP column.  IP revenue will grow, it is just a matter how it will be counted.

Synopsys’s Growth

Gary observed that Synopsys was growing through acquisitions in niche markets with the intent to dominate those markets.  I think this view is too narrow.  To be sure Synopsys acquires companies that have a significant opportunity to grow, but the reason for the acquisitions is not “just” diversification.  If one steps back and looks at the system level, and not just at the electronic hardware level, one finds, or at least I find, that Synopsys is looking at the requirements of systems in  the near future and is obtaining the tools to be able to satisfy them as a company.  Security is an important especially with respect to software attacks.  The acquisition of Codenomicon addresses the robustness of developed software.  The earlier acquisition of Coverity is fundamentally in the same direction.  Gary is correct that both acquisitions bring Synopsys in the larger market of software development outside of EDA, but they also strengthen the corporate position within EDA.  The same can be said of the Optical Solution Group grown entirely by acquisition of non EDA companies.  My point is that Synopsys is becoming a true system company, not “just” an EDA company.

HoT Love IP Event at DAC

Tuesday, May 19th, 2015

Gabe Moretti, Senior Editor

Jim Hogan, a man with many visions, both technical and social has created an organization Heart of Technology (HoT) that provides a vehicle for high tech companies and individuals to give back to their community by supporting worthy causes.  On the occasion of DAC, Monday evening, June 8, from 7 to 11:30 HoT will hold a party to Love IP and through the generosity of sponsors and even attendees raise money to support the San Jose State University Guardian Program.

The location of the party is Jillian’s at 145 4th Street in the city.

Jim Hogan said: “we thought it was an excellent organization to support. Serving 35 to 50 youth emancipated from foster care, wards of the court, and homelessness, our goal is to bring awareness to this vital program and raise funds to support these young scholars as they transition to become the next scientist, engineer, health care professional or even Silicon Valley entrepreneur. We appreciate your help to make this happen.”

As many as 23 corporation in our industry have step forward to sponsor the event, too many to list them all.  But if you would like to see their names go to http://heartoftechnology.org/love-ip-party-at-dac-2015/love-ip-party-sponsors/.

If you make a donation of $50 or more, you will receive immediate entrance to the party and a special event t-shirt. You will also be entered into a drawing for a $100 gift card.  Otherwise get invited by one of the sponsors r register for the IP track at DAC.  At the party you will find:

  • Fabulous food and open bar with 17 microbrews on tap
  • Performances by The Sonics and Groovy Love Band
  • A billiard tournament hosted by Atrenta
  • Heads or Tails Game with the Big Kahuna
  • A Summer of Love Costume Contest
  • A silent auction featuring sport and rock and roll memorabilia

This party in a city like San Francisco, attractive but expensive, is the best deal in town on that Monday evening.  Visit the website for further information: http://heartoftechnology.org/love-ip-party-at-dac-2015/

You Ought To See This Webinar

Friday, January 16th, 2015

Gabe Moretti, Senior Editor

A little over a month ago I wrote a blog about eSilicon’s IP MarketPlace.  On Wednesday January 21st eSilicon will present a webinar on the product.  To register go to: https://esilicon.clickwebinar.com/Try_IP_Before_You_Buy/register.  In case you cannot attend you can always see the webinar at: http://www.esilicon.com/resources/webinars-and-technical-presentations.  Jack Harding, eSilicon CEO, has just published a blog on his company web site that provides the rationale for a product like IP MarketPlace.  I want to quote a couple of fragments from the blog.

Jack writes:” One of my employees recently passed me the article “When Marketing is Strategy” authored by Niraj Dawar in the December 2013 issue of the Harvard Business Review. The essence of the article is that, “The strategic question that drives business today is not ‘What else can we make?’ but ‘What else can we do for our customers?’ Customers and the market—not the factory or the product—now stand at the core of the business.”

And later on he says: “eSilicon finds itself in a familiar place today. True to our original thesis and Mr. Dawar’s crisp description of the phenomenon, we are now leading another strategic shift “downstream” to help shape our customers’ “criteria of purchase.” Namely, we are deploying internet-based tools to redefine the manner in which the semiconductor development world accesses technical and commercial information in a format and structure that they did not request. Just as a fabless ASIC model was not requested.”

Mr. Dawar’s concept is not new, just often forgotten by companies who pay the price for their superficiality.  It is just another way of saying “A company is not in business to make money.  It is in business to give its customers what they need and make money as a consequence.”  I learned that while getting my B.A. in Business Administration.  Some professional sporting an MBA after their name ought to review their course material!

The IP MarketPlace is the third such product deployed by eSilicon.  At a time when the hottest three letter word in electronics is “IoT” that stands for (Internet of Things) eSilicon is leading the use of the “I” part the acronym.  The IP MarketPlace, in fact, will play an important part in facilitating the creation of “Things”.  Efficiency is the best tool to decrease development costs, and the Internet-based tools from eSilicon are the best tools I know to decrease the time and cost of defining, pricing, documenting and scheduling a chip project.  The best part is that using IP MarketPlace is free and simple.  I did it and I am a computer scientist, not an EE.

IP Components Are EDA Tools

Friday, January 9th, 2015

Gabe Moretti, Senior Editor

It has been just about 25 years since the first IP product was licensed and yet there are still questions about the true nature of the IP industry.  A few years ago EDAC started providing figures for IP revenue, and that created a debate that to some extent continues today.  Is an IP company an EDA company?  Some say yes and some strongly object and prefer to define an IP company as a fabless semiconductor company.  Is there a correct definition of the industry that creates IP?  And which IP are we talking about?  EDAC is looking only at hardware IP, but of course there are many software IP products available.  What is IP?  The name itself is not very specific.  IP stands for Intellectual Property, but that covers anything that can be copyrighted, patented, or otherwise claimed to be property that cannot be freely copied, sold, or used without express permission from its creator.  It was not the intention of the creator of the term to cover all those items, but then marketing is a difficult if imprecise, job.

So to make things easier, let’s discuss only about the IP components representing hardware that are used by hardware designers in the design and development of hardware.  Are the producers and vendors of such products EDA companies?  To be sure some EDA companies develop and sell IP.  Cadence, Mentor, and Synopsys call themselves EDA companies and all generate revenue from licensing IP products.  ARM, the leading IP company, sells development software for its products that is just as sophisticated as tools sold by EDA companies, so is ARM an EDA company?

I would like to look at IP in a different light, a point of view I share with Lucio Lanza.  IP components are used by designers in the design and development of electronic products.  The EDA industry’s purpose is to develop and market tools used by designers to design and develop electronic products.  Ergo, IP is an EDA tool.  In fact engineers do not just integrate IP components in their designs.  They use IP in making tradeoff judgments regarding architecture, performance, development cost, and ultimately price of the product they are working on.  IP is indeed an EDA tool, so EDAC is correct in counting the revenue as an EDA industry revenue.

There is also a practical aspect to the argument.  One cannot separate IP revenue from the overall revenues of Cadence, Mentor, or Synopsys, just to take the big three into consideration, without those companies deciding to account for IP revenues as a separate profit center.  And why should they if IP is the same as EDA tools?

Next Page »