Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘IoT’

Next Page »

EDA in the year 2017 – Part 1

Thursday, January 12th, 2017

Gabe Moretti, Senior Editor

The EDA industry performance is dependent on two other major economies: one technological and one financial.  EDA provides the tools and methods that leverage the growth of the semiconductor industry and begins to receive its financial rewards generally a couple of year after the introduction of the new product on the market.  It takes that long for the product to prove itself on the market and achieve general distribution.

David Fried from Coventor addressed the most important topics that may impact the foundry business in 2017.  He made two points.

“Someone is going to commit to Extreme Ultra-Violet (EUV) for specific layers at 7nm, and prove it.  I expect EUV will be used to combine 3-4 masks currently using 193i in a multi-patterning scheme (“cut” levels or Via levels) for simplicity (reduced processing), but won’t actually leverage a pattern-fidelity advantage for improved chip area density.

The real density benefit won’t come until 5nm, when the entire set of 2D design rules can be adjusted for pervasive deployment of EUV.  This initial deployment of EUV will be a “surgical substitution” for cost improvement at very specific levels, but will be crucial for the future of EUV to prove out additional high-volume manufacturing challenges before broader deployment.  I am expecting this year to be the year that the wishy-washy predictions of who will use EUV at which technology for which levels will finally crystallize with proof.

7nm foundry technology is probably going to look mostly evolutionary relative to 10nm and 14nm. But 5nm is where the novel concepts are going to emerge (nanowires, alternate channel materials, tunnel FETs, stacked devices, etc.) and in order for that to happen, someone is going to prove out a product-like scaling of these devices in a real silicon demonstration (not just single-device research).  The year 2017 is when we’ll need to see something like an SRAM array, with real electrical results, to believe that one of these novel device

concepts can be developed in time for a 5nm production schedule.”

Rob Knoth, Product Marketing Director, Digital and Signoff Group at Cadence offered the following observation.  “This past year, major IDM and pure-play foundries began to slow the rate at which new process nodes are planned to be released. This was one of the main drivers for the restless semiconductor-based advances we’ve seen the past 50 years.

Going forward, fabs and equipment makers will continue to push the boundaries of process technology, and the major semiconductor companies will continue to fill those fabs. While it may be slowing, Moore’s Law is not “dead.” However, there will be increased selection about who jumps to the “next node,” and greater emphasis will be placed on the ability of the design engineer and their tools/flows/methods to innovate and deliver value to the product. The importance for an integrated design flow to make a difference in product power/performance/area (PPA) and schedule/cost will increase.

The role that engineering innovation and semiconductors play in making the world a better place doesn’t get a holiday or have an expiration date.

The semiconductor market, in turn, depends on the general state of the world-wide economy.  This is determined mostly by consumer sentiment: when consumers buy, all industries benefit, from industrial to financial.  It does not take much negative inflection in consumers’ demand to diminish the requirement for electronic based products and thus semiconductors parts.  That in turn will have a negative effect on the EDA industry.

While companies that sell multi-years licenses can smooth the impact, new licenses, both multi-year and yearly are more difficult to sell and result in lower revenue.

The electronic industry will evolve to deal with increased complexity of designs.  Complex chips are the only vehicle that can make advance fabrication nodes profitable.  It makes no sense decreasing features’ dimensions and power requirements at the cost of increased noise and leakage just for technology sake.  As unit costs increase, only additional functionality can justify new projects.  Such designs will require new methodology, new versions of existing tools, and new industry organization to improve the use of the development/fabrication chain.

Michael Wishart, CEO of Efabless believes that in 2017 we will begin to see full-fledged community design, driven by the need for customized silicon to serve emerging smart hardware products. ICs will be created by a community of unaffiliated designers on affordable, re-purposed 180nm nodes and incorporate low cost, including open source, processors and on-demand analog IP. An online marketplace to connect demand with the community will be a must.

Design Methods

I asked Lucio Lanza of Lanza techVentures what factors would become important in 2017 regarding EDA.  As usual his answer was short and to the point.  “Cloud, machine learning, security and IoT will become the prevailing opportunities for design automation in 2017. Design technology must progress quickly to meet the needs of these emerging markets, requiring as much as possible from the design automation industry. Design automation needs to willingly and quickly take up the challenge at maximum speed for success. It’s our responsibility, as it’s always been.”

Bob Smith, Executive Director of the ESD alliance thinks that in 2017, the semiconductor design ecosystem will continue evolving from a chip-centric (integration of transistors) focus to a system-centric (integration of functional blocks) worldview. While SoCs and other complex semiconductor devices remain critical building blocks and Moore’s Law a key driver, the emphasis is shifting to system design via the extensive use of IP. New opportunities for automation will open up with the need to rapidly configure and validate system-level design based on extensive use of IP.  Industry organizations like the Electronic System Design Alliance have a mission to work across the entire design ecosystem as the electronic design market makes the transition to system-level design.

Wally Rhines, Chairman and CEO of Mentor Graphics addressed the required changes in design as follows: “EDA is a changing.  Most of its effort in the last two decades in the EDA industry has focused on the automation of integrated circuit design. Virtually all aspects of IC design are now automated with the use of computers.  But system design is in the infancy of an evolution to virtual design automation. While EDA has now given us the ability to do first pass functional integrated circuit designs, we are far from providing the same capability to system designers.

What’s needed is the design of “systems of systems”.  That capability is coming.  And it is sooner than you might think.  Designers of planes, trains and automobiles hunger for virtual simulation of their designs long before they build the physical prototypes for each sub-system.  In the past, this has been impossible.  Models were inadequate.  Simulation was limited to mechanical or thermal analysis.  The world has changed.  During 2017, we will see the adoption of EDA by companies that have never before considered EDA as part of their methodology.”

Frank Schirrmeister, Senior Product Management Group Director, System and Verification Group at Cadence offered the following observation.  “IoT that spans across application domains will further grow, especially in the industrial domain. Dubbed in Gernany as “Industrie 4.0”, industrial applications are probably the strongest IoT driver. Value shifts will accelerate from pure semiconductor value to systemic value in IoT applications. The edge node sensor itself may not contribute to profits greatly, but the systemic value of combining the edge node with a hub accumulating data and sending it through networks to cloud servers in which machine learning and big data analysis happens allows for cross monetization. The value definitely is in the system. Interesting shifts lie ahead in this area from a connectivity perspective. 5G is supposed to broadly hit is in 2020, with early deployments in 2017. There are already discussions going on regarding how the connectivity within the “trifecta” of IoT/Hub/Server is going to change, with more IoT devices bypassing the aggregation at the hub and directly accessing the network. Look for further growth in the area that Cadence calls System Design Enablement, together with some customer names you would have previously not expected to create chips themselves.

Traditionally ecosystems have been centered on processor architectures. Mobile and Server are key examples, with their respective leading architectures holding the lion share of their respective markets. The IoT is mixing this up a little as more processor architectures can play and offer unique advantages, with configurable and extensible architectures. No clear winner is in sight yet, but 2017 will be a key year in the race between IoT processor architectures. Even OpenSource hardware architectures are look like they will be very relevant judging from the recent momentum which eerily reminds me of the early Linux days. It’s definitely one of the most entertaining spaces to watch in 2017 and for years to come. “

Standards

Standards have played a key role in EDA.  Without them designers would be locked to one vendor for all of the required tools, and given the number of necessary tools very few EDA companies would be able to offer all that is required to complete, verify, and transfer to manufacturing a design.  Michiel Ligthart, President and COO at Verific, sees two standards, in particular, playing a key role in 2017.  “Watch for quite a bit of activity on the EDA standards front in 2017. First in line is the UVM standard (IEEE 1800.2), approved by the Working Group in December 2016. The IEEE may ratify it as early as February. Another one to watch is the next installment of SystemVerilog, mainly a “clarifications and corrections” release, that will be voted on in early 2017 with an IEEE release just before the end of the year. In the meantime, we are all looking at Accellera’s Portable Stimulus group to see what it will come up with in 2017.”

In regards to the Portable Stimulus activity Adnan Hamid, CEO of Breker Verification Systems goes into more details.  “While it’s been a long time coming, Portable Stimulus is now an important component of many design verification flows and that will increase significantly in 2017. The ability to specify verification intent and behaviors reusable across target platforms, coupled with the flexibility in choosing vendor solutions, is an appealing prospect to a wide range of engineering groups and the appeal is growing. While much of the momentum is rooted in Accellera’s Portable Stimulus Working Group, verification engineers deserve credit for recognizing its value to their productivity and effectiveness. Count on 2017 to be a big year for both its technological evolution and its standardization as it joins the ranks of SystemVerilog, UVM and others.

Conclusion

Given the amount of contributions received, it would be overwhelming to present all of them in one article.  Therefore the remaining topics will be covered in a follow-on article the following week.

Interview with Pim Tuyls, President and CEO of Intrinsic-ID

Tuesday, October 4th, 2016

Gabe Moretti, Senior Editor

After the article on security published last week, I continued the conversation with more corporations.  The Apple vs. FBI case showed that the stakes are high and the debate is heated.  Privacy is important, not only for guarding sensitive information but for also ensuring functionality in our digital world.

I asked Pim Tuyls his impressions on security in electronics systems.

Pim:

“Often, privacy is equated with security. However, ‘integrity’, is often the more important issue. This is especially true with the Internet of Things (IoT) and autonomous systems, which rely on the inputs they receive to operate effectively.    If these inputs are not secure, how can they be trusted?  Researchers have already tricked sensors of semi-autonomous cars with imaginary objects on the road, triggering emergency braking actions.  Counterfeit sensors are already on the market.

Engineers have built in redundancy and ‘common-sense’ rules to help ensure input integrity. However, such mechanisms were built primarily for reliability, not for security. So something else is needed. Looking at the data itself is not enough. Integrity needs to be built into sensors and, more generally, all end-points.”

Chip Design: Are there ways you think could be effective in increasing security?

Pim:

“One way to do this is to append a Message Authentication Code (MAC) to each piece of data. This is essentially a short piece of information that authenticates a message or confirms that the message came from the claimed sender (its authenticity) and has not been changed in transit (its integrity). To protect against replay attacks the message is augmented with a timestamp or counter before the MAC is calculated.  Another approach to implement a MAC is based on hash functions (HMAC or Hash-based message authentication code). Hash functions such as the SHA-2 family are well-known and widely supported cryptographic primitives with efficient and compact implementation.”

Chip Design: These approaches sound easy but there are reasons they are not widely adopted?

Pim:

“First, even though an algorithm like HMAC is efficient and compact, it may still be too high of a burden on the tiny microcontrollers and sensors that are the nerves of a complex system.  Authenticating every piece of data naturally takes up resources such as processing, memory and power.  In some cases, like in-vitro medical sensors, any reduction in battery life is not acceptable. Tiny sensor modules often do not have any processing capabilities. In automotive, due to the sheer number of sensors and controllers, costs cannot be increased.”

Chip Design: It is true that many IoT devices are very cost sensitive, I said, however, over recent years there is an increasing use of more powerful, 32-bit, often ARM- based microcontrollers. Many of these now come with basic security features like crypto accelerators and memory management. So some of the issues that prevent adoption of security are quickly being eroded.

Pim continued:

“A second obstacle relates to the complex logistics of configuring such a system. HMAC relies on a secret key that is shared between the sensor and the host.  Ensuring that each sensor has a unique key and that the key is kept secret via a centralized approach creates a single point of failure and introduces large liabilities for the party that manages the keys.”

Chip Design: What could be a cost-effective solution?

Pim concluded:

“A new solution to all these issues is based on SRAM Physical Unclonable Functions (PUFs). An SRAM PUF can reliably extract a unique key from a standard SRAM circuit on a standard microcontroller or smart sensor. The key is determined by tiny manufacturing differences unique to each chip. There is no central point of failure and no liability for key loss at the manufacturer.  Furthermore, as nothing is programmed into the chip, the key cannot even be extracted through reverse engineering or other chip-level attacks.

Of course adapting a new security paradigm is not something that should be done overnight. OEMs and their suppliers are rightly taking a cautious approach. After all, the vehicle that is now being designed will still be on the road in 25 years. For industrial and medical systems, the lifecycle of a product may even be longer.

Still, with technologies like SRAM PUF the ingredients are in place to introduce the next level of security and integrity, and pave the road for fully autonomous systems. Using such technologies will not only help to enhance privacy but will also ensure a higher level of information integrity.”

This brought me back to the article where a solution using PUF was mentioned.

Hardware Based Security

Friday, August 5th, 2016

Gabe Moretti, Senior Editor

If there is one thing that is obvious about the IoT market it is that security is essential.  IoT applications will be, if they are not already, invasive to the life of their users and the privacy of each individual must be preserved.  The European Union has stricter privacy laws than the US, but even in the US privacy is valued and protective.

Intrinsic-ID has published a white paper “SRAM PUF: The Secure Silicon Fingerprint” that you can read in the Whitepapers section of this emag, or you can go to www.intrinsic-id.com and read it under the “Papers” pull down.

For many years, silicon Physical Unclonable Functions (PUFs) have been seen as a promising and innovative security technology that was making steady progress. Today, Static Random-Access Memory (SRAM)-based PUFs offer a mature and viable security component that is achieving widespread adoption in commercial products. They are found in devices ranging from tiny sensors and microcontrollers to high performance Field-Programmable Gate Arrays (FPGAs) and secure elements where they protect financial transactions, user privacy, and military secrets.

Intrinsic-ID goal in publishing this paper is to show that SRAM PUF is a mature technology for embedded authentication. The behavior of an SRAM cell depends on the difference of the threshold voltages of its transistors. Even the smallest differences will be amplified and push the SRAM cell into one of two stable states. Its PUF behavior is therefore much more stable than the underlying threshold voltages, making it the most straightforward and most stable way to use the threshold voltages to build an identifier.

It turns out that every SRAM cell has its own preferred state every time the SRAM is powered resulting from the random differences in the threshold voltages. This preference is independent from the preference of the neighboring cells and independent of the location of the cell on the chip or on the wafer.

Hence an SRAM region yields a unique and random pattern of 0’s and 1’s. This pattern can be called an SRAM fingerprint since it is unique per SRAM and hence per chip. It can be used as a PUF. Keys that are derived from the SRAM PUF are not stored ‘on the chip’ but they are extracted ‘from the chip’, only when they are needed. In that way they are only present in the chip during a very short time window. When the SRAM is not powered there is no key present on the chip making the solution very secure.

Intrinsic-ID has bundled error correction, randomness extraction, security countermeasures and anti-aging techniques into a product called Quiddikey. This product extracts cryptographic keys from the SRAM PUF in a very secure manner and is available as Hardware IP (netlist), firmware (ANSI C Code), or a combination of these.

The hardware IP is small and fast – around 15K gates / 100K cycles – and connects to common interconnects like AMBA AHB, APB as well as proprietary interfaces. A Built-In Self-Test (BIST) and health checks are included in the logic. Since it is pure digital, single clock logic it synthesizes readily to any technology.  Software reference implementations start from 10KB of code and are available for major platforms like ARM, ARC, Intel and MIPS. Software implementations can be used to add PUF technology to existing products by a firmware upgrade.

I will deal with security issues in more depth in September.  In the mean time the Intrisic-ID white paper is worth your attention

The ARM – Softbank Deal: Heart Before Mind

Tuesday, July 19th, 2016

Gabe Moretti, Senior Editor

If you happen to hold ARM stock, congratulation, you are likely to make a nice profit on your investment.  SoftBank, a Japanese company with diversifies interests, including Internet provider, has offered to purchase ARM for cash by tendering $32.4 billion dollars.  SoftBank is a large company whose latest financial result show that it made a profit of $9.82 before interest payments and tax obligations.

ARM, on the other hand, reported for 2015 fiscal year revenue of $1.488.6 billion with a profit of $414.8 million and an operating margin of 42%.  This is a very healthy operating margin, showing a remarkable efficiency by all aspects of the company.  So, there is little to improve in the way ARM operates.

What seems logical, then is that SoftBank expects a significant increase in ARM revenue after the acquisition, or an effect on its profit due to ARM’s impact on other parts of the company.  ARM profit for 2015 were 414.8 million British sterling and the revenue in sterling was 968.3 million for a ratio of 42.8%.  Let’s assume that SoftBank instead invested all of the $32.4 billion and obtained a 5% return or $1.62 billion per year.  To obtain the same result from the ARM acquisition it would mean that ARM must generate a profit of 3.9 times what it generated in 2015.  This is a very large increase since if we assume that all other financial ratios stay the same revenue would have to be a little over $5.5 billion. Yet, using the growth of 15% realized between 2014 an2015 for every year between 2015 and 2020 we “only” achieve a $2,913.6 billion mark.  And keeping the growth ratio constant as revenue increase gets harder and harder since it means a large increase every year.

So the numbers do not make sense to me.  I can believe that ARM could be worth $16 billion, but not twice as much.  And here is another observation.  I have read in many publications that financial analysts expect the IoT market to be $20 billion by 2020.  Assuming that the SoftBank investment, net of interest charges, returns 5% per year in 2020, it would mean that ARM’s revenue would be $5.5 billion or over 25% of TAM (Total Available Market).  This, I consider impossible to achieve, simply because the IoT market will be price sensitive, thus opening ARM to competition by other companies offering competitive microcontrollers.  SoftBank cannot possibly believe that Intel will go away, or that every person will own three cell phones each, or that Google will use only ARM processors in its offerings, or even that IP companies like Cadence and Synopsys will decide to ignore the IoT market.

I am afraid that the acquisition marks the end of ARM as we know it.  It will be squeezed for revenue and profit like it has never been before and the quality of its products will suffer.

Lucio Lanza Joins ESDA Board

Friday, April 22nd, 2016

Gabe Moretti, Senior Editor

In a major change of the board structure, the Electronic System Design Association (ESDA) has elected Dr. Lucio Lanza to its Board of Directors.  Previously the Board members were only officers, and in most cases CEOs of member companies, but in an effort to broaden the outlook of the board to encompass ESDA’s new mission a new prospective on the electronic industry is required.

Dr. Lanza, the 2014 Phil Kaufman award winner, has spent all of his professional career in the electronic industry and is now a leading Silicon Valley venture capitalist and industry observer.  I asked Lucio the reason for his decision to join the ESDA board.  “It is a very important time for the next generation of electronic products. In the next few years we are going to have more products created than in the history of mankind.  The responsibility of organizations like ESDA to help people create those products is pretty significant.  So we need to make sure that we get well organized and we all cooperate.”

As it is typical of Lucio, he looks at the entire scope of the problem and defines organizations like ESDA as principal facilitator of the upcoming major change in the industry.  When Lucio looks at the purpose of what was EDAC he points out that the EDA industry had traditionally concentrated in enabling the electronic industry to step from one process node to the next while maintaining development costs practically flat.   This has been the EDA’s Moore’s Law.  It has been a success, but the challenges are different now, at least for the vast majority of companies and ventures that are looking at developing IoT products.  And one of the signals that the EDA industry has recognized this fact is the change of the EDAC name to ESDA.

Much has already been written about the new name: ESDA, and how there might have been better choices.  Personally I am glad that at the time we did not name Accellera as the Electronic Standards Development Association, or EDAC would really have had a much harder task renaming itself.  But the aim was to make a statement that electronic products are now much more than an orderly collection of silicon transistors, and that engineers require more than traditional EDA tools to efficiently develop them.  So the word “system” provides the best description of the problem to be solved.  The method to develop hardware has changed with the use of IP blocks, and the use of software has increased significantly.  The way Lucio explains it, makes it so clear that now I appreciate what motivated the name change.

Lucio points out that: “The traditional EDA tools are no longer sufficient to fulfill its own Moore’s Law.  In the last few years what engineers needed to design SoCs was availability of IP.  The issue became “Is there the right IP?”  Because up to 80% of the chip is not new development but new or modified IP.  Somehow we ended up designing new chips by assembling IP and designing only a minor portion from scratch.”

In looking at the state of development today Lucio found that many developers are spending ten times more in software than they are in hardware development and debug. He continued by pointing out that this is the problem to be addressed.  His message is that EDA intended as the provider of all development tools and methods, must find a way to bring the IP vendors and the software modules and tools providers to realize that everyone will benefit from a well-planned coordination on the supply side of the equation.

My next question dealt with how could ESDA contact a software tool company and convince it that it is in the common interest to “design together”.  “There are two steps here” said Lucio “the first step, which I cannot say I am an expert in, so I am very very humble, is to find out what is the environment today.  Are there companies that are already trying to do that?  If so, is there something we can do to help these people to acquire visibility?”  The second question is “Is there a way that potential users can encourage these companies and others to strengthen and expand such approach is the follow on question.  Organizations like ESDA must become the leaders in organizing and supporting this work.”

The just announced agreement between ESDA and Semico says Lucio is a way to understand that we are all after the same goals.   “if we cooperate the efficiency of the industry will increase and we all will benefit, not just benefit as businesses, but benefit as society.”

The Year in Review: Thoughts on 2015

Thursday, December 3rd, 2015

Gabe Moretti, Senior Editor

I want to start by covering two sad events that occurred this year.  Two people that made significant contributions to the EDA industry passed on: Gary Smith and Marie Pistilli.  It may be time for EDAC to consider instituting an award that parallels the Phil Kaufman award to recognize people that made significant contributions to the industry without necessarily significantly enriching themselves through inventions or business skills.  If it did, then certainly both Gary and Marie would have earned the award.

Marie PIstilli was for many years the nucleus around which the world of DAC revolved.  Although Pat invented the concept, Marie provided the organizational skill that grew DAC to what it is today.  My first encounter with Marie was as a first time exhibitor at DAC.  She had been described to me as an unbending task master who followed the rules by the letter and had no understanding for compromises.  My experience with her was not like that.  Marie understood that cash flow was the first rule of small startup, and, within reason, did compromise.  Marie also was a champion of professional women and started the Women in Engineering award given yearly during DAC.

Gary was the EDA analyst everyone listened to.  He moved from a direct contributor to the marketing and development of electronic products to an industry observer.  But an observer that had lived within the object he was analyzing and so his opinions had added value because he brought not only financial and statistical knowledge, but also the understanding of the technology and its impact on the growth of EDA.  Gary was unassuming, always ready to share his point of view, always accessible to me as I also transitioned from a technological producer to an analytical career.

The third important thing that happened this year is the marketing of IoT within the industry.  A three letter acronym that had been invented some years ago to stand for Internet of Things.  It turns out that we are not really talking about internet and we still have to define “things”, but the label has stuck and we are not letting reality change it.   IoT is changing our industry by introducing a new set of customers that are systems integrators and deal mostly with reasonably small mixed/signal circuits.  The vast majority of the circuits have MEMS or RF modules or both in them and do not use the latest semiconductor process nodes.  Suppliers of tools and IP for these customers could be found in the exhibit hall at ARM TechCon this year.  The other characteristics of the majority of IoT products contain embedded software modules as well.  The result is that covering EDA by dividing it in vertical specialized areas will no longer work, since the system is the most important topic, not specific tools.  All the tools must be seamlessly integrated and facilitate the dialogue among experts in different disciplines.  The year 2015 was the start of a fundamental change for EDA vendors; the quicker to adapt will without doubt be the new leaders of the industry.

Synopsys’ Relaunched ARC Is Not The Answer

Wednesday, October 14th, 2015

Gabe Moretti, Senior Editor

During the month of September Synopsys spent considerable marketing resources relaunching its ARC processor family of products by leveraging the IoT.  First on September 10 it published a release announcing two additional versions of the ARC EM family of deeply embedded DSP cores.  Then on September 15 the company held a free one-day ARC Processor Summit in Santa Clara and on September 22 issued another press release about its involvement in IoT again mentioning the embARC Open Software Platform and ARC Access Program.  It is not clear that ARC will fare any better in the market after this effort than it did in the past.

Background

Almost ten years ago a company called ARC International LTD designed and developed a RISC processor called Argonaut RISC Core.  Its architecture has roots in the Super FX chip for the Super Nintendo Entertainment System.  In 2009 Virage Logic purchased ARC International.  Virage specialized in embedded test systems and was acquired by Synopsys in 2010.  This is how Synopsys became the owner of the ARC architecture, although it was just interested in the embedded test technology.

Since that acquisition ARC has seen various developments that produced five product families all within the DesignWare group.  Financial success of the ARC family has been modest, especially when compared within the much more popular product families in the company.  The EM family is one of the five product families where the two new products reside.  During this year’s DVCon, at the beginning of March I had an interview with Joachim Kunkel, Sr. Vice President and General Manager of the Solutions Group at Synopsys who is responsible among other things of the IP products.  We talked about the ARC family and how Synopsys had not yet found a way to efficiently use this core.  We agreed that IoT applications could benefit from such an IP especially if well integrated with other DesignWare pieces and security software.

The Implementation

I think that the ARC family will never play a significant part in Synopsys revenue generation, even after this last marketing effort.

It seems clear to me that the IoT strategy is built on more viable corporate resources than just the ARC processor.  The two new cores are the EM9D and EM11D which implement an enhanced version of the ARCv2DSP instruction set architecture, combining RISC and DSP processing with support for an XY memory system to boost digital signal processing performance while minimizing power consumption.  Synopsys claims that the cores are from 3 to 5 times more efficient than the two previous similar cores, but the press release specifically avoids comparison with similar devices from other vendors.

When I read the data sheets of devices from possible competitors I appreciate the wisdom to avoid direct comparison.  Although the engineering work to produce the two new cores seems quite good, there is only so much that can be done with a ten years old architecture.  ARC becomes valuable only if sold as part of a sub-system that integrates other Synopsys IP and security products owned by the company.

It is also clear that those other resources will generate more revenue for Synopsys when integrated with other DSP processors from ARM, Intel, and may be Apple or even Cadence.  ARC has been neglected for too long to be competitive by itself, especially when considering the IoT market.  ARC is best used at the terminals or data acquisition nodes.  Such nodes are highly specialized, small, and above all very price sensitive.  A variation of few cents makes the difference between adoption or not.  This is not a market Synopsys is comfortable with.  Synopsys prefers to control by offering the best solution at a price it finds acceptable.

Conclusion

The ARC world will remain small.  Synopsys mark on the IoT will possibly be substantial but certainly not because of ARC.

Cadence Introduced Tensilica Vision P5 DSP

Thursday, October 8th, 2015

Gabe Moretti, Senior Editor

DSP devices are indispensable in electronic products that deal with the outside environment.  Wheter one needs to see, to touch, or in any way gather information from the environmanet, DSP devices are critical.  Improvements in their performance characteristics, therefore, have a direct impact not only on the capability of a circuit, but more importantly, on its level of competitiveness.  Cadence Design Systems has just announced the Cadence Tensilica Vision P5 digital signal processor (DSP), which it calls its flagship high-performance vision/imaging DSP core. Cadence claims that the new imaging and vision DSP core offers up to 13X performance boost, with an average of 5X less energy usage on vision tasks compared to the previous generation IVP-EP imaging and video DSP.

Jeff Bier, co-founder and president of Berkeley Design Technology, Inc. (BDTI) noted that: “There is an explosion in vision processing applications that require dedicated, efficient offload processors to handle the large streams of data in real time.  Processor innovations like the Tensilica Vision P5 DSP help provide the backbone required for increasingly complex vision applications.”

The Tensilica Vision P5 DSP core includes a significantly expanded and optimized Instruction Set Architecture (ISA) targeting mobile, automotive advanced driver assistance systems (or ADAS, which includes pedestrian detection, traffic sign recognition, lane tracking, adaptive cruise control, and accident avoidance) and Internet of Things (IoT) vision systems.

“Imaging algorithms are quickly evolving and becoming much more complex – particularly in object detection, tracking and identification,” stated Chris Rowen, CTO of the IP Group at Cadence. “Additionally, we are seeing a lot more integrated systems with multiple sensor types, feeding even more data in for processing in real time. These highly complex systems are driving us to provide more performance in our DSPs than ever before, at even lower power. The Tensilica Vision P5 DSP is a major step forward to meeting tomorrow’s market demands.”
Modern electronic systems architecture threats hardware and software with the same amount of attention.  They must balance each other in order to achieve the best possible execution while minimizing development costs.  The Tensilica Vision P5 DSP further improve the ease of software development and porting, with comprehensive support for integer, fixed-point and floating-point data types and an advanced toolchain with a proven, auto-vectorizing C compiler. The software environment also features complete support of standard OpenCV and OpenVX libraries for fast, high-level migration of existing imaging/vision applications with over 800 library functions.

The Tensilica Vision P5 DSP is specifically designed for applications requiring ultra-high memory and operation parallelism to support complex vision processing at high resolution and high frame rates. As such, it allows off-loading vision and imaging functions from the main CPU to increase throughput and reduce power. End-user applications that can benefit from the DSP’s capabilities include image and video enhancement, stereo and 3D imaging, depth map processing, robotic vision, face detection and authentication, augmented reality, object tracking, object avoidance and advanced noise reduction.

The Tensilica Vision P5 DSP is based on the Cadence Tensilica Xtensa architecture, and combines flexible hardware choices with a library of DSP functions and numerous vision/imaging applications from our established ecosystem partners. It also shares the comprehensive Tensilica partner ecosystem for other applications software, emulation and probes, silicon and services and much more.  The Tensilica Vision P5 core includes these new features:

  • Wide 1024-bit memory interface with SuperGather technology for maximum performance on the complex data patterns of vision processing
  • Up to 4 vector ALU operations per cycle, each with up to 64-way data parallelism
  • Up to 5 instructions issued per cycle from 128-bit wide instruction delivering increased operation parallelism
  • Enhanced 8-,16- and 32-bit ISA tuned for vision/imaging applications
  • Optional 16-way IEEE single-precision vector floating-point processing unit delivering a massive 32GFLOPs at 1GHz

Coventor’s MEMS+ 6.0 Enables MEMS/IoT Integration

Tuesday, October 6th, 2015

Gabe Moretti, Senior Editor

In thinking about the architecture and functioning of the IoT, I came to represent it as a nervous system.  Commands and data flow through the architecture of IoT while computations are performed at the appropriate location in the system.  The end terminal points of IoT, just like in the human nervous system function as the interface with the outside world.  MEMS are indispensable to the proper functioning of the interface, yet, as focused as we are on electronics, we seldom give prominence to MEMS when the IoT is discussed in EDA circles.

Coventor, Inc., a leading supplier of MEMS design automation solutions, introduced MEMS+ 6.0, the latest version of its MEMS design platform.   The tool is available immediately.  MEMS+ 6.0 is a significant advance toward a MEMS design automation flow that complements the well-established CMOS design flow, enabling faster integration of MEMS with electronics and packaging.  MEMS+ 6.0 features new enablement of MEMS process design kits (PDKs) and second-generation model reduction capabilities.

“The fast growing Internet of Things market will increasingly require customization of MEMS sensors and customized package-level integration to achieve lower power, higher performance, smaller form factors, and lower costs,” said Dr. Stephen R. Breit, Vice President of Engineering at Coventor.  “MEMS+6.0 is focused on enabling rapid customization and integration of MEMS while enforcing design rules and technology constraints.”

With MEMS+ 6.0, users can create a technology-defined component library that imposes technology constraints and design rules during design entry, resulting in a “correct-by-construction” methodology. This new approach reduces design errors and enables MEMS foundries to offer MEMS Process Design Kits (PDKs) to fabless MEMS designers. Both parties will benefit, with submitted designs having fewer errors, and ultimately fewer design spins and fab cycles required to bring new and derivative products to market.

“We have collaborated with Coventor in defining the requirements for MEMS PDKs for MEMS+,” said Joerg Doblaski, Director of Design Support at X-FAB Semiconductor Foundries. “We see the new capabilities in MEMS+ 6.0 as a big step toward a robust MEMS design automation flow that will reduce time to market for fabless MEMS developers and their foundry partners.”

MEMS+6.0 also includes a second-generation model reduction capability with export to MathWorks Simulink as well as the Verilog-A format. The resulting reduced-order models (ROMs) simulate nearly as fast as simple hand-crafted models, but are far more accurate. This enables system and IC designers to include accurate, non-linear MEMS device models in their system- and circuit-level simulations. For the second generation, Coventor has greatly simplified the inputs for model reduction and automatically includes the key dynamic and electrostatic non-linear effects present in capacitive motion sensors such as accelerometers and gyroscopes. ROMs can be provided to partners without revealing critical design IP.   Figure 1 shows one such integration architecture.

Figure 1: Integration of MEMS with digital/analog design

Additional advances in MEMS+ 6.0 include:

  • Support for design hierarchy, encouraging time-saving re-use of device sub-structures.
  • Refined support for including packaging effects in thermal stability analysis of sensors, reducing the impact ambient temperature can have on the thermal stability of sensor outputs such as zero offset in accelerometers and drift bias in gyros.
  • Improved modeling of devices that rely on piezo-electric effects for sensing. Interest in piezo sensing is growing because the underlying process technology for piezo materials has matured and the potential benefits over capacitive sensing, the current market champion.
  • An expanded MATLAB scripting interface that now allows design entry as well as simulation control.

A Point to Ponder from Gary Smith’s DAC Presentation

Monday, July 27th, 2015

Gabe Moretti, Senior Editor

The integration of electrical and mechanical tools is one of the subjects that Gary Smith covered in his presentation at this year’s DAC.  It is clear that we are finally reaching an understanding among EDA analysts that a system is more than just its electronic and electrical portions.  Gary suggested that an ideal integration of electronic and mechanical tools could be achieved through  acquisitions.

To Gary, who was a fervent supporter of EDA, the ideal situation would be for EDA companies to acquire companies that offer mechanical and structural development tools.  And clearly this is an option.  But integrating mechanical design automation tools with electrical and electronic DA tools through corporate collaboration is also a possible solution.  By integration I do not mean building one tool that supports both electronic and mechanical design.  I mean creating a family of products that share data and are supported by one organization.  The centralized support is critical to avoid misunderstanding of problems that can arise from improper integration of tools.  I know at least three companies that already offer some levels of such integration: Ansys, Mathworks, and Mentor.

Ansys is an example of a company that started supporting system design with tools dealing with mechanical and physical problems and then that acquired companies dealing with electrical and electronic problems.  Today it offers integrated support to system designers and developers that span the entire product development sequence in various markets.

Mathworks is an example of a company that started out developing tools that spanned both physical and electronics development and worked with leading EDA tools providers to integrate its Simulink tool with additional EDA capabilities to create solutions without the need to acquire companies or develop tools itself.

In the automotive market Mentor has integrated its tools with third party tools to provide a multi-discipline development environment.  Its SystemVision multi-discipline development environment provides an integrated simulation opportunity to analyze both electronic and mechanical issues.

Clearly Gary’s proposed solution has additional benefits to the EDA industry.  For one it increases the revenue of the sector and it also decreases the dependence on solely electronics and semiconductors demand for its revenue.  At a time when the IoT sector is talked about as the major contributor to increased revenue, calmer voices are pointing out that most of the foreseen products are consumer products that do not require advanced technology.  But they require increased security and improved integration of electro/mechanical and software constituent parts.  The opportunity, then, is not to focus solely on improving support for leading edge process technologies, but to provide fail safe integration of hardware/software/mechanical portions of the products at a reasonable price point.  At this is something that will require a change in the growth plans for the EDA industry.

Next Page »