Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘security’

EDA in the year 2017 – Part 1

Thursday, January 12th, 2017

Gabe Moretti, Senior Editor

The EDA industry performance is dependent on two other major economies: one technological and one financial.  EDA provides the tools and methods that leverage the growth of the semiconductor industry and begins to receive its financial rewards generally a couple of year after the introduction of the new product on the market.  It takes that long for the product to prove itself on the market and achieve general distribution.

David Fried from Coventor addressed the most important topics that may impact the foundry business in 2017.  He made two points.

“Someone is going to commit to Extreme Ultra-Violet (EUV) for specific layers at 7nm, and prove it.  I expect EUV will be used to combine 3-4 masks currently using 193i in a multi-patterning scheme (“cut” levels or Via levels) for simplicity (reduced processing), but won’t actually leverage a pattern-fidelity advantage for improved chip area density.

The real density benefit won’t come until 5nm, when the entire set of 2D design rules can be adjusted for pervasive deployment of EUV.  This initial deployment of EUV will be a “surgical substitution” for cost improvement at very specific levels, but will be crucial for the future of EUV to prove out additional high-volume manufacturing challenges before broader deployment.  I am expecting this year to be the year that the wishy-washy predictions of who will use EUV at which technology for which levels will finally crystallize with proof.

7nm foundry technology is probably going to look mostly evolutionary relative to 10nm and 14nm. But 5nm is where the novel concepts are going to emerge (nanowires, alternate channel materials, tunnel FETs, stacked devices, etc.) and in order for that to happen, someone is going to prove out a product-like scaling of these devices in a real silicon demonstration (not just single-device research).  The year 2017 is when we’ll need to see something like an SRAM array, with real electrical results, to believe that one of these novel device

concepts can be developed in time for a 5nm production schedule.”

Rob Knoth, Product Marketing Director, Digital and Signoff Group at Cadence offered the following observation.  “This past year, major IDM and pure-play foundries began to slow the rate at which new process nodes are planned to be released. This was one of the main drivers for the restless semiconductor-based advances we’ve seen the past 50 years.

Going forward, fabs and equipment makers will continue to push the boundaries of process technology, and the major semiconductor companies will continue to fill those fabs. While it may be slowing, Moore’s Law is not “dead.” However, there will be increased selection about who jumps to the “next node,” and greater emphasis will be placed on the ability of the design engineer and their tools/flows/methods to innovate and deliver value to the product. The importance for an integrated design flow to make a difference in product power/performance/area (PPA) and schedule/cost will increase.

The role that engineering innovation and semiconductors play in making the world a better place doesn’t get a holiday or have an expiration date.

The semiconductor market, in turn, depends on the general state of the world-wide economy.  This is determined mostly by consumer sentiment: when consumers buy, all industries benefit, from industrial to financial.  It does not take much negative inflection in consumers’ demand to diminish the requirement for electronic based products and thus semiconductors parts.  That in turn will have a negative effect on the EDA industry.

While companies that sell multi-years licenses can smooth the impact, new licenses, both multi-year and yearly are more difficult to sell and result in lower revenue.

The electronic industry will evolve to deal with increased complexity of designs.  Complex chips are the only vehicle that can make advance fabrication nodes profitable.  It makes no sense decreasing features’ dimensions and power requirements at the cost of increased noise and leakage just for technology sake.  As unit costs increase, only additional functionality can justify new projects.  Such designs will require new methodology, new versions of existing tools, and new industry organization to improve the use of the development/fabrication chain.

Michael Wishart, CEO of Efabless believes that in 2017 we will begin to see full-fledged community design, driven by the need for customized silicon to serve emerging smart hardware products. ICs will be created by a community of unaffiliated designers on affordable, re-purposed 180nm nodes and incorporate low cost, including open source, processors and on-demand analog IP. An online marketplace to connect demand with the community will be a must.

Design Methods

I asked Lucio Lanza of Lanza techVentures what factors would become important in 2017 regarding EDA.  As usual his answer was short and to the point.  “Cloud, machine learning, security and IoT will become the prevailing opportunities for design automation in 2017. Design technology must progress quickly to meet the needs of these emerging markets, requiring as much as possible from the design automation industry. Design automation needs to willingly and quickly take up the challenge at maximum speed for success. It’s our responsibility, as it’s always been.”

Bob Smith, Executive Director of the ESD alliance thinks that in 2017, the semiconductor design ecosystem will continue evolving from a chip-centric (integration of transistors) focus to a system-centric (integration of functional blocks) worldview. While SoCs and other complex semiconductor devices remain critical building blocks and Moore’s Law a key driver, the emphasis is shifting to system design via the extensive use of IP. New opportunities for automation will open up with the need to rapidly configure and validate system-level design based on extensive use of IP.  Industry organizations like the Electronic System Design Alliance have a mission to work across the entire design ecosystem as the electronic design market makes the transition to system-level design.

Wally Rhines, Chairman and CEO of Mentor Graphics addressed the required changes in design as follows: “EDA is a changing.  Most of its effort in the last two decades in the EDA industry has focused on the automation of integrated circuit design. Virtually all aspects of IC design are now automated with the use of computers.  But system design is in the infancy of an evolution to virtual design automation. While EDA has now given us the ability to do first pass functional integrated circuit designs, we are far from providing the same capability to system designers.

What’s needed is the design of “systems of systems”.  That capability is coming.  And it is sooner than you might think.  Designers of planes, trains and automobiles hunger for virtual simulation of their designs long before they build the physical prototypes for each sub-system.  In the past, this has been impossible.  Models were inadequate.  Simulation was limited to mechanical or thermal analysis.  The world has changed.  During 2017, we will see the adoption of EDA by companies that have never before considered EDA as part of their methodology.”

Frank Schirrmeister, Senior Product Management Group Director, System and Verification Group at Cadence offered the following observation.  “IoT that spans across application domains will further grow, especially in the industrial domain. Dubbed in Gernany as “Industrie 4.0”, industrial applications are probably the strongest IoT driver. Value shifts will accelerate from pure semiconductor value to systemic value in IoT applications. The edge node sensor itself may not contribute to profits greatly, but the systemic value of combining the edge node with a hub accumulating data and sending it through networks to cloud servers in which machine learning and big data analysis happens allows for cross monetization. The value definitely is in the system. Interesting shifts lie ahead in this area from a connectivity perspective. 5G is supposed to broadly hit is in 2020, with early deployments in 2017. There are already discussions going on regarding how the connectivity within the “trifecta” of IoT/Hub/Server is going to change, with more IoT devices bypassing the aggregation at the hub and directly accessing the network. Look for further growth in the area that Cadence calls System Design Enablement, together with some customer names you would have previously not expected to create chips themselves.

Traditionally ecosystems have been centered on processor architectures. Mobile and Server are key examples, with their respective leading architectures holding the lion share of their respective markets. The IoT is mixing this up a little as more processor architectures can play and offer unique advantages, with configurable and extensible architectures. No clear winner is in sight yet, but 2017 will be a key year in the race between IoT processor architectures. Even OpenSource hardware architectures are look like they will be very relevant judging from the recent momentum which eerily reminds me of the early Linux days. It’s definitely one of the most entertaining spaces to watch in 2017 and for years to come. “

Standards

Standards have played a key role in EDA.  Without them designers would be locked to one vendor for all of the required tools, and given the number of necessary tools very few EDA companies would be able to offer all that is required to complete, verify, and transfer to manufacturing a design.  Michiel Ligthart, President and COO at Verific, sees two standards, in particular, playing a key role in 2017.  “Watch for quite a bit of activity on the EDA standards front in 2017. First in line is the UVM standard (IEEE 1800.2), approved by the Working Group in December 2016. The IEEE may ratify it as early as February. Another one to watch is the next installment of SystemVerilog, mainly a “clarifications and corrections” release, that will be voted on in early 2017 with an IEEE release just before the end of the year. In the meantime, we are all looking at Accellera’s Portable Stimulus group to see what it will come up with in 2017.”

In regards to the Portable Stimulus activity Adnan Hamid, CEO of Breker Verification Systems goes into more details.  “While it’s been a long time coming, Portable Stimulus is now an important component of many design verification flows and that will increase significantly in 2017. The ability to specify verification intent and behaviors reusable across target platforms, coupled with the flexibility in choosing vendor solutions, is an appealing prospect to a wide range of engineering groups and the appeal is growing. While much of the momentum is rooted in Accellera’s Portable Stimulus Working Group, verification engineers deserve credit for recognizing its value to their productivity and effectiveness. Count on 2017 to be a big year for both its technological evolution and its standardization as it joins the ranks of SystemVerilog, UVM and others.

Conclusion

Given the amount of contributions received, it would be overwhelming to present all of them in one article.  Therefore the remaining topics will be covered in a follow-on article the following week.

Interview with Pim Tuyls, President and CEO of Intrinsic-ID

Tuesday, October 4th, 2016

Gabe Moretti, Senior Editor

After the article on security published last week, I continued the conversation with more corporations.  The Apple vs. FBI case showed that the stakes are high and the debate is heated.  Privacy is important, not only for guarding sensitive information but for also ensuring functionality in our digital world.

I asked Pim Tuyls his impressions on security in electronics systems.

Pim:

“Often, privacy is equated with security. However, ‘integrity’, is often the more important issue. This is especially true with the Internet of Things (IoT) and autonomous systems, which rely on the inputs they receive to operate effectively.    If these inputs are not secure, how can they be trusted?  Researchers have already tricked sensors of semi-autonomous cars with imaginary objects on the road, triggering emergency braking actions.  Counterfeit sensors are already on the market.

Engineers have built in redundancy and ‘common-sense’ rules to help ensure input integrity. However, such mechanisms were built primarily for reliability, not for security. So something else is needed. Looking at the data itself is not enough. Integrity needs to be built into sensors and, more generally, all end-points.”

Chip Design: Are there ways you think could be effective in increasing security?

Pim:

“One way to do this is to append a Message Authentication Code (MAC) to each piece of data. This is essentially a short piece of information that authenticates a message or confirms that the message came from the claimed sender (its authenticity) and has not been changed in transit (its integrity). To protect against replay attacks the message is augmented with a timestamp or counter before the MAC is calculated.  Another approach to implement a MAC is based on hash functions (HMAC or Hash-based message authentication code). Hash functions such as the SHA-2 family are well-known and widely supported cryptographic primitives with efficient and compact implementation.”

Chip Design: These approaches sound easy but there are reasons they are not widely adopted?

Pim:

“First, even though an algorithm like HMAC is efficient and compact, it may still be too high of a burden on the tiny microcontrollers and sensors that are the nerves of a complex system.  Authenticating every piece of data naturally takes up resources such as processing, memory and power.  In some cases, like in-vitro medical sensors, any reduction in battery life is not acceptable. Tiny sensor modules often do not have any processing capabilities. In automotive, due to the sheer number of sensors and controllers, costs cannot be increased.”

Chip Design: It is true that many IoT devices are very cost sensitive, I said, however, over recent years there is an increasing use of more powerful, 32-bit, often ARM- based microcontrollers. Many of these now come with basic security features like crypto accelerators and memory management. So some of the issues that prevent adoption of security are quickly being eroded.

Pim continued:

“A second obstacle relates to the complex logistics of configuring such a system. HMAC relies on a secret key that is shared between the sensor and the host.  Ensuring that each sensor has a unique key and that the key is kept secret via a centralized approach creates a single point of failure and introduces large liabilities for the party that manages the keys.”

Chip Design: What could be a cost-effective solution?

Pim concluded:

“A new solution to all these issues is based on SRAM Physical Unclonable Functions (PUFs). An SRAM PUF can reliably extract a unique key from a standard SRAM circuit on a standard microcontroller or smart sensor. The key is determined by tiny manufacturing differences unique to each chip. There is no central point of failure and no liability for key loss at the manufacturer.  Furthermore, as nothing is programmed into the chip, the key cannot even be extracted through reverse engineering or other chip-level attacks.

Of course adapting a new security paradigm is not something that should be done overnight. OEMs and their suppliers are rightly taking a cautious approach. After all, the vehicle that is now being designed will still be on the road in 25 years. For industrial and medical systems, the lifecycle of a product may even be longer.

Still, with technologies like SRAM PUF the ingredients are in place to introduce the next level of security and integrity, and pave the road for fully autonomous systems. Using such technologies will not only help to enhance privacy but will also ensure a higher level of information integrity.”

This brought me back to the article where a solution using PUF was mentioned.

Hardware Based Security

Friday, August 5th, 2016

Gabe Moretti, Senior Editor

If there is one thing that is obvious about the IoT market it is that security is essential.  IoT applications will be, if they are not already, invasive to the life of their users and the privacy of each individual must be preserved.  The European Union has stricter privacy laws than the US, but even in the US privacy is valued and protective.

Intrinsic-ID has published a white paper “SRAM PUF: The Secure Silicon Fingerprint” that you can read in the Whitepapers section of this emag, or you can go to www.intrinsic-id.com and read it under the “Papers” pull down.

For many years, silicon Physical Unclonable Functions (PUFs) have been seen as a promising and innovative security technology that was making steady progress. Today, Static Random-Access Memory (SRAM)-based PUFs offer a mature and viable security component that is achieving widespread adoption in commercial products. They are found in devices ranging from tiny sensors and microcontrollers to high performance Field-Programmable Gate Arrays (FPGAs) and secure elements where they protect financial transactions, user privacy, and military secrets.

Intrinsic-ID goal in publishing this paper is to show that SRAM PUF is a mature technology for embedded authentication. The behavior of an SRAM cell depends on the difference of the threshold voltages of its transistors. Even the smallest differences will be amplified and push the SRAM cell into one of two stable states. Its PUF behavior is therefore much more stable than the underlying threshold voltages, making it the most straightforward and most stable way to use the threshold voltages to build an identifier.

It turns out that every SRAM cell has its own preferred state every time the SRAM is powered resulting from the random differences in the threshold voltages. This preference is independent from the preference of the neighboring cells and independent of the location of the cell on the chip or on the wafer.

Hence an SRAM region yields a unique and random pattern of 0’s and 1’s. This pattern can be called an SRAM fingerprint since it is unique per SRAM and hence per chip. It can be used as a PUF. Keys that are derived from the SRAM PUF are not stored ‘on the chip’ but they are extracted ‘from the chip’, only when they are needed. In that way they are only present in the chip during a very short time window. When the SRAM is not powered there is no key present on the chip making the solution very secure.

Intrinsic-ID has bundled error correction, randomness extraction, security countermeasures and anti-aging techniques into a product called Quiddikey. This product extracts cryptographic keys from the SRAM PUF in a very secure manner and is available as Hardware IP (netlist), firmware (ANSI C Code), or a combination of these.

The hardware IP is small and fast – around 15K gates / 100K cycles – and connects to common interconnects like AMBA AHB, APB as well as proprietary interfaces. A Built-In Self-Test (BIST) and health checks are included in the logic. Since it is pure digital, single clock logic it synthesizes readily to any technology.  Software reference implementations start from 10KB of code and are available for major platforms like ARM, ARC, Intel and MIPS. Software implementations can be used to add PUF technology to existing products by a firmware upgrade.

I will deal with security issues in more depth in September.  In the mean time the Intrisic-ID white paper is worth your attention

Synopsys’ Relaunched ARC Is Not The Answer

Wednesday, October 14th, 2015

Gabe Moretti, Senior Editor

During the month of September Synopsys spent considerable marketing resources relaunching its ARC processor family of products by leveraging the IoT.  First on September 10 it published a release announcing two additional versions of the ARC EM family of deeply embedded DSP cores.  Then on September 15 the company held a free one-day ARC Processor Summit in Santa Clara and on September 22 issued another press release about its involvement in IoT again mentioning the embARC Open Software Platform and ARC Access Program.  It is not clear that ARC will fare any better in the market after this effort than it did in the past.

Background

Almost ten years ago a company called ARC International LTD designed and developed a RISC processor called Argonaut RISC Core.  Its architecture has roots in the Super FX chip for the Super Nintendo Entertainment System.  In 2009 Virage Logic purchased ARC International.  Virage specialized in embedded test systems and was acquired by Synopsys in 2010.  This is how Synopsys became the owner of the ARC architecture, although it was just interested in the embedded test technology.

Since that acquisition ARC has seen various developments that produced five product families all within the DesignWare group.  Financial success of the ARC family has been modest, especially when compared within the much more popular product families in the company.  The EM family is one of the five product families where the two new products reside.  During this year’s DVCon, at the beginning of March I had an interview with Joachim Kunkel, Sr. Vice President and General Manager of the Solutions Group at Synopsys who is responsible among other things of the IP products.  We talked about the ARC family and how Synopsys had not yet found a way to efficiently use this core.  We agreed that IoT applications could benefit from such an IP especially if well integrated with other DesignWare pieces and security software.

The Implementation

I think that the ARC family will never play a significant part in Synopsys revenue generation, even after this last marketing effort.

It seems clear to me that the IoT strategy is built on more viable corporate resources than just the ARC processor.  The two new cores are the EM9D and EM11D which implement an enhanced version of the ARCv2DSP instruction set architecture, combining RISC and DSP processing with support for an XY memory system to boost digital signal processing performance while minimizing power consumption.  Synopsys claims that the cores are from 3 to 5 times more efficient than the two previous similar cores, but the press release specifically avoids comparison with similar devices from other vendors.

When I read the data sheets of devices from possible competitors I appreciate the wisdom to avoid direct comparison.  Although the engineering work to produce the two new cores seems quite good, there is only so much that can be done with a ten years old architecture.  ARC becomes valuable only if sold as part of a sub-system that integrates other Synopsys IP and security products owned by the company.

It is also clear that those other resources will generate more revenue for Synopsys when integrated with other DSP processors from ARM, Intel, and may be Apple or even Cadence.  ARC has been neglected for too long to be competitive by itself, especially when considering the IoT market.  ARC is best used at the terminals or data acquisition nodes.  Such nodes are highly specialized, small, and above all very price sensitive.  A variation of few cents makes the difference between adoption or not.  This is not a market Synopsys is comfortable with.  Synopsys prefers to control by offering the best solution at a price it finds acceptable.

Conclusion

The ARC world will remain small.  Synopsys mark on the IoT will possibly be substantial but certainly not because of ARC.

Cadence Introduced Tensilica Vision P5 DSP

Thursday, October 8th, 2015

Gabe Moretti, Senior Editor

DSP devices are indispensable in electronic products that deal with the outside environment.  Wheter one needs to see, to touch, or in any way gather information from the environmanet, DSP devices are critical.  Improvements in their performance characteristics, therefore, have a direct impact not only on the capability of a circuit, but more importantly, on its level of competitiveness.  Cadence Design Systems has just announced the Cadence Tensilica Vision P5 digital signal processor (DSP), which it calls its flagship high-performance vision/imaging DSP core. Cadence claims that the new imaging and vision DSP core offers up to 13X performance boost, with an average of 5X less energy usage on vision tasks compared to the previous generation IVP-EP imaging and video DSP.

Jeff Bier, co-founder and president of Berkeley Design Technology, Inc. (BDTI) noted that: “There is an explosion in vision processing applications that require dedicated, efficient offload processors to handle the large streams of data in real time.  Processor innovations like the Tensilica Vision P5 DSP help provide the backbone required for increasingly complex vision applications.”

The Tensilica Vision P5 DSP core includes a significantly expanded and optimized Instruction Set Architecture (ISA) targeting mobile, automotive advanced driver assistance systems (or ADAS, which includes pedestrian detection, traffic sign recognition, lane tracking, adaptive cruise control, and accident avoidance) and Internet of Things (IoT) vision systems.

“Imaging algorithms are quickly evolving and becoming much more complex – particularly in object detection, tracking and identification,” stated Chris Rowen, CTO of the IP Group at Cadence. “Additionally, we are seeing a lot more integrated systems with multiple sensor types, feeding even more data in for processing in real time. These highly complex systems are driving us to provide more performance in our DSPs than ever before, at even lower power. The Tensilica Vision P5 DSP is a major step forward to meeting tomorrow’s market demands.”
Modern electronic systems architecture threats hardware and software with the same amount of attention.  They must balance each other in order to achieve the best possible execution while minimizing development costs.  The Tensilica Vision P5 DSP further improve the ease of software development and porting, with comprehensive support for integer, fixed-point and floating-point data types and an advanced toolchain with a proven, auto-vectorizing C compiler. The software environment also features complete support of standard OpenCV and OpenVX libraries for fast, high-level migration of existing imaging/vision applications with over 800 library functions.

The Tensilica Vision P5 DSP is specifically designed for applications requiring ultra-high memory and operation parallelism to support complex vision processing at high resolution and high frame rates. As such, it allows off-loading vision and imaging functions from the main CPU to increase throughput and reduce power. End-user applications that can benefit from the DSP’s capabilities include image and video enhancement, stereo and 3D imaging, depth map processing, robotic vision, face detection and authentication, augmented reality, object tracking, object avoidance and advanced noise reduction.

The Tensilica Vision P5 DSP is based on the Cadence Tensilica Xtensa architecture, and combines flexible hardware choices with a library of DSP functions and numerous vision/imaging applications from our established ecosystem partners. It also shares the comprehensive Tensilica partner ecosystem for other applications software, emulation and probes, silicon and services and much more.  The Tensilica Vision P5 core includes these new features:

  • Wide 1024-bit memory interface with SuperGather technology for maximum performance on the complex data patterns of vision processing
  • Up to 4 vector ALU operations per cycle, each with up to 64-way data parallelism
  • Up to 5 instructions issued per cycle from 128-bit wide instruction delivering increased operation parallelism
  • Enhanced 8-,16- and 32-bit ISA tuned for vision/imaging applications
  • Optional 16-way IEEE single-precision vector floating-point processing unit delivering a massive 32GFLOPs at 1GHz

Synopsys to Acquire Codenomicon

Wednesday, April 22nd, 2015

Gabe Moretti, Senior Editor

After what many thought was a diversion of focus when Synopsys acquired Coverity, the company is making another bold move with the announced acquisition of Codenomicon

Based in Finland, Codenomicon is well-known and highly respected in the global software security world with a focus on software embedded in chips and devices.

The official Synopsys release states: “The additional talent, technology and products will expand Synopsys’ presence in the software security market segment and extend the Coverity quality and security platform to help software developers throughout various organizations quickly find and fix security vulnerabilities and protect applications from security attacks.”

Fine thought and certainly true.  But looking at the security problems, those already found and those yet to be written about, in the IoT architecture, I think that Synopsys should not minimize the impact that the technologists at Codenomicon will have on the EDA market.

“Businesses are increasingly concerned about the security of their applications and protecting customer data. Adding the Internet of Things to the mix increases the complexity of security even further. During the past 15 months, the world was hit by major security breaches such as Heartbleed, Shellshock, etc.,” said Chi-Foon Chan, president and co-CEO of Synopsys. “By combining the Coverity platform with the Codenomicon product suite, Synopsys will expand its reach to provide a more robust software security solution with a full set of tools to help ensure the integrity, privacy and safety of an organization’s most critical software applications.”

Codenomicon’s customer base includes some of the world’s leading organizations in telecommunications, finance, manufacturing, software development, healthcare, automotive and government agencies.  But as part of Synopsys Codenomicon’s solutions deliver a more comprehensive security offering for the software development lifecycle by adding its Defensics tool for file and protocol fuzz testing, and its AppCheck tool for software composition analysis and vulnerability assessment to the embedded software used in electronics systems.

The Codenomicon Defensics tool used to discover the Heartbleed bug automatically tests the target system for unknown vulnerabilities, helping developers find and fix them before a product goes to market. It is a systematic solution to make systems more robust, harden them against cyber-attacks and mitigate the risk of 0-day vulnerabilities. The Defensics tool also helps expose failed cryptographic checks, privacy leaks or authentication bypass weaknesses. The Defensics tool is heavily used by buyers of Internet-enabled products to validate and verify that procured products meet their stringent security and robustness requirements.

The Codenomicon AppCheck tool adds software composition analysis (SCA) capabilities to the Coverity platform, helping customers reduce risks in third-party and open source components. When using the AppCheck tool, customers are able to obtain a software bill of materials (BOM) for their application portfolios, and identify components with known vulnerabilities.