Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘Aldec’

Next Page »

Blog Review –Monday, October 24 2016

Monday, October 24th, 2016

The how, what and why of time-of-flight sensors; Conference season: ARM TechCon 2016 and IoT Solutions Congress; Save time on big data analysis; In praise of FPGAs; Is it time for augmented and virtual reality?

Drastically reducing big data analysis is music to a data scientist’s ears. Larry Hardesty reports on researchers at MIT (Massachusetts Institute of Technology) have presented an automated system that can reduce preparation and analysis from months to just hours.

Keeping an eye on the nation’s bank vaults, Robert Vamosi, Synopsys, looks at the what bank regulators are doing to ramp up cybersecurity.

If you can’t head to Barcelona, Spain this week for IoT Solutions World Congress (October 25-27), Jonathan Ballon, Intel, reveals what the company will unveil, including a keynote: IoT: From Hype to Reality, what 5G means, smart cities and a hackathon.

Tired of the buzz, and seeking enlightenment, Jeff Bier, Berkeley Design, delves into just what is augmented reality and virtual reality. He examines hardware and software, markets and what is needed for widespread adoption.

Closer to home, 2016 ARM TechCon, in Santa Clara, California (October 25 – 27), Phil Brumby, Mentor Graphics, offers a heads-up on its industrial robot demo, using Nucleus RTOS separated by ARM TrustZone, and the ECU (Engine Control Unit) demo in a Linux-hosted In-Vehicle Infotainment (IVI) system. There is also a technical session: Making Sure your UI makes the most of the ARM-based SoC (Thurs, 10.30am, Ballroom E)

The role of memory is reviewed by Paul McLellan, Cadence Design System, as he discusses MemCon keynotes by Hugh Durdan, VP of the IP Group and Steve Pwalowski, VP of Advanced Computing Solutions at Micron. There is comprehensive pricing strategy and a look at industry trends.

A teardown of the Apple iPhone 7, by Dick James, Chipworks, links STMicroelectronics’ time-of-flight sensors with the Starship Enterprise. The blog has a comprehensive answer to questions such as what are these sensors and why are they in phones.

If the IoT is flexible, Zibi Zalewski, Aldec, argues, then FPGAs can tailor solutions without major investments in an ASIC. He takes Xilinx’s Zynq-7000 All-Programmable SoC as a starting point and illustrates how it can boost performance for IoT gateways.

Elegantly illustrating how multiple Eclipse projects can be run on a single microcontroller with MicroEJ, Charlottem, ARM, runs through a connected washing machine that can communicate via Bluetooth, MQTT, Z-Wave and LWM2M.

Caroline Hayes, Senior Editor

Blog Review Monday, August 29, 2016

Monday, August 29th, 2016

This week’s blogs are futuristic, with machine learning, from Intel, augmented reality from Synopsys, smart city software from Dassault Systèmes, questions and answers about autonomous vehicles, and security issues, around devices and MQTT on the IoT.

Artificial intelligence is the next great wave, predicts Lenny Tran, Intel. His post looks at machine learning and Intel’s High Performance Computing architecture is part of the way forward in machine learning.

On a similar theme, Hezi Saar, Synopsys, examines the Microsoft 28nm SoC and is impressed with the possibilities for augmented reality that the HoloLens Processing Units has for this developing marketplace.

If you are dissatisfied with your present office location, Dassault Systèmes has plans for smart faciliites, reports Akio. He describes some illuminating projects using 3D Experience City, real-time monitoring, the IoT and systems operations for a comfortable workspace in smart cities.

It’s all about teamwork according to Brandon Wade, Aldec, who offers an introduction to the AXI protocol. His post summarizes the protocol specifications and shares his revelation at how understanding the protocol opens up a world of design possibilities.

Autonomous cars are occupying a lot of Eamonn Ahearne’s, ON Semiconductor, time. Living in the hotbed of self-drive test, he reads, visits and analyses what is happening and is disappointed that hardware is being eclipsed by software in the popularity stakes.

Also occupied with autonomous vehicles, Andrew Macleod, Mentor Graphics, starts with an update on electric vehicles, and moves onto the disconnect between ADAS technologies and autonomous vehicles and the engineering challenges that can be addressed using a single ECU (Engine Control Unit).

Attending the Linley Mobile & Wearables Conference, Paul McLellan, Cadence Design Systems, pays attention to Asaf Ashkenazi of Cryptography Research (now part of Rambus) and his well-illustrated post reports how devices can be secured.

An IoT network, powered by the ISO/IEC PRF 20922 standard MQTT (MQ Telemetry Transport) can be at risk, warns Wilfred Nilsen, ARM. It is a sound warning about personal information being vulnerable to MQTT brokers. Luckily, he offers a solution, introducing the SMQ IoT protocol.

Caroline Hayes, Senior Editor

Blog Review – Monday May 16, 2016

Monday, May 16th, 2016

Ramifications for Intel; Verification moves to ASIC; Connected cars; Deep learning is coming; NXP TFT preview

Examining the industry’s transition to 5G, Dr. Venkata Renduchintala, Intel, describes the revolution of connectivity and why the company is shifting its SoC focus and exploit its ecosystem.

Coming from another angle, Chris Ciufo, Intel Embedded, assess the impacts of the recently announced changes at Intel, including the five pillars designed to support the company: data center, memory, FPGAs, IoT and 5G, with his thoughts on what it has in its arsenal to achieve the new course.

As FPGA verification flows move closer to those of ASICs, Dr. Stanley Hyduke, Aldec, looks at why the company has extended its verification tools for digital ASIC design, including the steps involved.

Software in vehicles is a sensitive topic for some, since the VW emissions scandal, but Synopsys took the opportunity of the Future Connect Cars Conference in Santa Clara, to highlight its Software Integrity Platform. Robert Vamosi, Synopsys, reports on some of the presentations at the event on the automotive industry.

Identifying excessive blocking in sequential programming as evil, Miro Samek, ARM, write a spirited and interesting blog on real-time design strategy and the need to keep it flexible, from the earliest stages.

Santa Clara also hosted the Embedded Vision Summit, and Chris Longstaff, Imagination Technologies, writes about deep learning on mobile devices. He notes that Cadence Design Systems highlighted the increase in the number of sensors in devices today, and Google Brain’s Jeff Dean talked about the use of deep learning via GoogLeNet Inception architecture. The blog also includes examples of Convolutional Neural Networks (CNN) and how PowerVR mobile GPUs can process the complex algorithms.

This week, NXP FTF (Freescale Technology Forum), in Austin, Texas, is previewed by Ricardo Anguiano, Mentor Graphics. He looks at a demo from the company, where a simultaneous debug of a patient monitoring system runs Nucleus RTOS on the ARM Cortex-M4. He hints at what attendees can see using Sourcery CodeBench with ARM processors and a link to heterogeneous solutions from the company.

Caroline Hayes, Senior Editor

Blog Review – Monday, March 21 2016

Monday, March 21st, 2016

Coffee breaks and C layers; Ideas for IoT security; Weather protection technology; Productivity boost; Shining a light on dark silicon

Empathizing with his audience, Jacek Majkowski, sees the need for coffee but not necessarily a C layer in Standard Co-Emulation Modelling Interface (SCE-MI).

At last week’s Bluetooth World, in Santa Clara, CA, there was a panel discussion – Is the IoT hype or hope? Brian Fuller, ARM, reports on the to-and-fro of ideas from experts from ARM, Google, and moderated by Mark Powell, executive director of the Bluetooth SIG.

Of all the things to do on a sabbatical, Matt Du Puy, ARM, chose to climb Dhaulagiri (26,795feet /8161m), described as one of the most dangerous 8,000m mountains. Brian Fuller, ARM, reports that he is armed a GPS watch with cached terrain data and some questionable film choices on a portable WiDi disk station.

Still with extremes of weather, the Atmel team, enthuses about a KickStarter project for the Oombrella, a smart umbrella that uses sensors to analyse temperature, pressure, humidity and light, to let you know if you will need it because rain is coming your way. Very clever as long as you remember to bring it with you. Not so appealing is the capacity to share via social media the type of weather you are experiencing – and they say the Brits are obsessed with the weather!

IoT protection is occupying an unidentified blogger at Rambus, who longs for a Faraday cage to shield it. The blog has some interesting comments about the make up of, and security measures for the IoT, while promoting the company’s CryptoManager.

Still with IoT security, Richard Anguiano, Mentor Graphics examines a gateway using ARM TrustZone, and heterogeneous operating system configurations and running Nucleus RTOS and Mentor Embedded Linux. There is a link provided to the Secure Converged IoT Gateway and the complete end-to-end IoT solution.

Europe is credited as the birthplace for the Workplace Transformation, but Thomas Garrison, Intel. Ahead of CEBIT he writes about the role of Intel’s 6 th Generation Core vPro processor and what it could mean for a PC’s battery life, compute performance and the user’s productivity.

The prospects for MIPI and future uses in wearables, machine learning, virtual reality and automotive ADAS are uppermost in the mind of Hezi Saar, Synopsys, following MIPI Alliance meetings. He was particularly taken with a Movidius vision processor unit, and includes a video in the blog.

Examining dark silicon, Paul McLellan, Cadence Design Systems, wonders what will supercede Dennard Scaling to overcome the limitations on power on large SoCs.

Caroline Hayes, Senior Editor

Blog Review – Monday, January 11, 2016

Monday, January 11th, 2016

In this week’s review, as one blog has predictions for what 2016 holds, another reviews 2015. Others cover an autonomous flight drone; a taster of DesignCon 2016 and a bionic leg development.

Insisting it’s not black magic or fortune telling but a retelling of notes from past press announcements, Dick James, Chipworks, thinks 2016 will be a year of mixed fortunes, with a low profile for leading edge processes and plenty of activity in memory and sensors as the sectors reap the rewards of developments being realized in the marketplace.

Looking back on 2015, Tom De Schutter, Synopsys, is convinced that the march of software continues and world domination is but a clock cycle away. His questions prompted some interesting feedback on challenges, benefits and working lives.

Looking ahead to autonomous drone flight, Steve Leibson, Xilinx, reports on the the beta release of Aerotenna’s OCPoC (Octagonal Pilot on Chip) ready-to-fly drone-control, based on a Zynq Z-7010 All Programmable SoC with integrated IMU (inertial measurement unit) sensors and GPS receiver.

Bigger isn’t always better, explains Doug Perry, Doulos, in a guest blog for Aldec. As well as outlining the issues facing those verifying larger FPGAs, he provides a comprehensive, and helpful, checklist to tackle this increasingly frequent problem, while throwing in a plug for two webinars on the subject.

Some people have barely unpacked from CES, and ANSYS is already preparing for DesignCon 2016. Margaret Schmitt previews the company’s plan for ‘designing without borders’ with previews of what, and who, can be seen there.

A fascinating case study is related by Karen Schulz, Gumstix, on the ARM Community blog site. The Rehabilitation Institute of Chicago has (RIC) has developed the first neural-controlled bionic leg, without using no nerve redirection surgery or implanted sensors. The revolution is powered by the Gumstix Overo Computer-on-Module.

Showing empathy for engineers struggling with timing closure, Joe Hupcey III, Mentor Graphics, has some sound advice and diagnoses CDC problems. It’s not as serious as it sounds, CDC, or clock domain crossing, can be addressed with IEEE 1801 low power standard. Just what the doctor ordered.

Caroline Hayes, Senior Editor

Blog Review – Monday, November 16, 2015

Monday, November 16th, 2015

ARM TechCon 2015 highlights: IoT, mbed and magic; vehicle monitoring systems; the road ahead for automotive design

It’s crunch time for IoT, announced ARM CEO Simon Segars at ARM TechCon. Christine Young, Cadence reports on what Segars believes is needed to get the IoT right.

Posing as a ‘booth babe’, Richard Solomon, Synopsys, was also at ARM TechCon demonstrating the latest iteration of DesignWare IP for PCI Express 4.0. As usual, there are pictures illustrating some of the technology, this time around switch port IP and Gen2 PCI, and quirky pictures from the show floor, to give readers a flavor of the event.

Tracking the progress of mbed OS, Chris Ciufo, eecatalog, prowled the mbed Zone at this year’s ARM TechCon, finding IoT ‘firsts’ and updates of wearables.

Enchanted by IoT, Eric Gowland, ARM, found ARM TechCon full of wonder and magic – or, to paraphrase Arthur C Clark, technology that was indistinguishable from magic. There are some anecdotes from the event – words and pictures – of how companies are using the cloud and the IoT and inspiring the next generation of magicians.

Spotting where Zynq devices are used in booth displays, might become an interesting distraction when I am visiting some lesser shows in future. I got the idea from Steve Leibson, Xilinx, who happened upon the Micrium booth at ARM TechCon where one was being used, stopping to investigate, he found out about free μC/OS for Makers.

Back to Europe, where DVCon Europe was help in Munich, Germany (November 11-12). John Aynsley, Doulos, was pleased that UVM is alive and well and companies like Aldec are realising that help and support is needed.

Identifying the move from behavior-based driver monitoring systems to inward-looking, camera-based systems, John Day, Mentor Graphics, looks at what this will use of sensors will mean for automakers who want to combine value and safety features.
Deciding how many functions to offer will be increasingly important for automakers, he advises.

Still with the automotive industry, Tomvanvu, Atmel, addresses anyone designed for automotive embedded systems and looks at what is driving progression for the inevitable self-driving cars.

Caroline Hayes, Senior Editor

Blog Review – Monday, September 28 2015

Monday, September 28th, 2015

ARM Smart Design competition winners; Nordic Semiconductor Global Tour details; Emulation alternative; Bloodhound and bridge-building drones; Imagination Summit in Taiwan; Monolithic 3D ‘game changer’; Cadence and collaboration; What size is wearable technology?

Winners of this year’s ARM Smart Product Design competition had no prior experience of using ARM tools, yet managed, in just three months to produce a sleep Apnea Observer app (by first prize winner, Clemente di Caprio), an amateur radio satellite finder, a water meter, an educational platform for IoT applications and a ‘CamBot’ camera-equipped robot, marvels, Brian Fuller, ARM.

This year’s Nordic Semiconductor Global Tech Tour will start next month, and John Leonard, ARM has details of how to register and more about this year’s focus – the nRF52 Series Bluetooth Smart SoC.

Offering an alternative to the ‘big box’ emulation model, Doug Amos, Aldec, explains FPGA-based emulation.

Justin Nescott, Ansys, has dug out some great stories from the world of technology, from the UK’s Bloodhound project and the sleek vehicle’s speed record attempt; and a story published by Giz Mag about how drones created a bridge – with video proof that it is walkable.

A review of the 2015 Imagination Summit in Taiwan earlier this month is provided by Vicky Hewlett. The report includes some photos from the event, of attendees and speakers at Hsinchu and Taipei.

It is with undeniable glee that Zvi Or-Bach, MonolithIC 3D announces that the company has been invited to a panel session titled: “Monolithic 3D: Will it Happen and if so…” at IEEE 3D-Test Workshop Oct. 9th, 2015. It is not all about the company, but a discussion of the technology challenge and the teaser of the unveiling of a ‘game changer’ technology.

A review of TSMC Open Innovation Platform (OIP) Ecosystem Forum, earlier this month, is presented in the blog by Christine Young, Cadence. There are some observations from Rick Cassidy, TSMC North America on Thursday, on automotive, IoT and foundry collaboration.

How big is wearable, ponders Ricardo Anguiano, Mentor Graphics. Unwrapping a development kit, he provides a link to Nucleus RTOS and wearable devices to help explain what’s wearable and what’s not.

A brief history of Calypto Design Systems, recently acquired by Mentor Graphics, is discussed by Graham Bell, RealIntent, and what the change of ownership means for existing partners.

Beginning a mini series of blogs about the HAPS-80 with ProtoCompiler, Michael Posner, Synospys, begins with a focus on the design flow and time constraints. He provides many helpful illustrations. (The run-on piece about a visit to the tech museum in Shanghai shows how he spends his free time: seeking out robots!)

Caroline Hayes, Senior Editor

Blog Review – Monday, June 08, 2015

Monday, June 8th, 2015

DAC duo announce DDA; Book a date for DAC with ARM, Ansys, Cadence; Synopsys and Xilinx; True FPGA-based verification

Announcing a partnership with Cadence Design Systems at DAC 2015, Dennis Brophy, Mentor Graphics teases with some details of Deug Data API (DDA). Full details will be unveiled at a joint presentation at the Verification Academy Booth (2408) on Tuesday at 5pm.

Amongst demonstrations of an IoT sub-system for Cortex-M processors, ARM will show a new IP tooling suite and the ARM Cordio radio core IP. There will be over a dozen partners, reports Brenda Westcott, ARM, in the Connected Community Pavillion and the ARM Scavenger Hunt. (DAC June 7 – 11, ARM booth 2428).

As if justifying its place at DAC 2015, Ravi Ravikumar, Ansys, explains how the show has evolved beyond EDA for SoCs. The company will host videos on automotive, IoT and mobile, and presentations from foundry partners. (DAC June 7 – 11, Anysys booth 1232).

If you are interested in the continuum of verification engines, DAC is the place to be this week. Frank Schirrmeister, Cadence, summarizes the company’s offerings to date, with a helpful link to a COVE (Continuum of Verification Engines) article, and provides an overview of some of the key verification sessions at the Moscone Center. (DAC June 7 – 11, Cadence booth 3515).

Back with FPGA prototyping system, HAPS, Michael Posner, Synopsys, invites visitors to DAC to come see the Xilinx UltraScale VU440-based HAPs. As well as proudly previewing the hardware software development support, he also touches on the difficulties of mapping ASICs to FPGAS.

More Xilinx-DAC news, as Doug Amos’s guest blog at Aldec, announces the era of true FPGA-based verification. He believes the end of big-box emulation is nigh, following the adoption of Xilinx’s Virtex UltraScale devices in its HES-7 (Hardware Emulation Solution, seventh generation) technology.

Caroline Hayes, Senior Editor

EDA in the year 2017 – Part 2

Tuesday, January 17th, 2017

Gabe Moretti, Senior Editor

The first part of the article, published last week, covered design methods and standards in EDA together with industry predictions that impacted all of our industry.  This part will cover automotive, design verification and FPGA.  I found it interesting that David Kelf, VP of Marketing at OneSpin Solutions, thought that Machine learning will begin to penetrate the EDA industry as well.  He stated: “Machine Learning hit a renaissance and is finding its way into a number of market segments. Why should design automation be any different?  2017 will be the start of machine learning to create a new breed of design automation tool, equipped with this technology and able to configure itself for specific designs and operations to perform them more efficiently. By adapting algorithms to suit the input code, many interesting things will be possible.”

Rob Knoth, Product Management Director, Digital and Signoff Group at Cadence touched on an issue that is being talked about more recently: security.  He noted that: “In 2016, IoT bot-net attacks brought down large swaths of the Internet – the first time the security impact of IoT was felt by many. Private and nation-state attacks compromised personal/corporate/government email throughout the year. “

In 2017, we have the potential for security concerns to start a retreat from always-on social media and a growing value on private time and information. I don’t see a silver bullet for security on our horizon. Instead, I anticipate an increasing focus for products to include security managers (like their safety counterparts) on the design team and to consider safety from the initial concept through the design/production cycle.

Figure 1.  Just one of the many electronics systems found in an automobile (courtesy of Mentor)

Automotive

The automotive industry has increased the use of electronics year over year for a long time.  At this point an automobile is a true intelligent system, at least as far as what the driver and passenger can see and hear the “infotainment system”.  Late model cars also offer collision avoidance and stay-in-lane functions, but more is coming.

Here is what Wally Rhines thinks: “Automotive and aerospace designers have traditionally been driven by mechanical design.  Now the differentiation and capability of cars and planes is increasingly being driven by electronics.  Ask your children what features they want to see in a new car.  The answer will be in-vehicle infotainment.  If you are concerned about safety, the designers of automobiles are even more concerned.  They have to deal with new regulations like ISO 26262, as well as other capabilities, in addition to environmental requirements and the basic task of “sensor fusion” as we attach more and more visual, radar, laser and other sensors to the car.  There is no way to reliably design vehicles and aircraft without virtual simulation of electrical behavior.

In addition, total system simulation has become a requirement.  How do you know that the wire bundle will fit through the hole in the door frame?  EDA tools can tell you the answer, but only after seeking out the data from the mechanical design.  Wiring in a car or plane is a three dimensional problem.  EDA tools traditionally worry about two dimension routing problems.  The world is changing.  We are going to see the basic EDA technology for designing integrated circuits be applied to the design of systems. Companies that can remain at the leading edge of IC design will be able to apply that technology to systems.”

David Kelf, VP of Marketing at OneSpin Solutions, observed: “OneSpin called it last year and I’ll do it again –– Automotive will be the “killer app” of 2017. With new players entering the marketing all the time, we will see impressive designs featured in advanced cars, which themselves will move toward a driverless future.  All automotive designs currently being designed for safety will need to be built to be as secure as possible. The ISO 26262 committee is working on security as well safety and I predict security will feature in the standard in 2017. Tools to help predict vulnerabilities will become more important. Formal, of course, is the perfect platform for this capability. Watch for advanced security features in formal.”

Rob Knoth, Product Management Director, Digital and Signoff Group at Cadence noted: “In 2016, autonomous vehicle technology reached an inflection point. We started seeing more examples of private companies operating SAE 3 in America and abroad (Singapore, Pittsburgh, San Francisco).  We also saw active participation by the US and world governments to help guide tech companies in the proliferation and safety of the technology (ex. US DOT V2V/V2I standard guidelines, and creating federal ADAS guidelines to prevent state-level differences). Probably the most unique example was also the first drone delivery by a major retailer, something which was hinted at 3 years prior and seemingly just a flight of fancy then.

Looking ahead to 2017, both the breadth and depth are expected to expand, including the first operation of SAE level 4/5 in limited use on public streets outside the US, and on private roads inside US. Outside of ride sharing and city driving, I expect to see the increasing spread of ADAS technology to long distance trucking and non-urban transportation. To enable this, additional investments from traditional vehicle OEM’s partnering with both software and silicon companies will be needed to enable high-levels of autonomous functions. To help bring these to reality, I also expect the release of new standards to guide both the functional safety and reliability of automotive semiconductors. Even though the pace of government standards can lag, for ADAS technology to reach its true potential, it will require both standards and innovation.”

FPGA

The IoT market is expected to provide a significant opportunity to the electronics industry to grow revenue and open new markets.  I think the use of FPGA in IoT dvices will increase the use of these devices in system design.

I asked Geoff Tate, CEO of FlexLogix, his opinions on the subject.  He offered four points that he expects to become reality in 2017:

1. the first customer chip will be fabricated using embedded FPGA from an IP supplier

2. the first customer announcements will be made of customers adopting embedded FPGA from an IP supplier

3. embedded FPGAs will be proven in silicon running at 1GHz+

4. the number of customers doing chip design using embedded FPGA will go from a handful to dozen.

Zibi Zalewski, Hardware Division General Manager at Aldec also addressed the FPGA subject.

“I believe FPGA devices are an important technology player to mention when talking what to expect in 2017. With the growth of embedded electronics driven by Automotive, Embedded Vision and/or IoT markets, FPGA technology becomes a core element particularly for in products that require low power and re-programmability.

Features of FPGA such as pipelining and the ability to execute and easily scale parallel instances of the implemented function allow for the use of FPGA for more than just the traditionally understood embedded markets. FPGA computing power usage is exploding in the High Performance Computing (HPC) where FPGA devices are used to accelerate different scientific algorithms, big data processing and complement CPU based data centers and clouds. We can’t talk about FPGA these days without mentioning SoC FPGAs which merge the microprocessor (quite often ARM) with reprogrammable space. Thanks to such configurations, it is possible to combine software and hardware worlds into one device with the benefits of both.

All those activities have led to solid growth in FPGA engineering, which is pushing on further growth of FPGA development and verification tools. This includes not only typical solutions in simulation and implementation. We should also observe solid growth in tools and services simplifying the usage of FPGA for those who don’t even know this technology such as high level synthesis or engineering services to port C/C++ sources into FPGA implementable code. The demand for development environments like compilers supporting both software and hardware platforms will only be growing, with the main goal focused on ease of use by wide group of engineers who were not even considering the FPGA platform for their target application.

At the other end of the FPGA rainbow are the fast-growing, largest FPGA offered both from Xilinx and Intel/Altera. ASIC design emulation and prototyping will push harder and harder on the so called big-box emulators offering higher performance and significantly lower price per gate and so becoming more affordable for even smaller SoC projects. This is especially true when partnered with high quality design mapping software that handles multi-FPGA partitioning, interconnections, clocks and memories.”

Figure 2. Verification can look like a maze at times

Design Verification

There are many methods to verify a design and companies will, quite often, use more than one on the same design.  Each method: simulation, formal analysis, and emulation, has its strong points.

For many years, logic simulation was the only tool available, although hardware acceleration of logic simulation was also available.

Frank Schirrmeister, Senior Product Management Group Director, System and Verification Group at Cadence submitted a through analysis of verification issues.  He wrote: “From a verification perspective, we will see further market specialization in 2017 – mobile, server, automotive (especially ADAS) and aero/defense markets will further create specific requirements for tools and flows, including ISO 26262 TCL1 documentation and support for other standards. The Internet of Things (IoT) with its specific security and low power requirements really runs across application domains.  Continuing the trend in 2016, verification flows will continue to become more application-specific in 2017, often centered on specific processor architectures. For instance, verification solutions optimized for mobile applications have different requirements than for servers and automotive applications or even aerospace and defense designs. As application-specific requirements grow stronger and stronger, this trend is likely to continue going forward, but cross-impact will also happen (like mobile and multimedia on infotainment in automotive).

Traditionally ecosystems have been centered on processor architectures. Mobile and Server are key examples, with their respective leading architectures holding the lion share of their respective markets. The IoT is mixing this up a little as more processor architectures can play and offer unique advantages, with configurable and extensible architectures. No clear winner is in sight yet, but 2017 will be a key year in the race between IoT processor architectures. Even OpenSource hardware architectures are look like they will be very relevant judging from the recent momentum which eerily reminds me of the early Linux days. It’s one of the most entertaining spaces to watch in 2017 and for years to come.

Verification will become a whole lot smarter. The core engines themselves continue to compete on performance and capacity. Differentiation further moves in how smart applications run on top of the core engines and how smart they are used in conjunction.

For the dynamic engines in software-based simulation, the race towards increased speed and parallel execution will accelerate together with flows and methodologies for automotive safety and digital mixed-signal applications.

In the hardware emulation world, differentiation for the two basic ways of emulating – processor-based and FPGA-based – will be more and more determined by how the engines are used. Specifically, the various use models for core emulation like verification acceleration low power verification, dynamic power analysis, post-silicon validation—often driven by the ever growing software content—will extend further, with more virtualization joining real world connections. Yes, there will also be competition on performance, which clearly varies between processor-based and FPGA-based architectures—depending on design size and how much debug is enabled—as well as the versatility of use models, which determines the ROI of emulation.

FPGA-based prototypes address the designer’s performance needs for software development, using the same core FPGA fabrics. Therefore, differentiation moves into the software stacks on top, and the congruency between emulation and FPGA-based prototyping using multi-fabric compilation allows mapping both into emulation and FPGA-based prototyping.

All this is complemented by smart connections into formal techniques and cross-engine verification planning, debug and software-driven verification (i.e. software becoming the test bench at the SoC level). Based on standardization driven by the Portable Stimulus working group in Accellera, verification reuse between engines and cross-engine optimization will gain further importance.

Besides horizontal integration between engines—virtual prototyping, simulation, formal, emulation and FPGA-based prototyping—the vertical integration between abstraction levels will become more critical in 2017 as well. For low power specifically, activity data created from RTL execution in emulation can be connected to power information extracted from .lib technology files using gate-level representations or power estimation from RTL. This allows designers to estimate hardware-based power consumption in the context of software using deep cycles over longer timeframes that are emulated. ‘

Anyone who knows Frank will not be surprised by the length of the answer.

Wally Rhines, Chairman and CEO of Mentor Graphics was less verbose.  He said:” Total system simulation has become a requirement.  How do you know that the wire bundle will fit through the hole in the door frame?  EDA tools can tell you the answer, but only after seeking out the data from the mechanical design.  Wiring in a car or plane is a three dimensional problem.  EDA tools traditionally worry about two dimension routing problems.  The world is changing.  We are going to see the basic EDA technology for designing integrated circuits be applied to the design of systems. Companies that can remain at the leading edge of IC design will be able to apply that technology to systems.

This will create a new market for EDA.  It will be larger than the traditional IC design market for EDA.  But it will be based upon the basic simulation, verification and analysis tools of IC design EDA.  Sometime in the near future, designers of complex systems will be able to make tradeoffs early in the design cycle by using virtual simulation.  That know-how will come from integrated circuit design.  It’s no longer feasible to build prototypes of systems and test them for design problems.  That approach is going away.  In its place will be virtual prototyping.  This will be made possible by basic EDA technology.  Next year will be a year of rapid progress in that direction.  I’m excited by the possibilities as we move into the next generation of electronic design automation.”

The increasing size of chips has made emulation a more popular tool than in the past.  Lauro Rizzatti, Principal at Lauro Rizzatti LLC, is a pioneer in emulation and continues to be thought of as a leading expert in the method.  He noted: “Expect new use models for hardware emulation in 2017 that will support traditional market segments such as processor, graphics, networking and storage, and emerging markets currently underserved by emulation –– safety and security, along with automotive and IoT.

Chips will continue to be bigger and more complex, and include an ever-increasing amount of embedded software. Project groups will increasingly turn to hardware emulation because it’s the only verification tool to debug the interaction between the embedded software and the underlying hardware. It is also the only tool capable to estimate power consumption in a realistic environment, when the chip design is booting an OS and processing software apps. More to the point, hardware emulation can thoroughly test the integrity of a design after the insertion of DFT logic, since it can verify gate-level netlists of any size, a virtually impossible task with logic simulators.

Finally, its move to the data center solidifies its position as a foundational verification tool that offers a reasonable cost of ownership.”

Formal verification tools, sometimes referred to as “static analysis tools” have seen their use increase year over year once vendors found human interface methods that did not require a highly-trained user.  Roger Sabbagh, VP of Application Engineering at Oski Technology pointed out: “The world is changing at an ever-increasing pace and formal verification is one area of EDA that is leading the way. As we stand on the brink of 2017, I can only imagine what great new technologies we will experience in the coming year. Perhaps it’s having a package delivered to our house by a flying drone or riding in a driverless car or eating food created by a 3-D printer. But one thing I do know is that in the coming year, more people will have the critical features of their architectural design proven by formal verification. That’s right. System-level requirements, such as coherency, absence of deadlock, security and safety will increasingly be formally verified at the architectural design level. Traditionally, we relied on RTL verification to test these requirements, but the coverage and confidence gained at that level is insufficient. Moreover, bugs may be found very late in the design cycle where they risk generating a lot of churn. The complexity of today’s systems of systems on a chip dictates that a new approach be taken. Oski is now deploying architectural formal verification with design architects very early in the design process, before any RTL code is developed, and it’s exciting to see the benefits it brings. I’m sure we will be hearing a lot more about this in the coming year and beyond!”

Finally David Kelf, VP Marketing at OneSpin Solutions observed: “We will see tight integrations between simulation and formal that will drive adoption among simulation engineers in greater numbers than before. The integration will include the tightening of coverage models, joint debug and functionality where the formal method can pick up from simulation and even emulation with key scenarios for bug hunting.”


Conclusion

The two combined articles are indeed quite long.  But the EDA industry is serving a multi-faceted set of customers with varying and complex requirements.  To do it justice, length is unavoidable.

CDC Verification: Using Both Static and Dynamic Checking is Key to Success

Wednesday, May 13th, 2015

Pavlo Leshtaiev, ALINT Product Manager, Aldec

Clock domain crossing (CDC) verification has become a critical element for success in modern digital electronic designs.  Unlike “the old days” when relatively simple and slow digital designs could be run on a single synchronous clock, today’s complex, high-speed designs use multiple asynchronous clocks to drive separate high-frequency logic sections.  The CDC challenges come into play where these separate clock domains interface because any weaknesses in the crossing design can result in data errors, control problems or even overall system failure.

Multiple clocks running at different frequencies exhibit differing phase and latency characteristics, which means the relationships between clock edges in any two domains cannot be relied upon.  To overcome these clocking mismatches, specialized synchronization logic must be implemented at domain crossings between asynchronous domains to prepare signals for detection by receiving domains and avoid problems such as metastability.

The CDC verification process is comprised of two main parts: static checks and dynamic checks.  Both are important for effective verification.  Static checks come first, using a linting tool to determine the completeness and correct design of the CDC structure.  Static checks have the advantage that they can be run fast without requiring active stimulus. However, static checks alone are not sufficient to address the full spectrum of CDC issues.  Dynamic checks use active stimulus to allow discovery of a wider range of design faults but require a test bench and take more time to run in order to dig down and identify the sources of problems.

This article describes the static and dynamic checks that are needed for effective CDC verification and shows how they are useful and what the differences are between them.

Static Checks

The first step in the process is to verify the CDC logic to ensure the presence of synchronizers on the crossings and the validity of their structure.  Design Rule Checking (DRC) is used to analyze the static HDL code and validate it against a specific set of best practices for ASIC and FPGA designs.  The goal of the initial static DRC process is to detect bugs hidden in the code upfront and to avoid the hassles of finding them later through redundant design iterations. The escalating complexity of chip design and tighter time-to-market pressures have made upfront automated code review an absolute necessity for success.

The static verification flow performs code linting at specific design-entry points, including Parse, Elaborate, Synthesis, and Constraints, with comparisons against a special set rules to assure conformance before proceeding with further analysis.  Figure 1 illustrates a typical set of Clock and Reset Network Checks.

Figure 1 ALINT-PRO-CDC Netlist Checks

The main goal of static checking is to assure proper synchronization structures and correctly defined clocks and other design constraints.  Issues that are addressed in the static review include:

  • Unsynchronized crossings
  • Convergence/Divergence in the crossing
  • Incorrect synchronizer structures
  • Impacts of combinational logic in the crossings

The outputs of the static linting process are violations – special data structures that represent a single problem/fault in the design.  For example, the static process could identify specific structural issues in combinational logic such as illustrated in Figure 2.

Figure 2 Combinational Logic Structures

The static review should provide a rich set of information regarding the overall structure of the HDL code and specific detailed elements including:

- Show all Clock and Reset endpoints

- Show all nets and pins that form the clock/reset networks

- Discover any clock and reset sources that are unconstrained

- Identify any incorrect connection to clock or reset pins

- Detect any unwanted connections to logic cell pins

- Display design hierarchy with cross-probing for entity/module, architecture, instantiation

- Filter violations for viewing by various criteria as well as generating reports

- Highlight violated paths and elements in Schematic Viewer

By identifying and resolving violations, static checks prepare the design for Dynamic analysis.

Dynamic CDC Checking

After the static checking is complete, it is time to proceed with dynamic CDC verification. Some might ask, “If static checking has verified the structural issues, why do we need dynamic checking?” Dynamic analysis enables testing of the logic that controls the synchronizers during active stimulus. In essence, static checks make sure synchronizers are on the crossings, while dynamic checking makes sure that they perform correctly.

Dynamic CDC checking, using a system such as ALINT-PRO-CDC, also requires a simulator such as Riviera-PRO or Active-HDL to execute the target code-under-test.  Integration with the simulator is implemented by generating a testbench file that inserts specialized test conditions, including metastability emulation as well as assertion and coverage statements (See Figure 3).

Figure 3 Dynamic Checks Process Flow

The Auto-generated testbench inserts the following test elements:

1. Assertions

2. Coverage

3. Metastability emulation / metastability insertion

Figure 4 Relations between dynamic checks

Assertions


We generate assertions on the recognized and correctly defined synchronizers, because if the crossing is not synchronized or synchronizer structure is incorrect, there is no need to emulate metastability.  (Such situations should have been handled by the static checks already.)

For example, Assertion Statements can be used to insert and check specific conditions at the synchronizer NDFF for each crossing, such as “high pulse missed by receiving domain” or “low pulse missed by receiving domain”.  In addition, assertions can be used for MUX checking by inserting appropriate statements at the enable signal and checking the data stability while enable is active.

Figure 4 Relations between dynamic checks

Coverage
Coverage should check the user’s testbench to ensure that it stresses the design under test enough, such as making sure that it triggers all tricky situations and data was sent over the crossings. For example, coverage Statements for NDFFs can check that data was sent, and that the desired metastability effects occurred during simulation in all possible ways.

Metastability
Metastability Insertion is used to emulate the fact that sequential logic can randomly settle to “1” or “0” when setup and hold times are violated.  The simulation process should be able to selectively insert metastability conditions for all properly described synchronizers within the code. This enables dynamic testing to identify problem areas and correct them to assure that the system can successfully resolve metastability conditions during normal operation.

Metastability uses behavioral HDL code (SystemVerilog) which alters synchronizers behavior by introducing random delays on the crossings.  Metastability emulation doesn’t actually perform any checks but it enhances all other types of checks. For example, if some signals are synchronized with independent synchronizers and relations between these signals are important for proper design function, metastability emulation can create situations when data correlation is lost and thus user defined assertions may fail and point to the problem source.

The actual checks are performed by three elements: automatically generated assertions, automatically generated coverage, and testbench created by the user.

The bottom line is both Static Checking and Dynamic Checking are needed to assure proper CDC verification. By first conducting comprehensive Static Checks on the structural validity of the code and confirming the presence of required synchronizers, we establish a solid foundation that can act as our roadmap for further verification.  Then with Dynamic Checks, we actually drive that road at full speed and even insert some speed bumps to ensure the resilience and robustness of the system.

For further discussion and information on this subject visit Aldec’s booth at DAC.

Next Page »