Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘ADAS’

Next Page »

Blog Review – Monday, April 24, 2017

Monday, April 24th, 2017

This week’s blogs are concerned with AI and intelligent, connected vehicles, sometimes both. There are quests to find the facts behind myths and searches for answers for power management and software security too.

Is an effective tool for verification, the stuff of legends? Gabe Moretti, Chip Design Magazine, seeks the truth behind Pegasus – no, not the winged horse, the more earthly verification engine from Cadence.

A power strategy is one thing, but a free trial adds a new dimension to energy management. Don Dingee, Sonics, elaborates on the company’s plan to bring power to the masses, using hardware IP and ICE-Grain Power architecture.

If you are unsure about USB, Senad Lomigora, ON Semiconductor’s blog should help. It looks at what it’s for, why we can’t get enough of USB Type C, USB 3.1, connectors and re-drivers.

Every vehicle’s ADAS relies on good visuals, observes Jim Harrison, guest blogger, Maxim Integrated, and good connectivity. He looks at the securely connected autonomous car, and then homes in on explained how Maxim Integrated exploits GMSL, an alternative to Ethernet, in its MAX96707 and MAX96708 chips, to create an effective in-car communication network.

Still with the connected car, Pete Decher, Mentor Graphics, is fresh from the Autotech Council meeting in San Jose. The company’s DRS360 Autonomous Driving Platform launch was high on the list of discussion topics, along with the role of artificial intelligence (AI) in the future of driving.

Still with AI, Evens Pan, ARM provides an in-depth blog on Chinese start-up, Peceptin’s enabled embedded deep learning. The case study is fascinating and well reported in this comprehensive essay.

Making any software engineer feel insecure about software security is an everyday occurrence, helping them out is a little more out-of-the-ordinary, so if it refreshing to see a post from the editorial team, Synopsys, letting the put-upon software engineer know there is a webinar coming soon (May 2) to enlighten them on the Building Security In Maturity Model (BSIMM), with a link to register to attend.

Caroline Hayes, Senior Editor

Blog Review – Monday, March 27, 2017

Monday, March 27th, 2017

How AI can be used for medical breakthroughs; What’s wired and what’s not; A new compiler from ARM targets functional safety; Industry 4.0 update

A personal history lesson from Paul McLellan, Cadence Design Systems, as he charts the evolution from the beginning of the company, via the author’s career and various milestones with different companies and the trials of DAC over the decades.

Post Embedded World, ARM announced the ARM Compiler 6. Tony Smith, ARM, looks at its role for functional safety and autonomous vehicles.

A review of industrial IoT at Embedded World 2017 is the focus for Andrew Patterson’s blog. Mentor Graphics had several demonstrations for Industry 4.0. He explains the nature of Industry 4.0 and where it is going, the role of OPC-UA (Open Platform Communication – Unified Architecture) and support from Mentor.

What’s wired and what’s wireless, asks David Andeen, Maxim Integrated. His blog looks at vehicle sub-systems and wired communications standards, building automation and wired interface design and a link to an informative tutorial.

There are few philosophical questions posed in the blogs that I review, but this week throws up an interesting one from Philippe Laufer, Dassault Systemes. The quandary is does science drive design, or does design drive science? Topically posted ahead of the Age of Experience event in Milan next month, the answer relies on size and data storage, influenced by both design and science.

Security issues for medical devices are considered by David West, Icon Labs. He looks at the threats and security requirements that engineers must consider.

A worthy competition is announced on the Intel blog – the Artificial Intelligence Kaggle competition to combat cervical cancer. Focused on screening, the competition with MobileODT, using its optical diagnostic devices and software, challenges Kagglers to develop an algorithm that classifies a cervix type, for referrals for treatment. The first prize is $50,000 and there is a $20,000 prize for best Intel tools usage. “We aim to challenge developers, data scientists and students to develop AI algorithms to help solve real-world challenges in industries including medical and health care,” said Doug Fisher, senior vice president and general manager of the Software and Services Group at Intel.

Caroline Hayes, Senior Editor

Blog Review – Tuesday, November 22, 2016

Tuesday, November 22nd, 2016

New specs for PCI Express 4.0; Smart homes gateway webinar this week; sensors – kits and tools; the car’s the connected star; Intel unleashes AI

Change is good – isn’t it? Richard Solomon, Synopsys, prepares for the latest draft of PCI Express 4.0, with some hacks for navigating the 1,400 pages.

Following a triumphant introduction at ARM TechCon 016, Diya Soubra, ARM, examines the ARM Cortex-M33 from the dedicated co-processor interface to security around the IoT.

Steer clear of manipulating a layout hierarchy, advises Rishu Misri Jaggi, Cadence Design Systems. She advocates the Layout XL Generation command to put together a Virtuoso Layout XL-compliant design, with some sound reasoning – and a video – to back up her promotion.

A study to save effort is always a winner and Aditya Mittal and Bhavesh Shrivastava, Arrow, include the results of their comparisons in performing typical debug tasks. Although the aim is to save time, the authors have spent time in doing a thorough job on this study.

Are smart homes a viable reality? Benny Harven, Imagination Technologies, asks for a diary not for a webinar later this week (Nov 23) for smart home gateways – how to make them cost-effective and secure.

Changes in working practice mean sensors and security need attention and some help. Scott Jones, Maxim Integrated looks at the company’s latest reference design.

Still with sensors, Brian Derrick, Mentor Graphics Design, looks at how smartphones are opening up opportunities for sensor-based features for the IoT.

This week’s LA Auto Show, inspires Danny Shapiro, NVIDIA, to look at how the company is driving technology trends in vehicles. Amongst the name dropping (Tesla, Audi, IBM Watson) some of the pictures in the blog inspire pure auto-envy.

A guide to artificial intelligence (AI) by Douglas Fisher, Intel, has some insights into where and how it can be used and how the company is ‘upstreaming’ the technology.

Caroline Hayes, Senior Editor

Blog Review Monday, August 29, 2016

Monday, August 29th, 2016

This week’s blogs are futuristic, with machine learning, from Intel, augmented reality from Synopsys, smart city software from Dassault Systèmes, questions and answers about autonomous vehicles, and security issues, around devices and MQTT on the IoT.

Artificial intelligence is the next great wave, predicts Lenny Tran, Intel. His post looks at machine learning and Intel’s High Performance Computing architecture is part of the way forward in machine learning.

On a similar theme, Hezi Saar, Synopsys, examines the Microsoft 28nm SoC and is impressed with the possibilities for augmented reality that the HoloLens Processing Units has for this developing marketplace.

If you are dissatisfied with your present office location, Dassault Systèmes has plans for smart faciliites, reports Akio. He describes some illuminating projects using 3D Experience City, real-time monitoring, the IoT and systems operations for a comfortable workspace in smart cities.

It’s all about teamwork according to Brandon Wade, Aldec, who offers an introduction to the AXI protocol. His post summarizes the protocol specifications and shares his revelation at how understanding the protocol opens up a world of design possibilities.

Autonomous cars are occupying a lot of Eamonn Ahearne’s, ON Semiconductor, time. Living in the hotbed of self-drive test, he reads, visits and analyses what is happening and is disappointed that hardware is being eclipsed by software in the popularity stakes.

Also occupied with autonomous vehicles, Andrew Macleod, Mentor Graphics, starts with an update on electric vehicles, and moves onto the disconnect between ADAS technologies and autonomous vehicles and the engineering challenges that can be addressed using a single ECU (Engine Control Unit).

Attending the Linley Mobile & Wearables Conference, Paul McLellan, Cadence Design Systems, pays attention to Asaf Ashkenazi of Cryptography Research (now part of Rambus) and his well-illustrated post reports how devices can be secured.

An IoT network, powered by the ISO/IEC PRF 20922 standard MQTT (MQ Telemetry Transport) can be at risk, warns Wilfred Nilsen, ARM. It is a sound warning about personal information being vulnerable to MQTT brokers. Luckily, he offers a solution, introducing the SMQ IoT protocol.

Caroline Hayes, Senior Editor

Blog Review – Monday, August 15 2016

Monday, August 15th, 2016

In this collection, we define the IoT, investigate IP fingerprinting, and break into vehicles in the name of crypto-research. There is also prophesizing about 5G and disruption technology for technology, and relationship advice for computing and data.

Empathizing with anyone who has ever struggled with CMSIS RTOS API, Liviu Ionescu, ARM, offers a helping hand, catalogues the issues that can be encountered and reassures designers they are not alone and, more importantly, offers practical help.

Putting IP fingerprints to work may sound like the brief for an episode of CSI, but it is Warren Savage’s, (IP-extreme) recipe for successful SoC tapeout. He does do some CSI-style digging to thoroughly explain how to delve into a chip’s IP to limit the risks associated with IP reuse.

Listening intently at the Linley Mobile Conference, Paul McLellan, Cadence, sees the advent of 5G as good news for high-capacity, high-speed, low-latency wireless networks and linked with all things IoT.

Famous couplings, love and marriage, horse and carriage, could be joined by computing and data. Rob Crooke, Intel, believes that an increase in data and increased computing will transform cloud computing, but that memory storage has to keep up to realize smart cities to autonomous vehicles, industrial automation, medicine, immersive gaming to name a few. His post covers 3D XPoint and 3D NAND technology.

On security detail this week, Gabe Moretti, Chip Design magazine, finds a white paper from Intrinsic-ID that he recommends on the topic of embedded authentication which is vital to the secure operation of the IoT.

At the end of this year, the last Volkswagen Camper, (or kombi) van, will roll off the assembly line in Brazil. Robert Vamosi, Synopsys, includes the iconic vehicle in his post about a hack related in a paper authored by researchers at the University of Birmingham to clone a VW remote entry systems. The paper was presented at the Usenix cybersecurity conference in Austin, Texas, with reassurances that the group is in ‘constructive’ talks with VW.

For a vintage automobile to the latest, EV and PHEVs, Andrew Macleod, Mentor Graphics, looks at disruption they may bring to the automotive industry. Referring to account technology manager Paul Johnston’s presentation at 2016 IESF, he touches on the electrical engineering and embedded software challenges as well as the predicted scale of the EV industry.

Still looking at a market rather than the technology, Alex Voica, Imagination Technologies, looks at the IoT. He has some interesting graphs and statistics and asks some interesting questions around definitions, from what is the IoT and what defines a device.

Caroline Hayes, Senior Editor

Blog Review – Tuesday, May 31 2016

Tuesday, May 31st, 2016

Security issues around IoT and maritime vessels; CCIX Consortium accelerates data centers; Cheers for metering; Noise integrity in ADAS; Virtual Reality in practice

Protecting IoT devices is clearly and elegantly outlined by Jim Wallace, ARM, he includes illustrations, a lot of information and guidelines on advice on how security can produce new business models.

Accelerating data centers always raises interest and when names like AMD, ARM, Huawei, IBM, Mellanox, Qualcomm, and Xilinx come together. Steve Liebson, Xilinx, describes how the companies, via the CCIX (Cache Coherent Interconnect for Accelerators) Consortium are developing a single interconnect technology specification whereby processors using different instruction set architectures can share data with accelerators and enable efficient heterogeneous computing to improve efficiency.

Advocating an alternative to the plan to drink beer when the fresh water runs out, David Andeen, Maxim explains the importance of an ultrasonic water meter which can accelerate design cycles and reduce the cost of meters.

All in the name of research, Alexandru Voica, Imagination, tries his hand at Daydream, the Virtual Reality (VR) platform built on Android N and outlines the rules of VR.

Another cyber threat is identified by Robert Vamosi, Synopsys. His blog looks at research from Plymouth University and how vulnerable marine vessels can be at risk.

The undeniable increase in Advanced Driver Assistance Systems (ADAS) needs careful design consideration, and Ravi Ravikumar, ANSYS, discusses how the ANSYS CPS simulation helps power noise integrity to be met. His blog is informative, with some clear graphics to illustrate ADAS design.

For a quick catch-up on USB 3.1 and the Type-C connector, turn to Chris A Ciufo, eecatalog, for a quick reference guide. He includes some handy links for extra reading.

A review of the Bangalore, India, Design&Reuse event is provided by Steve Brown, Cadence Design Systems. A rundown of keynotes ends with a head-up for the next event.

Blog Review – Monday, March 21 2016

Monday, March 21st, 2016

Coffee breaks and C layers; Ideas for IoT security; Weather protection technology; Productivity boost; Shining a light on dark silicon

Empathizing with his audience, Jacek Majkowski, sees the need for coffee but not necessarily a C layer in Standard Co-Emulation Modelling Interface (SCE-MI).

At last week’s Bluetooth World, in Santa Clara, CA, there was a panel discussion – Is the IoT hype or hope? Brian Fuller, ARM, reports on the to-and-fro of ideas from experts from ARM, Google, and moderated by Mark Powell, executive director of the Bluetooth SIG.

Of all the things to do on a sabbatical, Matt Du Puy, ARM, chose to climb Dhaulagiri (26,795feet /8161m), described as one of the most dangerous 8,000m mountains. Brian Fuller, ARM, reports that he is armed a GPS watch with cached terrain data and some questionable film choices on a portable WiDi disk station.

Still with extremes of weather, the Atmel team, enthuses about a KickStarter project for the Oombrella, a smart umbrella that uses sensors to analyse temperature, pressure, humidity and light, to let you know if you will need it because rain is coming your way. Very clever as long as you remember to bring it with you. Not so appealing is the capacity to share via social media the type of weather you are experiencing – and they say the Brits are obsessed with the weather!

IoT protection is occupying an unidentified blogger at Rambus, who longs for a Faraday cage to shield it. The blog has some interesting comments about the make up of, and security measures for the IoT, while promoting the company’s CryptoManager.

Still with IoT security, Richard Anguiano, Mentor Graphics examines a gateway using ARM TrustZone, and heterogeneous operating system configurations and running Nucleus RTOS and Mentor Embedded Linux. There is a link provided to the Secure Converged IoT Gateway and the complete end-to-end IoT solution.

Europe is credited as the birthplace for the Workplace Transformation, but Thomas Garrison, Intel. Ahead of CEBIT he writes about the role of Intel’s 6 th Generation Core vPro processor and what it could mean for a PC’s battery life, compute performance and the user’s productivity.

The prospects for MIPI and future uses in wearables, machine learning, virtual reality and automotive ADAS are uppermost in the mind of Hezi Saar, Synopsys, following MIPI Alliance meetings. He was particularly taken with a Movidius vision processor unit, and includes a video in the blog.

Examining dark silicon, Paul McLellan, Cadence Design Systems, wonders what will supercede Dennard Scaling to overcome the limitations on power on large SoCs.

Caroline Hayes, Senior Editor

Blog Review – Monday, February 29, 2016

Monday, February 29th, 2016

ARM and Xilinx Embedded World highlights; Mobile World Congress news; Sensors are on a roll; What makes MIPI?

Ahead of the ARM Cortex-A32 processor announcement at Embedded World and Mobile World Congress, ARM announced its latest real-time processor IP, the ARM Cortex-R8, designed for LTE-Advanced and 5G designs. Neil Wermuller, ARM goes into detail about the Cortex-R8 quad-core, real-time processor, building on the ARMv7-R architecture.

Also at Embedded World, Mentor Embedded teamed up with Xilinx which used demonstrated the Xilinx ZYNQ 7000 platform, hosting a Nucleus RTOS. Andrew Patterson, Mentor, describes how this can be used in advanced driver assistance systems (ADAS)

More power for less dollars is driving demand in the consumer market. Alexandru Voica, Imagination Technologies, explains how the latest additions to the PowerVR series, PowerVR Series8XE meets efficiency and performance requirements.

When someone says “pass the masking tape” do check that it’s not a sensor network. The Atmel team blogs about SensorTape, the MIT Media Lab’s Responsive Environments group project for a sensor network that is on a roll.

Ahead of the MIPI Alliance event (March 7), Hezi Saar, Synopsys looks at what makes up the specification as it moves from the mobile marketplace.

Using a real-life crime to illustrate hazards, ARM’s Simon Segars focused on security at Mobile World Congress in Barcelona, Spain last week, reports Paul McLellan, Cadence. Other areas of interest was virtual reality, and an appearance by F1 racing driver, Lewis Hamilton, under the guise of discussing CAN in vehicles and what street cars could learn from F1.

Still with Mobile World Congress, Gary Bronner, Rambus, is quoted in report of the demonstration there of thermal-enabled lensless smart sensor (LSS) technology, by Rambus Labs. With the capability to replace traditional thermal lenses for IoT in medical equipment, manufacturing as well as the less obvious smart cities and transportation, this is a new approach to imaging, driven by computing rather than optics.

Striving to reduce debug effort and increase productivity is a noble cause, championed by Aditya Mittal, Arrow Devices. He looks at the AX13 system bus and its virtues as well as the company’s PDA tool.

Caroline Hayes, Senior Editor

Blog Review – Monday, July 27 2015

Monday, July 27th, 2015

IoT for ADAS; ESC 2015 focuses on security; untangling neural networks; what drives new tools; consolidation conundrum; IoT growth forecast; three ages of FPGA

Likening a business collaboration to a road trip may be stretching a metaphor that would make Jack Kerouac blush, but David McKinney, Intel, presses on as he explains Intel and QNX’s ADAS solution, based on Intel IoT for automobiles. He includes some interesting links and a video to inform the reader.

A review of ESC 2015 shows that Chris Ciufo is not only ahead of the curve, advocating embedded security, but also not one to pass by a freebie at a show. He relates some of the highlights from the first day of the Santa Clara event.

Neural network processors hold promise for computer vision, believes Jeff Bier, BDTI. His blog explains what work is needed for the scale of computation the industry expects.

Posing an interesting question, Carey Robertson, Mentor Graphics, asks what prompts the development of new tools. He blends this with helpful information about the newly launched Calibre xACT extraction tool, without too much “hard sell”.

“It works!” is the triumphant message of the blog co-authored by Jacek Duda and Steve Brown, Cadence. Reporting from this month’s workshop where Type-C USB was put through its paces.

What to do with wireless IP is asked and answered by Navari Nandra, Synopsys. He explains what can be done and how it can contribute to the IoT.

The SoC market is consolidating fast, says Rupert Baines, UltraSoC, on an IP Exteme blog. This poses two challenges that he believes licensed IP can simplify.

A common proposition is to move from Intel to ARM, and Rich Nass, ARM presents a well-rounded blog on how to make the transition, with some input from WinSystems hardware and software experts.

Forget consumer, the future of the IoT growth is in enterprise, reports Brian Fuller, ARM, observing analyst IDC’s webinar on which parts of the IoT will be lucrative and why.

Recalling the talk by Xilinx Fellow, Dr. Steve Trimberger, Steve Leibson, explains the three ages of the FPGA, with a link to a video on the history of the technology.

Caroline Hayes, Senior Editor

Formal, Logic Simulation, hardware emulation/acceleration. Benefits and Limitations

Wednesday, July 27th, 2016

Stephen Bailey, Director of Emerging Technologies, Mentor Graphics

Verification and Validation are key terms used and have the following differentiation:  Verification (specifically, hardware verification) ensures the design matches R&D’s functional specification for a module, block, subsystem or system. Validation ensures the design meets the market requirements, that it will function correctly within its intended usage.

Software-based simulation remains the workhorse for functional design verification. Its advantages in this space include:

-          Cost:  SW simulators run on standard compute servers.

-          Speed of compile & turn-around-time (TAT):  When verifying the functionality of modules and blocks early in the design project, software simulation has the fastest turn-around-time for recompiling and re-running a simulation.

-          Debug productivity:  SW simulation is very flexible and powerful in debug. If a bug requires interactive debugging (perhaps due to a potential UVM testbench issue with dynamic – stack and heap memory based – objects), users can debug it efficiently & effectively in simulation. Users have very fine level controllability of the simulation – the ability to stop/pause at any time, the ability to dynamically change values of registers, signals, and UVM dynamic objects.

-          Verification environment capabilities: Because it is software simulation, a verification environment can easily be created that peeks and pokes into any corner of the DUT. Stimulus, including traffic generation / irritators can be tightly orchestrated to inject stimulus at cycle accuracy.

-          Simulation’s broad and powerful verification and debug capabilities are why it remains the preferred engine for module and block verification (the functional specification & implementation at the “component” level).

If software-based simulation is so wonderful, then why would anyone use anything else?  Simulation’s biggest negative is performance, especially when combined with capacity (very large, as well as complex designs). Performance, getting verification done faster, is why all the other engines are used. Historically, the hardware acceleration engines (emulation and FPGA-based prototyping) were employed latish in the project cycle when validation of the full chip in its expected environment was the objective. However, both formal and hardware acceleration are now being used for verification as well. Let’s continue with the verification objective by first exploring the advantages and disadvantages of formal engines.

-          Formal’s number one advantage is its comprehensive nature. When provided a set of properties, a formal engine can exhaustively (for all of time) or for a, typically, broad but bounded number of clock cycles, verify that the design will not violate the property(ies). The prototypical example is verifying the functionality of a 32-bit wide multiplier. In simulation, it would take far too many years to exhaustively validate every possible legal multiplicand and multiplier inputs against the expected and actual product for it to be feasible. Formal can do it in minutes to hours.

-          At one point, a negative for formal was that it took a PhD to define the properties and run the tool. Over the past decade, formal has come a long way in usability. Today, formal-based verification applications package properties for specific verification objectives with the application. The user simply specifies the design to verify and, if needed, provides additional data that they should already have available; the tool does the rest. There are two great examples of this approach to automating verification with formal technology:

  • CDC (Clock Domain Crossing) Verification:  CDC verification uses the formal engine to identify clock domain crossings and to assess whether the (right) synchronization logic is present. It can also create metastability models for use with simulation to ensure no metastability across the clock domain boundary is propagated through the design. (This is a level of detail that RTL design and simulation abstract away. The metastability models add that level of detail back to the simulation at the RTL instead of waiting for and then running extremely long full-timing, gate-level simulations.)
  • Coverage Closure:  During the course of verification, formal, simulation and hardware accelerated verification will generate functional and code coverage data. Most organizations require full (or nearly 100%) coverage completion before signing-off the RTL. But, today’s designs contain highly reusable blocks that are also very configurable. Depending on the configuration, functionality may or may not be included in the design. If it isn’t included, then coverage related to that functionality will never be closed. Formal engines analyze the design, in its actual configuration(s) that apply, and does a reachability analysis for any code or (synthesizable) functional coverage point that has not yet been covered. If it can be reached, the formal tool will provide an example waveform to guide development of a test to achieve coverage. If it cannot be reached, the manager has a very high-level of certainty to approving a waiver for that coverage point.

-          With comprehensibility being its #1 advantage, why doesn’t everyone use and depend fully on formal verification:

  • The most basic shortcoming of formal is that you cannot simulate or emulate the design’s dynamic behavior. At its core, formal simply compares one specification (the RTL design) against another (a set of properties written by the user or incorporated into an automated application or VIP). Both are static specifications. Human beings need to witness dynamic behavior to ensure the functionality meets marketing or functional requirements. There remains no substitute for “visualizing” the dynamic behavior to avoid the GIGO (Garbage-In, Garbage-Out) problem. That is, the quality of your formal verification is directly proportional to the quality (and completeness) of your set of properties. For this reason, formal verification will always be a secondary verification engine, albeit one whose value rises year after year.
  • The second constraint on broader use of formal verification is capacity or, in the vernacular of formal verification:  State Space Explosion. Although research on formal algorithms is very active in academia and industry, formal’s capacity is directly related to the state space it must explore. Higher design complexity equals more state space. This constraint limits formal usage to module, block, and (smaller or well pruned/constrained) subsystems, and potentially chip levels (including as a tool to help isolate very difficult to debug issues).

The use of hardware acceleration has a long, checkered history. Back in the “dark ages” of digital design and verification, gate-level emulation of designs had become a big market in the still young EDA industry. Zycad and Ikos dominated the market in the late 1980’s to mid/late-1990’s. What happened?  Verilog and VHDL plus automated logic synthesis happened. The industry moved from the gate to the register-transfer level of golden design specification; from schematic based design of gates to language-based functional specification. The jump in productivity from the move to RTL was so great that it killed the gate-level emulation market. RTL simulation was fast enough. Zycad died (at least as an emulation vendor) and Ikos was acquired after making the jump to RTL, but had to wait for design size and complexity to compel the use of hardware acceleration once again.

Now, 20 years later, it is clear to everyone in the industry that hardware acceleration is back. All 3 major vendors have hardware acceleration solutions. Furthermore, there is no new technology able to provide a similar jump in productivity as did the switch from gate-level to RTL. In fact, the drive for more speed has resulted in emulation and FPGA prototyping sub-markets within the broader market segment of hardware acceleration. Let’s look at the advantages and disadvantages of hardware acceleration (both varieties).

-          Speed:  Speed is THE compelling reason for the growth in hardware acceleration. In simulation today, the average performance (of the DUT) is perhaps 1 kHz. Emulation expectations are for +/- 1 MHz and for FPGA prototypes 10 MHz (or at least 10x that of emulation). The ability to get thousands of more verification cycles done in a given amount of time is extremely compelling. What began as the need for more speed (and effective capacity) to do full chip, pre-silicon validation driven by Moore’s Law and the increase in size and complexity enabled by RTL design & design reuse, continues to push into earlier phases of the verification and validation flow – AKA “shift-left.”  Let’s review a few of the key drivers for speed:

  • Design size and complexity:  We are well into the era of billion gate plus design sizes. Although design reuse addressed the challenge of design productivity, every new/different combination of reused blocks, with or without new blocks, creates a multitude (exponential number) of possible interactions that must be verified and validated.
  • Software:  This is also the era of the SoC. Even HW compute intensive chip applications, such as networking, have a software component to them. Software engineers are accustomed to developing on GHz speed workstations. One MHz or even 10’s of MHz speeds are slow for them, but simulation speeds are completely intolerable and infeasible to enable early SW development or pre-silicon system validation.
  • Functional Capabilities of Blocks & Subsystems:  It can be the size of input data / simuli required to verify a block’s or subsystem’s functionality, the complexity of the functionality itself, or a combination of both that drives the need for huge numbers of verification cycles. Compute power is so great today, that smartphones are able to record 4k video and replay it. Consider the compute power required to enable Advanced Driver Assistance Systems (ADAS) – the car of the future. ADAS requires vision and other data acquisition and processing horsepower, software systems capable of learning from mistakes (artificial intelligence), and high fault tolerance and safety. Multiple blocks in an ADAS system will require verification horsepower that would stress the hardware accelerated performance available even today.

-          As a result of these trends which appear to have no end, hardware acceleration is shifting left and being used earlier and earlier in the verification and validation flows. The market pressure to address its historic disadvantages is tremendous.

  • Compilation time:  Compilation in hardware acceleration requires logic synthesis and implementation / mapping to the hardware that is accelerating the simulation of the design. Synthesis, placement, routing, and mapping are all compilation steps that are not required for software simulation. Various techniques are being employed to reduce the time to compile for emulation and FPGA prototype. Here, emulation has a distinct advantage over FPGA prototypes in compilation and TAT.
  • Debug productivity:  Although simulation remains available for debugging purposes, you’d be right in thinking that falling back on a (significantly) slower engine as your debug solution doesn’t sound like the theoretically best debug productivity. Users want a simulation-like debug productivity experience with their hardware acceleration engines. Again, emulation has advantages over prototyping in debug productivity. When you combine the compilation and debug advantages of emulation over prototyping, it is easy to understand why emulation is typically used earlier in the flow, when bugs in the hardware are more likely to be found and design changes are relatively frequent. FPGA prototyping is typically used as a platform to enable early SW development and, at least some system-level pre-silicon validation.
  • Verification capabilities:  While hardware acceleration engines were used primarily or solely for pre-silicon validation, they could be viewed as laboratory instruments. But as their use continues to shift to earlier in the verification and validation flow, the need for them to become 1st class verification engines grows. That is why hardware acceleration engines are now supporting:
    • UPF for power-managed designs
    • Code and, more appropriately, functional coverage
    • Virtual (non-ICE) usage modes which allow verification environments to be connected to the DUT being emulated or prototyped. While a verification environment might be equated with a UVM testbench, it is actually a far more general term, especially in the context of hardware accelerated verification. The verification environment may consist of soft models of things that exist in the environment the system will be used in (validation context). For example, a soft model of a display system or Ethernet traffic generator or a mass storage device. Soft models provide advantages including controllability, reproducibility (for debug) and easier enterprise management and exploitation of the hardware acceleration technology. It may also include a subsystem of the chip design itself. Today, it has become relatively common to connect a fast model written in software (usually C/C++) to an emulator or FPGA prototype. This is referred to as hybrid emulation or hybrid prototyping. The most common subsystem of a chip to place in a software model is the processor subsystem of an SoC. These models usually exist to enable early software development and can run at speeds equivalent to ~100 MHz. When the processor subsystem is well verified and validated, typically a reused IP subsystem, then hybrid mode can significantly increase the verification cycles of other blocks and subsystems, especially driving tests using embedded software and verifying functionality within a full chip context. Hybrid mode can rightfully be viewed as a sub-category of the virtual usage mode of hardware acceleration.
    • As with simulation and formal before it, hardware acceleration solutions are evolving targeted verification “applications” to facilitate productivity when verifying specific objectives or target markets. For example, a DFT application accelerates and facilitates the validation of test vectors and test logic which are usually added and tested at the gate-level.

In conclusion, it may seem that simulation is being used less today. But, it is all relative. The total number of verification cycles is growing exponentially. More simulation cycles are being performed today even though hardware acceleration and formal cycles are taking relatively larger pieces of the overall verification pie. Formal is growing in appeal as a complementary engine. Because of its comprehensive verification nature, it can significantly bend the cost curve for high-valued (difficult/challenging) verification tasks and objectives. The size and complexity of designs today require the application of all verification engines to the challenges of verifying and validating (pre-silicon) the hardware design and enabling early SW development. The use of hardware acceleration continues to shift-left and be used earlier in the verification and validation flow causing emulation and FPGA prototyping to evolve into full-fledged verification engines (not just ICE validation engines).

Next Page »