Part of the  

Chip Design Magazine

  Network

About  |  Contact

Posts Tagged ‘FPGA’

Next Page »

Blog Review – Monday, June 26, 2017

Monday, June 26th, 2017

This week, hot on the heels of DAC, a review of the Austin event; Intel administers a dose of precision medicine; Challenges for drivers; How to choose between a GPU or FPGA and a blockchain reaction for the IoT

DAC 2017 took place in Austin, Texas, and Paul MeLellan, Cadence Design Systems, was there and has collated a wide-ranging report, with day-by-day news, including bats and bagpipes from the 54 th incarnation of the event.

Writing from a very personal viewpoint, Bryce Olson, Intel, advocates precision medicine, and looks at Intel’s scalable reference architecture to speed up the research and answers in medical care.

Vehicle safety is critical, and Stephen Pateras, Mentor Graphics, looks at self-test and monitoring in autonomous cars, using the Tessent MissionMode architecture. He explains in a clear, detailed manner, the IC test capabilities and simulation for self-driving cars.

Still with vehicle design, Robert Vamosi, Synopsys, flags up the security hazards around the connected car as sensors proliferate and hackers ramp up their assaults. He advocates software security and the communication protection afforded by the IEEE 802.11p protocol.

A handy white paper is brought to our attention by Steve Leibson, Xilinx, for those deciding whether a GPU is better than an FPGA in cloud computing, machine leaning, video and image processing applications.

I learned a couple of things from Christine Young, Maxim Integrated this week. One is that there is a job title of ‘chief IoTologist’, the other was to put the term ‘blockchain’ into context for the IoT. She reports from the IoT World Conference about how blockchain, using advanced cryptography, provides a “tamper-proof distributed record of transactions” and how the IoT Alliance is occupied in developing a shared blockchain protocol as a common identifier to secure IoT products.

Starstruck John Blyler, looks at the reality behind the stardust and conducts an interview with Dr Clifford Johnson, physicist at University of Southern California and script adviser for the National Geographic Channel’s TV program, Genius, about Albert Einstein.

Cadence Launches New Verification Solutions

Tuesday, March 14th, 2017

Gabe Moretti, Senior Editor

During this year’s DVCon U.S. Cadence introduced two new verification solutions: the Xcelium Parallel Simulator and the Protium S1 FPGA-Based Prototyping Platform, which incorporates innovative implementation algorithms to boost engineering productivity.

Xcelium Parallel Simulator

.The new simulation engine is based on innovative multi-core parallel computing technology, enabling systems-on-chip (SoCs) to get to market faster. On average, customers can achieve 2X improved single-core performance and more than 5X improved multi-core performance versus previous generation Cadence simulators. The Xcelium simulator is production proven, having been deployed to early adopters across mobile, graphics, server, consumer, internet of things (IoT) and automotive projects.

The Xcelium simulator offers the following benefits aimed at accelerating system development:

  • Multi-core simulation improves runtime while also reducing project schedules: The third generation Xcelium simulator is built on the technology acquired from Rocketick. It speeds runtime by an average of 3X for register-transfer level (RTL) design simulation, 5X for gate-level simulation and 10X for parallel design for test (DFT) simulation, potentially saving weeks to months on project schedules.
  • Broad applicability: The simulator supports modern design styles and IEEE standards, enabling engineers to realize performance gains without recoding.
  • Easy to use: The simulator’s compilation and elaboration flow assigns the design and verification testbench code to the ideal engines and automatically selects the optimal number of cores for fast execution speed.
  • Incorporates several new patent-pending technologies to improve productivity: New features that speed overall SoC verification time include SystemVerilog testbench coverage for faster verification closure and parallel multi-core build.

“Verification is often the primary cost and schedule challenge associated with getting new, high-quality products to market,” said Dr. Anirudh Devgan, senior vice president and general manager of the Digital & Signoff Group and the System & Verification Group at Cadence. “The Xcelium simulator combined with JasperGold Apps, the Palladium Z1 Enterprise Emulation Platform and the Protium S1 FPGA-Based Prototyping Platform offer customers the strongest verification suite on the market”

The new Xcelium simulator further extends the innovation within the Cadence Verification Suite and supports the company’s System Design Enablement (SDE) strategy, which enables system and semiconductor companies to create complete, differentiated end products more efficiently. The Verification Suite is comprised of best-in-class core engines, verification fabric technologies and solutions that increase design quality and throughput, fulfilling verification requirements for a wide variety of applications and vertical segments.

Protium S1

The Protium S1 platform provides front-end congruency with the Cadence Palladium Z1 Enterprise Emulation Platform. BY using Xilinx Virtex UltraScale FPGA technology, the new Cadence platform features 6X higher design capacity and an average 2X performance improvement over the previous generation platform. The Protium S1 platform has already been deployed by early adopters in the networking, consumer and storage markets.

Protium S1 is fully compatible with the Palladium Z1 emulator

To increase designer productivity, the Protium S1 platform offers the following benefits:

  • Ultra-fast prototype bring-up: The platform’s advanced memory modeling and implementation capabilities allow designers to reduce prototype bring-up from months to days, thus enabling them to start firmware development much earlier.
  • Ease of use and adoption: The platform shares a common compile flow with the Palladium Z1 platform, which enables up to 80 percent re-use of the existing verification environment and provides front-end congruency between the two platforms.
  • Innovative software debug capabilities: The platform offers firmware and software productivity-enhancing features including memory backdoor access, waveforms across partitions, force and release, and runtime clock control.

“The rising need for early software development with reduced overall project schedules has been the key driver for the delivery of more advanced emulation and FPGA-based prototyping platforms,” said Dr. Anirudh Devgan, senior vice president and general manager of the Digital & Signoff Group and the System & Verification Group at Cadence. “The Protium S1 platform offers software development teams the required hardware and software components, a fully integrated implementation flow with fast bring-up and advanced debug capabilities so they can deliver the most compelling end products, months earlier.”

The Protium S1 platform further extends the innovation within the Cadence Verification Suite and supports the company’s System Design Enablement (SDE) strategy, which enables system and semiconductor companies to create complete, differentiated end products more efficiently. The Verification Suite is comprised of best-in-class core engines, verification fabric technologies and solutions that increase design quality and throughput, fulfilling verification requirements for a wide variety of applications and vertical segments.

Blog Review – Monday May 16, 2016

Monday, May 16th, 2016

Ramifications for Intel; Verification moves to ASIC; Connected cars; Deep learning is coming; NXP TFT preview

Examining the industry’s transition to 5G, Dr. Venkata Renduchintala, Intel, describes the revolution of connectivity and why the company is shifting its SoC focus and exploit its ecosystem.

Coming from another angle, Chris Ciufo, Intel Embedded, assess the impacts of the recently announced changes at Intel, including the five pillars designed to support the company: data center, memory, FPGAs, IoT and 5G, with his thoughts on what it has in its arsenal to achieve the new course.

As FPGA verification flows move closer to those of ASICs, Dr. Stanley Hyduke, Aldec, looks at why the company has extended its verification tools for digital ASIC design, including the steps involved.

Software in vehicles is a sensitive topic for some, since the VW emissions scandal, but Synopsys took the opportunity of the Future Connect Cars Conference in Santa Clara, to highlight its Software Integrity Platform. Robert Vamosi, Synopsys, reports on some of the presentations at the event on the automotive industry.

Identifying excessive blocking in sequential programming as evil, Miro Samek, ARM, write a spirited and interesting blog on real-time design strategy and the need to keep it flexible, from the earliest stages.

Santa Clara also hosted the Embedded Vision Summit, and Chris Longstaff, Imagination Technologies, writes about deep learning on mobile devices. He notes that Cadence Design Systems highlighted the increase in the number of sensors in devices today, and Google Brain’s Jeff Dean talked about the use of deep learning via GoogLeNet Inception architecture. The blog also includes examples of Convolutional Neural Networks (CNN) and how PowerVR mobile GPUs can process the complex algorithms.

This week, NXP FTF (Freescale Technology Forum), in Austin, Texas, is previewed by Ricardo Anguiano, Mentor Graphics. He looks at a demo from the company, where a simultaneous debug of a patient monitoring system runs Nucleus RTOS on the ARM Cortex-M4. He hints at what attendees can see using Sourcery CodeBench with ARM processors and a link to heterogeneous solutions from the company.

Caroline Hayes, Senior Editor

Blog Review – Monday, February 15, 2016

Monday, February 15th, 2016

Research converts contact lens to computer screens; What to see at Embedded World 2016; Remembering Professor Marvin Minsky; How fast is fast and will the IoT protect us?

The possibilities for wearable technology, where a polymer film coating can turn a contact lens into a computer screen are covered by Andrew Spence Nanontechnology University of South Australia’s Future Industries Institute. The lens can be used as a sensor to measure blood glucose levels to a pair of glasses acting as a computer screen.

If you are preparing your Embedded World 2016, Nuremberg, schedule, Philippe Bressy, ARM offers an overview of what will be at his favourite event. He covers the company’s offerings for IoT and connectivity, single board computing, software productivity, automotive and from ARM’s partners to be seen on the ARM booth (Hall 5, stand 338), as well as some of the technical conference’s sessions and classes.

Other temptations can be found at the Xilinx booth at Embedded World (Hall 1, stand 205). Steve Leibson, Xilinx explains how visitors can win a Digilent ARTY Dev Kit based on an Artix-7 A35T -1LI FPGA, with Xilinx Vivado HLx Design Edition.

Showing more of what can be done with the mbed IoT Device Platform, Liam Dillon, ARM, writes about the reference system for SoC design for IoT endpoints, and its latest proof-of-concept platform, Beetle.

How fast is fast, muses Richard Mitchell, Ansys. He focuses on the Ansys 17.0 and its increased speeds for structural analysis simulations and flags up a webinar about Ansys Mechanical using HPC on March 3.

If the IoT is going to be omnipresent, proposes Valerie C, Dassault, can we be sure that it can protect us and asks, what lies ahead.

A pioneer of artificial intelligence, Professor Marvin Minsky as died at the age of 88. Rambus fellow, Dr David G Stork, remembers the man, his career and his legacy on this field of technology.

I do enjoy Whiteboard Wednesdays, and Corrie Callenback, Cadence, has picked a great topic for this one – Sachin Dhingra’s look at automotive Ethernet.

Another thing I particularly enjoy is a party, and Hélène Thibiéroz, Synopsys reminds us that it is 35 years since HSPICE was introduced. (Note to other party-goers: fireworks to celebrate are nice, but cake is better!)

Caroline Hayes, European Editor

Blog Review – Monday, January 11, 2016

Monday, January 11th, 2016

In this week’s review, as one blog has predictions for what 2016 holds, another reviews 2015. Others cover an autonomous flight drone; a taster of DesignCon 2016 and a bionic leg development.

Insisting it’s not black magic or fortune telling but a retelling of notes from past press announcements, Dick James, Chipworks, thinks 2016 will be a year of mixed fortunes, with a low profile for leading edge processes and plenty of activity in memory and sensors as the sectors reap the rewards of developments being realized in the marketplace.

Looking back on 2015, Tom De Schutter, Synopsys, is convinced that the march of software continues and world domination is but a clock cycle away. His questions prompted some interesting feedback on challenges, benefits and working lives.

Looking ahead to autonomous drone flight, Steve Leibson, Xilinx, reports on the the beta release of Aerotenna’s OCPoC (Octagonal Pilot on Chip) ready-to-fly drone-control, based on a Zynq Z-7010 All Programmable SoC with integrated IMU (inertial measurement unit) sensors and GPS receiver.

Bigger isn’t always better, explains Doug Perry, Doulos, in a guest blog for Aldec. As well as outlining the issues facing those verifying larger FPGAs, he provides a comprehensive, and helpful, checklist to tackle this increasingly frequent problem, while throwing in a plug for two webinars on the subject.

Some people have barely unpacked from CES, and ANSYS is already preparing for DesignCon 2016. Margaret Schmitt previews the company’s plan for ‘designing without borders’ with previews of what, and who, can be seen there.

A fascinating case study is related by Karen Schulz, Gumstix, on the ARM Community blog site. The Rehabilitation Institute of Chicago has (RIC) has developed the first neural-controlled bionic leg, without using no nerve redirection surgery or implanted sensors. The revolution is powered by the Gumstix Overo Computer-on-Module.

Showing empathy for engineers struggling with timing closure, Joe Hupcey III, Mentor Graphics, has some sound advice and diagnoses CDC problems. It’s not as serious as it sounds, CDC, or clock domain crossing, can be addressed with IEEE 1801 low power standard. Just what the doctor ordered.

Caroline Hayes, Senior Editor

Blog Review – Monday, October 26, 2015

Monday, October 26th, 2015

Counting gates til the chickens come home to roost; Bio lab on a desk; Twin city goes digital; Back to the Future Day; Graphics SoC playground; Wearables get graphic

Something is troubling Michael Posner, Synopsys, when is a gate not a gate? He discusses the FPGA capacity of Xilinx’s UltraScale FPGAs and tries to find the answer. He also describes his Heath Robinson style light controlled chicken feeder he has installed in the chicken coop.

A desktop biolab sounds like something in a teenage boy’s room, but Amino is the ‘brainchild’ relates Atmel of Julie Legault. The Arduino-based bio-engineering system enables anyone to grow and take care of living cells. The mini lab allows the user to genetically transform an organism’s DNA through guided interactions. The Arduino-driven hardware monitors the resulting synthetic organism which needs to be fed nd kept warm. For those old enough to remember the Tamagotchi craze – it just moved up a gear.

3D computer models of buildings and cities take on a new role, demonstrated by Dassault Systèmes, whose 3DEXPERIENCity continuously generates the city as a digital twin city. Ingeborg Rocker explains how the IoT is used by the multi-dimensional data model which integrates population density, traffic density, weather, energy supply and recycling volumes data in real time to support city planners.

Recent acquisitions in the industry are analysed by Paul McLellan, Cadence Design Systems. Beginning with the acquisition of Carbon Design Systems by ARM, McLellan puts the deal in a market and engineering context. He moves on to the acquisition by Lam Research of KLA-Tencor and Western Digital which has bought SanDisk.

Putting the AMD R-Series through its paces, Christopher Hallinan, Mentor Graphics, delights in the versatility of the SoC, as discovered with Mentor Embedded Linux. He gives real-life examples of algorithms and how the visuals apply to industrial and scientific applications.

Celebrating a noteworthy date Back to the Future Day – October 21 2015 – Tobias Wilson-Bates, Georgia Tech, looks at how time travel has been portrayed in fiction. It gets philosophical: “One way to think about future speculations is to imagine that there are all these failed futures that co-exist with a present reality” but Marty would approve.

The acceptance of Mali-470 GPU to the wearables camp is complete. Dan Wilson, ARM, explains how the GPU is exploiting its OpenGL ES 2.0 graphics standard and power consumption for wearable and IoT applications.

Caroline Hayes, Senior Editor

Focus on France in Nuremberg

Friday, March 27th, 2015

Although the venue was the German city of Nuremberg, there was a distinctive coterie of French companies at Embedded World, writes Caroline Hayes, senior editor.

The market is not standing still, as evidenced by the acquisition of hardware companies in the PLDA Group by ReFLEX CES. The modified off the shelf and turnkey embedded systems company used Embedded World 2015 as a platform for its FPGA-based boards and System on Module (SoM) lines that were acquired in the sale of the sister companies, in January.

The acquisition brings the Accelize range, dedicated to the financial market, with FPGA accelerator boards, FPGA network processing boards, FPGA prototyping boards and SoMs alongside ReFLEX CES’ complex, high-speed and mezzanine FPGA boards based on PCIe, VPX and CompactPCI form factors.

“By uniting PLDA Group’s hardware solutions at ReFLEX CES, we strengthen our position as a leading provider of proven FPGA-based hardware for both COTS and MOTS FPGA solutions,” said Sylvain Neveu, ReFLEX CES and PLDA Group COO. He added that the diversification is expected to pave the way for a forecast growth rate of over 20% this year.

FPGA accelerator boards, network processing boards, and the XpressGX4, XpressGX5, XpressK7, XpressV7, XPressKUS prototyping boards as well as SoMs based on Xilinx and Altera FPGA devices took centre stage.

The company also launched FPGA boards based on Altera’s Arria10 FPGA at the show. The Arria10 GX FPGA board and an Arria10 SoC board target military and defense, communications, broadcast, high-performance computing, test and measurement and medical applications.

The Stimulus tool from Argosim saves rework time, says Argosim

Another Altera FPGA, Cyclone 5, was used in a GigE vision video demonstration board showing the company’s baseboard design, custom embedded design for system, board, firmware and software as well as manufacturing capabilities.

In another hall, Argosim was demonstrating a modelling and simulation tool that validates requirements before the design begins to help developers speed time to market.

The Stimulus modeling and simulation environment edits, debugs and tests design requirements so that system designers can verify the requirements before design begins.

The company estimates that around 40 to 60% of design bugs are caused by faulty requirements, so being able to verify them before beginning work reduces specification errors, process iteration and compliance costs for certification.

The tool’s high level modeling language formalizes natural language requirements, allowing the developer to express natural language requirements using formalized language templates, state machines, and block diagrams. These requirements can then be fully tested to identify errors before any design or coding takes place. A simulation engine generates and analyses executable traces to test requirements for real-time, safety-critical systems. The tool allows developers to define generic test scenarios and debug against inputs and generate test vectors for software-in-the-loop validation to reduce integration rework.

Fabien Gaucher, CEO, Argosim, explained that it is the ability to strip out some of the rework currently incurred that shortens time to market, yet the faults are often only found after coding and as a result of testing. One application example is the French electricity supplier, EDF. The utility company generates over 60GWe a year from over 50 nuclear reactors, and must comply with rigorous safety-critical standards. It uses Stimulus to validate functional system requirements early, and independently, from the design choices made by third-party sub-contractors.

It was a busy show for IP company, Cortus, with three software partners announced and two new cores.

The partners are Oryx Embedded, which has produced the CycloneTCP dual IP v4/v6 stack to support IPv6 which is the next-generation Internet Protocol, enabling more IP addresses, which can be connected in the Internet of Things. The second partner announcement is from Nabto, an integrated design environment (IDE), designed to offer licensees secure point-to-point connectivity from mobile devices, PCs and data systems for Nabto’s interfaces. Finally, the company announced that Blunk Microsystems’TargetTools is now interfaced to the Cortus toolchain as an IDE for TargetOS, the pre-emptive real-time operating system (RTOS) ported to the Cortus APS architecture.

The company has also announced design wins for some significant companies, such as Atmel’s WINC1500, the Cortus APS3-based, IEEE 802.11b/g/n IoT network controller, used in, among other WiFi applications, the Atmel Arduino WiFi shield 101. Microsemi has also integrated the Cortus APS1 in a family of smart sensor interface ICs, and StarChip, also based in France, has announced the third generation of Cortus-based SIM IC controllers.

I asked Roddy Urqhart, Vice President of Sales and Marketing, Cortus, about the ‘groundswell’ of companies from France at this year’s Embedded World. “Cortus is based near the University of Montpelier and other technology companies, so there is the local base, but also initiatives from the French Government, at local and national level,” he says “together with grants to promote R&D”. He also talked about the synergy of the south of France, with its strength in smartcards and SIM technology, such as Starchip which supplies controllers for cards up to and including LTE technology.

“The Cortus minimalist core makes it suitable for the cost-sensitive smartcard and SIM market,” he continues. “The minimal processor architecture can be used for any embedded application to save silicon, power and to add security,” he adds. The company reports a ramping up of licensees in the first half of this year in connected intelligent devices such as IoT network controllers, smart sensors, touchscreen controllers and next-generation SIM cards.

The company’s latest cores, the APS23 and APS25 were released in October 2014 and are based on the Cortus v2 instruction set.

By Caroline Hayes, senior editor.

Week in Review October 29

Tuesday, October 29th, 2013


Altera chooses quad-core 64bit ARM Cortex-A53 for Stratix 10 SoCs
At ARM TechCon, this week, Altera announced that its Stratix 10 SoC devices, manufactured on Intel’s 14nm Tri-Gate process, will incorporate the quad-core, 64bit ARM Cortex-A53 processor. It is the first 64bit processor used on an SoC FPGA system, says the company, which will roll out in May 2014, and will supersede the previous mid-range FPGA series, Stratix 5, says Chris Balough, senior director, SoC Products, Altera.

Spreadtrum signs ARM access license agreement
Fabless semiconductor company, Spreadtrum has signed a license agreement with ARM to supply Artisan physical IP for the IC foundry and 28nm processes, providing the Chinese company access to Artisan standard cells, next-generation memory compilers, including single- and dual-port SRAM compilers, on- and two-port register file compilers and ROM compilers, as well as POP IP for ARM Cortex processors and Mali GPUs.

IP Core interconnects mixed FPGAs
French design and manufacture company, Reflex CES, has released an FPGA Aurora-like 64B/66B IP core, to interconnect Xilinx and Altera high speed transceiver FPGAs for embedded military and telecommunications. It is based on Altera FPGAs and supports encoding and high-speed interfaces up to 14.4Gbit/s, enabling interoperability between the two competitors’ FPGAs, with an effective bandwidth of up to 97%, says the company.

Programmable SoCs use energy-friendly ARM technology
Multi-core microcontroller company, XMOS, has teamed up with Silicon Labs to form a technology partnership to integrate ARM technology into its xCORE multi-core microcontrollers to produce the next generation of programmable SoCs (system on chips). Silicon Labs will contribute the EFM32 Gecko, energy-efficient ARM Cortex-M3-based microcontrollers – the xCORE-XA (eXtended Architecture) family. The xCORE-XA technology was at ARM TechCon 2013

Blog Review October 10 2013

Thursday, October 10th, 2013

By Caroline Hayes

At the TSMC Open Innovation Platform (TSMC OIP) Ecosystem Forum, Richard Goering hears that 16mm FinFET design and 3D ICs are moving closer to volume production. Dr Cliff Hou, vice president, R&D, TSMC warned that although EDA tools and flows have been qualified, foundation IP has been validated, and interface IP is under development, one tool does not guarantee success, calling for a “more rigorous validation methodology”.

Steve Favre was also at TSMC OIP, discussing 450nm wafers. He wondered why EUV (extreme ultra violet) patterning has become a gating item for the move to 450nm, and how are these two related? Money, as usual, is the answer, It would cost billions of dollars to build a 450nm wafer fab and billions to move to EUV – why pay twice?

Lakshmi Mandyam from ARM’s smart connected community reflects on her journey from the power-hungry, boot-up slow laptop to a touch-sensor, multi-screen tablet. She ends by marking the anniversary of her laptop-free life. Maybe she should start an LA (Laptop Anonymous) support group?

Chip Design’s John Byler cringes with embarrassment while following up a nanotechtechnology lead at IEF in Dublin, Ireland. The lapse of government funding is proclaimed on the National Institute of Standards and Technology, accounting for the website’s and its affiliated websites’ closure. He turns to the French for further research, over a croissant – naturellement.

Pity Brian Fuller, caught off-guard by the usually genial  analyst Gary Smith in an interview for Unhinged.  Smith urged EDA vendors to be bolder, pooh-poohed the idea of industry consolidation, held forth on the power of the press and then complimented John Cooley. What is the world coming to?

Michael Posner sounds the alarm that “My RTL is an alien”, neatly timed to coincide with a white paper by Synopsys which details ways to accelerate FPGA (field programmable gate array)-based prototyping. With over 70% of today’s ASICs and systems-on-chips (SoCs) being prototyped in an FPGA, designers are looking for ways to ease the creation of FPGA-based prototypes directly from the ASIC design source files.

Gabe Moretti is feeling nostalgic in preparation for the Back to the Future Dinner organized by the EDA Consortium at the Computer Museum, Mountain View, California, this month.

In this blog he remembers the early days of EDA, when it was called CAD (computer aided design) and ruylith cut by hand. Those were the days!

EDA in the year 2017 – Part 2

Tuesday, January 17th, 2017

Gabe Moretti, Senior Editor

The first part of the article, published last week, covered design methods and standards in EDA together with industry predictions that impacted all of our industry.  This part will cover automotive, design verification and FPGA.  I found it interesting that David Kelf, VP of Marketing at OneSpin Solutions, thought that Machine learning will begin to penetrate the EDA industry as well.  He stated: “Machine Learning hit a renaissance and is finding its way into a number of market segments. Why should design automation be any different?  2017 will be the start of machine learning to create a new breed of design automation tool, equipped with this technology and able to configure itself for specific designs and operations to perform them more efficiently. By adapting algorithms to suit the input code, many interesting things will be possible.”

Rob Knoth, Product Management Director, Digital and Signoff Group at Cadence touched on an issue that is being talked about more recently: security.  He noted that: “In 2016, IoT bot-net attacks brought down large swaths of the Internet – the first time the security impact of IoT was felt by many. Private and nation-state attacks compromised personal/corporate/government email throughout the year. “

In 2017, we have the potential for security concerns to start a retreat from always-on social media and a growing value on private time and information. I don’t see a silver bullet for security on our horizon. Instead, I anticipate an increasing focus for products to include security managers (like their safety counterparts) on the design team and to consider safety from the initial concept through the design/production cycle.

Figure 1.  Just one of the many electronics systems found in an automobile (courtesy of Mentor)

Automotive

The automotive industry has increased the use of electronics year over year for a long time.  At this point an automobile is a true intelligent system, at least as far as what the driver and passenger can see and hear the “infotainment system”.  Late model cars also offer collision avoidance and stay-in-lane functions, but more is coming.

Here is what Wally Rhines thinks: “Automotive and aerospace designers have traditionally been driven by mechanical design.  Now the differentiation and capability of cars and planes is increasingly being driven by electronics.  Ask your children what features they want to see in a new car.  The answer will be in-vehicle infotainment.  If you are concerned about safety, the designers of automobiles are even more concerned.  They have to deal with new regulations like ISO 26262, as well as other capabilities, in addition to environmental requirements and the basic task of “sensor fusion” as we attach more and more visual, radar, laser and other sensors to the car.  There is no way to reliably design vehicles and aircraft without virtual simulation of electrical behavior.

In addition, total system simulation has become a requirement.  How do you know that the wire bundle will fit through the hole in the door frame?  EDA tools can tell you the answer, but only after seeking out the data from the mechanical design.  Wiring in a car or plane is a three dimensional problem.  EDA tools traditionally worry about two dimension routing problems.  The world is changing.  We are going to see the basic EDA technology for designing integrated circuits be applied to the design of systems. Companies that can remain at the leading edge of IC design will be able to apply that technology to systems.”

David Kelf, VP of Marketing at OneSpin Solutions, observed: “OneSpin called it last year and I’ll do it again –– Automotive will be the “killer app” of 2017. With new players entering the marketing all the time, we will see impressive designs featured in advanced cars, which themselves will move toward a driverless future.  All automotive designs currently being designed for safety will need to be built to be as secure as possible. The ISO 26262 committee is working on security as well safety and I predict security will feature in the standard in 2017. Tools to help predict vulnerabilities will become more important. Formal, of course, is the perfect platform for this capability. Watch for advanced security features in formal.”

Rob Knoth, Product Management Director, Digital and Signoff Group at Cadence noted: “In 2016, autonomous vehicle technology reached an inflection point. We started seeing more examples of private companies operating SAE 3 in America and abroad (Singapore, Pittsburgh, San Francisco).  We also saw active participation by the US and world governments to help guide tech companies in the proliferation and safety of the technology (ex. US DOT V2V/V2I standard guidelines, and creating federal ADAS guidelines to prevent state-level differences). Probably the most unique example was also the first drone delivery by a major retailer, something which was hinted at 3 years prior and seemingly just a flight of fancy then.

Looking ahead to 2017, both the breadth and depth are expected to expand, including the first operation of SAE level 4/5 in limited use on public streets outside the US, and on private roads inside US. Outside of ride sharing and city driving, I expect to see the increasing spread of ADAS technology to long distance trucking and non-urban transportation. To enable this, additional investments from traditional vehicle OEM’s partnering with both software and silicon companies will be needed to enable high-levels of autonomous functions. To help bring these to reality, I also expect the release of new standards to guide both the functional safety and reliability of automotive semiconductors. Even though the pace of government standards can lag, for ADAS technology to reach its true potential, it will require both standards and innovation.”

FPGA

The IoT market is expected to provide a significant opportunity to the electronics industry to grow revenue and open new markets.  I think the use of FPGA in IoT dvices will increase the use of these devices in system design.

I asked Geoff Tate, CEO of FlexLogix, his opinions on the subject.  He offered four points that he expects to become reality in 2017:

1. the first customer chip will be fabricated using embedded FPGA from an IP supplier

2. the first customer announcements will be made of customers adopting embedded FPGA from an IP supplier

3. embedded FPGAs will be proven in silicon running at 1GHz+

4. the number of customers doing chip design using embedded FPGA will go from a handful to dozen.

Zibi Zalewski, Hardware Division General Manager at Aldec also addressed the FPGA subject.

“I believe FPGA devices are an important technology player to mention when talking what to expect in 2017. With the growth of embedded electronics driven by Automotive, Embedded Vision and/or IoT markets, FPGA technology becomes a core element particularly for in products that require low power and re-programmability.

Features of FPGA such as pipelining and the ability to execute and easily scale parallel instances of the implemented function allow for the use of FPGA for more than just the traditionally understood embedded markets. FPGA computing power usage is exploding in the High Performance Computing (HPC) where FPGA devices are used to accelerate different scientific algorithms, big data processing and complement CPU based data centers and clouds. We can’t talk about FPGA these days without mentioning SoC FPGAs which merge the microprocessor (quite often ARM) with reprogrammable space. Thanks to such configurations, it is possible to combine software and hardware worlds into one device with the benefits of both.

All those activities have led to solid growth in FPGA engineering, which is pushing on further growth of FPGA development and verification tools. This includes not only typical solutions in simulation and implementation. We should also observe solid growth in tools and services simplifying the usage of FPGA for those who don’t even know this technology such as high level synthesis or engineering services to port C/C++ sources into FPGA implementable code. The demand for development environments like compilers supporting both software and hardware platforms will only be growing, with the main goal focused on ease of use by wide group of engineers who were not even considering the FPGA platform for their target application.

At the other end of the FPGA rainbow are the fast-growing, largest FPGA offered both from Xilinx and Intel/Altera. ASIC design emulation and prototyping will push harder and harder on the so called big-box emulators offering higher performance and significantly lower price per gate and so becoming more affordable for even smaller SoC projects. This is especially true when partnered with high quality design mapping software that handles multi-FPGA partitioning, interconnections, clocks and memories.”

Figure 2. Verification can look like a maze at times

Design Verification

There are many methods to verify a design and companies will, quite often, use more than one on the same design.  Each method: simulation, formal analysis, and emulation, has its strong points.

For many years, logic simulation was the only tool available, although hardware acceleration of logic simulation was also available.

Frank Schirrmeister, Senior Product Management Group Director, System and Verification Group at Cadence submitted a through analysis of verification issues.  He wrote: “From a verification perspective, we will see further market specialization in 2017 – mobile, server, automotive (especially ADAS) and aero/defense markets will further create specific requirements for tools and flows, including ISO 26262 TCL1 documentation and support for other standards. The Internet of Things (IoT) with its specific security and low power requirements really runs across application domains.  Continuing the trend in 2016, verification flows will continue to become more application-specific in 2017, often centered on specific processor architectures. For instance, verification solutions optimized for mobile applications have different requirements than for servers and automotive applications or even aerospace and defense designs. As application-specific requirements grow stronger and stronger, this trend is likely to continue going forward, but cross-impact will also happen (like mobile and multimedia on infotainment in automotive).

Traditionally ecosystems have been centered on processor architectures. Mobile and Server are key examples, with their respective leading architectures holding the lion share of their respective markets. The IoT is mixing this up a little as more processor architectures can play and offer unique advantages, with configurable and extensible architectures. No clear winner is in sight yet, but 2017 will be a key year in the race between IoT processor architectures. Even OpenSource hardware architectures are look like they will be very relevant judging from the recent momentum which eerily reminds me of the early Linux days. It’s one of the most entertaining spaces to watch in 2017 and for years to come.

Verification will become a whole lot smarter. The core engines themselves continue to compete on performance and capacity. Differentiation further moves in how smart applications run on top of the core engines and how smart they are used in conjunction.

For the dynamic engines in software-based simulation, the race towards increased speed and parallel execution will accelerate together with flows and methodologies for automotive safety and digital mixed-signal applications.

In the hardware emulation world, differentiation for the two basic ways of emulating – processor-based and FPGA-based – will be more and more determined by how the engines are used. Specifically, the various use models for core emulation like verification acceleration low power verification, dynamic power analysis, post-silicon validation—often driven by the ever growing software content—will extend further, with more virtualization joining real world connections. Yes, there will also be competition on performance, which clearly varies between processor-based and FPGA-based architectures—depending on design size and how much debug is enabled—as well as the versatility of use models, which determines the ROI of emulation.

FPGA-based prototypes address the designer’s performance needs for software development, using the same core FPGA fabrics. Therefore, differentiation moves into the software stacks on top, and the congruency between emulation and FPGA-based prototyping using multi-fabric compilation allows mapping both into emulation and FPGA-based prototyping.

All this is complemented by smart connections into formal techniques and cross-engine verification planning, debug and software-driven verification (i.e. software becoming the test bench at the SoC level). Based on standardization driven by the Portable Stimulus working group in Accellera, verification reuse between engines and cross-engine optimization will gain further importance.

Besides horizontal integration between engines—virtual prototyping, simulation, formal, emulation and FPGA-based prototyping—the vertical integration between abstraction levels will become more critical in 2017 as well. For low power specifically, activity data created from RTL execution in emulation can be connected to power information extracted from .lib technology files using gate-level representations or power estimation from RTL. This allows designers to estimate hardware-based power consumption in the context of software using deep cycles over longer timeframes that are emulated. ‘

Anyone who knows Frank will not be surprised by the length of the answer.

Wally Rhines, Chairman and CEO of Mentor Graphics was less verbose.  He said:” Total system simulation has become a requirement.  How do you know that the wire bundle will fit through the hole in the door frame?  EDA tools can tell you the answer, but only after seeking out the data from the mechanical design.  Wiring in a car or plane is a three dimensional problem.  EDA tools traditionally worry about two dimension routing problems.  The world is changing.  We are going to see the basic EDA technology for designing integrated circuits be applied to the design of systems. Companies that can remain at the leading edge of IC design will be able to apply that technology to systems.

This will create a new market for EDA.  It will be larger than the traditional IC design market for EDA.  But it will be based upon the basic simulation, verification and analysis tools of IC design EDA.  Sometime in the near future, designers of complex systems will be able to make tradeoffs early in the design cycle by using virtual simulation.  That know-how will come from integrated circuit design.  It’s no longer feasible to build prototypes of systems and test them for design problems.  That approach is going away.  In its place will be virtual prototyping.  This will be made possible by basic EDA technology.  Next year will be a year of rapid progress in that direction.  I’m excited by the possibilities as we move into the next generation of electronic design automation.”

The increasing size of chips has made emulation a more popular tool than in the past.  Lauro Rizzatti, Principal at Lauro Rizzatti LLC, is a pioneer in emulation and continues to be thought of as a leading expert in the method.  He noted: “Expect new use models for hardware emulation in 2017 that will support traditional market segments such as processor, graphics, networking and storage, and emerging markets currently underserved by emulation –– safety and security, along with automotive and IoT.

Chips will continue to be bigger and more complex, and include an ever-increasing amount of embedded software. Project groups will increasingly turn to hardware emulation because it’s the only verification tool to debug the interaction between the embedded software and the underlying hardware. It is also the only tool capable to estimate power consumption in a realistic environment, when the chip design is booting an OS and processing software apps. More to the point, hardware emulation can thoroughly test the integrity of a design after the insertion of DFT logic, since it can verify gate-level netlists of any size, a virtually impossible task with logic simulators.

Finally, its move to the data center solidifies its position as a foundational verification tool that offers a reasonable cost of ownership.”

Formal verification tools, sometimes referred to as “static analysis tools” have seen their use increase year over year once vendors found human interface methods that did not require a highly-trained user.  Roger Sabbagh, VP of Application Engineering at Oski Technology pointed out: “The world is changing at an ever-increasing pace and formal verification is one area of EDA that is leading the way. As we stand on the brink of 2017, I can only imagine what great new technologies we will experience in the coming year. Perhaps it’s having a package delivered to our house by a flying drone or riding in a driverless car or eating food created by a 3-D printer. But one thing I do know is that in the coming year, more people will have the critical features of their architectural design proven by formal verification. That’s right. System-level requirements, such as coherency, absence of deadlock, security and safety will increasingly be formally verified at the architectural design level. Traditionally, we relied on RTL verification to test these requirements, but the coverage and confidence gained at that level is insufficient. Moreover, bugs may be found very late in the design cycle where they risk generating a lot of churn. The complexity of today’s systems of systems on a chip dictates that a new approach be taken. Oski is now deploying architectural formal verification with design architects very early in the design process, before any RTL code is developed, and it’s exciting to see the benefits it brings. I’m sure we will be hearing a lot more about this in the coming year and beyond!”

Finally David Kelf, VP Marketing at OneSpin Solutions observed: “We will see tight integrations between simulation and formal that will drive adoption among simulation engineers in greater numbers than before. The integration will include the tightening of coverage models, joint debug and functionality where the formal method can pick up from simulation and even emulation with key scenarios for bug hunting.”


Conclusion

The two combined articles are indeed quite long.  But the EDA industry is serving a multi-faceted set of customers with varying and complex requirements.  To do it justice, length is unavoidable.

Next Page »