Get Real Results with Virtual Prototypes

The tools are now mature enough for even risk-averse engineering teams to embrace.

The complexity of chip design and its associated software continues to grow at a breathtaking pace. As a result, the growing system-level design-tools market has recently enjoyed a lot of attention. Several startups and all three major electronic-design-automation (EDA) players have entered the market. One can safely say that the time for virtual prototyping has come. It's in the common interest of chip companies and their customers to quickly bring to market new products based on systems-on-a-chip (SoCs). That common interest is helping to drive the adoption of this technology. It is joining other, more established techniques like field-programmable-gate-array (FPGA) -based prototyping, which already is in use for 70% of today's chip designs.

A typical electronic system used to consist of a simple piece of hardware. It ran a small amount of useful software to enable a small set of end features or functions. Times have definitely changed. In today’s electronic systems, the hardware may consist of tens or even hundreds of processors. All of them must interoperate seamlessly. As if that wouldn't be enough challenge already, the amount of software designed into an electronic system also has grown exponentially over the past decade. It is now a key element to differentiation for most end products. The growing cost of semiconductor manufacturing has driven approaches like platform-based design, in which one hardware platform serves different markets and the actual product differentiation is done mostly in software.

Not surprisingly, software also accounts for roughly 80% of product delays. The complexity of software and the effort required to integrate and validate the combined hardware and software system has clearly ballooned and continues to grow. The simple question for system designers remains: “How do you get the whole thing to work?”

 

  Figure 1: Designers were polled on the percentage of project effort spent on software development.

 

 

Exploring All the Options

There are various ways that engineers can create and validate a complex hardware and software system. Traditionally, users designed a chip, then a board, and waited for sample chips to come back from the fab. Next came the integration of the chips on the boards--before designers loaded what they thought was working software onto the boards and tried to bring the whole thing up in what was nothing less than a "big-bang test." System bring-up became a case of trial and error. With designers having no idea what is and isn't working, the problems that they were experiencing could originate in the new chips or the software--or, most likely, in both. Slowly, with the aid of debug tools and a lot of intuition, they got more and more of the system to work until finally the product was shipped.

The problem with this traditional, sequential approach is that it's unpredictable and usually takes a very long time. Some schedule time is gained by designing the boards at the same time as the chips. But designers still can’t validate that the software will run correctly until they have assembled the system.

One way to develop the software in parallel to the chips is to use an emulator. Once there's working RTL, it can be loaded into the emulator and the software run on top of it. By the time the chips come back from the fab, it’s “likely” that the software that users were able to run on the emulator also will work when loaded onto the boards. But the big problem here is that of cost. Emulators are relatively expensive. While the cost can typically be justified for doing hardware verification, that may not be the case for the development of all of the software. In addition, the execution speed of emulators really doesn't lend itself for executing long, exhaustive software traces. The lack of speed often makes it a challenge to connect to real-world interfaces at speed.

Another common approach, FPGA-based prototyping, is less expensive and provides much higher execution speeds compared to an emulator. It has a very useful role to play in software development as well as system integration and validation. Once available, speed is sufficient to connect to many real-world interfaces at speed. As is the case with emulators, however, a complete, verified RTL description is needed before the FPGA-based prototype can be used to develop software on it. In addition, control of execution and debug are non-trivial and often require intrusive methods, making it difficult to reproduce and efficiently debug defects at the hardware-software interface.

Enter Virtual Prototyping

Each of the previous approaches offers designers tradeoffs between system performance, accuracy, debug efficiency, and cost. Virtual prototyping represents an attractive, promising, and complementary alternative to each of these well-established approaches. The core tenet of virtual prototyping is the use of a software representation of the relevant behavior of the hardware, which will execute the embedded software as faithfully as possible compared to the final chip and system (i.e., a model of the hardware).

This offers several distinct advantages versus emulation and FPGA prototyping. First, software development can commence before the RTL is finalized. This is because only a high-level model of the hardware is needed, which can be developed based on design specifications. Next, virtual prototypes may be updated easily and quickly, whereas FPGA-based prototypes must have any changes re-flashed into FPGAs. This can be a slow and expensive process, depending on the number of FPGA-based development boards used in the project. Third, given that the execution is based on software simulation, every defect can be easily reproduced once identified. Therefore, the execution can be well controlled and observed using non-intrusive debugging techniques. Finally, virtual prototypes can be shared easily and cost effectively across geographically dispersed development teams.

Of course, there also are drawbacks. For example, it's usually difficult and time consuming to develop accurate models--particularly when clock cycle accuracy is required. Luckily, for a large number of software-development tasks including driver development, so-called loosely timed models are sufficient. Nevertheless, it also can be quite challenging having to develop a model by reverse-engineering legacy RTL code for which a precise specification may not exist.

 


fig1.1

  

Figure 2: Shown is an end-to-end prototyping solution.

 

 

A Hybrid Future

One key reason that design teams use prototyping is to achieve faster time to market. We see this mostly in the consumer-electronics industry--especially wireless handsets. Companies in this space are looking to develop whatever they can in parallel. In order to capture customer dollars, they must introduce new features in shrinking market windows against stiff competition and then ramp volume production quickly. In this challenging environment, design teams are making use of both virtual and FPGA-based prototyping techniques. Virtual prototyping has unique advantages when the RTL code doesn't yet exist. Once RTL code is available, FPGA-based prototyping exhibits high performance and accuracy. Obviously, there are advantages to combining both approaches.

Other key reasons for adopting prototyping are safety and security issues--often seen in the automotive market. Using virtual hardware representations, more test cases can be executed earlier, leading to larger test coverage and more robust code. In addition, test sequences can be driven to extremes that often cannot be safely reproduced in the real world. Automotive companies have been leading the way in the adoption of a true mixed-hybrid approach. They're building very complex prototypes using a combination of software technologies, dedicated hardware platforms, and even components that are part of a real vehicle. There's a trend in this industry to move toward the use of virtual prototyping because hardware prototypes are expensive. Of course, there's also a massive increase in the amount of software used in vehicles.

Shared Platforms, Shared Costs

The rising cost of software and hardware design is encouraging design teams to find common ground between application areas that were previously thought to be far apart. They're motivated to develop platforms that can address the needs of multiple applications and therefore get a better return on their development costs.

For example, take a TV and a mobile handset. One is a home-entertainment application and the other is a wireless application. From a system-level view, however, they actually have a lot in common and share a growing number of features. It's possible to unify parts of the architecture to serve both of these application areas. A phone connects through a wireless interface to the network, which provides the content. You may be watching video or browsing images on a small screen. But you need to be able to process and display the content. A TV gets its images through a cable connection. But again, you need to process the images to display them on a big screen. While the modems for TVs and phones will be very different, the architectures--especially the application processor that deals with the content on both the phone and TV--can be made to look similar, at least from a software point of view.

This shared-platform approach gives design teams the ability to spread investments across projects. By integrating Linux or Android just once for both a phone and TV, for example, design teams can reduce software-infrastructure development costs. The shared-platform approach also allows companies to spread the development cost of virtual prototypes across multiple product lines.

Supply Chain Creates Demand

As an industry, we've moved away from the old ASIC model. Today’s system companies define the chip requirements, often choose the processor cores, and build the application software--with the semiconductor company assuming the task of architecting the SoC. The system companies are under pressure to complete their software before they receive any chip samples from vendors. Meanwhile, semiconductor companies are now taking development responsibility for more of the driver software, middleware, and operating system (OS) that come with their chips. They can both benefit greatly from sharing the same virtual prototype to also create application software in parallel with SoC development. And when the sample chips arrive, they'll already have a prototype running and the bring-up process will be smooth and fast.

But there’s a problem. It's one thing for a chip vendor to use a virtual prototype for internal development purposes. But that virtual prototype must be robust, documented, supported, and maintained if a vendor is to make it available to a system company as a supported product. This hasn’t always been an easy task to accomplish. Often, the chip company has built its virtual prototype using a mixture of internal tools. In order to share the virtual prototype, the chip company would have to hand off the tools as well. Any advantages gained by the concept can easily be eroded by the issues of documentation, support, and training.

Moving to a virtual prototype with a commercially supported compute engine and validated IP models will address these problems. Semiconductor vendors that do this can then share their designs within a robust virtual environment that will offer significant value to their design chain.

It's not just an engineering or methodology imperative that's driving the move to virtual prototypes within chip companies. Marketing and sales teams also are interested in virtual prototypes because they see them as a way to engage prospective clients very early on. Virtual prototypes allow them to put a demonstration and evaluation vehicle into the hands of their system company prospects. They can show off the capabilities of their products and get feedback so that they can win business opportunities.

Future Commitment

As well as developing in-house technology, Synopsys has recently acquired new technologies from VaST and CoWare. Often, acquired technology can be difficult to integrate with existing solutions. However, there are several standards, including SystemC and the SPIRIT Consortium’s IP-XACT specification, which enable even proprietary technologies to readily work together using common, standards-compliant models.

Synopsys' vision for future development environments is that there will be co-existence between the various approaches to developing the hardware and software that are essential elements of advanced electronic systems. Developers will create some of their software using previous-generation chips. They'll resort to virtual prototypes when prior-generation chips and RTL code of the new chip aren't available or when very large numbers of prototypes must be made available at low cost. They'll use FPGA-based prototypes once RTL code is available if performance and accuracy are important. Designers will continue to use emulators if absolute accuracy is of the essence and there's no time to develop an FPGA-based prototype.

All in all, the benefits offered by virtual-prototyping technologies are too compelling to ignore. The tools are mature enough for even risk-averse engineering teams to embrace. The reality is that there's more risk in the traditional ways of designing systems. Our commitment to virtual prototyping for system design is clear: We believe its time has come.

Joachim Kunkel joined Synopsys in 1994 and is currently senior vice president and general manager of the Solutions Group. In that capacity, he manages the business units responsible for Synopsys DesignWare® intellectual property (IP), strategic market development and system-level design. Before coming to Synopsys, Mr. Kunkel was co-founder of CADIS GmbH in Aachen, Germany. There, he served as managing director and performed myriad duties in engineering, sales and marketing. Before co-founding CADIS, Mr. Kunkel was a research assistant at the Aachen University of Technology, where he conducted research in system-level simulation techniques for digital signal processing, with special emphasis on parallel computing. Mr. Kunkel holds an MSEE degree, the Dipl.-Ing. der Nachrichtentechnik, from the Aachen University of Technology.


CHIPD TV

EECatalog Tech Videos

MAGAZINE

  • Download the latest issue of the Chip Design Magazine
    and subscribe to receive future issues and the email newsletter

©2014 Extension Media. All Rights Reserved. PRIVACY POLICY | TERMS AND CONDITIONS