Part of the  

Chip Design Magazine

  Network

About  |  Contact

Headlines

Headlines

What’s shaping the modeling environment?

By Caroline Hayes, Senior Editor

Increased complexity is not so much a challenge as an impetus, as Cadence, Mentor and National Instruments discuss the trends in model driven development.

There are many ways to exploit model-driven development to reduce complexity in the chip design process for embedded and cyber-physical systems. Companies such as Cadence Design Systems use the term in the context of software development, with tools to enable model development and as inputs throughout the hardware/software development flow. For others, it is a way to enhance productivity through model driven development with benefits to the development cycle and project costs. This is summed up by Mentor Graphics’ Dean McArthur, product marketing engineer, model-based system design: “[Our] software…enables model-driven development…an upfront modeling and analysis approach flows seamlessly to design implementation significantly enhances productivity, while at the same time reducing cost and risk”.

Types of tools

Model driven development is used in mil/aero, transportation, medical, industrial and consumer projects, as well as, points out Shelley Gretlein, director of software marketing, National Instruments, in design and test products from academic circuit design to ‘cutting edge’ applications, such as power electronics systems and RF design.

National Instruments takes an engineering approach, she says, with tool that enable engineers to integrate third party simulation tools with the company’s hardware and software. “NI tools act as a framework…to develop sophisticated applications based on a range of simulation environments and programming languages.” The company provides software with its I/O library, which allows engineers to take a simulation and run it in the real world with physical I/O.

Having the ability to run unlimited system simulation of models is an advantage that can translate to savings in development time and cost, says McArthur, referring to the strategic investments Mentor Graphics has made in the system-level simulation space to create a model-driven development environment.

For software development, Cadence focuses on IP models, says Frank Schirrmeister. “Users start with high level system models (the traditional space of model-driven development) that can use high level functional IP models as input. One of the modeling tasks is to provide models of the hardware to enable software development prior to chips being available,” he says. The company provides tools to develop virtual TLM (transaction level models) for virtual prototypes, which register accurate software development before the RTL (register transfer level) descriptions, to define the hardware, are available. In this way, RTL models can be mapped into emulation for faster execution and into FPGA-based prototypes to achieve higher speeds.

Complexity challenges

It is this tighter integration that is able to meet the challenge of the increased complexity of the next-generation of chips. National Instruments’ Gretlein advocates model driven development for embedded and cyber-physical systems, saying that it can reduce complexity in the chip design process. “Abstraction techniques facilitate the rapid development of models that are appropriate to various domains,” she says. “Multiple domains can be combined into system models and software models can be included.”

One of the benefits afforded to designers is the tight integration between math and physical modeling software and I/O so that prototyping and test can occur early in the product design process,” she points out, adding that this approach can benefit both chip and component design as well as overall system integration and test. Both LabVIEW MathScript RT Module and LabVIEW Control Design and Simulation Module are examples of algorithm, design, and modeling tools that can be used in desktop analysis and simulation as well as combined with I/O for real-time control and dynamic test.

McArthur agrees with the increasing complexity. “Each new generation of products carries with it the expectation of greater capability, more performance, and lower cost. This demand for new and improved functionality in an optimum format puts high demands on the engineering teams who create them.” He looks at the embedded system engineer’s lot which is to combine software and electronics in larger systems.

Different engineers combine, but may not always communicate. Graphic - Mentor Graphics

This requires several types of engineers – typically architects or systems engineer to determine the high level design concept and architecture, hardware engineers, of both the electrical and electronic variety for specialized hardware tasks, and software engineers to perform detailed design work based on the architect’s specifications. Finally there are verification or test engineers to confirm that the integrated system satisfies the specification. Inter-discipline communication is still a hurdle, he says. “Oftentimes these specialists overlook imprecise specifications from the system level, the lack of communication between the domains, and the risks associated with waiting until the late stages of development to do integration testing. However, these issues often cause design and integration errors that dramatically increase the cost and development time.” He advocates the model driven design methodology enabled by modelling languages and tools, saying that they can dramatically improve design efficiencies, maximize reuse and reduce late-stage risks, as the model can be more fully developed. “[It] not only documents the design, but precisely demonstrates design concepts and unambiguously describes what the final implementation must do.” Applying stimulus and examining the model’s response verifies the high level design behavior. This executable specification serves as an enhanced method for validating requirements and exploring alternative implementations, he says. The high level of precision means that the process can support transforming the models in to starting points for the detailed design process “In the model driven development context, the line between model and design is truly blurred,” he says, “it is this blurring that provides significant value beyond traditional model usage,” he says.

Schirrmeister agrees that chip complexity, with higher levels of abstraction, combined with automated implementation and verification, allows the development of more complex hardware. He believes that providing early models of hardware for software development is key to parallelization of hardware and software development, which will shorten delays that a serial development flow brings.

He also pinpoints replacing models at one level of abstraction with less complex ones at the next higher level is key. “Verification can be done at the next level up, using more abstract but less accurate models, and with automated synthesis,” he observes This verification is followed by equivalence checking.” For him, automating the implementation allows the design teams to implement more complex designs.

Gretlein also identifies heterogenous computing targets combining floating point microprocessors and FPGAs which are being increasingly used. The algorithms and models running on these targets can be simulated and analyzed early in the design process using co-simulation approaches.

Tomorrow’s trends

Complexity is also driving model driven development today, says McArthur. As there are systems within systems, in which everything must work together, he points out, so one miscalculation can result in overrunning costs, delays, reliability problems and even product failures.

The increasing system complexity and higher computational demands for simulation is driving the need for distributed model execution, says National Instruments’ Gretlein. Models from multiple simulation environments must often be integrated and often must be run on a combination of processor based systems and hardware. For example, she cites power electronics applications that require extremely high fidelity simulation that executes at rates that are beyond the capabilities of processor based systems. “In this case, the simulation is based on finite element analysis, and it must be performed in hardware, which has historically not been possible,” she says. “Due to the advancements in FPGA technology and tools [such as LabVIEW FPGA], simulations that used to take hours or even days can now be run in seconds.”

Simulation tools providers are collaborating to provide support. Many simulation tools, such as SimulationX from ITI and MapleSim from MapleSoft, support the Modelica Framework and the Functional Mock-up Interface (FMI) standard which provides a common interface for models from different environments to interact seamlessly. Tools such as NI VeriStand provide a common integration framework that enables models from a range of programming languages and simulation environments to communicate with one another and connect to physical I/O. Models can be optimized to execute on specific cores of a multiprocessor computer or they can be deployed to FPGA based hardware for even higher performance.

For Cadence, Schirrmeister believes that a hardware trend is the increase in models as different representations at different abstraction in IP blocks, to assemble complex chips. He believes this has become an essential feature: “Today’s processors come with proper models to execute software prior to the processor being implemented. This significantly shortens development, through concurrent design,” he says “and also allows different disciplines to interact in new ways”.

The last word belongs to McArthur, who sees executable models replacing physical prototypes. These are, he says, expensive to create and often used at a point where design changes are difficult to incorporate. He observes, “Model driven development teams have started using executable models to assemble Virtual environments. In addition to being cheaper, easier to change and available much earlier, a good virtual environment can deliver better visibility, accessibility and control than the equivalent physical elements, or even provide connections to actual physical components where necessary to ensure fidelity.” The virtual environments allows for what-if tradeoffs, he says “performance as a function of component tolerances can be determined, and many other analyses can be performed. This is the pay-off for the model development effort”.

- Model driven development provides value beyond traditional model usage. Graphic - Mentor

Tags: , , ,

Leave a Reply