Part of the  

Chip Design Magazine

  Network

About  |  Contact

ESL: The Power Savior?

December 17th, 2008

Recently, power has been on the front burner for most design teams. In fact, nearly every corporate executive welcomes a discussion on power solutions given the strong correlation between low-power design and their income statement. So, here’s the question: Is ESL the savior to our power optimization challenge? Well, yes and no.

 

Much of the historical progress in ESL can be attributed to need for system timing analysis. In an RTL-dominant world, timing often brings to mind terminology such as “timing closure” and “setup and hold parameters.” But in the world of ESL, timing concerns involve systemic characteristics such as throughput, bottlenecks and latency. Of course, ESL also has an important role in software development where timing may not even be required.

 

But getting back to the topic of power optimization, recent progress in the ESL domain has enabled system-level designers to analyze power characteristics in the context of software and architectural hardware tradeoffs. Arguably (and based on my last blog entry I anticipate some argument) ESL has always been able to support power analysis.

 

Applying ESL power analysis simply (or not so simply) requires some power modeling effort. In fact, accurate power modeling at the transaction level is quite a difficult task. Fortunately, solutions exist today that not only automate much of the TLM (transaction level modeling) power modeling task, but do so while retaining accuracy close to gate-level analysis results.

 

So if ESL methodologies combined with TLM power modeling solutions enable system level power analysis, why isn’t ESL our savior for low-power design? After all, design teams have more knobs to tweak during the architectural design phase, and the number of knobs quickly reduce to a precious few as the design progresses through the various phases of implementation.

 

The answer is that power optimization is required at every stage of the design process. For instance, system design teams using ESL can optimize HW/SW tradeoffs, processor choices, memory and cache layering, bus parameterization, and test power domains and voltage/frequency scaling strategies, but additional mechanisms such as clock gating, power gating, place & route optimizations, including clock tree synthesis, can only be tuned as design implementation progresses into RTL, gates, and physical representations. In short, Power Optimization requires design team attention and tools at every level of design abstraction.

 

But, there’s good news in all of this. Specifically, ESL power analysis and modeling enables huge opportunities to improve power characteristics. Considering recent advances in TLM power modeling—this is a major step forward. In fact, my customers tell me that Power is one of the key driving forces behind their active investment ESL methodologies. While ESL power analysis might not be the savior, it is most certainly a very key weapon in the arsenal to address the increasing power optimization challenges.

 

Keeping the “E” in ESL

October 23rd, 2008

I was in Boston last week and I met with a few customers who shared information about their activities around ESL.  I’m always pleased to meet with customers who are already down the path of ESL. However, I’m often surprised to hear what constitutes ESL for them. In this case, the customer was creating very high level, conceptual models of a system, also known as model based design, to represent a wide range of engineering disciplines across the system (HW, SW, Thermal, Optical). I noted that the models had no representation of actual hardware, just functions in concept which may or may not ever be in actual hardware.

The next night, I was in Philadelphia, at a renowned cheesesteak sub shop, where I watched the Phillies clobber the Dodgers, all the while raising my cholesterol with a great sub. Earlier in the day, I met with another customer who said they were interested in applying ESL and were planning to start by creating UML models. In less than 24 hours, I had met with two customers with completely different concepts of ESL.  Interestingly, neither concept–the model-based design or the UML modeling activity–really fit within my definition of ESL. Specifically, both methodologies neglected the “E” in ESL.  The E in ESL implies electronic hardware and some representation of that implementation. 

While ESL is certainly a design representation above RTL, that doesn’t mean it encompasses every abstraction above RTL. ESL starts and stops within the abstraction of TLM (transaction level modeling), which includes some representation of hardware architecture.  While there are certainly benefits and good reasons to also do conceptual modeling, above ESL, lumping those methodologies into the  ESL bucket dilutes the space and risks confusion in the market.   ESL is a major step in hardware design methodology above RTL and a logical progression in hardware design methodology.  Given the long history of ESL, as we watch it come of age, it seems like a good idea to maintain clarity as to what is and is not within this design methodology.

The Promise of TLM 2.0 – Exploring New Horizons

October 1st, 2008

The recently announced SystemC Transaction-level Modeling Standard, TLM 2.0, by the Open SystemC Initiative (OSCI) holds a great deal of promise for the electronic design community.  This standard has been long anticipated and we, in the electronic system-level (ESL) arena, have been eagerly awaiting the arrival of this new interoperability standard that will enable SoC developers to create and design with more efficiency.

For those of you unfamiliar with the TLM 2.0 Modeling Standard, this solution provides SystemC model interoperability and reuse at the transaction level, resulting in an ESL “framework” to analyze system architecture, in the context of hardware and software, performing performance analysis, and system verification.  As an industry standard, TLM 2.0 is language- independent, meaning that users can integrate models from a variety of different sources easily—thus enabling more efficient simulation performance.  This new standard interface promises to help speed the integration of tools and models used in verification platforms for hardware and software co-design, a real plus for today’s advanced designs. 

TLM 2.0 provides robust virtual platform capabilities; direct memory interfaces (DMI) enable faster virtual platform execution speed for software development by providing  “backdoor” memory access to processor models, bypassing the need for transactions moving through complex bus hierarchies.  Fast,  loosely timed modeling and more detailed approximately timed modeling styles are fully interoperable with TLM 2.0.  With loosely timed modeling, this allows faster, scalable simulation—processors and peripherals in multicore systems can be executed in parallel.  Also, with SystemC integration and support from RTL simulation tools, TLM 2.0-based virtual platforms can be established at the front-end for design and verification flows.

There are three major developments that will guarantee the adoption of TLM 2.0.  First, there is increased need for SystemC TLM standards to enable model exchange for  existing ESL design tasks, includingsystem level modeling and architecture design, algorithm design, and TLM reference models for functional verification,. Second, TLM modeling standards  are needed to support the rapid adoption of SystemC virtual platforms for developing software.  Last, but certainly not least, OSCI has more than 2,100 SystemC users worldwide, making their support of the TLM 2.0 standard quite significant. 

While the promise of the TLM 2.0 standard is appealing, there are still several hurdles to jump over.  A key challenge with TLM 2.0 adoption will be persuading its many prospects to disband their legacy tools and internally developed methods and models to realize the benefits of this new standard.  Companies who have developed their own internal formats will need to replace them with new technologies for TLM 2.0 support.   The standard also has room for future improvement, such as fully-interoperable third-party IP, and models for critical applications, like power consumption, software and architectural optimization.

I equate this new standard with Lewis and Clark, the frontiersmen and explorers who ventured West to find and claim new territories for America.  Similar to Lewis and Clark’s discoveries, TLM 2.0 is the new fertile ground that should reap a bounty for its community.  At this stage of TLM 2.0, it is much like the foundation, walls and infrastructure of a building waiting to be occupied.  While this structure serves its purpose, to serve as shelter for its residents—it still is in need of utilities, carpeting, furniture, and plumbing. 

My intent is not to cast doubt on the viability of TLM 2.0.  Quite the contrary— the TLM 2.0 standard as it stands today, serves as a great foundation.  With several companies rallying behind OSCI’s efforts, including my own, I believe we have the perfect “ground floor” opportunity to establish something vital and critical to our industry—particularly for ESL sector.  I am excited about the new frontiers that will be encountered on this journey of discovery.  TLM 2.0 and its long list of supporters should only enrich and enliven this “promised land” of opportunity.  We will all gain and benefit from our collaborative efforts to make TLM 2.0 the ubiquitous standard for today’s SoC designs—and this will inspire the electronics community to further innovate and explore new possibilities. 

–Glenn Perry, Mentor Graphics general manager, ESL and HDL Design division

glenn_perry@mentor.com

 

Prevailing Winds in ESL

September 18th, 2008

What do hurricanes and ESL have in common? They are both scheduled to frequent the East Coast during the first two weeks of September 2008. I left for a two-week ESL customer tour beginning September 1st, followed by a week in Cancun in the third week. Apparently, a series of hurricanes with friendly names like Gustav, Hannah, Ike and Josephine had similar travel plans. They all started their journey in Africa…I only had to travel from Portland, Oregon.

I’m early in my trip and I’m hoping to avoid any serious encounters with hurricanes of the weather variety, but it’s not hard to draw analogies to hurricanes and ESL adoption. For instance, there is a lot of FUD (fear, uncertainty and doubt) swirling around both. With hurricanes, the questions are primarily “where is it going to land?” and “will I have a house to go back to??”

With regards to ESL, the fears are more nuanced. I met with a prominent consumer electronics company today to discuss their challenges in the front-end ASIC flow. We discussed advanced verification methodologies, reuse solutions, and requirements management, but you know what their number one concern was? Implementing an ESL solution that would allow them to analyze system level performance tradeoffs. Namely, latency, throughput, and bottlenecks. Sound familiar? If not, you might be in the minority. I would be hard pressed to find a customer today who isn’t trying to figure out how to address performance and power optimization at the system level.

ESL sounds like a good idea to most people, but lots of people are confused about what ESL is, what to do with it, where to start, what to avoid. In some ways, ESL shares some characteristics with a hurricane … it’s broad, it’s powerful, it’s not well understood….too bad hurricanes don’t have any redeeming qualities to compare. Nevertheless, several things are beginning to become clear. ESL is about abstraction to gain performance and productivity. With performance and productivity, design teams gain the advantage of exploring a broader solution space where they can ultimately produce highly optimized designs — a broader solution space that optimizes both software and hardware and optimized designs for optimal performance, optimal power, optimal area, and optimal time to tape out.

At Mentor Graphics, we view ESL as the next evolution in design methodology, not unlike what RTL design was to gate – level design. To that end, we are taking a holistic approach and investing accordingly. Specifically, we believe that an ESL solution requires three core components: Design, Synthesis and Verification — all at the level of TLM and above. The ultimate power in ESL comes through the combination of all three disciplines. If you want to learn more about what we are doing in this space, check us out at www.mentor.com/esl . I’m really grateful for this new System-Level Design newsletter and portal, and the opportunity to contribute to an ESL blog. It should serve as a good forum for idea exchange and education. Perhaps I can let you know how the hurricanes played out relative to travel plans in my next blog update. Until then, I wish you happy travels on the road to ESL and encourage your comments on the blog.

Glenn Perry

Mentor Graphics GM, ESL and HDL Design Division

glenn_perry@mentor.com