Chi-Ping Hsu, Senior Vice President, Chief Strategy Officer, EDA and Chief of Staff to the CEO at Cadence
The new system-design imperative
We’re at a tipping point in system design. In the past, the consumer hung on every word from technology wizards, looking longingly at what was to come. But today, the consumer calls the shots and drives the pace and specifications of future technology directions. This has fostered, in part, a new breed of system design companies that has taken direct control over the semiconductor content.
These systems companies are reaping business (pricing, availability), technical (broader scope of optimization) and strategic (IP protection, secrecy) benefits. This is clearly a trend in which the winning systems companies are partaking.
They’re less interested in plucking components from shelves and soldering them to boards and much more interested in conceiving, implementing and verifying their systems holistically, from application software down to chip, board and package. To this end, they are embracing the marriage of EDA and IP as a speedy and efficient means of enabling their system visions. For companies positioned with the proper products and services, the growth opportunities in 2015 are enormous.
The shift left
Time-to-market pressures and system complexity force another reconsideration in how systems are designed. Take verification for example. Systems design companies are increasingly designing at higher levels, which requires understanding and validating software earlier in the process. This has led to the “shift left” phenomenon.
The simple way to think about this trend is that everything that was done “later” in the design flow is now being started “earlier” (e.g., software development begins before hardware is completed). Another way to visualize this macroscopic change is to think about the familiar system development “V-diagram” (Figure 1 below). The essence of this evolution is the examination of any and all dependencies in the product planning and development process to understand how they can be made to overlap in time.
This overlap creates the complication of “more moving parts” but it also enables co-optimization across domains. Thus, the right side of the “V” shifts left (Figure 2 below) to form more of an accelerated flow. (Note: for all of the engineers in the room, don’t be too literal or precise; it is meant to be thematic of the trend).
Prime examples of the shift left are the efforts in software development that are early enough to contemplate hardware changes (i.e., hardware optimization and hardware dependent software optimization), while at the other end of the spectrum we see early collaboration between the foundry, EDA tool makers and IP suppliers to co-optimize the overall enablement offering to maximize the value proposition of the new node.
A by-product of the early software development is the enablement of software-driven verification methodologies that can be used to verify that the integration of sub-systems does not break the design. Another benefit is that performance and energy can be optimized in the system context with both hardware and software optimizations possible. And, it is no longer just performance and power – quality, security and safety are also moving to the top level of concerns.
Another design area being revolutionized is packaging. Form factors, price points, performance and power are drivers behind squeezing out new ideas. The lines between PCB, package, interposer and chip are being blurred.
Having design environments that are familiar to the principle in the system interconnect creation, regardless of being PCB, package or die centric by nature, provides a cockpit from which the cross fabric structures can be created, and optimized. Being able to provide all of the environments also means that data interoperable data sharing is smooth between the domains. Possessing analysis tools that operate independent of the design environment offers the consistent results for all parties incorporating the cross fabric interface data. In particular power and signal integrity are critical analyses to ensure design tolerances without risking the cost penalties of overdesign.
The rise of mixed-signal design
In general, but especially driven by the rise of Internet of Things (IoT) applications, mixed-signal design has soared in recent years. Some experts estimate that as much as 85% of all designs have at least some mixed-signal elements on board.
Figure 3: IBS Mixed-signal design start forecast (source: IBS)
Being able to leverage high quality, high performance mixed signal IP is a very powerful solution to the complexity of mixed signal design in advanced nodes. Energy-efficient design features are also pervasive. Standards support for power reduction strategies (from multi-supply voltage, voltage/frequency scaling, and power shut-down to multi-threshold cells) can be applied across the array of analysis, verification and optimization technologies.
To verify these designs, the industry has been a little slower to migrate. The reality is that there is only so much tool and methodology change that can be digested by a design team while it remains immersed in the machine that cranks out new designs. So, offering a step-by-step progression that lends itself to incremental progress is what has been devised. “Beginning with the end in mind” has been the mantra of the legions of SoC verification teams that start with a sketch of the outcome desired in the planning and management phase at the beginning of the program. The industry best practices are summarized as: MD-UVM-MS – that is, metrics-driven unified verification methodology with mixed signal.
Figure 4: Path to MS Verification Greatness