By John Blyler
System-Level Design (SLD) sat down to discuss trends in analog and RF integrated circuit design with Ravi Subramanian, president and CEO of Berkeley Design Automation, (at the recent GlobalPress eSummit) and later with Trent McConaghy, Solido’s CTO. What follows are excerpts of those talks.
SLD: What are the important trends in analog and RF simulation?
Subramanian: I see two big trends. One is related to physics, namely, the need to bring in physical effects early in the design process. The second trend relates to the increased importance of statistics in doing design work. Expertise in statistics is becoming a must. One of the strongest demands made on our company is to help teach engineers how to do statistical analysis. What is required is an appreciation of the Design-of-Experiments (DOE) approach—common in the manufacturing world. Design engineers need to understand what simulations are needed for analog versus digital designers. For example, in a typical pre-layout simulation, you may want to characterize a block with very high confidence. Further, you may also want to do that block extracted in post layout with very high confidence. But what does ‘high confidence’ mean? How do you know when you have enough confidence? If you have a normally distributed Gaussian variable, you may have to run 500 simulations to get a 95% probability of confidence in that result. Every simulation waveform and data point has a confidence band associated with it.
McConaghy: As always, there is always a pull from customers for simulators that are faster and better. In general, simulators have been delivering on this. Simulators are getting faster, both in simulation time for larger circuits, and by easier-to-use multi-core and multi-machine implementations. Simulators are also getting better. They converge on a broader range of circuits, handle larger circuits, and more cleanly support mixed-signal circuits.
There’s another trend: meta-simulation. This term describes tools that feel like using simulators from the perspective of the designer. Just like simulators, meta-simulators input netlists, and output scalar or vector measures. However, meta-simulators actually call circuit simulators in the loop. Meta-simulators are used for fast PVT analysis, fast high-sigma statistical analysis, intelligent Monte Carlo analysis and sensitivity analysis. They bring the value of simulation to a “meta” (higher) level. I believe we’ll see a lot more meta-simulation, as the simulators themselves get faster and the need for higher-level analysis grows.
SLD: This sounds a lot like the Six Sigma methodology, a manufacturing technique use to find and remove defects from high volume productions—like CMOS wafers. Will design engineers really be able to incorporate this statistical approach into their design simulations?
Subramanian: Tools can help engineers incorporate statistic methods into their works. But let’s talk about the need for high sigma values. To achieve high sigma, you need a good experiment and a very accurate simulator. If you have a good experiment but you want to run it quickly and give up accuracy, you may have a Six-Sigma setup, but a simulator that has been relaxed so the Six-Sigma data is meaningless. This shows the difference between accuracy and precision. You can have a very precise answer but it isn’t accurate.
To summarize: Today’s low-node processes have associated physical effects that only can be handled by statistical methods. These two trends mean that new types of simulation must be run. Engineers need to give more thought as to which corners should be covered in their design simulations. Semiconductor chip foundries provided corners that are slow, fast and typical, based upon the rise- and fall-times of flip-flops. How relevant is that for a voltage-controlled oscillator (VCO)? In fact, are there more analog specific corners? Yes, there are.
SLD: Statistical analysis, design-of-experiments, and corner modes—designers already hear many of these terms from the yield experts in the foundries. Should they now expect to hear it from the analog and RF simulator communities?
Subramanian: Designers must understand or have tools that help them deal with statistical processes. For example, how do you know if a VCO will yield well? It must have a frequency and voltage characteristics that are reliable over a range of conditions. But if you only test it over common digital corners, you may miss some important analog corners where the VCO performs poorly. A corner is simply a performance metric, such as output frequency. You want to measure it within a particular confidence level, which is where statistics are needed. It may turn out that, in addition to the digital corners you’ll need to include a few analog ones.
McConaghy: These terms imply the need to address variation, and designers do need to make sure that variation doesn’t kill their design. Variation causes engineers to overdesign, wasting circuit performance, power and area or under design, hitting yield failures. To take full advantage of a process node, designers need tools that allow them to achieve optimal performance and yield. Since variation is a big issue, it won’t be surprising if simulator companies start using these terms with designers. The best EDA tools handle variation, while allowing the engineer to efficiently focus on designing with familiar flows like corner-based design and familiar analyses like PVT and Monte Carlo. But now the corners must be truly accurate, i.e., PVT corners must cause the actual worst-case behavior, and Monte Carlo corners must bound circuit (not device) performances like “gain” at the three-sigma level or even six-sigma level. These PVT and Monte Carlo analyses must be extremely fast, handling thousands of PVT corners, or billions of Monte Carlo samples.
SLD: Would a typical digital corner be a transistor’s switching speed?
Subramanian: Yes. Foundries parameterized transistors to be slow, typical and fast in terms of performance. The actual transistor model parameters will vary around those three cases, e.g., a very fast transistor will have a fast rise and switching time. So far, the whole notion of corners has been driven by the digital guys. That is natural. But now, analog shows up at the party at the same time as digital, especially at 28nm geometries.
The minimal requirement today is that all designs must pass the digital corners. But for the analog circuits to yield, they must pass the digital and specific analog corners, i.e., they must also pass the condition and variations relevant to the performance of that analog device. How do you find out what those other corners are? Most designers don’t have time to run a billion simulations. That is why people need to start doing distribution analysis for analog corners like frequency, gain, signal-to-noise ratios, jitter, power supply rejection ratio, etc. For each of these analog circuit measurements, a distribution curve is created from which Six-Sigma data can be obtained. Will it always be a Gaussian curve? Perhaps not.
SLD: How will this increase in statistical distribution analysis affect traditional analog electronic circuit simulators like Spice?
Subramanian: Spice needs to start generating these statistically-based distribution curves. I think we are at the early days of that frontier where you can literally see yourself having a design cockpit where you can make statistics simple to use. You have to make it simple to use otherwise it won’t happen. I think that is the responsibility of the EDA industry.
McConaghy: The traditional simulators will be used more than ever, as the meta-simulators call upon them to do fast and efficient PVT and statistical variation analysis up to 6-sigma design. The meta-simulators incorporate intelligent sampling algorithms to cut down the number of simulations required compared to brute force analysis. Today, many customers use hundreds of traditional SPICE simulator licenses to do these variation analysis tasks. However, they would like to be able to get the accuracy of billions of Monte Carlo samples in only thousands of actual simulations. These analyses are being done on traditional analog/RF, mixed-signal designs as well as memory, standard cell library and other custom digital design.
SLD: I know that the several of the major EDA tool vendors have recently released tools to make the statistical nature of low process node yields more accessible and useable by digital chip designers. Are their similar tools for the world of analog mixed signal design?
Subramanian: Analog and RF designs are now going through this same process, to move from an art to a science. That’s why I say that the nanometer mixed-signal era is here (see figure). Simulation tools are needed, but so are analysis capabilities. This is why our simulation tools have become platforms for analysis. We support the major EDA simulators but add an analysis cockpit for designers.
Figure 1: Mixed-Signal and RF designs are now part of the nanometer SoC design process.
SLD: Why now? What is unique about the leading-edge 28nm process geometries? I’d have expected similar problem at a higher node, e.g., 65nm. Is it a yield issue?
Subramanian: Exactly. At 65nm, designers were still able to margin their designs sufficiently. But now the cost of the margin becomes more significant because you either pay for it with area or with power, which is really current. At 28nm, with SerDdes (high frequency and high performance) and tighter power budgets, the cost of the margin becomes too high. If you don’t do power-collapsing, then you won’t meet the power targets.
SLD: Is memory management becoming a bigger market for simulation?
Subramanian: Traditionally, memory has had some traditional analog pieces like charge pumps, sensitivity chains, etc. Now, in order to achieve higher and higher memory density, vendors are going to multi-level cells. This allows storage of 2, 4 or 8 bits on a single cell. But to achieve this density you need better voltage resolution between the different bit levels, which means you need more accurate simulation to measure the impact of noise. Noise can appear as a bit error when you have tighter voltage margins. You might wonder if this is really a significant problem. Consider Apple’s purchase of Anobit, a company that corrected those types of errors. If you can design better memory, then you can mitigate the need for error correction hardware and software. But to do that, you need more accurate analog simulation of memory. You cannot use a digital fast Spice tool, which uses a transistor table look-up model. Instead, you must use a transistor BSIM (Berkeley Short-channel IGFET Model) model.