Part of the  

Chip Design Magazine

  Network

About  |  Contact

Headlines

Headlines

Experts Roundtable: Design-for-Test

By Caroline Hayes

System-Level Design discusses the evolution of design for test (DFT) with the following: Chris Allsup, marketing manager, high-level synthesis and test, and Sandeep Kaushik, senior marketing manager, embedded test and repair, both Synopsys; David Park, vice president marketing, OptimalTest; Steve Pateras, marketing director, Mentor Graphics; and Bassilios Petrakis, product marketing manager, Cadence Design Systems.

SLD: Can you identify how design-for-test has changed recently?

Figure 1: To meet the trend of reduced test execution time, the Synopsys DFTMAX Ultra improves test time reduction by 20 to 30x compared with standard compression technology.

Allsup: The need for higher scan compression has increased as the amount of data required to test today’s more complex designs has exploded. To address nanometer test quality requirements, a growing number of semiconductor companies have begun using advanced pattern generation capabilities. Higher scan compression is needed to reduce test data volume, enable faster test execution and lower test cost.

The second DFT trend is the adoption of test methods designed specifically to reduce test cost through reduction in test execution time. (See Figure 1) For example, multi-site testing screens multiple dies simultaneously to reduce test execution time. In many situations mixed signal, embedded memory, flash and quiescent current testing are the bottlenecks, with scan testing consuming little of the total test time even though it requires the most pins. When production volumes are very high, substantial cost savings can be achieved by “sacrificing” pins in the interests of parallelism. Multi-site testing poses a challenge to standard scan compression technology because it must still enable high test time/test data reduction and high defect coverage while utilizing fewer pins

Kaushik: And the third trend is the growing reliance on both internally-developed and third-party IP/cores (“cores”) to implement the SoC functionality. Most large designs these days are comprised of many cores, including user-defined digital logic, high-performance multi-core processors, mixed-signal circuits, embedded memory and interface blocks. Traditional ad hoc methods for adding the DFT logic and generating test patterns for each block and then later combining and sequencing all these patterns at the SoC level are time-consuming and error-prone. A more automated approach is needed for SoC test integration that accommodates a variety of cores, more cores per design and fewer test pins per core.

Park: There has been an increasing need for all manufacturers to be able to improve, across product quality and yield.

One of the areas relates to scan chain diagnostics. When a scan chain failure occurs in manufacturing test it means that there is a hardware problem with the component. This is one of the primary reasons that DFT is done on the design side.

Petrakis: From 1975 to 1985, most of the DFT focus was inserting scan and improving coverage of stuck faults. From 1985 to 1999 the focus was on automating reduced-pin wafer testing along with logic BIST and boundary scan for higher-level package testing. In 2000-2010, test compression began to dramatically reduce the time and data volume associated with logic testing as well as the rise of automation for reducing switching power during test and high-speed PLL driven delay testing. Since then there have been refinements, such as higher levels of compression and core-based hierarchical testing.

SLD: What has driven these changes?

Park: The need for improved testing capabilities is driven by both technological and business factors.  For instance, a 1% improvement in manufacturing test can yield multi-million dollars in savings.

Petrakis: Customer needs. Logic Built-in Self Test (LBIST) became very popular when it became easy to compute expected signatures and fault coverage improvements with test point insertion. Boundary scan was important for customers with multi-chip modules and other higher-level packages to aid package testing. Test compression came along at a time when customers were starting to have to drop tests because they would not all fit on the tester and re-loading the tester was expensive. Hierarchical test is becoming important to reduce the size of circuit ATPG needs to process and possibly to further improve test compression results. As more chip designs include multiple copies of cores we expect to see an increasing use of partially bad chips with one or two bad cores that can still be sold for reduced functionality.

Paternas: One of the largest challenges is maintaining test efficiency as design size and complexity grow exponentially. This is driving three trends. Firstly, hierarchical DFT with techniques such as wrapping cores with dedicated scan chains for isolation, generating patterns for each core in isolation and the ability to merge and retarget core level patterns to the chip top level pins.

Secondly, greater pattern compression. Increasing compression levels is one way to keep test patterns volumes, and hence test time, manageable. A number of new DFT techniques are being developed to help improve compression levels, for example Scan Bandwidth Management.

Finally, BIST, adding test resources directly on chip enables greater test efficiency through increased test application bandwidth and test parallelization. BIST adoption is evolving beyond embedded memories to logic and mixed-signal blocks.

SLD: What progress has been made in automating mixed-signal testing?

Kaushik: For standard mixed-signal interfaces, such as PCIe, USB 3.0, HDMI and other SERDES interfaces, customers now expect the IP to contain BIST for running analog loopback tests. To accelerate turn-around time, tools should use the IEEE 1500 standard to coordinate the IP test and provide the ability to trim waveforms via on-chip fuses.

For custom mixed-signal blocks, designers traditionally create the test program. Again, tools should make use of IEEE 1500 interfaces and infrastructure to control and observe the digital signals of the blocks. Designers then have direct access to the digital signals and can focus primarily on stimulating and observing the analog signals.

Petrakis: For the digital side of a mixed-signal device, there are often very few pins available through which to test logic. A combination of reduced-pin testing and compression techniques have allowed such devices to be tested using very few pins. On the analog side we have found that functional testing is still the norm with most using some ad-hoc approach. Customers do not see a big need to improve analog testing other than to reduce costs if possible. Without a big demand, there is little incentive for EDA companies to pursue this area.

Pateras: Mentor’s ETS 2011 paper describes a digital infrastructure for general mixed-signal DFT and BIST. It is the first proposal that would allow any mixture of stimulus and analysis by proprietary and third-party IP blocks as well as ATE, for very small to very large chips.  It builds on P1687 and includes P1687-like serial paths for sampled analog signals that provide almost unlimited voltage and time resolution at an unlimited number of circuit nodes. Our ETS 2013 workshop paper describes a new approach to analog defect simulation, for verifying any mixed-signal test’s quality and whether a proposed new test or BIST could replace a more-costly silicon-proven test.

SLD: What other trends have you identified?

Park: We see is a focus shift from yield improvement to product quality. Companies need to be able to collect comprehensive data for all of their products. This cannot be done through just DFT applications and methodologies, it must be addressed by a complete data and business intelligence approach.

Allsup: One trend is potentially higher DPPM and slower yield ramp due to subtle defects associated with advanced manufacturing processes. At 20nm and below, not only are defect densities higher, there are also significant on-chip process variations that affect transistor sizes, transistor threshold voltages and wire resistances. Synopsys provides a comprehensive test, physical diagnostics and yield analysis solution to lower DPPM and achieve faster yield ramp for all process nodes, with value links among the test products and across the Galaxy Implementation Platform.

Petrakis: All testing approaches, including test compression and hierarchical testing, will have to continue to provide means to reduce switching activity during test. Customers want to pursue hierarchical and higher compression approaches to reduce test time and test data, but tests must not fail due to excessive power use during scan or capture clocking.

Pateras: A move to FinFET based technology nodes. For the first time, critical dimensions of these transistors are significantly smaller than the node size, leading to concerns over defect levels and test coverage. Test tools should allow for modeling, targeting complex transistor level defect behaviors.

We also see a move to 2.5D and 3D ICs, which brings new test challenges.

Tags: , , ,

Leave a Reply