October 22, 2008 -- Maybe itís not as dramatic as a tsunami, but it sure seems like a perfect storm is brewing for chip and circuit board test because of factors such as the effects of sub-100-nm chip fabrication processes, the capacitance anomalies arising from test probing, newer packaging technologies, and more.
- Sub-100-nm chip fabrication processes have dramatic effects on device-level parametric performance characteristics and traditional characterization and testing techniques are ineffective at identifying the problems. External instruments at the corners of a chip cannot see the rampant variations across the chip. Only on-chip instruments can effectively monitor parametric characteristics such as thermal conditions, timing issues, clock propagation delays, power distribution and others.
- Placing an oscilloscopeís probe on a test pad on a high-speed serial bus such as PCI Express 2.0 or 3.0, Fibre Channel, 10-Gbps Ethernet, InfiniBand or Intelís Quick Path Interconnect (QPI) architecture introduces capacitance anomalies on the bus. The validation or test engineer canít tell the difference between a probe-induced anomaly and a fault in the design or on the manufactured assembly.
- Newer packaging technologies such as system-in-package (SiP), system-on-package (SoP), package-on-package (PoP) and others are complicating the test and design-for-test (DFT) process considerably. For example, multi-die devices with silicon vias may not provide access to embedded DFT intellectual property (IP) even when itís been embedded on one of the die in a multi-die package.
These instances and others point up the need for innovation when it comes to chip and circuit board design validation, test and debug. But, there is good news too. There are solutions available today and others that will be available in the not too distant future.
For example, greater chip-level densities at the 90-, 65- and 45-nm process nodes mean that there are gates available to insert validation and test facilities such as embedded instrumentation on-chip. In fact, many of the chip suppliers are already placing embedded instrumentation IP into their products (see Figure 1).
Figure 1. Transition of external hardware to chip IP and software.
Several open industry standards and proprietary de facto standards also are being developed or are already available to address the need for new test processes. An IEEE working group is developing the P1687 standard which will define a standard method to control, manage and interface to embedded instrumentation on chips no matter where the embedded IP comes from, such as chip vendors, design automation providers, IP providers, in-house developers or others. Embedded IP conforming to 1687 creates a standard mechanism for accessing and controlling diverse test, validation and debug IP. As a result, designers wonít have to learn and deploy a new set of tools for each source of the embedded DFx IP they are using.
In addition, Intel has developed its own embedded instrumentation IP ó Interconnect Built In Self Test (IBIST) ó to validate designs incorporating high-speed serial buses. Intel is placing IBIST into its next-generation high-end server chipsets. Another chip supplier, Avago, has incorporated IBIST into its Intel QPI SERDES cores for ASIC developers. ASSETís ScanWorks is currently the only third-party tool set for embedded IBIST test technology in Intel chipsets and Avagoís QPI SerDes cores.
Another IC supplier, Maxim, is embedding instrumentation at the chip level to perform system-level management and monitoring functions. One family of system monitors incorporates multiple digital-to-analog converters to perform voltage margining and tracking. ScanWorks, for example, can control these devices to measure and manipulate system voltages so that power supplies on circuit boards and in systems can be functionally validated.
The new test process
These and other developments indicate the need for a new test process. Letís face it. As chip, board and system level technologies evolve and become faster, denser and more complex, test methodologies must respond in kind. And as an open embedded instrumentation tools platform with roots in JTAG, which reaches into the worlds of both chip and board design validation and test, ASSETís ScanWorks is positioned at the center of this perfect storm.
By coming at this challenge from the perspective of board-level issues such as design validation, test and debug, ScanWorks not only incorporates a unique set of insights and capabilities, but its ability to access, automate and analyze the output of instruments embedded at the chip-level enables a holistic life-cycle test process within ScanWorks (see Figure 2).
Figure 2. "Bridging the Gap" for total re-use of design and test data from IC to PCB.
In essence, ScanWorks offers a validation and test migration strategy that begins at chip design with debug and characterization capabilities, continues to the board and sub-assembly level with design validation, structural and functional test, and culminates at the system-level where it addresses manufacturing test and diagnostics as well as field service troubleshooting.
The new test process that Iíve been referring to and which is rapidly emerging is evolving right along with the rest of technology. It will be different and it will require adjustments in attitudes toward test, but thatís nothing new. Evolving technologies have had the same effects on the way design is done. At the end of the day, the new test process will yield higher quality electronic products and continued growth of our industry.
By Glenn Woppman.
Glenn Woppman is President and CEO of ASSET InterTech, Inc.
Go to the ASSET InterTech, Inc. website to learn more.