February 21, 2008 -- It’s just geometry. As system-level ICs grow larger and more complex, they become impossible to observe and stimulate. Internal nodes aren’t accessible to bonding pads or even to probes. Signal voltages are small, noise thresholds are tiny, and drive strengths are negligible. As critical circuits reach gigahertz frequencies, it becomes physically impossible to get an accurate representation of signals off the die, even if you can probe the circuit.
Yet the need remains. Chip designers must be able to observe and stimulate individual blocks in an SOC (system-on-chip) to bring up the silicon. Manufacturing-test engineers must be able to create fast test programs on affordable test equipment. Increasingly, chip designers must also create autocalibration routines that can compensate critical circuits for process, voltage, temperature, impedance, and noise variations while the chip is in use. The only apparent option is to move test-and-measurement instruments—the racks of logic analyzers, bus analyzers, communications testers, and oscilloscopes that populate the bring-up lab—onto the chip itself.
And this option is now available. Beginning perhaps with debugging facilities built into CPU cores, extending through bus diagnostic blocks and logic built-in self-test blocks, on-chip instrumentation is today extending into high-speed transceivers and RF circuits. In the future, you may see on-chip analog instrumentation for characterization and calibration routinely becoming part of analog design.
By Ron Wilson, EDN Executive Editor
This brief introduction has been excerpted from the original copyrighted article.
View the entire article on the EDN Magazine website.