July 1, 2005 -- All of us in the semiconductor test business know the issues: cost of test is too high, quality standards are being raised, parts are more complex, storage of excessive test data is overflowing our networks and data warehouses, and finger pointing continues between ATE vendors, test floor operations, test and product engineering, and the foundries.
Semiconductor test remains primitive compared to the fabrication of the silicon as little investment has been made in changing historic approaches to the business. Cost of test continues to increase. Over the past ten years, significant progress has been made in the fabs by installing process monitoring and optimization methods. Instead of waiting for equipment to break and processes to go out of control, continuous monitoring against statistical guidelines help to keep the multi-billion dollar fabs running with high efficiency.
Meanwhile, in test, we run batches of parts until something fails, or worse yet, our customers receive bad parts. Unfortunately, when a failure occurs, we cannot immediately determine where the failure came from - the part, the testers or the operator? We are continually in crisis fighting mode. Furthermore, the analysis of the failure involves accessing historic data that has aged long after the problem occurred. We then ask our junior engineers to painstakingly isolate the cause through manual methods that require advanced engineering degrees in statistics.
The general methodology we have used for the past 20 years is this: Attempt to calibrate the tester, run parts and record lots of data in STDF format, identify bad parts and wait for a failure to occur then analyze the database of information in an off-line, post-processing approach. Because this is the way we do it, solutions providers have come to the rescue with traditional solutions like:
- Data Warehouse Systems: Since we generate so much data, we now need data warehouse tools.
- Data Analysis Solutions: Now that we have so much data, we need sophisticated data mining and mathematical tools to extract any meaningful information from the data.
What if there was a whole new approach?
What if we skipped over the whole process of storing 100% of the test data and recorded only the exceptions and the statistical footprint of the individual test results in real-time as each device is completed and as each lot is closed?
Doing so would reduce the size of useless data storage by several orders of magnitude, and provide information rather than just raw data. Test and product engineers are simply too busy to deal with mountains of data - they would simply rather know about the exceptions and when there are issues, rather than digging through data to find problems. In other words, knowing what is going on and not having to go searching for it.
Of course, one problem is that the semiconductor manufacturing model continues to change. More and more OEMs are turning to outsourced test and packaging operations, and as such, communication is suffering. The lag time between when a problem is observed and when the responsible parties are notified can be a real killer for ensuring continuity of the supply chain and meeting shipments. Geographical separation compounds the issue, making long delays even longer.
However, if all testers provided real-time data that has been statistically characterized in a pre-processed manner, a vast range of business intelligence solutions could be driven off of that data source. A new generation of OEE systems, life cycle management systems, quality auditing systems and accounting systems would bring improved productivity and efficiency to all parts of the test organization. The growing gap between OEMs and their offshore test services could be bridged. Web-enabled performance dashboards could provide information immediately, with timely information and intelligence.
A new generation of intelligent analysis tools could be created that operate from a database that is a fraction of the size STDF forces us to cope with. Correlation studies could be automatic and continuous. The intelligent test floor would become a reality - wherein ATE systems could determine automatically how their calibration and reliability was holding up. Data could be fed back continuously to the fabs and to the design engineers in a real-time manner.
And that is just the beginning
Real-time access to test information serves as an enabler as well, providing opportunities for adding powerful test process improvement software that could catch the test process going out of control before the equipment broke, time was wasted and material scrapped. Likewise, this access could allow test performance improvement software that could dramatically reduce test time using precise SPC techniques that are far more granular than manually possible. Finally, real-time access permits close monitoring of critical tests in production and can facilitate on-line, real-time binning of parts based upon quality standards like Parts Average Testing.
Of course, all of this has to be done without requiring additional equipment or additional test time.
Leading companies like Qualcomm, Texas Instruments, STATSChipPAC and rapidly growing startups like Volterra are finding out that real-time access to test results provide dramatic improvements in their test operations. These and other companies are harnessing the power of real-time data access to achieve:
- 20-50% test time reduction using existing designs and test equipment with minor test program edits;
- Reduced DPM using advanced parts average testing techniques on-line;
- A common solution that works at both wafer probe and final test with no impact to test time;
- Reduced errors and scrap due to real-time, on-line monitoring;
- Rapid access to useful information rather than analysis of volumes of old data;
- Reduced finger pointing between product engineers and test operations;
- Faster time-to-volume on new product rollouts.
The mantra of less data and more information with real-time precision control is paying off for the forward-thinking companies that are embracing this vision of real-time access to worldwide data. With Web-enabled systems providing instant test process information to any user anywhere in the world, from test and product engineering to operations and management, the growing gap between OEMs and their subcontractors is shrinking. Companies are now making inroads to dramatically improving the performance of their test operations.
By Scott Bibbee, Director of Marketing and a co-founder of Pintail Technologies, Inc.
Bibbee responsible for strategic and tactical marketing for the company. He holds Master's in IT Project Management and Commercial Contract Management from The George Washington University, and received his bachelor's degree in Applied Learning and Development from the University of Texas.
Go to the Pintail Technologies, Inc. website to learn more.