July 1, 2005 -- With each new process node, additional defect mechanisms appear and hinder the ability to achieve desired yield. The trend toward declining yields has created resurgence in the application of design for manufacturing (DFM) methodologies. Much of this reinvigorated effort relies heavily on a new breed of tools and technologies destined for the designer's hands.
But before designers adopt a comprehensive DFM methodology, it is important to understand the primary types of yield loss. These include 'random' (typically associated with particle defects), 'systematic' (induced by the process or lithography applications) and 'parametric' (which cause timing or other failures as a result of device physics and interconnect effects).
Figure 1. Initial production yields have not improved, and mature yields have declined.
At larger technologies, random defects are the dominant yield loss mechanism. Despite constant clean room improvements, particles still land on chips or masks. These particles cause catastrophic failures-shorts (extra metal between two metal lines) or opens (missing metal) -or result in parametric issues (resistive pinching, added coupling).
Random defects are being driven down in conjunction with feature scaling in advanced processes, enabling similar yield for a chip taken through a simple process shrink. At the same time, the increase in functional integration enabled by smaller features makes these designs even more susceptible to particle defect limited yield.
Systematic yield issues are not random in nature, and are a result of interaction between the layout and process variations. These include chemical variations in materials; mechanical variations in CMP, optical variations in lithography and plasma variations in etch. Some examples of systematic defects are: planarity (the difference in metal heights for a given area on a design), antenna effects (charge accumulation of interconnect components), via opens and electromigration.
A significant systematic effect occurs within the copper manufacturing process. Via opens, due to the copper cladding process, result from voids traveling down into the via. The results can vary from resistive via connections to completely opened lines. Some improvement to yield is gained by using redundant vias. If one via fails, the redundant via compensates for the failing one.
As features sizes have decreased, manufacturers have turned to Resolution Enhancement Techniques (RET) to help compensate for the physical limitations of stepper wavelengths. But the ability to apply RET appropriately can be limited by the design layout itself. Certain types of RET have requirements on pitches. (A pitch is essentially half the width plus half the spacing of the polygon in question.) Today, this is most problematic for poly gate transistors, but is quickly becoming important for metal lines as well.
The biggest impact in nanometer designs is attributed to parametric yield loss. This is when all elements of a chip are functioning, but timing or other electrical requirements, such as power, are not to specification. Parametric defects became an issue at 180nm, grew dramatically at 130nm, and are of significant concern in 90nm designs. Parametric defects are the result of interconnect parasitics and device physics. Parametric issues can cause a circuit to behave in non-ideal ways, such as causing power bus voltage drop, increased noise, timing skew in clock and signal lines.
Designers need device parameter and parasitic extraction tools that capture complex silicon effects and account for them in the design. If not accurately simulated, modeled and handled, the impact of these defects can have dire consequences on the success of a chip.
Practical approaches to DFM in the nanometer era
While the sources of yield loss appear ominous and difficult to overcome, significant work is being done in EDA tools to help produce designs that are much more resistant to yield issues. Currently, EDA tool development is concentrated on several areas: analysis of DFM rules, yield enhancement, physical test, litho-friendly design and nanometer silicon modeling.
Traditionally, design rule checking (DRC) acted as one of the key communication vehicles between manufacturing and design. The DRC rules tell the designer the limits imposed on them by the manufacturing process. Most of these constraints represent true process limitations, which, if not followed, produce nonfunctioning silicon, or considerably low yields.
Unlike traditional manufacturing constraints, which are easily quantified in DRC rules, yield issues are more difficult to pin down, because they are dependent on a number of variables and are largely statistically driven (i.e. the greater the occurrence of a certain configuration, the greater the likelihood of chip failure).
Recently, in addition to DRC rules, foundries have started delivering a new set of manufacturing rules. These 'DFM recommended rules' are given to designers to indicate where a design becomes easier to manufacture by adhering to the DFM rule, rather than the minimum spacing DRC rule. This all sounds useful in theory, but to a designer, the whole point of moving to the next process node to take advantage of more aggressive minimum spacing. The quest to comply with DFM recommended rules opens the door to many questions (Figure 2). What tools help the designer determine, for an area of the design, if DRC rules or DFM rules have a more positive impact on yield? One way is to gather layout statistics on the feature in question. For instance, statistical modeling and critical area analysis highlight not only how often an issue occurs, such as antennas and vias, but also in which combination and level of severity the issue occurs.
Figure 2. With DFM rules, a simple DFM check may produce millions of errors. Designers need statistical methods for evaluating the impact of these errors.
Statistical models reveal what accumulative effect a yield issue has on an area as well as on the design as a whole. Armed with comprehensive statistics, a design manager can determine which issues have the most impact and which are cost effective to fix.
In the future, tools must address this 'DFM recommended rule' data in terms of statistics, and not only provide information about the feature-based improvement in yield, but also a means to determine how these modifications impact the 'litho-friendly' and 'timing-aware' aspects of the design. New tools must also give designers a way to make yield predictions, allowing the designer to prove if modifications to the design really resulted in better yielding silicon
Automatic yield enhancement emerges
Clearly, the physical layout of a design has a primary impact on yield. The second area of DFM consists of yield enhancement, that like DFM rule adherence, results in a modification to the physical design: either through addition of features or the moving of features.
A common modification that designers want to make to the layout to improve yield is single via transitions. The number of single vias, and the number of via transitions with minimal overlap, can contribute significantly to yield loss. With tools currently available, users can analyze the layout for via conditions and then automatically add additional vias to increase the robustness of the design.
Adding vias needs to be done intelligently, though, so that re-routing is not required. Designers need the ability to add different via types to reduce the number of single vias without having to move any routes, and do so by adding the least amount of metal. This allows more via placements, without creating DRC violations, and the metal added can automatically be oriented in preferred directions (i.e. in the direction of existing metal lines). This decreases the capacitive impacts that can occur. There are also certain operations that are needed for 'island or isolated' posts. These operations are used on isolated vias where stress effects can make a via unstable.
Other areas where modifications to the layout can have a profound effect on yield are metal fill, via enclosure expansion, end of line extensions and notch filling.
Most of the failures associated with defect mechanisms in nanometer designs are context sensitive (dependent on layout placement and routing). This theoretically reduces the number of potential failure locations, but most test vector generation tools do not take advantage of this context-specific information, and generate vectors for all possible occurrences.
Knowing which layout features interact with the process, and the impact of that interaction is the fundamental foundation of DFM. Without this knowledge, failure probability information and any correction to the design process is purely guesswork, making tradeoff decisions ineffective.
So how can the minimum physical attributes necessary for effective defect-based testing be defined? A practical approach may be to first identify the number of nets that can be targeted based on pattern volume and tester constraints then prioritize the defect candidates by their likelihood to fail. After patterns are generated for a variety of physical attributes, they should be simulated for other fault models such as transition and stuck-at to see how many faults have already been detected. By only targeting the undetected faults during pattern generation, fewer patterns will be needed to reach high test coverage and ultimately defect coverage.
Using physical information to guide the test pattern generation process is an effective means of ensuring that maximum value is gained from each pattern. As a result, the overall test quality can increase by addition of tests for new faults and potential defect sites.
Litho-friendly design makes its debut
The 130nm node ushered in a new era in semiconductor manufacturing--an era that might have amazed pioneers in the science of optics. More than a century ago, English physicist, John Rayleigh, expressed the lithographic challenge with the following equation: R=k1(?NA), where R is resolution, k1 is a figure of merit (the smart factor), ? is the wavelength of light, and NA is the numerical aperture of the lens system.
While still valid in nanometer technology, Reyleigh's equation has seen the values for R, k1, ? and NA change drastically over the last several process nodes (Figure 3). One of the most critical parts of the equation is the value of k1. Implementing a high k1 is easy; implementing a low k1 is hard, and indicates how astutely the lithographer uses the light source. What is noteworthy is that with a k1 lower than 0.5, Resolution Enhancement Technology (RET) is a requirement for anything to print at all.
Figure 3. Lithography challenge.
Lithographers encounter numerous conflicts between design layouts and RET on a daily basis, and call for the day when designers learn more about the new era of lithography. However, the design community, of course, remains blissfully ignorant of this activity. While semiconductor manufacturers have worked diligently to minimize variation through advancements in the in the lithographic equipment and RET techniques, it is necessary for the design community to adopt practices that make designs more insensitive to process variation.
The gap between the lithography and design communities seems to arise because a paradigm for communication between the process world and the design world, which used to work well - the design rules - is unable to carry enough information to allow the entire system to be optimized. Yet, given the historical tendency, neither side has shown great motivation to break out of their classical paradigm.
What is evolving to fill this gap is a new generation of EDA tools, which communicate with standard EDA formats, but include algorithms and simulators tuned to the process requirements of RET. Library generation tools that produce RET-compliant layouts are a step forward in this direction. The addition of process simulation to layout verification tools, again allows process effects to be automatically predicted and flaws reported as potential design rule violations, using the same interface familiar from normal DRC checks (Figure 4).
Figure 4. Marginal point detection using the process window model.
As these tools implementing RET compliant knowledge become standardized, the information they require also become part of the standard package to be downloaded along with the classical design rules. In this way, the "paradigm" is not challenged; only the content of the information package has changed. Designers do not need to take seminars on diffraction, and lithographers do not need to learn how to debug RTL. Each group of experts can remain focused on doing what they do best, as long as the tools continue to evolve to fill this gap.
Simple parametric extraction is inadequate for nanometer design
New physical effects and layout complexities result from the combined impact of nanometer scale design and copper process technologies. The proximity of varied design components to one another conspires to make inductance, electro-migration, signal integrity and hierarchical timing analysis critical. In view of these effects, transistors must not be modeled in isolation, but evaluated more accurately across the diffusion characteristics of the entire design. In metal layers of certain densities, surfaces tend to dish or become convex, changing the resistance and capacitance characteristics of lines. Contacts and vias, which were not points of concern in earlier process technologies, now contribute substantially to total interconnect capacitance. Substrate effects adversely affect on-chip analog functionality, and coupled capacitance is becoming a significant percentage of total interconnect capacitance. Line width variations, and optical and process effects, applied post-layout, must also be considered.
Traditional analysis required accurate parasitic information to model resistance, capacitance, coupled capacitance and inductance. The new analysis paradigm requires more. Accurate parasitics must be annotated to accurate geometric information. This means delivering parasitics plus intentional device measurement including X and Y locations, width, length, layer and number of contacts. Current density measurement is necessary to model IR drop and electromigration. This requires a strong link between a robust layout vs. schematic (LVS) tool and a transistor-level parasitic extraction tool.
Non-standard elements, such as handcrafted analog components or MOS devices with irregularly shaped diffusion regions, are more susceptible to current-related issues. The resulting stresses impact electrical behavior, power, and ultimately device lifespan of devices. The only way to accurately assess the stress effect of these components is through post-layout re-simulation, with both parasitic effects and accurately measured physical device parameters. Without a tight link between an LVS tool, which can extract all device parameters, and transistor-level parasitic extraction tool, designs cannot be properly netlisted for re-simulation purposes.
Design for manufacturing in nanometer technologies has the potential to change almost every aspect of the IC design and manufacturing process. It has radically changed the lithographic process in manufacturing, and continues to mold advancements in IC manufacturing equipment and materials. For the design team, it beckons an extensive review of current design practices. Will concurrent change in both the design and manufacturing disciplines overcome the challenges we face? For those chip design and manufacturing companies who are already on the road to adopting new DFM tools and methodologies, the answer is simple: there is no time to delay.
By Joseph Sawicki, VP/GM, Design to Silicon Division, Mentor Graphics
Go to the Mentor Graphics Corp. website to learn more.