Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1996, IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
…
18 pages
1 file
The paper presents a methodology for evaluating defect levels in an integrated circuit (IC) design environment, focusing on non-equiprobable faults and utilizing the extended Illumina-Efrown formula. A critical examination of fault coverage metrics reveals that traditional single-vine fault coverage is an unreliable indicator of test quality. The proposed weighted fault coverage considers the probability of different faults occurring, enabling a more accurate assessment of test quality in relation to defect densities and physical layouts.
Quality and Reliability Engineering International, 1993
There has been a great amount of publicity about Taguchi methods which employ deterministic sampling techniques for robust design. Also given wide exposition in the literature is tolerance design which achieves similar objectives but employs random sampling techniques. The question arises as to which approach-random or deterministic-is more suitable for robust design of integrated circuits. Robust design is a two-step process and quality analysis-the first step-involves the estimation of 'quality factors', which measure the effect of noise on the quality of system performance. This paper concentrates on the quality analysis of integrated circuits. A comparison is made between the deterministic sampling technique based on Taguchi's orthogonal arrays and the random sampling technique based on the Monte Carlo method, the objective being to determine which of the two gives more reliable (i.e. more consistent) estimates of quality factors. Results obtained indicated that the Monte Carlo method gave estimates of quality which were at least 40 per cent more consistent than orthogonal arrays. The accuracy of prediction of quality by Taguchi's orthogonal arrays is strongly affected by the choice of parameter quantization levels-a disadvantage-since there is a very large number (theoretically infinite) of choices of quantization levels for each parameter of an integrated circuit. The cost of the Monte Carlo method is independent of the dimensionality (number of designable parameters), being governed only by the confidence levels required for quality factors, whereas the size of orthogonal array required for a given problem is partly dependent on the number of circuit parameters. Two integrated circuits-a 7-parameter CMOS voltage reference and a 20-parameter bipolar operational amplifier-were employed in the investigation. Quality factors of interest included performance variability, acceptability (relative to customer specifications) and deviation from target. KEY WORDS Quality analysis Quality design Tolerance analysis Tolerance design Integrated circuit design Taguchi methods Robust design
1998
In semiconductor manufacturing, the observed clustering of defects on a wafer has often led to the practice of discarding die from wafers, or parts of the wafer, that display a high incidence of failures. In recent work we have formalized and refined this process with the goal of minimizing test escapes during production testing. In the new approach, in evaluating
IEEE Transactions on Computer-aided Design of Integrated Circuits and Systems, 1985
IC chips or manufactured wafers run through selection processes at various stages of the fabrication process. Typically, the most important is the selection performed on IC dice which are tested directly on manufacturing wafers. This paper deals with the problem of the optimal assignment of the upper and lower selection thresholds applied for selecting dice during the wafer measurements. The tolerance assignment is defined as a statistical optimization problem, where the optimization objective function is a measure of the manufacturing profit. In the paper a method for computing a solution of this optimization problem is proposed, and an example of the industrial application of this method in the Computer-Aided Manufacturing (CAM) area is given.
2000
Understanding the effectiveness of their production tests is a critical task for IC suppliers. Numerous trends suggesting that conventionally applied test methods must change to meet future needs will make the task even more critical – and difficult – in the future. This paper presents characterization and diagnostic data and ideas aimed at helping IC suppliers understand test effectiveness.
IEEE Transactions on Electronics Packaging Manufacturing, 2004
Both cost and quality are important features when manufacturing today's high-performance electronics. Unfortunately, the two design goals (low) cost and (high) quality are somewhat mutually exclusive. High testing effort (and thus, quality) comes with a considerable cost, and lowering test activities has significant impact on the delivered quality. In this paper, we present a new structured search method to obtain the best combination of these two goals. It features a Petri-net oriented cost/quality modeling approach and uses a Pareto chart to visualize the results. The search for the Pareto-optimal points is done by means of a genetic algorithm. With our method, we optimize a manufacturing process for a global positioning system (GPS) front end. The optimized process clearly outperformed the standard fabrication process.
IEEE Design and Test of Computers, 2004
TODAY'S SEMICONDUCTOR fabrication processes for nanometer technology allow the creation of very high-density and high-speed SoCs. Unfortunately, this results in defect susceptibility levels that reduce process yield and reliability. This lengthens the production ramp-up period and hence affects profitability. The impact of nanometer technology on yield and reliability creates a dilemma for users of the conventional chip realization flow. Each chip realization phase affects manufacturing yield and field reliability. To optimize yield and reach acceptable reliability levels, the industry uses advanced optimization solutions, designed in and leveraged at different phases of the chip realization flow. Recognizing the importance of this topic, IEEE Design & Test has dedicated this special issue to design for yield and reliability solutions.
International Conference on Microelectronic Test Structures, 2003., 2000
Complexity of integrated circuits has led t u liundreds of millions of transistors, wiring lines, a u d layer t o layer via connections o n every chip. To allow accurate yield evaluation, it is required that process cliaracterization test chips g r o w in complexity as well which has let t o a significant bottleneck in testing them. Wafers that could be tested in less tlian two huurs in a 0.35uni technology now require 10 hours a n d m o r e in a 0.13um technology. This p a p e r will present methods how test structures c a n be redesigned t o better support testing. Based on those we will present modified test algorithms that will significantly reduce the test time by 50% and more, which will accelerate d a t a analysis a n d increases efficient use of parametric test systems.
2002
The issue of economics in design and test is a very troubled and turbulent one. Even amongst our accepted testability experts, there can be widely differing opinions as to the implications of design and test decisions at an economic level, e.g. the continuing debate on full scan vs. partial scan. This paper will outline some of the testability issues facing IC and system designers, especially the costs involved in making circuits testable.
Strojniški vestnik – Journal of Mechanical Engineering, 2012
Robustness is the key to successful product design when many variation sources exist throughout the product lifecycle. Variations are of many sources such as material defects, machining errors, and use conditions of the product. Most of product performance simulations are traditionally carried out using the numerical model created in the CAD system. This model only represents the nominal information about the product. Thus, it is difficult to take these variations into account in product performance prediction. A method is proposed in this paper to allow integrating the effect of these variation sources into the product performance simulation. This method is based on a random design of experiment method. As a result, an image of the "real" performance of the product is determined.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Taguchi G., Phadke M.S. (1989) Quality Engineering through Design Optimization. In: Dehnad K. (eds) Quality Control, Robust Design, and the Taguchi Method. Springer, Boston, MA, 1989
IEEE Transactions on Computer-aided Design of Integrated Circuits and Systems, 1993
2013 IEEE 20th International Conference on Electronics, Circuits, and Systems (ICECS), 2013
IEEE Transactions on Semiconductor Manufacturing, 1995
Quality Innovation Prosperity, 2013
Wiley Encyclopedia of Electrical and Electronics Engineering, 1999
IEEE Design & Test of Computers, 2000
IEEE Transactions on Circuits and Systems, 1979
Quality and Reliability Engineering International, 1991
2008 17th Asian Test Symposium, 2008
IEEE Transactions on Instrumentation and Measurement, 2001
Quality and Reliability Engineering International, 1995
Computational Materials Science, 2008
Biometrics, 1989
CRC Press eBooks, 2014
2006