DFT (Design for Test)
The number of transistors, the logical connections, and the related complexity of a chip design are soaring high with each passing day! 10 million random logic points are a common scene for an IC now. Such large designs offer a great user experience arising out of their superior performance, but the problem lies in testing such designs.
Today the functional units within a single design are as complex as the entire chip used to be some times ago. If any problem occurs in any logic connection of a single unit, then the overall functionality of the chip will be adversely affected. The added complications in terms of size, time, and the related complexities in fabricating a chip can be eased up with a proven design & test approach.
The tactics and tools a design team selects are of pivotal importance in a project’s success. An important concern in the whole design process is the DFT (Design-for-Test) which encompasses a vast range of design-related test tasks.
Let’s have a detailed look at the DFT approach.
Table of Content
- DFT’s Role
- Comparing Testing against Verification
- DFT Techniques
- Can DFT Permanently Eliminate Faults?
1.1 General Issues in Crafting New ICs
New technology is accompanied by a new set of challenges, and ICs with smaller die sizes mean the potential for more errors. The most probable issues in this regard are:
The introduction of deep-submicron design technologies has complicated the fabrication processes to a great extent. The design elements are coming closer while being smaller and thinner. In a single VLSI chip, billions of transistors are involved and so the chances of short-circuiting and wore breakage is fairly high.
These things are a few sources of fault in the design. There can be many more such errors that can creep during the fabrication phase. And with increasing densities, the probability of errors also increases.
The design software might also yield some errors in IC design for some bugs. These small errors during the simulation phase may translate into a significant issue in design.
Mode of Application
SoCs and microcontrollers are part of nearly all digital appliances. There are some industries like medical care or healthcare products in which a single error might risk a life! The space shuttles run on fuel in a cryogenic state and their consumption is also controlled by a microprocessor. Such microprocessors must not only be tested for functionality but also their performance at high and low temperatures.
In case of failure, one needs the proper coordinates of the fault. With decreasing size of the printed boards, multimeter testing is losing its relevance. This in turn makes the maintenance process more expensive.
Here it is important to note that even if the IC has been fabricated with utmost perfection, the possibility of errors will always be there. Sometimes the tempered or loose packaging makes way for the entry of moisture, electromagnetic radiations, stray voltage, and others, which translate into errors in the IC design.
1.2 DFT- The Sure Shot Solution
Modern microprocessors encapsulate myriads of functionalities inside them. There are thousands of pins and millions of transistors. Now if a single transistor is faulty then the whole functionality paralyzes and sometimes the whole chip needs to be discarded!
The biggest problem comes in identifying a single faulty transistor and every function with all possible combinations needs to be tested. And even then one might not get the right cause of the fault. If this was the methodology for fault-finding, then the time to market would be so high that the IC might not reach consumers at the right time.
To counter all this, a special methodology is deployed that adds a special feature to the chip. The methodology is Design for Testability, and the related feature is testability.
DFT is a design tactic that adds additional circuitry to the chip, making the testing- possible, easy, and cost-effective. With this technique, the controllability and observability of the internal nodes are improved and with this, the embedded functions can be verified.
2. DFT’s Role
2.1 Tests of Sequential Circuits
The logic circuits whose output not only depends on the current input values but also on the sequence of past input histories are termed as the sequential logics in automata theory. Testing the simple combinational circuits is a straightforward task but in the case of such sequential circuits ‘Shift Registers’ are required in a technique called scan chains.
Testing the sequential circuits is more challenging as the circuits have a specific ‘state’ at every instant of time and input. Starting with a known initial condition, a very large number of test vector cycles will be required for getting the circuit in the desired state.
So, for sequential logic systems, one needs a specific set of features along with the conventional circuitry. This is easily attained with the DFT methodology.
2.2 Easing the Manufacturing Process
In the chip fabrication process, DFT help gains two significant objectives:
Eliminating the faulty modules-
Testing help gain knowledge of the errors that might create a fault in the IC. As they are detected earlier the underlying defects can be discarded at that phase only. This not only saves time but also the re-manufacturing cost, as the defected design is being discarded even before the production.
Improve the Manufacturing process-
DFT solutions are applicable across all the phases of ideation. This helps gain cognizance of the process where the parameters are going out of the track. This eases up the failure analysis as the probable defects, along with the location are identified easily.
Such monitoring enhances accuracy and reduces the potential for defects.
3. Comparing Testing against Verification
Verification proves the correctness of the logical functioning of the design before the chip fabrication. But this process is put into implementation when the RTL design is coded. And this is done only once before the chip manufacturing using the universal verification methodology.
On the other hand, Design for Testability attempts to assure the correctness of the chip at every level of abstraction. Chip level, board level, and system-level- all undergo the DFT tactics to test their correctness. Testing is done on every component and unit of the system, as each part has the equal potentiality of being faulty. So DFT help improves the overall functionality of the turnkey product that is going to be sold in the market to end consumers.
4. DFT Techniques
The techniques deputed in the DFT methodology are broadly categorized into two parts as follows:
4.1 Ad-Hoc Technique
Comprehended from the experience, these are the collections of tactics to make the design testability an easy goal to attain. They are a set of rules regarding the dos & don’ts in the design process of the IC.
The biggest plus-point of the ad-hoc technique lies in the simplicity of test vector generation. They are very easy to implement, and area overhead is fairly less without any hard and fast rules for design constraints.
There are some challenges in the implementation as each design has specific needs and testability problems. Ad-hoc tactics are not always reusable and are not systematic to ensure a uniform approach to the testable circuit.
Breaking down a larger circuit into smaller units, inserting test access points to improve the controllability and observability, increasing the number of nodes, and multiplexing primary output for internal nodes are a few examples of ad-hoc techniques.
4.2 Structured Technique
In this technique, additional logic and digital signals are added within the circuits to test as per the predefined criteria.
In comparison with the ad-hoc technique, the structured form of DFT means greater reusability and better testability. Whatever core function a circuit performs, the structured technique can be used for any logic connections. This technique is the sure-shot solution to any problem the current electronics world is facing.
Accepting the new set of design rules, added silicon area, and propagation delays are the only challenges that one needs to overcome.
Scan Path, Partial Scan, BIST, Level Sensitive Scan, and Boundary Scan are some of the examples of structured DFT techniques.
5. Can DFT Permanently Eliminate Faults?
Do these test sets ensure that the chip will not be facing any functional issue ever?
Well, the answer might be in negation. Faults can occur at any time at any instant depending on various parameters- physical or electrical. Humid environments, dielectric breakdowns, higher temperature, and aging are some of the important factors leading to the failing chip.
As discussed earlier, DFT techniques are deputed at all phases of chip fabrication, and after the final development, there is also a heat test. In this, the chip is tested for its functionality and changes at a high temperature in an oven. Here the thermal durability is tested and the final nod is provided for dispatch.
While dispatching the chip to the end consumer, the proper packaging also needs to be ensured, accordingly.
Prolonged overclocking in the chip system may stress out the design and shorten the lifespan by causing intermittent flaws and random crashes in the future. These are some overt & explicit causes of chip failure.
DFT enhances the confidence of developers by testing the chip at all phases and the related potentialities of breakdown. This also creates provision for testing the cause of failure and the related rectifications.