Wednesday, February 28, 2007

6. TESTING/ DESIGN FOR TEST

I decided to cover the sub sections one by one..
First lets go with test...

One major ingredient of CAD is Testing and design for Test.

The thumb rule in chip design is "you can't market it until you test it".

Testability has become an essential part of today's chips. Once fabricated each chip sits on a tester where functional as well as ATPG patterns are run on it to catch fabrication issues. It also undergoes the process of burn-in to accelerate any more failures which can happen in the field. Only working chips come out into the market.
Some chips might be lost and hence we have yield defined. ((working chips/total chips manufactured) * 100).

The defects in silicon are targetted after we define them at a higher level of abstraction called faults (fault modeling is a major area of research). Most defects in silicon can be modelled using these fault models. This higher level of abstraction simplifies the process of DFT.

Here we go with the most popular fault models...

stuck at 0 (single/multiple), stuck at 1, coupling faults, bridging faults, delay faults.

Testability is most commonly accomplished in most ASIC's by full-scan or partial-scan. Although it is an overhead in terms of area (Muxed flip flops or LSSD), full scan breaks down a sequential test problem into a combinational one and thereby reduces complexity of test.

Partial scan still requires an atpg (automatic test pattern generation) tool to generate both sequential and combinational test patterns, but is less of an overhead on area.

These scan flops are connected in the form of a serial shift register configuration. The test vectors are scanned in serially (during shift mode) and then applied to logic during capture mode to catch defects. Then the responses are serially scanned out and compared to golden responses.

The test pattern is a vector which distinguishes between a good and a faulty circuit.
So any pattern which can find all satisfying assignments of the f(good) xor f(faulty) = 1 is a test pattern. All atpg tools work on solving this problem.
There are various algorithms for generating test vectors. The most fundamental one's are D-algorithm(Roth), PODEM(Path oriented decision making) and FAN (Fujiwara). Then show up the most recent SAT (Satisfiability based techniques- Larabee).

the test vectors can be driven from a tester or by using BIST (built in self test logic). Each has it's advantages and disadvantages. BIST can be sometimes an overhead if we also make it drive deterministic test patterns (as they need to be stored in memory, in case test is happening at speed). Random BIST is a small overhead on area but usually might not give the desired coverage.

So there are some pseudo random pattern generating BIST's, which are a tradeoff.
Apart from all BIST types we discussed till now, there is mixed signal BIST for Test coverage on Mixed signal circuits.

Memories have their own class of defects and (like pattern sensitivity faults, NPSF etc)hence have their own test requirements. So memory BIST uses various other techniques like march test, GALPAT (galloping patterns test) to get test coverage on memory. The complexity involved in testing NPSF is too high, so some people might skip testing for them all together).

A major advantage of BIST is at-speed. This is needed to catch some path delay faults which can be only caught at speed and if chip frequency (> 500 MHz+) is high.

Since each chip has to be tested and needs to have a desired coverage so that it can be marketted, a major area of research is test compression. This has to be accomplished in order to reduce test time/time to market. During compression we try and target those vectors which can catch multiple faults and use them as patterns for the tester. This way we reduce the number of test vectors and hence test time. test time = no of vectors * test frequency. Most atpg tools these days are capable of test compression.

Mentor graphics has come up with a unique way of compressing test patterns using EDT (embedded determinstic test-technique). There is a compression, decompression logic which is available on chip. Externally only a few test patterns are applied from tester. The decompressor makes the few external patterns into a lot of deterministic patterns internally, which then are applied to logic to get the desired fault/test coverage. Once this is achieved, the resulting patterns are compressed back and are collected by the tester. There is a many to one mapping/one to many mapping mechanism, decomposing which, the defects in the chip can be pin pointed to.

The advantage of this mechanism is externally only a few patterns are applied from the tester. This reduces the tester memory and also test time. Hence it saves a lot of $$$. A lot of patterns (decompressed) are applied at speed further reducing test time and also catching tough to catch at speed defects/which translates to higher test coverage.

One major concern which remains about this mechanism "how will heat dissipation be handled during test?". This is the concern which is common to most at speed testing scenarios.

There are other DFT mechanisms which reconfigure scan chains on the fly and reduce test time.

In addition to these mechanisms, there are various parametric tests like IDDQ and IDDT (quiscent and transient power supply current testing). These defects are caught by various current/voltage sensors built on and off chip. This area has little to do with CAD automation except when it comes to generating the appropriate vectors forthese tests during ATPG. :)

1 comment:

Anonymous said...

Hey! Good post.

I can't vouch for the ILP out-of-order reference, but your description of the various technologies at-hand was very good.

I have a website (blog) called DFT Digest, which concentrates solely on design-for-test issues and trade-offs, if you're interested. It's light on the math, heavy on the methodology.

A couple things in your post, FYI: Both scan and Logic BIST technologies can run at-speed. 'Logic BIST' is distinct from, but related to 'memory BIST', and BTW, carries all the overhead of scan (since a BISTed circuit must be full-scan), plus some, since it's pseudo-random nature sometimes requires the insertion of test points to achieve the same high fault coverage as scan.

Anyway, come check out DFT Digest when you get a chance, and feel free to comment and ask questions!

Keep up the blogging!

John