1 / 43

Future IC Test Challenges Quality, Cost and Time to Market

Future IC Test Challenges Quality, Cost and Time to Market. Korea test conference workshop 2014 Jin-Soo Ko Teradyne Inc. Jin-soo.ko@Teradyne.com. Korea test workshop Oct. 15 2014 Jin-Soo Ko. Mobile Device challenges. Fast Ramps

nlehman
Télécharger la présentation

Future IC Test Challenges Quality, Cost and Time to Market

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Future IC Test Challenges Quality, Cost and Time to Market Korea test conference workshop 2014 Jin-Soo Ko Teradyne Inc. Jin-soo.ko@Teradyne.com Korea test workshop Oct. 15 2014 Jin-Soo Ko

  2. Mobile Device challenges • Fast Ramps • Essentially full volume from product launch for high profile new products • Must bring up new silicon process at the same time (14nm, FinFET) • Yield must be good to reduce cost but defect rate must be extremely low Number of units sold during the first weekend of product introduction “Zero” DPM COT is important but getting to market quickly with the best quality is what really counts! Days Weeks Silicon Samples Mass Production

  3. Advanced silicon process and VDD

  4. ITRS semiconductor roadmap • Roadmap for Gate Length and Supply Voltage

  5. Market trends driving IC test

  6. What is the most important for test? (from itrs) • Cost of Test, Time To Market and Test Quality are equally Important • Why? • Test Costs are a small part of the overall cost to make an IC. Focusing only on this does not increase profit much • In the mobile space, being first to market captures more market share which increases profit the most • Equipment manufacturers will not accept poor IC quality. • High quality devices have higher value and bring more profit. • Higher yield also means lower overall cost. Korea test workshop Oct. 15 2014 Jin-Soo Ko

  7. Cost of test(COT) Cost trends (ITRS) ATE Capital costs are actually decreasing This also increases Time To Market “consumable” items like probe cards and sockets are increasing in costs Korea test workshop Oct. 15 2014 Jin-Soo Ko

  8. Cost of test(COT) Cost trends (ITRS) Very dependent on DFT Technology Could potentially eliminate System Level Test insertion to lower costs Korea test workshop Oct. 15 2014 Jin-Soo Ko

  9. Design complexity and scan depth • Compression contained ATE memory requirements growth from 2000 to 2010, but is approaching theoretical limits • Reduced Pin Count Test Will Drive Memory Requirements Higher • Also puts strain on datalog and post processing features Current Industry Practice

  10. More Scan Testing = More Test Time • As devices get more complex and scan compression can’t keep up, test times will get longer • Increased ATE efficiency keeps COT flat • Higher Site count (Multi-site Testing) is the most efficient way to reduce costs • Similar to memory test 3X scan test time in 5 years due to higher gate count

  11. COT - Multi-site and Concurrent test and TTR Concurrent PMIC PMIC BT FM PMIC PMIC PMIC PMIC RF Tx/Rx RF Tx/Rx RF Tx/Rx RF Tx/Rx Time ABB ABB ABB ABB FM FM FM FM BT BT BT BT BB BB RF USB RF FM ~11s (estimated) => 35% TTR (estimated) • Test Throughput improvement • N=16 sites (PTE 0.98) • 35% test time reduction by concurrent test, • Multi-site test throughput 0.98xN site • Concurrent test throughput 1/( 1-0.35) • Throughput = 0.98x16/(1-0.35) = 24.12 BT USB *Shared device functions prevent some concurrency ~17s Korea test workshop Oct. 15 2014 Jin-Soo Ko

  12. Concurrent test programming and debug tool Serial Test Flow Concurrent Test Flow Initial Tests Block A Initial Tests Block A Tests Block E Tests Block D Tests Block B Tests Block B Test Time Tests Block C Tests Block F Tests Block C Test Time Full Functional Test Tests Block D Tests Block E Tests Block F Full Functional Test Timeline viewer • Development Challenges • Common bus/pins • Shared test resources • Flow manipulation • Multi-site implementation • Adaptive test & Retest • Debug tools

  13. Multi-site Test Roadmap 2003 2000 2001 2002 2010 2004 2005 2007 2008 2009 2012 2014 2011 2006 2 or 4-site 16-site 16-site 32-site 8-site >> 32// 2016 TIU+DSA 16-32-site >1000 pin count SOC device test HIB design solutionusing new high density (x2 ~ x4)digital, AC, and DC options FLEX Catalyst/Tiger UltraFLEX UltraFLEX-HC UltraFLEX-XD 4-site Optical Disk Drive SOC test solution 32-site Mobile A/V Processor SOC test solution 8-site CD, DVD Player Processor SOC test solution 16-site Mobile A/V Processor SOC test solution • Multi-site capability is the key strategy to achieve low COT • 4-site codec in 2001 • 8-site CDP/DVDP in 2004 • 16-site Mobile A/V processor in 2007 • 32-site Mobile A/V processor in 2009 • 16 -site Mobile Application Process in 2011 • 32-siite from 2015 ? Korea test workshop Oct. 15 2014 Jin-Soo Ko 13

  14. COT - Multi Site Count test roadmap Number of Sites “High Mix” = many different device types tested in small lots “Low Mix” = only a few device types tested in large lots

  15. COT - Multi Site Vs. Parallel test efficiency Korea test workshop Oct. 15 2014 Jin-Soo Ko

  16. AWG Sequence for Pattern Based Programming • The entire AWG Plots BUCK6 BUCK2 BUCK3 BUCK4 BUCK1 BUCK5 BUCK7 VIN LNR1 LDR1 LDR2 LDR3 LNR2 MCU The scale reference is different in each plots. Korea test workshop Oct. 15 2014 Jin-Soo Ko

  17. COT - Chip to Chip Data Korea test workshop Oct. 15 2014 Jin-Soo Ko

  18. COT - Pattern Based Programming Test Time • BUCK ,BUCK_DVS and LDO,LDO_LDRTest time reduction Korea test workshop Oct. 15 2014 Jin-Soo Ko

  19. COT - Upgrade test computer • Next generation tester computer • Load & Validate time improvement up to ~ 20% • Average Runtime Improvement of ~ 4% to 20% • Windows 7 • 33% increase in application memory • Microsoft Office 2010 • Interoperability between Excel 2010 and Excel 2003 • New Sheet-Grouping and Navigation Features Benchmark Test Summary: Tera1 Windows7

  20. COT – Buy Rate down Equipment Capital Buy Rate down Buy Rate “Front End” Costs • $1.00 of IC revenue = $0.005 of test capital • Lowering Cost of Test 10% only increases profit by 0.05% • Raising Yield from 95% to 96% increases profit by > 1% (Much better investment!) • Winning new socket increases market share (Best investment!) 2013 • Test equipment is already very efficient. • Most new “test” investment focused on Time To Market and Quality to improve IC revenue and market share 1% “Buy Rate” = ATE Cost / IC Revenue “Back End” Costs Korea test workshop Oct. 15 2014 Jin-Soo Ko

  21. Time to market (TTM) What is this and why you should care? • Directly impact to market share and profits • Smart phone market is never wait for the delay of test. Korea test workshop Oct. 15 2014 Jin-Soo Ko

  22. TTM - How to get fast Time to market ? Failure Analysis / Yield Enhancement • Industry standard test system and SW capability • Integration with design and bench test • Advanced ATE SW tools for Time to Market “on tester” tools “off tester” tools Design  Test  Design Loop STDF Design Simulation events ATPG • Timing/Levels • Mixed Signal • Repeatability • Correlation On-Tester Debug/ Characterization (hours/minutes) • EDA Systems • ATPG Tools • Adaptive Test • Real-time Fault Isolation • Physical Failure Analysis • “Big Data” Storage Pattern & Test program. Gen. transactions • EDA-based Pattern Viewer • Simultaneous display of EDA and tester information • Diagnose Physical Device Faults Korea test workshop Oct. 15 2014 Jin-Soo Ko

  23. TTM - Multi-sheet use model • Separate test code & data for each sub program • Tied together at the Job List Sheet • Multi-Sheet Model = no more manual merging of sub programs Sub-Program A • Enabler for independent development • Reduces time to integrate Sub-Program B

  24. TTM - RF tools- LTE-A TX signal debug tool • IG-XL 7.30 • ESA 2.0 • 3GPP LTE • TD-SCDMA • 802.11n 4x4 MIMO • VSA 10.01 • 1 port vector • Power de-embedding • Signal sheet support • Smith charting • IG-XL 8.20 • ESA 4.0 • LTE-A (100MHz) • 802.11ac (160MHz) • 802.11ac (80+80) • BT 4.0 (LE) • VSA 16 • 90% reduction in VSA instance creation times • IG-XL 7.40 • ESA 2.5 • 3GPP LTE Update • Bluetooth 3.0 • VSA 11 • IG-XL 8.00.01 • ESA 3.0 • LTE 8.9 • VSA12 • IG-XL 8.10 • ESA 3.5 • LTE-A (R10) • 802.11ac • VSA 14 Korea test workshop Oct. 15 2014 Jin-Soo Ko

  25. TTM - How To Do Protocol Level Test? • Match an independent part of the tester to each interface • Match the device’s frequency, timing, etc. • Communicate natively in the “Language” of the port Protocol Synchronization & Communication Protocol Level ATE DC Test Resources AC Test Resources Integrated Mobile Device Power Mgmt Functions Audio / BB Functions USB Protocol Engine USB I/F GPS Modulation Domain RF DRAM Emulation Engine Mem I/F 4G Modulation Domain RF BB Proc DSP CPU DRAM Emulation Engine Mem I/F WiFi Modulation Domain RF JTAG Protocol Engine JTAG I/F FM/TV Modulation Domain RF

  26. TTM - Protocol Aware Complex Device Architecture “Stored Response” ATE Power Mgmt Functions Integrated Mobile Device Audio / BB Functions USB I/F GPS Tries to Test DRAM I/F 3G RF BB Proc DSP CPU Flash I/F WiFi JTAG I/F FM/TV Write.jtag ( ADDR: 04h, DATA: 55h) Read.jtag (ADDR: 0Ah, DATA  read_var) Protocol Definition Editor For defining and modifying protocols • Protocol Studio • For online debug of protocol transactions • Transaction results Debug displays Data capture setups • Module management Port Properties

  27. Device Trends drive New test needs Quality of test (QOT) – Technical challenges Mapping Physical and electrical defects Power Supply Stability Example Test Flow Site 1 Site 2

  28. Quality of test (QOT) – Technical challenges DC Challenges Korea test workshop Oct. 15 2014 Jin-Soo Ko

  29. QOT - ITRS semiconductor roadmap for gate length and supply voltage • Supply Voltage levels will continue to decrease • New test requirements for power supplies to be stable and accurate • Need very critical DIB PI simulation and design process

  30. Test Quality - Dc power VDD accuracy & droop Power Supply Stability Example • AP requires many supplies • Core supplies • IO Supplies • Requirements are very different • Core: Accuracy, dynamic performance • IO: Wider voltage range, more connections All device supplies will have some momentary “droop” when scan patterns are started. Too large a “droop” will cause good parts to fail, reducing yield Network Processor Example Single Supply Solution Ganged Solution . . .

  31. Quality of test (QOT) – Technical challenges RF Challenges Korea test workshop Oct. 15 2014 Jin-Soo Ko

  32. Wireless Industry Trends • Increase Demand for Higher Data Rate & Connectivity • Overall mobile data traffic is expected to grow at a 61% CAGR to 15.9 Exabytes per month by 2018 • Migration to LTE-Advanced occurring in all market segment (high and low end) • New standards require 2x-3x more active RF device ports • Demand for higher performance and high site count test capability • Internet-of-things driving rapid growthof MCU + RF segment • Shrinking Device Size While Increasing Complexity • Mobile IC’s moving away from conventional package to wafer-level package technologies (Flip-chip, WLCSP, FOWLP)

  33. LTE Test Challenges – Modulation Quality • When Testing an RF Device, we want to measure how much the signal is corrupted by things like: • Phase Noise • Signal Imbalance • Other noise and distortion • All of these errors are combined into Error Vector Magnitude. It is a clear way to measure RF signal quality • To do production testing, the EVM of the tester must be much better than the Device Under Test Measured Signal Test Limit Ideal Signal LTE Base Station = 13.5% LTE User Equip 12.5% 802.11ac = 11.22% LTE Base Station = 9.0% 802.11ac = 3.98% 802.11ac = 2.51% - Device Spec Limit 16-QAM 802.11a/g, LTE 64-QAM 802.11a/g/n; LTE-A 256-QAM 802.11 ac

  34. Example of How Tester Errors Become More Critical for New RF Standards • The plot below shows the effects of IQ skew Imbalances (modulation signals being out of phase) • If testing an actual device, the skew, gain and other distortion would contribute to the EVM error. 802.11ac 160M LTE-A 100M Can’t Test! X Good Test Margin 802.11ac 160M EVM Limit = 2.51% LTE 20M Skew Tester with 0.5% EVM Test Capability Tester with 1-2% EVM Test Capability

  35. Quality of test (QOT) – Technical challenges HIGH SPEED Interface Challenges Korea test workshop Oct. 15 2014 Jin-Soo Ko

  36. Jedec Data

  37. LPDDR3 and ddr4 specs • Extremely difficult timing Accuracy Requirements • DDR3 Used DFT functions to validate • DDR4 may be too difficult for DFT • Might require production test with ATE to guarantee spec LP-DDR3 (After leveling) LP-DDR4

  38. DDR Test Strategies Very Little DFT Needed

  39. Advantages No need to buy new Tester Options – much lower cost Minimizes signal path to DUT for best signal quality and device yield Simplifies DIB by eliminating matching circuitry Can be customized easily High speed serial test • High Speed Serial Challenges • Maintain signal integrity from Instrument to DUT • Support rapid increases in serial data rates • 2013: 16Gbps • 2014: 28+Gbps • 2015: 45Gbps / MultiLevel • Minimize tester capital investment • Maximize tester capital useful life • Test Strategy • Support high pin count interfaces with standard tester instrument • Develop Re-useable IP than can be implemented on DIBs or DIB modules • Deliver solutions as • Turnkey applications • Vendor-designed and manufactured hardware only • Custom hardware made by ATE vendor, 3rd Party or HiSilicon (under license) ITRS Data Rate Forecast

  40. Future standards • New PAM (Pulse Amplitude Modulation) Standard is coming • Targeted for 2015 • PAM4 @ 16Gsym/s • PAM8 @ 16Gsym/s • Similar technology used on Hard Disk Drive and LAN devices • Module-Based Solution will allow a solution quickly and inexpensively

  41. Towerless Probe for TSV and Bumped die Standard Prober Docking Towerless Prober Docking Tester Tester PIB Probe Card Prober Probe tower Probe Head Prober Probe Card Probe Head • Advantages: • Higher signal fidelity • Lower tooling costs • Better planarity with chuck Standard Prober Docking Instrument PIB Probe Tower Probe Card Probe Head interposer pogos pogos Solder pads Probe Needles Towerless Prober Docking Instrument Probe Card Probe Head interposer Solder pads Probe Needles

  42. Conclusion • Cost of TestATE Capital Program developmentMultisite and Concurrent testPattern Oriented TestDFT dependent test solutions • Time To MarketSW and debugging toolsAdaptive testingProtocol Aware • Test Quality DC accuracy and powerRF AC speed and skewHigh speed IODirect Wafer Probing Korea test workshop Oct. 15 2014 Jin-Soo Ko

  43. Q&A Q&A Jin-Soo Ko Highbrand building 10’th floor, YangJa, SeoCho, Seoul Korea Jin-Soo.Ko@Teradyne.com

More Related