1 / 25

Issues in FPGA Verification

Issues in FPGA Verification. Panel Discussion 2005 MAPLD International Conference Washington, D.C. September 6, 2005. Panel Members. Rod Barto NASA Office of Logic Design Charlie Howard Southwest Research Institute ® Ronnie Killough Southwest Research Institute ®

jam
Télécharger la présentation

Issues in FPGA Verification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Issues in FPGA Verification Panel Discussion 2005 MAPLD International Conference Washington, D.C. September 6, 2005

  2. Panel Members Rod Barto NASA Office of Logic Design Charlie Howard Southwest Research Institute® Ronnie Killough Southwest Research Institute® Dave Moser BAE Systems Incorporated Wesley A. Powell Goddard Space Flight Center Jim Westfall CU-LASP

  3. Purpose and Goals • To gather experienced engineers with different current responsibilities • Engineering management • Software design and management • ASIC design • FPGA design • Board / Box design and test • To use each of their viewpoints to contribute to the discussion about how to properly and completely verify FPGA correctness • To engage the audience in the exchange

  4. Format • Maximum of 3 minutes (each) introduction and brief statement of intent • Each will in-turn pick one question of interest and moderate (15 minutes absolute max) • Comments from audience solicited • I’ll step in if I must to move it along

  5. Question #1 – Rod Barto (NASA-OLD) • The digital design paradigm at one time used proven components from data books and built systems upwards.  Similarly, in the software world, Ada programmers could use the proven Booch components to build software systems.  Why has the paradigm of starting with proven components and building upwards not carried over to more VHDL-based designs?

  6. Question #2 – Charlie Howard (SwRI) • How do we produce FSW type interactions (in simulation or HW tests) when FSW is in flux (even after launch)? • Especially hard in the presence of interrupt servicing routines

  7. Question #3 – Ronnie Killough (SwRI) • What level of requirements analysis and capture is appropriate to apply to the specification of functionality being implemented in an FPGA?

  8. Question #4 – Dave Moser (BAE Systems) • How do you determine what features should be tested in the FPGA and which ones should be tested in simulation?

  9. Question #5 – Wes Powell (NASA) • Given the increasing size of today's FPGAs and the growing complexity of the designs that must be implemented in them, there is desire to use high-level design tools and IP Cores to streamline the application development process.  However, there is uncertainty over the quality of the designs these tools produce.  To what extent can the large and complex designs produced by these tools be verified?

  10. Question #6 – Jim Westfall (LASP) • There are a number of programmatic factors that play into when you start testing a new FPGA design at the board level. Start testing too early and the software and test engineers may end up spinning their wheels while the hardware folks are debugging the design. Start too late and they might spin their wheels waiting for any kind of hardware to test. Where is the “sweet spot” between extensive simulation and analysis and extensive board level testing?

  11. Question #7 – Rick Katz • Should there be an "independent verification and validation team?"  Does the answer to this question depend on the size/complexity of the design?

  12. Question #8 – John Stone (SwRI) • Has simulation become “too” important in verifying high-reliability design implementations?

  13. Question #9 • Can all flow control inducing cases be covered during verification; can they be envisioned?

  14. Question #10 • What methodology is used to formulate worst case scenario tests (i.e. are all the relays on at once, what happens if device "A" is reset, etc...)? 

  15. Question #11 • How have assertion verification methods affected the verification processes?

  16. Question #12 • Is formal verification being used with FPGA designs and, if so, what has the experience been like?

  17. Question #13 • What constraints need to be placed on how and where designs developed using high-level design tools and IP cores should be used?

  18. Question #14 • With tools now available that can automatically instantiate redundancy in a user design, to what extent can and should the output of these tools be verified?

  19. Question #15 • What approaches in software design can/should be applied to the design of an FPGA?

  20. Question #16 • What are appropriate levels of testing in the context of an FPGA?

  21. Question #17 • Over that last few years, as we’ve made the transition (at LASP) from schematic based designs to VHDL based designs, it appears that we are killing fewer FPGAs during the development cycle. Is this experience duplicated at other organizations? Why would this be the case? Are we getting better at hiring good engineers? Does the software development model used in VHDL designs provide better insight and review capability than schematic entry? Are the simulation, verification and test bench capabilities easier to use or more accurate?

  22. Question #18 • We’ve successfully used Actel’s ProASIC parts to “breadboard” and verify everything from simple logic modules all the way up to complex, multiple FPGA designs where the discrete FPGAs and interfaces are partitioned in the test device. What are the limits to using reprogrammable FPGAs in the verification process? How comfortable are we extending this to any reprogrammable device even if the evaluation development environment differs from the flight development environment? For example, how successful are verifications performed when the breadboard uses Vendor A parts and tools and the flight uses parts and tools from Vendor B?

  23. Question #19 • Given that a digital design actually creates a three-dimensional object with complex temporal relationships between its constituent parts, why has the dominant design paradigm become a text description in which the details of connectivity and temporal relationships are suppressed?

  24. Question #20 • How can the space electronics community be convinced that reviewability and transferability are important attributes of a good design?

  25. Question #21 • Under what circumstances can we create a design which is correct and verifiable by inspection?

More Related