1 / 35

George M. Hrbek X-8 Computational Science Methods Los Alamos National Laboratory

The Pinocchio Project and Verifying Complex Multi-Material Flow Simulations Requirements and Issues for Automation of Complex Multi-Material Flows. Presented at the Workshop on Numerical Methods for Multi-Material Fluid Flows St Catherine’s College Oxford, UK 8 September 2005.

zamora
Télécharger la présentation

George M. Hrbek X-8 Computational Science Methods Los Alamos National Laboratory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Pinocchio Project and Verifying Complex Multi-Material Flow SimulationsRequirements and Issues for Automationof Complex Multi-Material Flows Presented at the Workshop on Numerical Methods for Multi-Material Fluid Flows St Catherine’s College Oxford, UK 8 September 2005 George M. Hrbek X-8 Computational Science Methods Los Alamos National Laboratory

  2. The Pinocchio Project • Verification module of Quantitative Simulation, Analysis, and Testing (QSAT)

  3. Mission of QSAT • Identify, create, and maintain products, and provide services that aid in analyzing and certifying coupled physics simulation codes.

  4. Products are Analytic Test Functions (ATFs) • ATFs are analytical tests performed on physics-simulation codes • Essential to our simulation efforts • Aid in interpretation and application of relevant theory • ATFs include • Code and Calculation-verification analyses (e.g., convergence studies) • Error-ansatz Characterization (formulating discretization-error models) • Sensitivity Analysis • Uncertainty Quantification

  5. Properties of ATFs • Generally ATFs are • Mathematically complex • Require multiple procedural steps • Each procedure requires specialized software • ATFs may require significant computing resources to generate the underlying or foundational simulations • ATFs are limited by their complexity and intensive computational nature. • Frequent and independent testing is necessary throughout the development, assessment, and deployment of physics-simulation codes. -> Automation can help -> Automation is essential

  6. QSAT Focus Areas • Apply cutting edged analysis methods incorporated in the ATF modules to interpret experiments through simulation • Demonstrates importance to experimental and simulation efforts • Aids in interpretation and application of relevant theory • Automate as appropriate • Scripting • Streamline and merge similar elements • Common – spawn jobs, manage results, spawn ATFs, write reports • Unique – determine # of jobs, input templates, create ATF analysis tools

  7. QSAT Focus Areas • Extract common and unique processes • Codes • Platforms • Organizations • Abstract processes and create a common code framework

  8. Requirements for QSAT ATF Modules • Invoked through CTS (Collaborative Testing System) • Cross platform compatibility • Works with ALL ASC and Legacy Projects • Meets or exceeds best software engineering practices • Documentation • Maintainability • Code reuse

  9. Requirements for QSAT ATF Modules • Requires uniform test problems and methods • Uniform application of ATFs, coded exact analytics, and report generation software • Standard template for adding new problems • Increases functionality for ATF analyses and report generation

  10. Operational Requirements • Run in automatic and interactive modes • Improves frequency and ease of • Use • Reporting • Archiving • Traceability • Uses good software practices • Increases reliability • Maintainability • Addition of upgrades and new features

  11. Fig. 1. Flowchart of Automatic Verification Results of a Physical Simulation from CTS Grid Points are extracted Exact Analytic Program Exact Analytic Solutions for Grid Points Perform Verification Analysis

  12. Fig. 2. The Pinocchio Project (Verification) Specialized Decision Module (How many jobs?) Templates Library Spawn jobs Run jobs, results stored, and control deck written Verification Analysis Module (Pinocchio Project) Verification Control Deck Simulation Results Report written

  13. Fig. 3. The Pinocchio Project - Major Modules and their Function Jiminey - Scripts Geppetto - Analytic Solutions Figaro – Data Acquisition and Parsing Tools Collodi – Automation Tools and Scripts Repository Cleo - Verification Tools

  14. Problems Automated to Date • Crestone Project: six of the seven tri-lab test problems. • Frank Timmes T-DO, Jim Kamm X-7, and Ron Kirkpatrick X-3 • Noh • Sedov • Reinicke Meyer-ter-Vehn • Su-Olson • Coggeshall 8 • Mader • Shavano Project: one of the seven tri-lab test problems. • Jim Kamm X-7 and Jerry Brock X-7 • Noh

  15. Fig. 4. The General ATF Flowchart Templates Library Uncertainty Specialized Decision Modules (How many runs?) GUI (spawns jobs) Validation … Manage Simulation Jobs and Results through the Collaborative Testing System (CTS) Regression Verification (Pinocchio Project) Simulation Results ATF Control Deck Report Written ATF Modules

  16. How do we automate an ATF? • Recognize that all code projects seem to implement common ATFs in unique ways. • Separate serendipity from real code dependent requirements (e.g., data structures, file formats) • Identify real code dependent requirements that effect implementation of ATFs • Breakdown ATFs into steps or processes that are clearly defined and understood by an independent agent • Drill down into each process and identify as either a common or a unique element.

  17. What elements should we automate in an ATF? • ONLY the UNIQUE elements particular to the specific ATF analysis • Which jobs to run? • Details of the ATF analysis • Common processes that include a ‘translator’ to handle cell, vertex, and face centered data • Code unique dump files need to be read • Move towards a ‘universal’ format.

  18. How do we automate an ATF? • Identify individual ATFs • YOU tell people like ME what you need to do. • Break each ATFs down into individual processes that can be clearly defined • People like ME aid YOU (i.e., the Experts) in explaining each step in excruciating detail! • Identify each process as either a common or a unique element. • That’s why I’M here

  19. What is Verification? • Demonstrates that the code • Solves the governing equations correctly • Shows the accuracy of the implementation • Two types • Code verification • Forward Analytical problems • Backward Analytical Problems • Calculation verification • No Analytical Solution • Often Self Convergence

  20. Why do we need to verify code? • Only way to realistically measure and demonstrate how well a physics code approximates the variables for a particular physical regime.

  21. Why do we need to do it so often? • Demonstrate that the code has not changed • New features are added • Problems fixed • Demonstrate that the instantiation of algorithms is properly achieved • Second order algorithms achieve second order accuracy

  22. What are convergent variables? • In addition to space and time • Temperature • Pressure • Velocity • … • It really depends on the test problem!

  23. As an example…. • Consider the ‘instability triad’ • Richtmeyer-Meshkoff • Rayleigh-Taylor • Kelvin-Helmholtz • What would constitute the parameter space and range of validity of these phenomena? • What are the ranges of validity of the instantiated algorithms? • Is there proper overlap? (i.e., does every algorithm stay within the range of validity of the phenomena?) • What are the ‘Universal’ Test Problems?

  24. A test problem is said to be universally usable when • Can be understood by all serious researchers in a particular field of research • Can be implemented on all physical simulation codes that are ready for meaningful scientific investigation • Can generate information about the physical or mathematical phenomena that is unambiguous • Can fulfill three requirements: • Is unambiguously defined • Is documented • Is certified as correct • How do we do this?

  25. Forward vs. Backward Problems • The Forward Problem • Classical method of solving PDE’s (e.g., solving the heat conduction equation using separation of variables for given IC’s, BC’s, and coefficients) • The Backward Problem • Solved through the Method of Manufactured Solutions (MMS)

  26. The “Forward Problem” • Direct comparison of code with exact solutions to real problems • Limitations • Simplification of general problem space • Primitive physical domains • Existence of singularities • Many special cases needed to test BC’s and/or IC’s • Difficult if not impossible to design a full coverage test suite

  27. The “Backward Problem” • Method of manufactured solutions (MMS) • Allow one to test the most general code capability that one intends to use (i.e., the Cardinal Rule of verification) • Limitations • Must think about the ‘types of terms’ that will be exercised in the most general use of the code • Requires code developers to insert a source term into the appropriate differencing equations • Must prevent users from accessing this source term for a ‘knob’

  28. The 10 Steps of the MMES • Determine the governing equations and the theoretical order of accuracy • Design a suite of coverage tests • Construct an exact solution • Perform the test and calculate the error • Refine the grid • Compute the observed order of accuracy • Troubleshoot the test implementation • Fix test implementation • Find and correct coding mistakes • Report results and conclusions of verification tests

  29. Design a suite of coverage tests • Define what code capabilities will and will not be tested • Determine the level that each capability will be tested • List and describe what tests will be performed • Describe the subset of the governing equations that each test will exercise

  30. Example of MMS:1-D Steady State Thermal Slab • Manufactured Solution (made up) • Steady State Condition and BC’s • Steady State Solution T [x] = C (A cos l x + B sin l x) T [x] = T0 cos lx + csc lL sin lx (T1 - T0 cos lL )

  31. Determining the Source Function Q[x] • The Source Function Q is defined from • We obtain the corresponding source function • For this case ONLY (in general this is NOT true!) Q [x] =- l2 ( T0 cos lx + csc lL sin lx (T1 - T0 cos lL ) ) Q [x] =- l2 T [x]

  32. Using the Source Function Q[x] • For difference equations • We compute Q and insert it into every zone Q [xn] =- l2 ( T0 cos l xn + csc lL sin lx (T1 - T0 cos lL ) )

  33. Fig. 5. Temperature and Source Function Profiles T0 = 100oC T1 = 375oC L = 10 m

  34. The ‘Bottom Line’ • Test problems are • Expensive to implement • Tedious => essential to automate! • Tend to be redundant between ATFs

  35. Conclusions • We need to perform ATFs often so we must automate the processes • Must choose problems carefully to properly cover physical regimes and parameter spaces • The development of a single automated ATF framework should allow for easy incorporation of additional ATFs

More Related