Presentation is loading. Please wait.

Presentation is loading. Please wait.

Page 1 of 35 George Hrbek LA-UR–05–6750 The Pinocchio Project and Verifying Complex Multi-Material Flow Simulations Requirements and Issues.

Similar presentations


Presentation on theme: "Page 1 of 35 George Hrbek LA-UR–05–6750 The Pinocchio Project and Verifying Complex Multi-Material Flow Simulations Requirements and Issues."— Presentation transcript:

1 Page 1 of 35 George Hrbek LA-UR–05–6750 The Pinocchio Project and Verifying Complex Multi-Material Flow Simulations Requirements and Issues for Automation of Complex Multi-Material Flows George M. Hrbek X-8 Computational Science Methods Los Alamos National Laboratory Presented at the Workshop on Numerical Methods for Multi-Material Fluid Flows St Catherines College Oxford, UK 8 September 2005

2 Page 2 of 35 George Hrbek LA-UR–05–6750 The Pinocchio Project Verification module of Quantitative Simulation, Analysis, and Testing (QSAT)

3 Page 3 of 35 George Hrbek LA-UR–05–6750 Mission of QSAT Identify, create, and maintain products, and provide services that aid in analyzing and certifying coupled physics simulation codes.

4 Page 4 of 35 George Hrbek LA-UR–05–6750 Products are Analytic Test Functions (ATFs) ATFs are analytical tests performed on physics-simulation codes Essential to our simulation efforts Aid in interpretation and application of relevant theory ATFs include Code and Calculation-verification analyses (e.g., convergence studies) Error-ansatz Characterization (formulating discretization-error models) Sensitivity Analysis Uncertainty Quantification

5 Page 5 of 35 George Hrbek LA-UR–05–6750 Properties of ATFs Generally ATFs are Mathematically complex Require multiple procedural steps Each procedure requires specialized software ATFs may require significant computing resources to generate the underlying or foundational simulations ATFs are limited by their complexity and intensive computational nature. Frequent and independent testing is necessary throughout the development, assessment, and deployment of physics- simulation codes. -> Automation can help -> Automation is essential

6 Page 6 of 35 George Hrbek LA-UR–05–6750 QSAT Focus Areas Apply cutting edged analysis methods incorporated in the ATF modules to interpret experiments through simulation Demonstrates importance to experimental and simulation efforts Aids in interpretation and application of relevant theory Automate as appropriate Scripting Streamline and merge similar elements Common – spawn jobs, manage results, spawn ATFs, write reports Unique – determine # of jobs, input templates, create ATF analysis tools

7 Page 7 of 35 George Hrbek LA-UR–05–6750 QSAT Focus Areas Extract common and unique processes Codes Platforms Organizations Abstract processes and create a common code framework

8 Page 8 of 35 George Hrbek LA-UR–05–6750 Requirements for QSAT ATF Modules Invoked through CTS (Collaborative Testing System) Cross platform compatibility Works with ALL ASC and Legacy Projects Meets or exceeds best software engineering practices Documentation Maintainability Code reuse

9 Page 9 of 35 George Hrbek LA-UR–05–6750 Requirements for QSAT ATF Modules Requires uniform test problems and methods Uniform application of ATFs, coded exact analytics, and report generation software Standard template for adding new problems Increases functionality for ATF analyses and report generation

10 Page 10 of 35 George Hrbek LA-UR–05–6750 Operational Requirements Run in automatic and interactive modes Improves frequency and ease of Use Reporting Archiving Traceability Uses good software practices Increases reliability Maintainability Addition of upgrades and new features

11 Page 11 of 35 George Hrbek LA-UR–05–6750 Fig. 1. Flowchart of Automatic Verification Results of a Physical Simulation from CTS Grid Points are extracted Exact Analytic Solutions for Grid Points Exact Analytic Program Perform Verification Analysis

12 Page 12 of 35 George Hrbek LA-UR–05–6750 Spawn jobs Templates Library Run jobs, results stored, and control deck written Specialized Decision Module (How many jobs?) Verification Control Deck Verification Analysis Module (Pinocchio Project) Simulation Results Report written Fig. 2. The Pinocchio Project (Verification)

13 Page 13 of 35 George Hrbek LA-UR–05–6750 Geppetto - Analytic Solutions Collodi – Automation Tools and Scripts Repository Figaro – Data Acquisition and Parsing Tools Cleo - Verification Tools Jiminey - Scripts Fig. 3. The Pinocchio Project - Major Modules and their Function

14 Page 14 of 35 George Hrbek LA-UR–05–6750 Problems Automated to Date Crestone Project: six of the seven tri-lab test problems. Frank Timmes T-DO, Jim Kamm X-7, and Ron Kirkpatrick X-3 Noh Sedov Reinicke Meyer-ter-Vehn Su-Olson Coggeshall 8 Mader Shavano Project: one of the seven tri-lab test problems. Jim Kamm X-7 and Jerry Brock X-7 Noh

15 Page 15 of 35 George Hrbek LA-UR–05–6750 GUI (spawns jobs) Manage Simulation Jobs and Results through the Collaborative Testing System (CTS) Templates Library Specialized Decision Modules (How many runs?) Verification (Pinocchio Project) Uncertainty … Regression Validation ATF Modules Simulation Results Report Written ATF Control Deck Fig. 4. The General ATF Flowchart

16 Page 16 of 35 George Hrbek LA-UR–05–6750 How do we automate an ATF? Recognize that all code projects seem to implement common ATFs in unique ways. Separate serendipity from real code dependent requirements (e.g., data structures, file formats) Identify real code dependent requirements that effect implementation of ATFs Breakdown ATFs into steps or processes that are clearly defined and understood by an independent agent Drill down into each process and identify as either a common or a unique element.

17 Page 17 of 35 George Hrbek LA-UR–05–6750 What elements should we automate in an ATF? ONLY the UNIQUE elements particular to the specific ATF analysis Which jobs to run? Details of the ATF analysis Common processes that include a translator to handle cell, vertex, and face centered data Code unique dump files need to be read Move towards a universal format.

18 Page 18 of 35 George Hrbek LA-UR–05–6750 How do we automate an ATF? Identify individual ATFs YOU tell people like ME what you need to do. Break each ATFs down into individual processes that can be clearly defined People like ME aid YOU (i.e., the Experts) in explaining each step in excruciating detail! Identify each process as either a common or a unique element. Thats why IM here

19 Page 19 of 35 George Hrbek LA-UR–05–6750 What is Verification? Demonstrates that the code Solves the governing equations correctly Shows the accuracy of the implementation Two types Code verification Forward Analytical problems Backward Analytical Problems Calculation verification No Analytical Solution Often Self Convergence

20 Page 20 of 35 George Hrbek LA-UR–05–6750 Why do we need to verify code? Only way to realistically measure and demonstrate how well a physics code approximates the variables for a particular physical regime.

21 Page 21 of 35 George Hrbek LA-UR–05–6750 Why do we need to do it so often? Demonstrate that the code has not changed New features are added Problems fixed Demonstrate that the instantiation of algorithms is properly achieved Second order algorithms achieve second order accuracy

22 Page 22 of 35 George Hrbek LA-UR–05–6750 What are convergent variables? In addition to space and time Temperature Pressure Velocity … It really depends on the test problem!

23 Page 23 of 35 George Hrbek LA-UR–05–6750 As an example…. Consider the instability triad Richtmeyer-Meshkoff Rayleigh-Taylor Kelvin-Helmholtz What would constitute the parameter space and range of validity of these phenomena? What are the ranges of validity of the instantiated algorithms? Is there proper overlap? (i.e., does every algorithm stay within the range of validity of the phenomena?) What are the Universal Test Problems?

24 Page 24 of 35 George Hrbek LA-UR–05–6750 A test problem is said to be universally usable when Can be understood by all serious researchers in a particular field of research Can be implemented on all physical simulation codes that are ready for meaningful scientific investigation Can generate information about the physical or mathematical phenomena that is unambiguous Can fulfill three requirements: Is unambiguously defined Is documented Is certified as correct How do we do this?

25 Page 25 of 35 George Hrbek LA-UR–05–6750 Forward vs. Backward Problems The Forward Problem Classical method of solving PDEs (e.g., solving the heat conduction equation using separation of variables for given ICs, BCs, and coefficients) The Backward Problem Solved through the Method of Manufactured Solutions (MMS)

26 Page 26 of 35 George Hrbek LA-UR–05–6750 The Forward Problem Direct comparison of code with exact solutions to real problems Limitations Simplification of general problem space Primitive physical domains Existence of singularities Many special cases needed to test BCs and/or ICs Difficult if not impossible to design a full coverage test suite

27 Page 27 of 35 George Hrbek LA-UR–05–6750 The Backward Problem Method of manufactured solutions (MMS) Allow one to test the most general code capability that one intends to use (i.e., the Cardinal Rule of verification) Limitations Must think about the types of terms that will be exercised in the most general use of the code Requires code developers to insert a source term into the appropriate differencing equations Must prevent users from accessing this source term for a knob

28 Page 28 of 35 George Hrbek LA-UR–05–6750 The 10 Steps of the MMES 1. Determine the governing equations and the theoretical order of accuracy 2. Design a suite of coverage tests 3. Construct an exact solution 4. Perform the test and calculate the error 5. Refine the grid 6. Compute the observed order of accuracy 7. Troubleshoot the test implementation 8. Fix test implementation 9. Find and correct coding mistakes 10. Report results and conclusions of verification tests

29 Page 29 of 35 George Hrbek LA-UR–05–6750 Design a suite of coverage tests Define what code capabilities will and will not be tested Determine the level that each capability will be tested List and describe what tests will be performed Describe the subset of the governing equations that each test will exercise

30 Page 30 of 35 George Hrbek LA-UR–05–6750 Example of MMS: 1-D Steady State Thermal Slab Manufactured Solution (made up) Steady State Condition and BCs Steady State Solution T [x] = C (A cos x + B sin x) T [x] = T 0 cos x + csc L sin x (T 1 - T 0 cos L )

31 Page 31 of 35 George Hrbek LA-UR–05–6750 Determining the Source Function Q[x] Q [x] =- T 0 cos x + csc L sin x (T 1 - T 0 cos L ) ) The Source Function Q is defined from We obtain the corresponding source function For this case ONLY (in general this is NOT true!) Q [x] =- x

32 Page 32 of 35 George Hrbek LA-UR–05–6750 Using the Source Function Q[x] Q [x n ] =- T 0 cos x n + csc L sin x (T 1 - T 0 cos L ) ) For difference equations We compute Q and insert it into every zone

33 Page 33 of 35 George Hrbek LA-UR–05–6750 Fig. 5. Temperature and Source Function Profiles T 0 = 100 o C T 1 = 375 o C L = 10 m

34 Page 34 of 35 George Hrbek LA-UR–05–6750 The Bottom Line Test problems are Expensive to implement Tedious => essential to automate! Tend to be redundant between ATFs

35 Page 35 of 35 George Hrbek LA-UR–05–6750 Conclusions We need to perform ATFs often so we must automate the processes Must choose problems carefully to properly cover physical regimes and parameter spaces The development of a single automated ATF framework should allow for easy incorporation of additional ATFs


Download ppt "Page 1 of 35 George Hrbek LA-UR–05–6750 The Pinocchio Project and Verifying Complex Multi-Material Flow Simulations Requirements and Issues."

Similar presentations


Ads by Google