1 / 24

Meeting Objectives

Meeting Objectives. H .-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Joint IAEA-ITAMP Technical Meeting on Uncertainty Assessment for Theoretical Atomic and Molecular Scattering Data Cambridge, Massachusetts, USA 7-9 July 2014.

kimn
Télécharger la présentation

Meeting Objectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Meeting Objectives H.-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Joint IAEA-ITAMP Technical Meeting on Uncertainty Assessment for Theoretical Atomic and Molecular Scattering Data Cambridge, Massachusetts, USA 7-9 July 2014

  2. IAEA Atomic and Molecular (A+M) Data Unit to support for fusion program worldwide We say that we will put the sun into a box. The idea is pretty. The problem is, we don't know how to make the box. -- Nobel prize winner Pierre-Gilles de Gennes Fusion researchrequireshugeamounts of material data – AM/PSI data • IAEA A+M Unit formed in 1977 • Review progress and achievements of Atomic, molecular and plasma-surface interaction (A+M/PSI) data for Fusion programmeworldwide • Stimulate international cooperation in measurement, compilation and evaluation of A+M / PSI data for fusion Coordinated Research Projects Publications Knowledgebase Databases Meetings 157 Member States ~2400 Staffs

  3. Network Collaboration for AM/PSI Data for Fusion Fusion Laboratories ITER EFDA JET, UKAEA ASDEX-Upgrade, IPP TEXTOR, Jülich, FZJ KSTAR, NFRI NIFS, JAEA PPPL, ORNL Data Users Data Centre Network ADAS, Summers H. CRAAMD, Jun, Y. FZJ, Reiter D. IAEA, Braams, B. J. JAEA, Nakano, T. KAERI, Kwon, D. Kurchatov, Kukushkin, A. B. NIFS, Murakami, I. NIST, Ralchenko, Y. NFRI, Yoon, J. . Code Centre Network Curtin Univ. I. Bray Kitasato Univ. F. Koike Univ. Autonoma de Madrid I. Rabadan Univ. P&M. Curie, Paris, A. Dubois Univ. of Bari, M. Capitelli Kurchatov Institute, A. Kukushkin Lebedev Institute, L. Vainshtein Ernst-Moritz-Arndt Univ, R. Schneider NIST, Y. Ralchenko PPPL, D. Stotler LANL, J. Abdallah Jr. IAEA, B. J. Braams HULLAC M. Klapisch CNEA, P.D. Fainstein Weizmann, E. Stambulchik Data Producers Data Centres & Evaluators IAEA Coordination Coordinated Research Projects Publications Knowledgebase Databases Meetings

  4. ITER – the biggest tokamak everbuilt€15 B - 7 Member States (EU, India, Russia, China, S. Korea, Japanand USA) ITER istwice as big as the world’slargestcurrently operating tokamak (JET) • Construction completed in 2020 • Power supplies: • 120 MW duringoperations • Up to 620 MW for 30s periods • Vessel: 5400 tons steel • SuperconductingCoils: 9500 tons steel • Divertor: 10 MW/m2 on 8 m2 • Cryostat : - 270 oC Huge Investment for the Sustainable Future of Mankind

  5. Evaluated and recommended data needed for ITER design and operation Typical edge transport code runtime (for same model, equations, grid size) ITER (R=6.2 m), Cadarache, FRA 3 months JET (R=2.96 m), Oxford, UK TEXTOR (R=1.75 m) Jülich, GER 1-2 weeks 1 day Because of more important plasma chemistry (increased non-linearity, non-locality, in sources).

  6. Need international collaboration for Evaluated Data http://www-amdis.iaea.org/DCN/Evaluation/

  7. TM on Data Evaluation 2012http://www-amdis.iaea.org/meetings/NFRI2012/ • More than 20 Participants from 11 countries • Proceeding papers published at Fusion Science and Technology (2013) • Community Consensus needed to produce evaluated/recommended data • Disseminate standard definitions of TERMINOLOGIES adopted internationally • Disseminate materials with the CRITICAL ANALYSIS SKILLS NRC report • Involve COMMUNITY in data evaluation eMOL, Group evaluation • Technical Issues • Assessment for THEORETICAL data  UNCERTAINTY ESTIMATES • Assessment of EXPERIMENTAL data  Self-consistency checks • ERROR PROPAGATION and SENSITIVITY ANALYSIS  Uncertainties in “Data” & “Data Processing Toolbox”

  8. Define Terminology: Uncertainty ApproachIt’s NOT AN ERROR but AN UNCERTAINTY • Terminology in metrology adopted by international communities (IAEA, IUPAC, IUPAP, BIPM, ISO, WHO, FAO, etc) VIM (Vocabulaire International de métrologie, Bureau Int. des Poids et Measures) 2007 GUM (guide to the expression of uncertainty in measurement) 2008 • Conceptual Changes of Values and Uncertainties True Value (Error Approach, ~ 1984)  A measure of the possible error in the estimated value of the measurand as provided by the result of a measurement Measured Value (Uncertainty Approach)  Parameter that characterizes the dispersion of the quantity value that are being attributed to a measurand, based on the information used. (VIM 3) uncertainty value value and uncertainty

  9. Critical Assessment for Modeling of Physical Processes • Verification. The process of determining how accurately a computer program (“code”) correctly solves the equations of the mathematical model. • Validation. The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model • Uncertainty quantification (UQ). The process of quantifying uncertainties associated with model calculations of true, physical QOIs, with the goals of accounting for all sources of uncertainty and quantifying the contributions of specific sources to the overall uncertainty. • NSF Division of Mathematics and Physical Sciences should encourage interdisciplinary interaction between domain scientists and mathematicians on the topic of uncertainty quantification, verification and validation, risk assessment, and decision making. (2010)

  10. Bayesian Update for Data Evaluation and Uncertainty Assignment Bayes Theorem (1763): p(x|σ M) = p(σ |xM) p(x|M) / p(σ|M) x:Set ofcalculationparameter • posterior = likelihood x prior/ evidenceσ : Set of experimental Data • p(x|σ M) : Aposteriori distribution, p(x|M) : Prior distribution p(σ |xM) : Likelihoodfunctiongivestheprobabilitydistributionofthedataσfor a model M withparameters Example: observable f(x)=a+bx+cx2 simulated experimental data given by f(y)=(a+by+cy2)(1+d*r) + ∆t(y) where r is a uniform random variable d is the width of the random interval ∆t(y)is a systematic error H. Leeb et al. Consistent Procedure for Nuclear Data Evaluation Based on Modeling, Nuclear Data Sheets, 109 (2008) 2762

  11. Theoretical cross-sections without uncertainty estimates • “The Low-Energy, Heavy-Particle Collisions–A Close-Coupling Treatment” Kimura and Lane, AAMOP, 26, 79 (1989) What is the best way to assess the quality of theoretical data without physical measurements?

  12. 5 Steps in Uncertainty Evaluation Modeling the measurement Identifying uncertainty components for each input quantity Evaluating standard uncertainty Type A, Type B Sensitivity coefficient Combining standard uncertainties of input quantities Coverage factor Expanded uncertainty

  13. 3rd Code Centre Network TM 2013 • Strategies to develop guidelines for the uncertainty estimates of theoretical data • Depend on Target, Resolution, Observable of interest (QOI in NRC) • Atomic structures • State descriptions, operators, basis sizes, basis parameters, sensitivity • Special volume in “Atoms” journal – 5 papers on the topic • Atomic collisions • Highly accurate, computationally intensive codes vs production codes • Benchmark results, basis sets, different methods, consistency check • Molecular collisions • Target, resonances, different methods, consistency check

  14. IAEA-ITAMP TM : Uncertainty Assessment for Theoretical Atomic and Molecular Scattering Data • Bring together a number of people who are working on electron collisions with atoms, ions, and molecules, heavy-particle collisions, and electronic structure of atoms and molecules (~ 25 Participants) • Come up with reasonable uncertainty estimates for calculations using the various methods of collision physics: perturbative, nonperturbative, time-independent, time-dependent, semi-classical, etc. • Desirable output  Guidelines for estimating uncertainties of theoretical atomic and molecular data

  15. Discussion sessions are extremely important!!!

  16. Supplementary slides

  17. Community Role: Consensus Building • Change of notions: Databases  Data research • Enlighten young generation (in early career) that data evaluation is a critical part of scientific work • Disseminate materials to train students and researchers with the “Critical Analysis Skills” • Disseminate the standard definitions of terminologies adopted by international organizations (IAEA, IUPAC, IUPAP, BIPM, ISO, WHO, FAO, etc) • VIM (Vocabulaire International de métrologie, Bureau Int. des Poids et Measures) 2007 • GUM (guide to the expression of uncertainty in measurement) 2008 • Agree on the procedure of evaluation towards a standard reference data

  18. Common Workflow Guidelines Workflow of critical evaluation of data on wavelengths and energy levels (NIST) Advantages: Easier to expand the evaluators’ network including early career researchers. Introduces more rigorous procedures for evaluation and increases the dependability of the evaluation. Disadvantages: The quality of evaluation critically depends on the experiences of the evaluators. It is possible that different people may reach at different conclusions using the same guidelines and the results may not be reproducible. Solutions: Collaborations can help reducing the disadvantages. Evaluation activities with scientific advisors and the editorial panels will be a great mechanism to produce the evaluated data library.

  19. Evaluation by the Community • The community consensus with an endorsement from the IAEA or other international authorities • Group Evaluation: 4-5 panelists including young and senior people like the editorial board for a journal with the broad backgrounds ( experimentalists, theoreticians, producers and users) • Self-Evaluation: Data producers with a deep knowledge in some cases. May work better for theoretical data sets. • Establishment of the evaluation guidelines: will evolve with time and experience with broad collaborations from the community • Merits of Group Evaluation: • Facilitate the knowledge transfer to younger generation • Review papers can be written for the evaluation work

  20. Experimental Data Evaluation • Check Lists • Uncertainty estimates or error assessment critical • Self-consistencies checked. • Experimental techniques evaluated. • Reputation of the data producer considered • Anomalies in some experimental processes (ro-vibrational / metastable states) • Wish Lists • Evaluation by a group of “established’ experts with broad expertise • Engage the community of relative measurements for cross-section measurements • Provide Recommended values where possible • Include a comparison with theory and an assessment of overall status • Evaluation will lead to the understanding of the gaps of the field • Establish “benchmarks” where possible: • Benchmarks will be accepted as heavier connotation than recommendation

  21. Theoretical Data Evaluation • No criteria of assessments for theoretical data • Need guidelines for uncertainty estimates of theoretical data • Should not try to give a straight recipe for assessing uncertainties, however, there are still several to start with. • There are prescriptions such as energy grids for resonances and partial waves. • One may take a model to see a convergence and estimate uncertainties based on assumptions within the model. • Comparison with experiments: this can be dangerous. • Different theories: if some theories are better than another, it may be given a benchmark status. • For scattering data maybe we should aim for “ideas” or “suggestions” rather than “guidelines”. • Theoreticians may have an idea of uncertainty estimates already • Journal policies can change the culture: PRA policies (G. Drake)

  22. Error Propagation and Sensitivity Analysis: Uncertainties in “Data” & “Data Processing Toolbox” ? • Atomic structure • collision codes fundamental data Processed data (rate coef.) CR transition matrix A = A_excit+ A_radiative+ A_ionis+ A_cx+ A_recomb+…. effective rates, population coefficients cooling rates, beam stopping rates,…. experimental data Velocity distribution: Boltzmann solver, Maxwellian Monte Carlo Linear algebra, ODE solvers Sensitivity,error propagation to final model results: PDEq, IDEq,… leave to modelers, spectroscopists

  23. Conceptual Changes of Uncertainties ~1984A measure of the possible error in the estimated value of the measurand as provided by the result of a measurement ~ 1992 An estimate characterizing the range of values with which the true value of a measurand lies (VIM 1) ~ 2007 Parameter associated with the result of a measurement, that characterizes the dispersion of the valuesthat could reasonably be attributed to the measurand (VIM 2) 2008 ~ Parameter that characterizes the dispersion of the quantity value that are being attributed to a measurand, based on the information used. (VIM 3)

  24. Analyst Sampling Data treatment Calibration ability extraction stability Assumption experience matrix Result Reference m. resolution Temp humidity purity automation drift impurity pressure Laboratory Reagent Instrument Uncertainty Evaluation Guide to the expression of Uncertainty in Measurement (GUM), 1993, BIPM,IEC,IFCC,ISO, IUPAC, IUPAP, OIML Variation of each parameter

More Related