1 / 21

Coordinated Activities On Data Evaluation And Recommendation

Coordinated Activities On Data Evaluation And Recommendation. H.-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Division of Physical and Chemical Sciences, IAEA, Vienna March 20th, 2013

kioshi
Télécharger la présentation

Coordinated Activities On Data Evaluation And Recommendation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Coordinated Activities On Data Evaluation And Recommendation H.-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Division of Physical and Chemical Sciences, IAEA, Vienna March 20th, 2013 3rd Research Coordination Meeting of the CRP on Light Element Atom, Molecule and Radical Behaviour in the Divertor and Edge Plasma Regions

  2. Network Collaboration for AM/PSI Data for Fusion Fusion Laboratories ITER EFDA JET, UKAEA ASDEX-Upgrade, IPP TEXTOR, Jülich, FZJ KSTAR, NFRI NIFS, JAEA PPPL, ORNL Data Users • Review progress and achievements of A+M/PSI data for Fusion programme • Stimulate international cooperation in measurement, compilation and evaluation of A+M / PSI data for fusion Data Centre Network ADAS, Summers H. CRAAMD, Jun, Y. IAEA, Braams, B. J. JAEA, Nakano, T. KAERI, Rhee, Y. Kurchatov, Martynenko, Yu. NIFS, Murakami, I NIST, Wiese, W.L. NFRI, Yoon, J ORNL, Schultz, D. R. Code Centre Network Curtin Univ. I. Bray Kitasato Univ. F. Koike Univ. Autonoma de Madrid I. Rabadan Univ. P&M. Curie, Paris, A. Dubois Univ. of Bari, M. Capitelli Kurchatov Institute, A. Kukushkin Lebedev Institute, L. Vainshtein FZJ, D. Reiter Ernst-Moritz-Arndt Univ, R. Schneider NIST, Y. Ralchenko PPPL, D. Stotler LANL, J. Abdallah Jr. IAEA, B. J. Braams HULLAC M. Klapisch CNEA, P.D. Fainstein Data Producers Data Centres & Evaluators IAEA Coordination CRP Publications Knowledgebase Databases Meetings

  3. Code Centre Network Meeting (October 2010) • CCN is organized to improve Online Code Capabilities to provide needed data for Data Users, particularly, Plasma Modellers • Online Codes generate too many data sets without quality information • Data Users need Complete sets and/or Recommended data (Data Users: D. Coster, D. Reiter, R. Schneider, D. Elder participated) IAEA Network finds the critical need of evaluated and recommended data

  4. Typical edge transport code runtime (for same model, same equations, same grid size) ITER (R=6.2 m), Cadarache, FRA 3 months JET (R=2.96 m), Oxford, UK TEXTOR (R=1.75 m) Jülich, GER 1-2 weeks 1 day Because of more important plasma chemistry (increased non-linearity, non-locality, in sources).

  5. Data Centre Network Meeting (September 2011) • Data Evaluation Tasks are Difficult • Lack-of man-power: Experts retiring or leaving the field • Evaluation requires multiple sets : Too many or too few • Very few benchmark experiments for collisional data • Even fewer uncertainty estimates for theoretical data • Recommendation • Data should be first collected and available for evaluation • Evaluation activities should be organized in the community • Evaluation guidelines should be established in the community • A list of recommended data sets should be available as a final product

  6. Coordination Meetings for Evaluationhttp://www-amdis.iaea.org/DCN/Evaluation/

  7. IAEA-NFRI TM on Data Evaluation:http://www-amdis.iaea.org/meetings/NFRI2012/ • Presentation Sessions (Focused on Reaction Data) • Current Evaluated Databases (Kramida, Landi, Mason) • Evaluation Methods and Experiences (Itikawa, Kumar, Cho, Karwasz) • Error Propagation and Sensitivity Analysis (O’Mullane, Ballance, Reiter, Krstic) • Theoretical Data Evaluation (Aggarwal, Liang, Takagi, Song) • Experimental Data Evaluation (Nakamura, Buckman, Shevelko, Imai) • Data Centres Evaluation Activities (Yoon, Murakami, Mason, Chung) • More than 20 Participants from Australia, China, Germany, India, Japan, Korea, Poland, Russia, UK, USA and IAEA.

  8. Summary of Discussions Community Consensus • Involve the community in data evaluation • Engage young generation for transfer of knowledge • Define terminology and vocabulary used in evaluation • Define common workflow guidelines Technical Issues • Assessment for theoretical data • Assessment of experimental data • Error propagation and Sensitivity Analysis

  9. Community Role: Consensus Building • Change of notions: Databases  Data research • Enlighten young generation (in early career) that data evaluation is a critical part of scientific work • Disseminate materials to train students and researchers with the “Critical Analysis Skills” • Disseminate the standard definitions of terminologies adopted by international organizations (IAEA, IUPAC, IUPAP, BIPM, ISO, WHO, FAO, etc) • VIM (Vocabulaire International de métrologie, Bureau Int. des Poids et Measures) 2007 • GUM (guide to the expression of uncertainty in measurement) 2008 • Agree on the procedure of evaluation towards a standard reference data

  10. Uncertainty Approach • Terminology in metrology • VIM (Vocabulaire International de Métrologie, Bureau Int. des Poids et Measures) 2007 • GUM (guide to the expression of uncertainty in measurement) 2008 • Measurement and uncertainty • The objective of a measurement is to determine the value of the measurand. • In general, a measurement has imperfections that give rise to an error in its result. • Error 1 = Measurement result – True value (Error approach) True value : value consistent with the definition of a given particular quantity • Error 2 = Measured value – Reference value (Uncertainty approach) Reference value (Assigned value): a true quantityvalue of the measurand, in which case measurement error is unknowable, or an appropriate, known quantity value such as a conventional quantity value or a specified target quantity value to be realized in a production process. uncertainty Value Value and uncertainty

  11. Standard Reference Data based on VIM & GUM : The true value lies within the uncertainty range

  12. The procedure of evaluation towards a Standard Reference Data Evaluation (NFRI) Final evaluation (panel decision) 1st evaluation (experts) Data Compilation

  13. Evaluation by the Community • The community consensus with an endorsement from the IAEA or other international authorities • Group Evaluation: 4-5 panelists including young and senior people like the editorial board for a journal with the broad backgrounds ( experimentalists, theoreticians, producers and users) • Self-Evaluation: Data producers with a deep knowledge in some cases. May work better for theoretical data sets. • Establishment of the evaluation guidelines: will evolve with time and experience with broad collaborations from the community • Merits of Group Evaluation: • Facilitate the knowledge transfer to younger generation • Review papers can be written for the evaluation work

  14. Common Workflow Guidelines Workflow of critical evaluation of data on wavelengths and energy levels (NIST) Advantages: Easier to expand the evaluators’ network including early career researchers. Introduces more rigorous procedures for evaluation and increases the dependability of the evaluation. Disadvantages: The quality of evaluation critically depends on the experiences of the evaluators. It is possible that different people may reach at different conclusions using the same guidelines and the results may not be reproducible. Solutions: Collaborations can help reducing the disadvantages. Evaluation activities with scientific advisors and the editorial panels will be a great mechanism to produce the evaluated data library.

  15. Experimental Data Evaluation • Check Lists • Uncertainty estimates or error assessment critical • Self-consistencies checked. • Experimental techniques evaluated. • Reputation of the data producer considered • Anomalies in some experimental processes (ro-vibrational / metastable states) • Wish Lists • Evaluation by a group of “established’ experts with broad expertise • Engage the community of relative measurements for cross-section measurements • Provide Recommended values where possible • Include a comparison with theory and an assessment of overall status • Evaluation will lead to the understanding of the gaps of the field • Establish “benchmarks” where possible: • Benchmarks will be accepted as heavier connotation than recommendation

  16. Theoretical Data Evaluation • No criteria of assessments for theoretical data • Need guidelines for uncertainty estimates of theoretical data • Should not try to give a straight recipe for assessing uncertainties, however, there are still several to start with. • There are prescriptions such as energy grids for resonances and partial waves. • One may take a model to see a convergence and estimate uncertainties based on assumptions within the model. • Comparison with experiments: this can be dangerous. • Different theories: if some theories are better than another, it may be given a benchmark status. • For scattering data maybe we should aim for “ideas” or “suggestions” rather than “guidelines”. • Theoreticians may have an idea of uncertainty estimates already • Journal policies can change the culture: PRA policies (G. Drake)

  17. Error Propagation and Sensitivity Analysis: Uncertainties in “Data” & “Data Processing Toolbox” ? • Atomic structure • collision codes fundamental data Processed data (rate coef.) CR transition matrix A = A_excit+ A_radiative+ A_ionis+ A_cx+ A_recomb+…. effective rates, population coefficients cooling rates, beam stopping rates,…. experimental data Velocity distribution: Boltzmann solver, Maxwellian Monte Carlo Linear algebra, ODE solvers Sensitivity,error propagation to final model results: PDEq, IDEq,… leave to modelers, spectroscopists

  18. Need the community feedback and support Future activities

  19. Near-term Actions • Developing an evaluators network – Key people identified • Sketch out guidelines for uncertainty assessment of theoretical data • Code Centre Network Meeting in May, 2013 • IAEA-ICTP-ITAMP workshop in 2014 • Organize a workshop (SUP@VAMDC) • Joint Meeting with eMOL group in May 2013 (Water evaluation) • NFRI to organize a data evaluation group for demonstration • The 1st Group Evaluation Meeting in January 2013 • The 2nd Group Evaluation Meeting in June 2013 • Inventorise datasets that are now used by fusion plasma modellers • European Fusion Development Association (EFDA) - Integrated Transport Modelling Task Force (ITM-TF) Workshop. In March 2013 • The ITER project should recognize the need of standard reference data (SRD) for A+M/PSI processes used in the design

  20. Long-term goal…. Global Network towards the Internationally Agreed Data Library for Fusion and other Plasma Applications Data Needs Data Users Data Compilation, Evaluation and Recommendation Data Evaluators Experimental and Theoretical Data Production Data Producers

  21. Summary • The series of IAEA meetings including the Joint IAEA-NFRI TM on Data Evaluation were highly successful in drawing consensus from participants on the coordinated data evaluation activities by the community. • Disseminate the concepts of VIM3 (International Vocabulary of Metrology), GUM (Guide to the expression of Uncertainty in Measurement) and “Critical Assessment Skills” • Engage younger generation in the process • Collaborate with colleagues in the community • Change the culture about data research with publications • IAEA A+M Data Unit will actively participate in organizing and coordinating the community effort in the data evaluation activities, ultimately towards the standard data library for fusion applications. • Assess the needs of user communities • Collaborate with SUP@VAMDC • We urge you, the community to join us in the data evaluation activities that will benefit data users, producers and evaluators in the future.

More Related