Presentation is loading. Please wait.

Presentation is loading. Please wait.

IAEA International Atomic Energy Agency Coordinated Activities on Evaluation of Collisional Data for Fusion Applications H.-K. Chung and B. J. Braams Atomic.

Similar presentations


Presentation on theme: "IAEA International Atomic Energy Agency Coordinated Activities on Evaluation of Collisional Data for Fusion Applications H.-K. Chung and B. J. Braams Atomic."— Presentation transcript:

1 IAEA International Atomic Energy Agency Coordinated Activities on Evaluation of Collisional Data for Fusion Applications H.-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Division of Physical and Chemical Sciences, IAEA, Vienna October 1-4, 2012 ICAMDATA-8, Gaithersburg, MD

2 IAEA Contributors to this presentation (direct) I. Murakami, D. Kato, T. Nakano, Y. Itikawa, Y. Nakamura, Alex M Imai, H. Takagi, T. Kato, F. Koike (Japan), G.W.F. Drake, D. R Schultz, A. Kramida, P. Krstic, E. Landi, C. Ballance (USA), J.-S. Yoon, M-Y. Song, H. Cho, C.-G. Kim, J.-O. Choi (Korea), J. Yan, G. Liang (China), M. O'Mullane, N. Mason, K. Aggarwal (UK), D. Reiter, D. Coster, J. Roth (Germany), V. Kumar (India), S. Buckman (Australia), V. Shevelko (Russia), G. Karwasz (Poland), S. Lisgo (ITER)

3 IAEA Outline IAEA Atomic and Molecular Data Unit Needs of Evaluated Data for Fusion Applications IAEA Coordinated Activities on Evaluation of Collisional Data for Fusion Applications Future Activities

4 IAEA IAEA ATOMIC AND MOLECULAR DATA UNIT Who we are and why we are here

5 IAEA atomic energy peace, health and prosperity IAEA : Accelerate and enlarge the contribution of atomic energy to peace, health and prosperity nuclear science and technology for various peaceful purposes Assists its Member States, in the context of social and economic goals, in planning for and using nuclear science and technology for various peaceful purposes, including the generation of electricity. 155 Member States 2200 Staffs IAEA A+M Unit formed in 1977  1976 at Culham Laboratory, UK Review progress and achievements of A+M/PSI data for Fusion program Stimulate international cooperation in measurement, compilation and evaluation of A+M/PSI data for fusion

6 IAEA Coordinated Research Projects (CRP) TheoriesMeasurements Fusion Plasma Modelling Data Evaluation Databases (AMBDAS, GENIE, ALADDIN, Wiki…) Publications (INDC, APID, Bulletin) Consultants Meetings (CM) Technical Meetings (TM) Data Compilation International Coordination of A+M/PSI Data Research for Fusion Data Production

7 IAEA Online AM/PSI Data Services: http://www-amdis.iaea.org

8 IAEA Network Collaboration for AM/PSI Data for Fusion Data Centre Network ADAS, Summers H. CRAAMD, Jun, Y. IAEA, Braams, B. J. JAEA, Nakano, T. KAERI, Rhee, Y. Kurchatov, Martynenko, Yu. NIFS, Murakami, I NIST, Wiese, W.L. NFRI, Yoon, J ORNL, Schultz, D. R. Fusion Laboratories ITER EFDA JET, UKAEA ASDEX-Upgrade, IPP TEXTOR, Jülich, FZJ KSTAR, NFRI NIFS, JAEA PPPL, ORNL Code Centre Network Curtin Univ. I. Bray Kitasato Univ. F. Koike Univ. Autonoma de Madrid I. Rabadan Univ. P&M. Curie, Paris, A. Dubois Univ. of Bari, M. Capitelli Kurchatov Institute, A. Kukushkin Lebedev Institute, L. Vainshtein FZJ, D. Reiter Ernst-Moritz-Arndt Univ, R. Schneider NIST, Y. Ralchenko PPPL, D. Stotler LANL, J. Abdallah Jr. IAEA, B. J. Braams HULLAC M. Klapisch CNEA, P.D. Fainstein Data Users Data Producers IAEA Coordination CRP Publications Knowledgebase Databases Meetings Data Centres & Evaluators

9 IAEA Data Users Need: Evaluated and recommended data Code Centre Network Meeting (October 2010)  CCN is organized to improve Online Code Capabilities to provide needed data for Data Users, particularly, Plasma Modellers  Data Users (D. Coster, D. Reiter, R. Schneider, D. Elder) participated to interact with the code producers Discussions  Online Codes generate too many data sets without quality information  Data Users need Complete sets and/or Recommended data At every meeting, data evaluation / quality was an issue  IAEA Meetings  VAMDC (Compilation & Distribution)  SUP@VAMDC

10 IAEA NEEDS OF EVALUATED DATA FOR FUSION APPLICATIONS From user’s perspectives

11 IAEA Typical edge transport code runtime (for same model, same equations, same grid size) 1 day 1-2 weeks 3 months TEXTOR (R=1.75 m) Jülich, GER JET (R=2.96 m), Oxford, UK ITER (R=6.2 m), Cadarache, FRA Because of more important plasma chemistry (increased non-linearity, non-locality, in sources).

12 IAEA Data needs to be “verified” by an expert Data needs to be “robust” Data needs to be “comprehensive” Data needs to be easy to use Version control: know what data was used for a particular run, and which runs used a particular version of the data Needs to be efficient Needs to be able to address special needs “Robust” Data “Verified” Data Requirement for Plasma Modelling: Reliable data sets for AMNS data Require a Recommended & Internationally Agreed Library for Atomic, Molecular, Nuclear, Surface Data

13 IAEA IAEA ACTIVITIES ON EVALUATION AND RECOMMENDATION A series of meeting to organize community efforts for evaluation and recommendation of A+M/PSI Data

14 IAEA Data Centre Network Meeting (2011)  Data Evaluation Tasks are Difficult Lack-of man-power: Experts retiring or leaving the field No young people get in the field (no publication, no funding) Evaluation requires multiple sets : Too many or too few Very few benchmark experiments for collisional data Even fewer uncertainty estimates for theoretical data  Conclusions Data should be first collected and available for evaluation Evaluation activities should be organized in the community Evaluation guidelines should be established in the community A list of recommended data sets should be available as a final product  Current status NIST: Critical evaluation of atomic structure and transition probabilities NFRI/KRISS: National efforts to establish standard data sets NIFS/JAEA: Evaluated data libraries, Collaboration IAEA/ORNL: ALADDIN, individual consultancies

15 IAEA Coordination Meetings for Evaluation http://www-amdis.iaea.org/DCN/Evaluation/ http://www-amdis.iaea.org/DCN/Evaluation/ Feb 12 CM on Procedures for Evaluation of AM/PMI Data for Fusion (Japan) Current status & future coordination 14 Participants from Korea, Japan and China Jun 12 CM on Data Evaluation & Establishment of a Standard Library of AM/PMI Data for Fusion (IAEA) 7 Participants of Journal Editors, Data Users, Producers, Evaluators Sep 12 TM on Data Evaluation for AM/PSI Processes in Fusion (Korea) 6 Topics 25 Participants from 10 Countries and IAEA

16 IAEA IAEA-NFRI TM on Data Evaluation: http://www-amdis.iaea.org/meetings/NFRI2012/ http://www-amdis.iaea.org/meetings/NFRI2012/ 24 Presentations (2.5 days) and 1 day Technical Discussions Topics (Focused on Reaction Data)  Current Evaluated Databases (Kramida, Landi, Mason)  Evaluation Methods and Experiences (Itikawa, Kumar, Cho, Karwasz)  Error Propagation and Sensitivity Analysis (O’Mullane, Ballance, Reiter, Krstic)  Theoretical Data Evaluation (Aggarwal, Liang, Takagi, Song)  Experimental Data Evaluation (Nakamura, Buckman, Shevelko, Imai)  Data Centres Evaluation Activities (Yoon, Murakami, Mason, Chung) Participants from Australia, China, Germany, India, Japan, Korea, Poland, Russia, UK, USA and IAEA.

17 IAEA Summary of Discussions Community Role Involve the community in data evaluation Engage young generation for transfer of knowledge Define terminology and vocabulary used in evaluation Define common workflow guidelines Technical Issues Assessment for theoretical data A ssessment of experimental data Error propagation and Sensitivity Analysis

18 IAEA Community Role: Consensus Building Change of notions: Databases  Data research Engage young generation (in early career)  Culture should teach that data evaluation is a critical part of scientific work  Publication Issues: Review publications are good for career development  Funding Issues: Evaluation leads to the gap of understanding of the field and finds interesting problems Disseminate materials to train students and researchers with the “Critical Analysis Skills” Disseminate the standard definitions of terminologies adopted by international organizations (IAEA, IUPAC, IUPAP, BIPM, ISO, WHO, FAO, etc) Agree on the procedure of evaluation towards a standard reference data

19 IAEA Define Terminology: Uncertainty Approach It’s NOT AN ERROR but AN UNCERTAINTY Terminology in metrology  VIM ( Vocabulaire International de Métrologie, Bureau Int. des Poids et Measures ) 2007  GUM ( guide to the expression of uncertainty in measurem ent) 2008 Measurement and uncertainty  The objective of a measurement is to determine the value of the measurand (GUM)  In general, a measurement has imperfections that give rise to an error in its result.  Error 1 = Measurement result – True value (Error approach) True value : value consistent with the definition of a given particular quantity  Error 2 = Measured value – Reference value (Uncertainty approach) Reference value (Assigned value): The reference quantity value can be a true quantity value of the measurand, in which case measurement error is unknowable, or an appropriate, known quantity value such as a conventional quantity value or a specified target quantity value to be realized in a production process. value value and uncertainty uncertainty

20 IAEA Any measurement has an uncertainty Value and uncertainty Time, method, place, person, procedure

21 IAEA Uncertainty Approach based on VIM & GUM: The true value lies within the uncertainty range

22 IAEA Common Workflow Guidelines for Evaluation of Collisional data Advantages:  Easier to expand the evaluators’ network including early career researchers.  Introduce more rigorous procedures for evaluation and increases the dependability of the evaluation. Disadvantages:  The quality of evaluation critically depends on the experiences of the evaluators.  It is possible that different people may reach at different conclusions using the same guidelines and the results may not be reproducible. Solutions:  Collaborations can help reducing the disadvantages.  Evaluation activities by scientific advisors and editorial panels will be a great mechanism to produce the evaluated data library. Workflow of critical evaluation of data on wavelengths and energy levels (NIST)

23 IAEA Evaluation by Editorial Board Panel Evaluation and recommendation require the community consensus with an endorsement from the IAEA or other international authorities  Establishment of the evaluation guidelines: will evolve with time and experience with broad collaborations from the community  Group Evaluation: 4-5 panelists including young and senior people like the editorial board for a journal with the broad backgrounds (experimentalists, theoreticians, producers and users)  Self-Evaluation: Data producers with a deep knowledge in some cases. May work better for theoretical data sets. Merits of Group Evaluation:  Facilitate the knowledge transfer to younger generation  Review papers can be written for the evaluation work  Make the data research project visible to the Community

24 IAEA An example of evaluation towards a Standard Reference Data Evaluation (NFRI) Data Compilation 1 st evaluation (experts) Final Evaluation (Panel Decision)

25 IAEA Summary of Discussions Community Role Involve the community in data evaluation Engage young generation for transfer of knowledge Define terminology and vocabulary used in evaluation Define common workflow guidelines Technical Issues Assessment for theoretical data A ssessment of experimental data Error propagation and Sensitivity Analysis

26 IAEA Theoretical Data Evaluation No criteria of assessments for theoretical data A Critical need: guidelines for uncertainty estimates of theoretical data  Should not try to give a straight recipe for assessing uncertainties, however, there are still several to start with. There are prescriptions such as energy grids for resonances and partial waves. One may take a model to see a convergence and estimate uncertainties based on assumptions within the model.  Comparisons with experiments: this can be dangerous.  Comparisons among different theories: if some theories are better than another, it may be given a benchmark status.  For scattering data maybe we should aim for “ideas” or “suggestions” rather than “guidelines”. Theoreticians may have an idea of uncertainty estimates already  Journal policies can change the culture: PRA policies

27 IAEA Experimental Data Evaluation Check Lists  Uncertainty estimates or error assessment critical  Self-consistencies checked  Experimental techniques evaluated.  Reputation of the data producer considered  Anomalies in some experimental processes (ro-vibrational / metastable) Wish Lists  Evaluation by a group of “established’ experts with broad expertise  Provide Recommended values where possible  Include a comparison with theory and an assessment of overall status  Evaluation will lead to the understanding of the gaps of the field  Establish “benchmarks” where possible:

28 IAEA Atomic structure collision codes fundamental data Processed data (rate coef.) CR transition matrix A = A_excitation + A_radiative + A_ionization + A_recombination + A_charge-exchange …. effective rates, population coefficients cooling rates, beam stopping rates,…. Error Propagation and Sensitivity Analysis: Uncertainties in “Data” & “Data Processing Toolbox” ? experimental data Velocity distribution: Boltzmann solver, Maxwellian Linear algebra, ODE solvers Sensitivity,error propagation to final model results: PDEq, IDEq,… leave to modelers, spectroscopists Monte Carlo

29 IAEA FUTURE ACTIVITIES Need the community feedback and support

30 IAEA Near-term goals Priorities for evaluation  Electron scattering on Be q+. (IAEA CRP)  Electron scattering on CH 4. (NFRI group)  Charge exchange and electron loss for H on Be q+. Action Items  Developing an evaluators network – Key people identified  Inventorise datasets that are now used by fusion plasma modellers  Sketch out guidelines for uncertainty assessment of theoretical data  Organize a workshop (SUP@VAMDC)  NFRI to organize a data evaluation group for demonstration  The ITER project should recognize the need of standard reference data (SRD) for A+M processes used in the design

31 IAEA Long-term goal…. Data Users Data Producers Data Evaluators Data Needs Experimental and Theoretical Data Production Data Compilation, Evaluation and Recommendation Global Network towards the Internationally Agreed Data Library for Fusion and other Plasma Applications

32 IAEA The development of a standard library : Document data sets used by the fusion community Priority List of Critically Needed Data Database of Available Data for Evaluation Basis of Evaluated Data Library

33 IAEA Evaluation of evaluated data sets: a prototype of evaluated data library

34 IAEA Summary The series of IAEA meetings including the Joint IAEA-NFRI TM on Data Evaluation were highly successful in drawing consensus from participants on the coordinated data evaluation activities by the community.  Disseminate the concepts of VIM3 (International Vocabulary of Metrology), GUM (Guide to the expression of Uncertainty in Measurement) and “Critical Assessment Skills”  Engage younger generation in the process  Collaborate with colleagues in the community  Change the culture about data research with publications IAEA A+M Data Unit will actively participate in organizing and coordinating the community effort in the data evaluation activities, ultimately towards the standard data library for fusion applications.  Assess the needs of user communities  Collaborate with SUP@VAMDC We urge you, the community to join us in the data evaluation activities that will benefit data users, producers and evaluators in the future.


Download ppt "IAEA International Atomic Energy Agency Coordinated Activities on Evaluation of Collisional Data for Fusion Applications H.-K. Chung and B. J. Braams Atomic."

Similar presentations


Ads by Google