Presentation is loading. Please wait.

Presentation is loading. Please wait.

Workshop on research assessment in CERIF Stephen Grace, Brigitte Jörg, Aija Kaitera, Maximilian Stempfhuber.

Similar presentations


Presentation on theme: "Workshop on research assessment in CERIF Stephen Grace, Brigitte Jörg, Aija Kaitera, Maximilian Stempfhuber."— Presentation transcript:

1 Workshop on research assessment in CERIF Stephen Grace, Brigitte Jörg, Aija Kaitera, Maximilian Stempfhuber

2 2 Workshop on research assessment in CERIF Three ten-minute presentations on systems in UK, Germany and Finland One hour discussion on common data elements across different national boundaries Map them to CERIF Submit to CERIF Task Force

3 Questions to ask WHO does it? Is it local or national? WHEN and WHERE it takes place? One-off or regular activity? WHY is it done? What is the purpose of the exercise – to allocate money, to improve performance, to encourage co-operation etc.? WHAT information/data is needed? The types of information needed? HOW universities handle the process? Systems, staff involved? 3

4 4 Assessment in the UK Stephen Grace Research Assessment Exercise (RAE): WHO does it? Funding Councils WHEN? Every six years WHERE? From University to Funding Council WHY? To identify Research Excellence in order to allocate funding WHAT? Information on Researchers, their Outputs, Sources of Funding, Studentships, Measures of Esteem HOW? Gather data from internal systems to submit to the Funding Council using XML Schema or online form filling

5 5 Research Excellence Framework (REF) Research Excellence Framework (REF) aims to be simpler to administer than RAE Introduces Impact element (25% of marks) to cover benefits to economy, society, culture, public policy and services, health, environment, international development and the quality of life Still working out what will be asked, and the form of submissions in the area of Impact

6 6 Research Rating: WHO does it? German Research Council (Wissenschaftsrat) WHEN? Since 2007, pilot phase for selected disciplines WHERE? At Universities and research centres jointly funded by federal ministries and states WHY? To rate and compare research performance WHAT? Information on Research Groups: Quality, Impact / Effectiveness, Promotion of young Scientists, Transfer HOW? Questionnaire for all information except publications; self reporting system for publications or use of commercial reference database (depending on discipline) Assessment in Germany Maximilian Stempfhuber

7 7 Rating (not ranking) Pilot studies in Chemistry and Sociology (2007/08), different modes of data collection Current: Electrical engineering & Information technology, planned: Humanities Currently not used for performance based funding In parallel: University Excellence Initiative (Wissenschaftsrat & DFG) CHE University Ranking (CHE & Die Zeit)

8 1: Quality of Research Citations Per selected publication Per selected publication, normalized to journal No. of reviewed publications Publications List of publications per publication type 5 most important publications List of patents

9 1: Quality of Research (cont.) Total of external funding spent Public funding Commercial funding List of research awards etc. Management of national / international research networks

10 2. Impact / Effectiveness Publications: No. of reviewed publications Patents: No. of patents, total of earned licensing fees Total of external funding spent Public funding Commercial funding Percentage of external funded staff Management of national / international research networks

11 2. Impact / Effectiveness (cont.) Citations of selected (important) publications Citations of selected (important) publications normalized to journal No. of funded visiting researchers No. of distinguished positions in external scientific organizations, learned societies etc. Chair / co-chair at conferences Ratio: reviewed publications / scientific staff

12 3. Promotion of young scientists PhDs No. of successful PhD studies / female applicants List of PhD (structured) programmes No. of advised and finished PhD thesis No. of persons receiving grants List of first-time professorships No. of habilitations / female applicants

13 4. Knowledge / Technology Transfer No. of patents, total of earned license fees Total of external funding from industry List of spin-offs Involvement in standardizing bodies List of positions relevant for transfer List of activities in knowledge transfer

14 14 Assessment in Finland Aija Kaitera WHO does it? Academy of Finland / Ministry of Education WHEN? Every three years WHERE? from individual commitees, the Universities WHY? The state and quality of scientific research in Finland Funding Allocation WHAT? publications, international mobility, develop research training research careers, interaction with society HOW? bibliometrics, targeted questionnaires; evaluation of disciplines and research fields (more continuous)

15 15 Assessment in Finland Aija Kaitera

16 16 Assessment Activities in Finland Aija Kaitera National level research system evaluations by Academy of Finland The state and quality of scientific research in Finland (every 3 years; bibliometrics, targeted questionnaires) Evaluation of disciplines and research fields (more continuous) Organisation Level by Scientific Advisory Boards Intl evaluation of organisation’s activities (every 3 or 6 years) Operational level by organisation’s internal committees Allocation of basic funding (from ministry to universities every three years; universities to faculties every year) Allocation of quality funding Evaluation of grant applications (continuous) Evaluation of research programmes (continuous)

17 17 Assessment Activities in Finland Aija Kaitera Data Collecting Processes From researcher to funding bodies From researcher to university From university to funding body From university to ministry

18 18 Assessment Activities in Finland Aija Kaitera Methods New national data warehouse (2013?) Research: Project, Grant decision, Person, Person’s education, University or Polytechnic, Address, Research group, Space, Publications, Innovations, Other research activities, Results Also person, employment, academic data New national publication database (2012?) New national publication classification Open access is included in national reporting (is/is not), separate service for full texts Will include publication quality assessment “Norwegian model National annual reporting Questionnaires Grant reports SABs

19 19 Assessment Activities in Finland Aija Kaitera Publication classification in Finland 2010->

20 20 CERIF4REF schema – 3: Tables generated in CERIF cfOrgUnit-CORE cfPers-CORE cfFundProg-2ND cfPersName-ADD cfOrgUnitId_OrgUnit-LINK cfPersName-OrgUnit-LINK cfOrgUnit_ResPubl-LINK cfPers_ResPubl-LINK cfResPubl_Class-LINK cfOrgUnit_FundProg-LINK cfResPublBiblNote-LANG cfResPublAbstr-LANG cfClassTerm-LANG cfOrgUnitResAct-LANG cfOrgUnitName-LANG cfPers_Pers-LINK cfPers_ExpSkills-LINK cfExpSkillsDescr-LANG cfResPubl-RES

21 21 Expert Group on Assessment of University- Based Research EU, 2010 http://ec.europa.eu/researc h/science- society/document_library/p df_06/assessing-europe- university-based- research_en.pdf

22 22 PurposeResearch productivityQuality and scholarly impact Innovation and social benefit Sustainability and scale Research infrastructure Allocate resourcesResearch output/ bibliometric data Citation data Peer review Keynote, awards, etc. Research income‘research active’ as percentage of total academic staff; libraries, equipment etc. Drive research mission Differentiation Research output/ bibliometric data Output per research academic Peer review Self-evaluation Ration of research income: teaching income External research income Ratio of undergraduate: master/PhD students Increase regional/ community engagement Publications, Policy, Reports etc. End user reviews Keynote, Media awards etc. Percentage funding from end-users Patents, licences, spin- offs Number of collaborations and partnerships Improve research performance Research output/ bibliometrics data Citation data Number and percentage publication in top-ranked, high impact journals Peer review Assess value-for- money or cost- benefit of research Research output/ bibliometrics data Output per research academic Peer review and/or citation data Commercialisation data Social, economic, cultural and environmental impacts/ benefits indicators External research income Employability of PhD graduates Number of collaboration and partnerships Encourage international co- operation Research output/ bibliometrics data with focus on European & international collaborations Percentage of research income from international sources Number of collaboration and partnerships Increase multi- disciplinary research Research output/ bibliometrics data with focus on interdisciplinary fields Peer review Self-evaluation New research fields, interdisciplinary teaching programmes, etc. Research conducted by people from different disciplines Multidimensional research assessment matrix

23 23 Purpose of research assessment Allocate resources Drive research mission differentiation Increase regional/ community engagement Improve research performance Assess VFM or cost-benefits of research Encourage international co-operation Increase multidisciplinary research

24 24 Next steps Report back to this conference Report to CERIF Task Group etc

25 25 Thank you Stephen Grace Centre for e-Research King’s College London http://www.kcl.ac.uk/iss/cerch/ stephen.grace@kcl.ac.uk

26 Workshop Results -Many countries deal with the same problems -Different states of progress to implementation -Currently, most go for Output Measurement rather than Impact (because measurable) 26

27 Workshop Results -CERIF: Quite powerful representation means with respect to Assessment Requirements (-> REF) -Best Way utilize CERIF is Best Practice - Qualitative / Quantitative Data Requirements (across European activities) 27

28 Workshop Results -Formalization and Measurement is necessary in order to do Analysis -also the Measurement of soft qualities has to be formalized for Measurement -CRISs the place to store even soft facts 28

29 Workshop Results -CRISs the place to -document / archive soft facts (impacts) -for Reporting / Strategy / Planning … 29


Download ppt "Workshop on research assessment in CERIF Stephen Grace, Brigitte Jörg, Aija Kaitera, Maximilian Stempfhuber."

Similar presentations


Ads by Google