Main Panel A Criteria and Working Methods Cardiff School of Biosciences Ole H Petersen Chair.

Slides:



Advertisements
Similar presentations
GSOE Impact Workshop Impact and the REF 19 th May 2010 Lesley Dinsdale.
Advertisements

Research Excellence Framework Jane Boggan Planning Division Research Staff Forum - January 2010.
Main Panel A: Subpanels and Chairs A1: Clinical Medicine - Christopher Day, Newcastle University A2: Public Health, Health services and Primary Care -
Impact workshop Phil Hannaford VP Research and Knowledge Exchange.
REF2014 HODOMS Birmingham 8 th April Ann Dowling: Chairman of REF Main Panel B John Toland: Chairman of REF Sub-Panel B10: Mathematical Sciences.
ARC: Open Access and Research Data Justin Withers Director, Policy and Integrity Australian Research Council.
The Research Excellence Framework Assessment framework, guidance on submissions and panel criteria.
Guidance on submissions Chris Taylor, Deputy REF Manager Graeme Rosenberg, REF Manager.
These slides have been produced by the REF team, and were last updated on 3 September 2011 They provide a summary of the assessment framework and guidance.
Communicating the outcomes of the 2008 Research Assessment Exercise A presentation to press officers in universities and colleges. Philip Walker, HEFCE.
The Research Assessment Exercise in the United Kingdom Paul Hubbard International colloquium “Ranking and Research Assessment in Higher Education” 13 December.
Research at York Presentation to Council Alastair Fitter Pro-Vice-Chancellor, Research.
RQF Trials and the Newcastle Experience Barney Glover.
The Research Excellence Framework Panel criteria [Main Panel Chair] Graeme Rosenberg.
Achieving and Demonstrating Research Impact John Scott.
Demonstrating research impact in the REF Graeme Rosenberg REF Manager
The Research Excellence Framework. Purpose of REF The REF replaces the RAE as the UK-wide framework for assessing research in all disciplines. Its purpose.
The Research Excellence Framework. Presentation outline The REF assessment framework and guidance on submissions: - Overview - Staff - Outputs - Impact.
Consultation on panel criteria and working methods.
REF Information Session August Research Excellence Framework (REF)
The Research Excellence Framework Data and Audit May 2012.
Writing Impact into Research Funding Applications Paula Gurteen Centre for Advanced Studies.
REF 2014 kick-off meeting Alistair Fitt, Pro Vice-Chancellor, Research and Knowledge Transfer.
The UK Experience of Quality Assurance in Research and Doctoral Education Dr Robin Humphrey Director of Research Postgraduate Training Faculty of Humanities.
Beyond the RAE: New methods to assess research quality July 2008.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
The REF assessment framework and guidance on submissions Linda Tiller, HEFCW 16 September 2011.
Page 1 RESEARCH EXCELLENCE FRAMEWORK : RESEARCH IMPACT ASESSMENT LESSONS FROM THE PILOT EXERCISE Professor John Marshall Director Academic Research Development.
REF Impact Pilot Laura Tyler Marketing & New Media Manager University of Glasgow.
The Evaluation of Publicly Funded Research Berlin, 26/27 September 2005 Evaluation for a changing research base Paul Hubbard Head of Research Policy, HEFCE,
The Research Excellence Framework Expert Advisory Groups round 1 meetings February 2009 Paul Hubbard Head of Research Policy.
Chair: Professor Dame Ann Dowling Sub-panel Chairs: Panel Advisers: SP07: Professor David Price Dr Karen Ness SP08: Professor Richard Catlow Ms Lesley.
Professor Andrew Wathey Vice-Chancellor and Chief Executive Northumbria University.
Main Panel D Criteria and Working Methods Main Panel D covers: Area Studies Modern Languages and Linguistics English Language and Literature History Classics.
The Research Excellence Framework Impact: the need for evidence Professor Caroline Strange 22 June 2011.
THE IMPACT OF RAE ON SERIAL PUBLICATION Professor Judith Elkin UK Serials Group March 2004.
12/9/10 Pilot assessment impact- paperwork Findings of the expert panels- report + appendix Lessons learned- feedback from pilot institutions Examples.
The REF assessment framework (updated 23 May 2011)
1 Research Context and the RAE John Saunders Head of School, Aston Business School IDEAS Factory 23/24 October 2006.
Delivering Strength Across the Piece David Sweeney Director, Research, Education and Knowledge Exchange HEPI, Royal Society 31 March 2015.
Dr Jane Tonge Senior Examiner
CIL vs S106 The Regulation 123 list. The levy cannot be expected to pay for all of the infrastructure required: – 10-30% – Consider CIL as just one part.
Research Excellence Framework 2014 Michelle Double Hyacinth Gale Sita Popat Edward Spiers Research and Innovation Support Conference.
REF IMPACT PILOT : BACKGROUND BRIEFING This presentation has been compiled by the NCCPE to distil the key background information about the REF impact pilot.
Impact and the REF Consortium of Institutes of Advanced Study 19 October 2009 David Sweeney Director (Research, Innovation and Skills)
The Research Excellence Framework Assessment framework and guidance on submissions Graeme Rosenberg, REF Manager.
Current R& KE Issues David Sweeney
Towards REF 2020 What we know and think we know about the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS Anglia.
Research Day 2017 Generating Impact breakout session
Impact and the REF Tweet #rfringe17
WP2. Excellent university for the researchers
Consultation on draft panel criteria and working methods
REF 2021 Briefing 25 January 2018.
REF 2021 What we know and thought we knew, in preparation for the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS.
Law Sub-panel Generic Feedback - Impact
These slides have been produced by the REF team, and were last updated on 30 January 2012 They provide a summary of the assessment framework and guidance.
REF 2021 Briefing Consultation on the draft guidance
These slides have been produced by the REF team, and were last updated on 30 January 2012 They provide a summary of the assessment framework and guidance.
One year on: developments since Duxford 2016
Consultation on draft panel criteria and working methods
Research Update GERI May 2010.
Towards Excellence in Research: Achievements and Visions of
Research Support Conference, 21 June 2011
Consultation on draft panel criteria and working methods
Research funding application process
Subject Pilot Teaching Excellence and Student Outcomes Framework (TEF)
REF and research funding update
UCML, London 18 January 2019 REF 2021 Susan Hodgett (D25)
Understanding Impact Stephanie Seavers, Impact Manager.
REF 2021 Panel criteria and working methods David James
Presentation transcript:

Main Panel A Criteria and Working Methods Cardiff School of Biosciences Ole H Petersen Chair of Biological Sciences Sub-Panel REF2014

Main Panel A

Clinical medicine: Public health, health services and primary care: Allied health professions, dentistry, nursing and pharmacy: Psychology, neuroscience and psychiatry: Biological sciences: Agriculture, veterinary and food sciences:

Main Panel A Criteria and Working Methods Multiple submissions MPA does not expect to receive requests for multiple submissions and encourages HEIs to use subdivision into research groups. Any multiple submission requests need to meet Guidance on Submissions criteria plus make case that not feasible to present a single submission by subdivision into research groups. MPA oversees consistent criteria across sub-panels

Assessment Criteria: Outputs The Panel Criteria specifies in detail the following: a series of indicators of quality for research outputs eg scientific rigour with regard to design method, execution and analysis, etc. an indicative list of eligible research outputs the types of outputs that do not normally meet the REF definition of research eg non-research care studies/case studies.

Assessment Criteria: Types of Outputs

Assessment Criteria: Co-authored Outputs The Panel will give equal weighting to individual and collaborative/team efforts. Where co-authored outputs submitted, the submitting author is required to justify their material contribution to the output (up to 50 words). The sub-panel will determine whether there has been a material contribution. If no, Unclassified. If yes, the sub-panel will assess the quality of the output (with no further regard to contribution).

Assessment Criteria: Outputs Co-authored outputs submitted more than once in a submission The sub-panels consider that the fullest and most favourable impression of research is gained when co- authored outputs are only submitted once. However, co-authored outputs from a very substantial piece of research may exceptionally be submitted by up to two staff in a submission. Each staff member would need to justify the scale of research and the separate material contribution of authors (up to 50 words).

Assessment Criteria: Double-weighted Outputs Sub-panels will consider this when the scale of theacademic investment and the intellectual scope is far greater than normal (e.g. monographs and other such outputs) Required to justify claim for double weighing (up to 50 words). Where the case is not accepted the ‘missing’ output will be assessed as Unclassified. No reserve outputs are permitted.

Assessment Criteria: Citation Data Citations game is a quick fix but unlikely to deliver 15 December 2006 Ole Petersen The Royal Society, responding to the consultation on the reform of research assessment and funding, recommended that "the primary assessment method for all subjects must be peer judgment" and "research indicators... should do no more than inform peer review panels". Ole H. Petersen chaired the Royal Society's working group on the research assessment exercise.

Assessment Criteria: Citation Data Where available, sub-panels will make use of citation data provided (by Elsevier) specifically for REF2014 (for practical reasons early submission will be helpful) Only one element to inform peer-reviewed judgements, not a primary tool in the assessment. A positive indicator of the extent to which the academic community has made use of the output.

Assessment Criteria: Range of Impacts Impact of research in Main Panel A UOAs is broad. Sub-panels welcome case studies describing impacts that have provided benefits to one or more areas of health, society, culture, public policy and services, production, economy, environment, international development or quality of life. Impacts can be manifest in a wide variety of ways and a list of possible examples is provided as a guide. List is not exhaustive, exclusive or ranked.

Assessment Criteria: Evidence of Impact REF impact pilot useful reference for HEIs An extensive range of indicators that could be used to evidence case studies is provided to assist institutions in compiling their case studies. Sub-panels will consider all appropriate evidence and place particular emphasis on verifiable elements. Factual statements from external, non-academic organisations but not testimonials. Impact may be described at any stage of development though early stage or interim impacts may not score as highly as mature impacts.

Assessment Criteria: Impact Template The Impact Template has four sections and its assessment will comprise 20% of the impact quality sub-profile: Context Approach to fostering impact Strategies and plans Relationship to submitted case studies Main Panel A has explained in its criteria a range of indicators and evidence for each headings. These are not exhaustive lists.

Assessment Criteria: Impact The sub-panels will apply the following criteria to assess impact: Reach: the spread or breadth of influence or effect on the relevant constituencies. Significance: the intensity of the influence or effect within the period 1 January 2008 to 31 July 2013

Assessment Criteria: Environment Outstanding research can be undertaken in a wide variety of research structures and environments. Submissions may chose to define groups. May be departments/research groups or units which may or may not be cognate. Must be defined consistently across the submission as defined in Panel Criteria. Sub-panels will provide written qualitative feedback to institutions at research group level where appropriate Qualitative data as defined in Guidance on Submissions will be used to support assessment of the Environment Template.

Assessment Criteria: Environment Panel Criteria gives examples of possible evidence and indicators under each of the template headings: Overview; Strategy; People (covering staffing strategy and staff development, and research students); Income, Infrastructure and facilities; Collaboration and contribution to the discipline.

Assessment Criteria: Environment Word length linked to number of FTEs submitted Sub-panels will combine Overview and Strategy and assess the Environment Template as four equally weighted components.

Assessment Criteria: Working Methods Interdisciplinary and multidisciplinary research welcomed and treated equally. Sub-panels members have been selected to embrace broad-ranging experience to enable assessment of such work and work that crosses UOA boundaries. Additional assessors (both academic and user) will be appointed to each sub-panel to assist with the assessment phase where required. User members and user assessors will contribute significantly to the assessment of impact.

Assessment Criteria: Working Methods Sub-panel requirements for assessors will be informed by HEI submission intentions. All outputs will be examined in sufficient detail to inform robust judgement. Cross-referrals to other Main and Sub-panels if necessary.

Assessment Criteria: Working Methods Main Panel will work with sub-panels to ensure adherence to assessment criteria and consistent application of standards. Details defined in Panel Criteria. Sub-panels will ensure that submissions are assessed using appropriate expertise: approaches defined in Panel Criteria.

Further information Guidance on submissions (July 2011) Draft panel criteria and working methods (July 2011)

Main Panel A Criteria and Working Methods “Real scientists know that the only way to assess a colleague's research performance is to read their original papers, judging their importance, reliability and novelty. Which is exactly what the RAE does.” This is also essentially what REF2014 will do. The bottom line: In December 2006, I wrote in THE: