Main Panel D Criteria and Working Methods Main Panel D covers: Area Studies Modern Languages and Linguistics English Language and Literature History Classics.

Slides:



Advertisements
Similar presentations
Assessing Excellence with Impact Ian Diamond ESRC.
Advertisements

Research Excellence Framework Jane Boggan Planning Division Research Staff Forum - January 2010.
Main Panel A: Subpanels and Chairs A1: Clinical Medicine - Christopher Day, Newcastle University A2: Public Health, Health services and Primary Care -
The REF – panels and UOAs British Academy, 27 Oct 2009 Graeme Rosenberg REF Manager.
REF2014 HODOMS Birmingham 8 th April Ann Dowling: Chairman of REF Main Panel B John Toland: Chairman of REF Sub-Panel B10: Mathematical Sciences.
The Research Excellence Framework Assessment framework, guidance on submissions and panel criteria.
Guidance on submissions Chris Taylor, Deputy REF Manager Graeme Rosenberg, REF Manager.
Prof. Robin Nelson, University of London (CSSD)
These slides have been produced by the REF team, and were last updated on 3 September 2011 They provide a summary of the assessment framework and guidance.
Communicating the outcomes of the 2008 Research Assessment Exercise A presentation to press officers in universities and colleges. Philip Walker, HEFCE.
ACADEMIC INFRASTRUCTURE Framework for Higher Education Qualifications Subject Benchmark Statements Programme Specifications Code of Practice (for the assurance.
2005/6 ATN Research Quality Framework (RQF) Trial Curtin University of Technology Queensland University of Technology RMIT University University of South.
LIMITLESS POTENTIAL | LIMITLESS OPPORTUNITIES | LIMITLESS IMPACT Copyright University of Reading IMPACT AND THE ARTS AND HUMANITIES Anthony Atkin (Research.
The Research Excellence Framework Design Research Society Nottingham October 2011.
Outcomes Understand the way in which the Australian Curriculum has been structured in these learning areas Spend time familiarising themselves with the.
The Research Excellence Framework Panel criteria [Main Panel Chair] Graeme Rosenberg.
International Auditing and Assurance Standards Board The Clarified ISAs, Audit Documentation, and SME Audit Considerations ISA Implementation Support Module.
Achieving and Demonstrating Research Impact John Scott.
REF2014 – results and the way forward SSHP Meeting 12 March 2015.
Northampton – Development Opportunities a framework for enabling positive change.
Demonstrating research impact in the REF Graeme Rosenberg REF Manager
The Research Excellence Framework. Purpose of REF The REF replaces the RAE as the UK-wide framework for assessing research in all disciplines. Its purpose.
The Research Excellence Framework. Presentation outline The REF assessment framework and guidance on submissions: - Overview - Staff - Outputs - Impact.
Consultation on panel criteria and working methods.
REF Information Session August Research Excellence Framework (REF)
The Research Excellence Framework Data and Audit May 2012.
Writing Impact into Research Funding Applications Paula Gurteen Centre for Advanced Studies.
Research Quality Assessment following the RAE David Sweeney Director, Research, Innovation, Skills.
The REF assessment framework and guidance on submissions Linda Tiller, HEFCW 16 September 2011.
SSHRC Partnership and Partnership Development Grants Rosemary Ommer 1.
Teaching and Learning Research Programme CREATING KNOWLEDGE TOGETHER IN THE CONTEXT OF RESEARCH ASSESSMENT.
Research Assessment Exercise RAE Dr Gary Beauchamp Director of Research School of Education.
Page 1 RESEARCH EXCELLENCE FRAMEWORK : RESEARCH IMPACT ASESSMENT LESSONS FROM THE PILOT EXERCISE Professor John Marshall Director Academic Research Development.
Chair: Professor Dame Ann Dowling Sub-panel Chairs: Panel Advisers: SP07: Professor David Price Dr Karen Ness SP08: Professor Richard Catlow Ms Lesley.
Professor Andrew Wathey Vice-Chancellor and Chief Executive Northumbria University.
Research Quality Framework Presentation to APSR - ARROW - Repository Market Day 4 May 2007 Sandra Fox Department of Education Science and Training.
Understanding ARC Future Fellowships ANU College of Medicine, Biology and the Environment and ANU College of Physical Sciences 20 th October
The Research Excellence Framework Impact: the need for evidence Professor Caroline Strange 22 June 2011.
THE IMPACT OF RAE ON SERIAL PUBLICATION Professor Judith Elkin UK Serials Group March 2004.
12/9/10 Pilot assessment impact- paperwork Findings of the expert panels- report + appendix Lessons learned- feedback from pilot institutions Examples.
The REF assessment framework (updated 23 May 2011)
Delivering Strength Across the Piece David Sweeney Director, Research, Education and Knowledge Exchange HEPI, Royal Society 31 March 2015.
Atlantic Innovation Fund Round VIII February 5, 2008.
Main Panel A Criteria and Working Methods Cardiff School of Biosciences Ole H Petersen Chair.
GCSE English Language 8700 GCSE English Literature 8702 A two year course focused on the development of skills in reading, writing and speaking and listening.
What is impact? What is the difference between impact and public engagement? Impact Officers, R&IS.
Research Excellence Framework 2014 Michelle Double Hyacinth Gale Sita Popat Edward Spiers Research and Innovation Support Conference.
1 An introduction to research policy Professor John Strachan VERITAS Visit, 20 November 2015.
Impact and the REF Consortium of Institutes of Advanced Study 19 October 2009 David Sweeney Director (Research, Innovation and Skills)
Educational contributions to cohesion and well-being in European social and institutional life.
The Research Excellence Framework Assessment framework and guidance on submissions Graeme Rosenberg, REF Manager.
Current R& KE Issues David Sweeney
Towards REF 2020 What we know and think we know about the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS Anglia.
In REF 2014 the impact must have occurred during the period 1 January 2008 to 31 July 2013, underpinned by research produced during the period 1 January.
A Practical Guide to Evidencing Impact
Impact and the REF Tweet #rfringe17
WP2. Excellent university for the researchers
Consultation on draft panel criteria and working methods
REF 2021 Briefing 25 January 2018.
REF 2021 What we know and thought we knew, in preparation for the next Research Excellence Framework Dr. Tim Brooks, Research Policy & REF Manager, RDCS.
Law Sub-panel Generic Feedback - Impact
REF 2021 Briefing Consultation on the draft guidance
Anglia Ruskin REF Awayday 2017
Consultation on draft panel criteria and working methods
Research Update GERI May 2010.
Towards Excellence in Research: Achievements and Visions of
Consultation on draft panel criteria and working methods
REF and research funding update
Understanding Impact Stephanie Seavers, Impact Manager.
Presentation transcript:

Main Panel D Criteria and Working Methods Main Panel D covers: Area Studies Modern Languages and Linguistics English Language and Literature History Classics Philosophy Theology and Religious Studies Art and Design: History, Practice and Theory Music, Drama Dance and Performing Arts Communication, Cultural and Media Studies; Library and Information Management

Main Panel D Criteria and Working Methods Requests for multiple submissions need to meet the criteria specified in the Guidance on Submissions Requests are expected in: Area Studies Modern Languages Art and Design Music, Drama, Dance and Performing Arts Communication, Culture and Media Studies, Library and Information Management

Assessment Criteria: Outputs Eligible Outputs: Any type of output embodying research as defined for the REF may be submitted Sub-panels will not privilege any one kind of output above another Journal rankings will not be used!

Assessment Criteria: Outputs Eligible Outputs: The output should be submitted without additional material where that is in itself deemed to constitute sufficient evidence of the research Additional information for practice-based outputs words describing the research imperatives, research process and research significance Portfolio in cases where the research output is ephemeral, is one in a series of inter-connected outputs (eg performances etc) or cannot fully represent its scholarly dimensions through the evidence provided above.

Assessment Criteria: Outputs Co-authored, co-edited and collaborative outputs: May be listed by more than one author from within a single submitting unit or across submissions from different units In all cases submissions are required to provide an explanation of the nature and scale of the author’s contribution - not expressed as a % Panels may judge that significant differences in the quality of the respective contributions should be taken into account in the final grades awarded

Assessment Criteria: Outputs Double-Weighting: To recognise outputs of extended scale and scope = to 2 outputs. No particular type of output will automatically be double-weighted Institutions may identify up to 2 outputs per individual author which they consider worthy of double weighting and submit a supporting statement Panels will assess the claim for double weighting separately from the quality of the output (ie double-weighting does not necessarily result in 2 x 4*) For each claim, institutions may submit a reserve output, which will only be assessed if the claim for double weighting is not justified.

Assessment Criteria: Impact Definitions for the criteria for assessing impact are: Reach: The extent and/or diversity of the organisations, communities and/or individuals who have benefitted from the impact. Significance: The degree to which the impact enriched, influenced, informed or changed the policies, practices, understanding and awareness of organisations, communities and/or individuals.

Assessment Criteria: Impact The Main Panel believes that the impact of research conducted in its disciplines is powerful, pervasive and ubiquitous; challenging imaginations and enriching lives economically, culturally, spiritually and educationally It has provided, as illustration, a range of areas of impact, to help institutions to think about what case studies in the arts and humanities might look like These are: civil society, cultural life, economic prosperity, education, policy making, public discourse, public services There is no expectation that case studies should be classified in this way; indeed case studies may well cross the boundaries of these areas or go well beyond them

Assessment Criteria: Impact Examples of Impacts: A short list of examples of impact is provided in the panel criteria These are drawn from lengthy lists put together by sub-panels, which we would like to publish in due course as an aid to the sector [note to sub-panel chairs: you may wish to invite discussion of the informal list of examples relevant to your UOA. This discussion could be useful in refining the list for later publication.]

Assessment Criteria: Impact Evidence of Impact Main Panel D acknowledges that all potential records of evidence might not be available and that the integrity, coherence and clarity of the narrative will be essential to the panels in forming their judgements, but nonetheless key claims made in the narrative should be capable of corroboration Narratives should articulate the relationship between the underpinning research and the impact as well as the reach and significance of the impact itself An extensive range of types of evidence that could be used to support case studies is provided in the criteria to assist institutions in compiling their case studies

Assessment Criteria: Impact The Impact Template has four sections and will comprise 20% of the impact sub-profile: Context Approach to Impact Strategies and Plans Relationship to Case Studies Main Panel D has explained in its criteria the kinds of information it would like to see under these headings; not exhaustive lists In particular, it recognises that there is not always a planned, causal link between research and its subsequent impact and that pathways to impact may be diffuse and non-linear.

Assessment Criteria: Environment Definitions for the criteria for assessing environment are: Vitality: The extent to which the research environment supports a research culture characterised by intellectual vigour, innovation and positive contribution to the discipline. Sustainability: The extent to which the research environment ensures the future health and well-being of the unit and the discipline.

Assessment Criteria: Environment Data required [REF 4a/b/c] Data requirements have been reduced since RAE2008 to the following three datasets(by year, for the period 1 August 2008 – 31July 2013): Doctoral awards Research income by source Research income-in-kind Main Panel D has not asked for any other additional data to be submitted. This will be considered alongside the information provided in the environment template

Assessment Criteria: Environment Environment template [REF5] (equivalent to RA5) Headings: Overview; Strategy; People (covering staffing strategy and staff development, and research students); Income, Infrastructure and facilities; collaboration and contribution to the discipline. Panel Criteria specifies the kinds of information sub-panels would like to see under these headings; these are not exhaustive lists. Word lengths linked to number of ftes submitted.

Assessment Criteria: Working Methods Interdisciplinary and multidisciplinary research welcomed and treated equally. Sub-panels members have been selected to embrace broad-ranging experience to enable assessment of such work and work that crosses UOA boundaries. Within Main Panel D, cross-referral will be characterised by dialogue between the relevant SPs. Cross-referrals to other Main Panels if necessary.

Assessment Criteria: Working Methods Additional assessors (both academic and user) will be appointed to each sub-panel to assist with the assessment phase. Sub-panels will review institutional Statements of Submission Intentions to identify gaps in expertise or areas where the workload will be significantly heavier than anticipated. There will be an appointments process which will take due regard of advice received from subject associations and other professional bodies.

Assessment Criteria: Working Methods Main Panel will work with sub-panels to ensure adherence to assessment criteria and consistent application of standards. Details defined in Panel Criteria. Sub-panels will ensure that submissions are assessed using appropriate expertise: approaches defined in Panel Criteria. User members and user assessors will contribute significantly to the assessment of impact.

Assessment Criteria: Working Methods Reviewing Outputs “In every submission, all outputs will be examined with a level of detail sufficient to contribute to the formation of a reliable quality profile for all the outputs in that submission.”

Further information Guidance on submissions (July 2011) Draft panel criteria and working methods (July 2011)