Presentation is loading. Please wait.

Presentation is loading. Please wait.

The REF: the Good, the Bad and the Future Professor Julia Black Pro Director for Research London School of Economics and Political Science HEPI Conference.

Similar presentations


Presentation on theme: "The REF: the Good, the Bad and the Future Professor Julia Black Pro Director for Research London School of Economics and Political Science HEPI Conference."— Presentation transcript:

1 The REF: the Good, the Bad and the Future Professor Julia Black Pro Director for Research London School of Economics and Political Science HEPI Conference 31 st March 2015

2 Outline What’s the purpose of the REF, and for whom? REF: the good REF: the bad REF: the future?

3 Why have the REF? For government: – To justify an unequal distribution of unrestricted research funding (QR) – Possibly: to ‘showcase’ the UK HE sector as part of UK plc? For HE institutions – To gain a share of unrestricted research funding – To ‘showcase’ their own success in order to attract students, recruit staff and gain grants – As a benchmarking exercise – As an internal management tool – for good or ill Particularly where the REF rules allow selectivity in who is submitted For HE users and / or potential partners or funders – As an indication of quality when deciding where to commit resources (student fees, research funding, philanthropy etc)

4 REF: the good For Government – Provides a basis for the unequal allocation of funding which is generally trusted by the HE sector and others – Enables some showcasing of UK plc - but limited due to the fact is only a national comparison For HEIs – Peer-based assessment of research quality across HEIs which is (more or less) trusted by the academy provides (more or less) a justifiable basis for unequal QR allocation – New: recognition of the impact of research raises awareness of the success and importance of HEI research beyond academia For HE users and / or potential partners or funders – For funders: provides a reasonably robust assessment of quality so preventing adverse selection for the allocation of research funding – For students Gives a ‘badge’ to a degree, potentially increasing its marketability But note role of other indicators, notably NSS, in students’ selection decisions

5 REF: the bad 1. Panels and OUAs UOAs vs interdisciplinarity – HEIs have to fit themselves into UAO blocks – cuts across RCUK policy of promoting interdisciplinarity Insufficient norms of cross-referring between panels – poor understanding of what the practices are Should an individual be able to be submitted to more than one UOA? – Merging of blocks leads to odd combinations (eg Geography and Archaeology) – Some disciplines omitted (eg Social Psychology) – Number of non-submitted individuals due to ‘fit’ – that should not have to happen

6 REF: the bad 1. Panels and UOAs cont. Panel composition – Panel members chosen using national societies; no international input into the composition of panels – Query whether panels always had the necessary expertise to assess outputs (eg lack of quantitative skills for some UOAs in Panel C) – Query whether sufficient cross-referring occurred

7 REF: the bad 2. Selecting individuals for submission Principle of rewarding excellent where it is found is laudable but selectivity over submissions: – Places a significant cost burden on HEIs to determine who to submit – Career implications for those who are not submitted may be significant so a certain minimum level of due process is required in making assessments (eg getting an external, allowing appeals) – Gaming of census data – ‘hiding’ people by putting them on different contracts – Use of non-normalised and normalised tables led to significant gaming of league tables – so sending mixed signals as to quality contrary to one of the purposes of the REF

8 REF: the bad 3. Impact case studies A positive feature, but the ‘wildcard’ for REF2014 – Guidance was broad but interpretations by panels show that impacts were preferred that were narrow but clearly linked to research were preferred to more diffused impacts – Panels had unrealistic expectations of the extent to which the path from research to impact can be documented and rendered auditable, particularly in SSHR high level of very detailed audit queries – No scope for recognising impacts that arise from expertise of academics in a general area rather than specific publications – Purpose of the impact template unclear

9 REF: the bad 4. Red tape Needless complexity – Deadlines Multiple eligibility deadlines (for impact, census, publication) created unnecessary scope for confusion Submission deadline before publication deadline for outputs created needless uncertainty – Census data – unnecessary confusion in compilation of HESA data – why can’t HEFCE produce its own? – Unnecessary ambiguity on ‘complex circumstances’ creating significant and intrusive audit queries – ‘Salami slicing’ release of results by HEFCE and HESA was absurd and unnecessarily painful Did commercial publishers get the information before HEIs? – Environment template – what does it add? No relation to outputs score Just an exercise in essay writing?

10 REF – looking forward Submission – ICSs should stay, but keep the ratio to outputs the same, but with some flexibility (eg 1 ICS per 8-12 submitted for outputs) – Require submission of all staff on research contracts (though recognising gaming will still occur) – Allow individuals to be submitted to more than one UOA and / or clarify rules and practices on cross- referring – Eligibility of outputs and open access Deadline for submission to repository should be based on date of publication not date of acceptance

11 REF – the future? Assessment – Metric debate For outputs, do we need the same approach across all panels? Disciplinary publication and citation norms vary hugely Abolish environment statement and rely just on metrics, eg of research funding, research student numbers, other..? Need for clarity on metrics on impact – highly subjective assessments in 2014 with no clearly accepted norms – Outputs: with so many 4* outputs do we now need a 5* or even 6* category?

12 REF – looking forward QR funding is vital to ensure that pure, curiosity driven research survives But do we need such a fine grained assessment for the purposes of distributing funding? – Estimated costs of submission c.£500m-£1bn across HE sector, though figures unclear: proportionate cost given level of funding being distributed? Shrinking funding – In a growing sector, will overall cuts in funding mean SSHR lose their critical mass? – Increasing reliance on student fees to fund research Implications if regulated fees may be cut International competition for students intensifying

13 Conclusions: Making the REF fit for purpose(s) For government: – Use a less intensive exercise – not just focusing on metrics for assessing outputs but across the whole of the REF exercise – Increase international element of peer review to ‘showcase’ the UK HE sector For HE institutions – Need to maintain trust in the process of allocating unrestricted research funding, particularly if move to metrics – Moderate gaming of league tables by using normalisation as standard – Moderate pernicious elements of using REF as an internal management tool by requiring all staff to be submitted – Restructure UOAs and allow greater flexibility in submissions and cross-referring to enable appropriate benchmarking – Cut red tape and complexity – the REF needs its own ‘red tape challenge’ For HE users and / or potential partners or funders – Reduce gaming opportunities so as to give clear indications of quality across disciplines and HEIs


Download ppt "The REF: the Good, the Bad and the Future Professor Julia Black Pro Director for Research London School of Economics and Political Science HEPI Conference."

Similar presentations


Ads by Google