ENQA workshop, Sigtuna, Sweden, 7-8 October 20091 The UK approaches to quality in e-learning - as seen from the HE Academy/JISC benchmarking programmes.

Slides:



Advertisements
Similar presentations
The European University Associations Institutional Evaluation Programme Nina Arnhold European University Association Birmingham, 09 December 2005.
Advertisements

ENQA, Bologna, London and beyond
ENQA – QAA meeting 8-9 December 2005 Birmingham, UK 9 December, – Feedback from workshop groups.
Personal Development Plans (PDPs) Subject-specific PDPs in Economics.
Dr Catherine Bovill Academic Development Unit University of Glasgow Dr Kate Morss and Dr Cathy Bulley, Queen Margaret University.
HEPI conference, 12 May 2011 Great expectations: how can students gain a great deal from their HEI, and how can quality assurance help? Anthony McClaran.
Institutional Audit Who runs it? What is it and how often does it occur? How will it affect us? What do we need to do? What will the outcome be and does.
Introduction to the ESRC Question Bank Julie Lamb Department of Sociology University of Surrey.
Aberdeen City Council 2008 IMPROVEMENT CONFERENCE 1st APRIL 2009.
Wouter Noordkamp The assessment of new platforms on operational performance and manning concepts.
HE in FE: The Higher Education Academy and its Subject Centres Ian Lindsay Academic Advisor HE in FE.
1 Cathay Life Insurance Ltd. (Vietnam) 27/11/20091.
The role of Agency for Science and Higher Education Prof.dr.sc. Jasmina Havranek.
© University of South Wales Bending without breaking: developing quality flexible learning experiences Professor Jo Smedley & Mary Hulford April 2014.
QAA-HEA Education for Sustainable Development Guidance Document Consultation 5 November 2013, Birmingham Professor James Longhurst Assistant Vice Chancellor.
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
ARMENIA: Quality Assurance (QA) and National Qualifications Framework (NQF) Tbilisi Regional Seminar on Quality Management in the Context of National.
1. (c) Alan Rowley Associates Laboratory Accreditation Dr Alan G Rowley Quality Policy based on Quality Objectives Quality Management System Communicate.
Korkeakoulujen arviointineuvosto — Rådet för utvärdering av högskolorna — The Finnish Higher Education Evaluation Council (FINHEEC) eLearning and Virtual.
B E S T B E S T B E S T.
Mini_UPA, 2009 Rating Scales: What the Research Says Joe DumasTom Tullis UX ConsultantFidelity Investments
Introductions From The HEA Team
Irish Universities Quality Board Internal Quality Assurance at Universities: The Irish perspective Dr Padraig Walsh Chief Executive Irish Universities.
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
The quality assurance system in Sweden Håkan Hult Linköping University Gdansk March 13, 2009.
KTH Seminar, Stockholm, Sweden, 9 October Approaches to quality in e-learning through benchmarking programmes Professor Paul Bacsich Matic Media.
Benchmarks and Benchmarking in the UK - Lessons Learned Catherine Connor Quality Enhancement Unit London Metropolitan University.
UK Quality Framework OU and ARCs
Quality Assurance in the Bologna Process Fiona Crozier QAA
Embedding information literacy into the curriculum - How do we know how well we are doing? Katharine Reedy (Open University) Cathie Jackson (Cardiff University)
Northampton – Development Opportunities a framework for enabling positive change.
PILOT PROJECT: External audit of quality assurance system on HEIs Agency for Science and Higher Education Zagreb, October 2007.
Walking on two legs: LEARNING EVALUATION 1 Göran Brulin, Senior Analyst and professor, Swedish Agency for Economic and Regional Growth Sven Jansson, National.
Evaluation in the GEF and Training Module on Terminal Evaluations
1 European Lifelong Guidance Policy Network Work Package 1 – Career Management Skills Synthesis Meeting NATIONAL PROGRAMME FOR CAREER GUIDANCE Aleksandra.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
Quality school library – how do we find out? Polona Vilar Department of LIS&BS, Faculty of Arts, University of Ljubljana, SLO Ivanka Stričević Department.
The Art of the Designer: creating an effective learning experience HEA Conference University of Manchester 4 July 2012 Rebecca Galley and Vilinda Ross.
Quality Assurance in the European Higher Education Area Tibor Szanto ENQA Rogaska Slatina, 30 November 2007.
1 Standards, quality assurance, best practice and benchmarking in e-learning Professor Paul Bacsich Matic Media Ltd, and Middlesex University, UK.
Foundation Degrees Foundation Degree Forward Lichfield Centre The Friary Lichfield Staffs WS13 6QG — Tel: Fax: —
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU. Quality Assurance José Viegas Ribeiro IGF, Portugal SIGMA.
ACODE 39: Quality in Teaching and Learning, November 2005, Melbourne1 Benchmarking in e-learning: an overview Professor Paul Bacsich Matic Media.
MANAGEMENT SYSTEM AND STRUCTURAL ORGANIZATION FOR THE QUALITY ASSURANCE OF SPs ■ Slovak University of Technology in Bratislava ■ Faculty of Civil Engineering.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Joint Information Systems Committee 14/10/2015 | | Slide 1 Effective Assessment in a Digital Age Sarah Knight e-Learning Programme, JISC Ros Smith, GPI.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Main results of the “Comparative Report”: an overview of the educational systems in five EU countries and theirs training offer in agricultural machinery.
On-line briefing for Program Directors and Staff 1.
DECEMBER 2010 EUROPEAN MODULES ON MIGRANT INTEGRATION OUTCOMES & REFLECTIONS FROM THE FIRST MODULE ON LANGUAGE AND INTRODUCTORY COURSES.
UK eUniversities: eLearning Research Centre and the Community Professor Paul Bacsich Director of Special Projects UK eUniversities Worldwide Limited.
Kathy Corbiere Service Delivery and Performance Commission
February, MansourahProf. Nadia Badrawi Implementation of National Academic Reference Standards Prof. Nadia Badrawi Senior Member and former chairperson.
Andy Wilson – Team Manager HR Education (School Teacher Appraisal) (England) Regulations 2012 A briefing for Heads and Governors.
Benchmarking Pilot Workshop 1, 17 March 2006, Leicester1 Benchmarking e-learning: Putting the Pick & Mix approach into action Professor Paul Bacsich Consultant.
Benchmarking Pilot Launch Meeting, 20 January 2006, London1 Benchmarking e-learning: The Pick & Mix approach Professor Paul Bacsich Matic Media Ltd.
Internal Audit Quality Assessment Guide
The scorecard indicators for 2012 Overview of the scorecard indicators for the integrated implementation report for the BFUG 2012.
‘Preparing for Periodic Review’
Training Trainers and Educators Unit 8 – How to Evaluate
‘Preparing for Periodic Review’
Quality Assurance and Enhancement at The University of Edinburgh
SPHERE Study Visit: University of Edinburgh (October 2017)
DRAFT Standards for the Accreditation of e-Learning Programs
Training Trainers and Educators Unit 8 – How to Evaluate
Internal Quality Assurance in Higher Education in Europe
28-Nov-18 Benchmarking e-learning in UK Universities: Reflections on University of Leicester’s participation in the HEA-led benchmarking of e-learning.
PRESENTATION OF EXISTING EVALUATION
Reconsidering Evidence in Academic Quality
Presentation transcript:

ENQA workshop, Sigtuna, Sweden, 7-8 October The UK approaches to quality in e-learning - as seen from the HE Academy/JISC benchmarking programmes - and more recent developments including Re.ViCa and the DL benchmarking club Professor Paul Bacsich Matic Media Ltd

ENQA workshop, Sigtuna, Sweden, 7-8 October Topics 1.Introduction, disclaimers and acknowledgements 2.The four phases of the UK HE Benchmarking Programme 3.More recent developments in UK HE benchmarking e-learning 4.Implications for schemes on Quality of e-Learning

ENQA workshop, Sigtuna, Sweden, 7-8 October Introduction, disclaimers and acknowledgements

ENQA workshop, Sigtuna, Sweden, 7-8 October Disclaimer: This talk is not on behalf of any institution, agency or ministry – it is a personal expert view Thanks to HE Academy, JISC, EU Lifelong Learning Programme, Manchester Business School and University of Leicester for support - apologies to others omitted

ENQA workshop, Sigtuna, Sweden, 7-8 October Re.ViCa (Review of Virtual Campuses) Project supported by the European Union under the Lifelong Learning Programme - Erasmus/Virtual Campus –With International Advisory Committee Database of countries, agencies and Programmes (500) Nine case studies Set of 17 Critical Success Factors developed after wide international consultation – embedded in Pick&Mix scheme Organised post-secondary e-learning initiatives are found across the G-100 (all except the Least Developed Countries)

ENQA workshop, Sigtuna, Sweden, 7-8 October The four phases of the UK HE Benchmarking Programme an overview

ENQA workshop, Sigtuna, Sweden, 7-8 October Benchmarking e-learning At national level, started in UK and New Zealand –Soon spread to Australia –Not closely linked initially to quality agenda At European level, developments include E-xcellence and UNIQUe –Some earlier work from OBHE, ESMU etc – but not in public criterion mode –Later, developments in other projects –Increasingly, links made to quality agenda

ENQA workshop, Sigtuna, Sweden, 7-8 October Benchmarking e-learning (UK) Foreseen in HEFCE e-learning strategy 2005 Higher Education Academy (HEA) oversaw it Four phases – 82 institutions – 5 methodologies Two consultant teams – BELA and OBHE Justified entry to HEA Pathfinder and Enhancement National initiatives - and useful for JISC initiatives also (Curriculum Design etc) Can be leveraged into update of learning and teaching strategy (e.g. Leicester U)

ENQA workshop, Sigtuna, Sweden, 7-8 October Documentation – very good HE Academy reports on benchmarking Evaluator reports on each phase Consultant team reports on each phase Conference papers (EADTU/ICDE each year – and ALT-C etc) Definitive book chapter (to appear) HE Academy blog and wiki (web 2.0) Specific HEI blogs and some public reports Bibliography_of_benchmarking

ENQA workshop, Sigtuna, Sweden, 7-8 October UK: benchmarking e-learning Possibly more important is for us [HEFCE] to help individual institutions understand their own positions on e-learning, to set their aspirations and goals for embedding e-learning – and then to benchmark themselves and their progress against institutions with similar goals, and across the sector

ENQA workshop, Sigtuna, Sweden, 7-8 October Methodologies in UK HE There were five methodologies used in UK but only two now have public criteria, are routinely updated and are available for single institutions (to use outside consortia): Pick&Mix –Used under HEA auspices in 24 UK institutions –Including 4 diverse institutions in Wales –Now being used in a further UK HEI and one in Australia –About to be used by the 7-institution Distance Learning Benchmarking Club (UK, Sweden, Australia, Canada, New Zealand) eMM – as used in New Zealand and Australia

ENQA workshop, Sigtuna, Sweden, 7-8 October Pick&Mix overview Focussed on e-learning, not general pedagogy Draws on several sources and methodologies – UK and internationally (including US) and from college sector Not linked to any particular style of e-learning (e.g. distance or on-campus or blended) Oriented to institutions with notable activity in e-learning Suitable for desk research as well as in-depth studies Suitable for single- and multi-institution studies

ENQA workshop, Sigtuna, Sweden, 7-8 October Pick&Mix history Initial version developed in early 2005 in response to a request from Manchester Business School for an international competitor study Since then, refined by literature search, discussion, feedback, presentations, workshops, concordance studies and four phases of use – fifth and sixth phases now Forms the basis of the current wording of the Critical Success Factors scheme for the EU Re.ViCa project

ENQA workshop, Sigtuna, Sweden, 7-8 October Pick&Mix Criteria and metrics

ENQA workshop, Sigtuna, Sweden, 7-8 October Criteria Criteria are statements of practice which are scored into a number of performance levels from bad/nil to excellent It is wisest if these statements are in the public domain – to allow analysis & refinement The number of criteria is crucial Pick&Mix currently has a core of 20 – based on analysis from the literature (ABC, BS etc) and experience in many senior mgt scoring meetings

ENQA workshop, Sigtuna, Sweden, 7-8 October Pick&Mix: 20 core criteria Removed any not specific to e-learning –Including those in general quality schemes (QAA in UK) Careful about any which are not provably success factors Left out of the core were some criteria where there was not yet UK consensus Institutions will wish to add some to monitor their KPIs and objectives. Recommended no more than 6. –Pick&Mix now has over 70 supplementary criteria to choose from –more can be constructed or taken from other schemes These 20 have stood the test of four phases of benchmarking with only minor changes of wording –originally 18 - two were split to make 20

ENQA workshop, Sigtuna, Sweden, 7-8 October Pick&Mix Scoring Use a 6-point scale (1-6) –5 (cf Likert, MIT90s levels) plus 1 more for excellence Contextualised by scoring commentary There are always issues of judging progress especially best practice The 6 levels are mapped to 4 colours in a traffic lights system –red, amber, olive, green

ENQA workshop, Sigtuna, Sweden, 7-8 October Pick&Mix System: summary Has taken account of best of breed schemes Output and student-oriented aspects Methodology-agnostic but uses underlying approaches where useful (e.g. Chickering & Gamson, Quality on the Line, MIT90s) Requires no long training course to understand

ENQA workshop, Sigtuna, Sweden, 7-8 October Institutional competences University of Leicester used Pick&Mix in the very first phase of the HEA programme –And two phases of re-benchmarking Other universities with strong competence (with approved HEA Consultants) are University of Derby and University of Chester Several other universities have done excellent work and produced public papers and reports (e.g. Northumbria, Worcester)

ENQA workshop, Sigtuna, Sweden, 7-8 October Pick&Mix Three sample criteria

ENQA workshop, Sigtuna, Sweden, 7-8 October P01 Adoption (Rogers) 1.Innovators only 2.Early adopters taking it up 3.Early adopters adopted; early majority taking it up 4.Early majority adopted; late majority taking it up 5.All taken up except laggards, who are now taking it up (or retiring or leaving) 6.First wave embedded, second wave under way (e.g. m-learning after e-learning)

ENQA workshop, Sigtuna, Sweden, 7-8 October P10 Training 1.No systematic training for e-learning 2.Some systematic training, e.g. in some projects and departments 3.Uni-wide training programme but little monitoring of attendance or encouragement to go 4.Uni-wide training programme, monitored and incentivised 5.All staff trained in VLE use, training appropriate to job type – and retrained when needed 6.Staff increasingly keep themselves up to date in a just in time, just for me fashion except in situations of discontinuous change

ENQA workshop, Sigtuna, Sweden, 7-8 October P05 Accessibility 1.VLE and e-learning material are not accessible 2.VLE and much e-learning material conform to minimum standards of accessibility 3.VLE and almost all e-learning material conform to minimum standards of accessibility 4.VLE and all e-learning material conform to at least minimum standards of accessibility, much to higher standards 5.VLE and e-learning material are accessible, and key components validated by external agencies 6.Strong evidence of conformance with letter & spirit of accessibility in all countries where students study

ENQA workshop, Sigtuna, Sweden, 7-8 October Other methodologies Members of the BELA team have run three other methodologies: –MIT90s, eMM and ELTI for HE Academy And analysed most others: –Most US and European methodologies were analysed QoL, E-xcellence, BENVIC, OBHE Insights from other methodologies are fed into Pick&Mix to improve it

ENQA workshop, Sigtuna, Sweden, 7-8 October National indicators Pick&Mix is mapped to the HEFCE Measures of Success (England) Similar mappings were done for the Welsh Indicators of Success – draft and final and for the Becta Balanced Scorecard (for colleges)

ENQA workshop, Sigtuna, Sweden, 7-8 October Comparative work A databank of scores from 10 HEIs is public in anonymous form Because each criterion is stable in concept, longitudinal comparisons (across time) are also possible –Old criteria are withdrawn if no longer relevant and new criteria introduced (e.g for Web 2.0 and work-based learning) –Several HEIs have done re-benchmarking

ENQA workshop, Sigtuna, Sweden, 7-8 October Carpets

ENQA workshop, Sigtuna, Sweden, 7-8 October Supplementary criteria - examples IT reliability Market research, competitor research IPR Research outputs from e-learning Help Desk Management of student expectations Student satisfaction Web 2.0 pedagogy

ENQA workshop, Sigtuna, Sweden, 7-8 October Local criteria Institutions can track their own local criteria But this is rarely done –It is actually very hard to craft good criterion statements

ENQA workshop, Sigtuna, Sweden, 7-8 October Slices (departments etc) As well as benchmarking the whole institution, it is wise to look at a few slices: Schools, Faculties,, Programmes… Useful to give a context to scores Do not do too many Slices need not be organisational –Distance learning… –Thematic or dimensional slices like HR, costs… Most other systems also now use this approach

ENQA workshop, Sigtuna, Sweden, 7-8 October Evidence and Process Iterative Self-Review for public criterion systems

ENQA workshop, Sigtuna, Sweden, 7-8 October The Iterative Self-Review Process For all the methodologies we deployed, we use an Iterative Self-Review Process The methodologies do NOT require it – it was what our UK institutions desired, for all the public criterion systems – strong resistance to documentary review It encourages a more senior level of participation from the institution: the result is theirs, not the assessors It allows them to get comfortable with the criteria as they apply to their institution And move directly to implementation of change But it selects against complex methodologies And requires more effort from assessors

ENQA workshop, Sigtuna, Sweden, 7-8 October Iterative Self-Review details Introductory meeting Initial collection of evidence Selection of supplementary criteria Mid-process meeting Further collection of evidence Scoring rehearsal meeting Final tweaks on and chasing of evidence Scoring meeting Reflection meeting – to move to change

ENQA workshop, Sigtuna, Sweden, 7-8 October How to handle evidence Have a file for each criterion Institutions normally group criteria according to their own L&T strategy or in terms of owning departments –We also supply some standard groupings, e.g. based on MIT90s, but few use these

ENQA workshop, Sigtuna, Sweden, 7-8 October Peer review Peer review exists in the Iterated Self Review model: –Specialist assessors (normally two nowadays) have experience in the sector –Often, the benchmarking is done in a benchmarking cohort and the leaders of each HEI in the cohort form a peer group

ENQA workshop, Sigtuna, Sweden, 7-8 October Distance Learning Benchmarking Club A work package in the JISC Curriculum Delivery project DUCKLING at the University of Leicester A number (7) of institutions in UK and beyond will be benchmarked this year –And again next year (Sept-Oct 2010) –The aim is to baseline and then measure incremental progress in e-learning

ENQA workshop, Sigtuna, Sweden, 7-8 October Members University of Leicester (UK) University of Liverpool (UK) University of Southern Queensland (Australia) Massey University (NZ) Thompson Rivers University (Canada) Lund University (Sweden) KTH (Sweden)

ENQA workshop, Sigtuna, Sweden, 7-8 October Process Institutions will work in a virtual cohort using teleconferencing Pick&Mix will be used – with an adjusted set of Core Criteria to take account of: –Updated analysis of earlier benchmarking phases –Critical Success Factors for large dual-mode institutions –The need for expeditious working

ENQA workshop, Sigtuna, Sweden, 7-8 October Implications for QA in e-learning My thoughts

ENQA workshop, Sigtuna, Sweden, 7-8 October Too many concepts Benchmarking Standards? Quality Accreditation /approval /kitemarking Critical Success Factors E-learning is only a small part of the quality process – how can agencies and assessors handle five variants of the concept across many separate methodologies?

ENQA workshop, Sigtuna, Sweden, 7-8 October My view - the pyramid Critical Success Factors Benchmarking ---- Quality Detailed pedagogic guidelines Criteria are placed at different layers in the pyramid depending on their level Leadership level Senior managers

ENQA workshop, Sigtuna, Sweden, 7-8 October Benchmarking frameworks It is implausible that there will be a global scheme or even continent-wide schemes for benchmarking But common vocabulary and principles can be enunciated – e.g. for public criterion systems: –Criteria should be public, understandable, concise and relatively stable – and not politicised or fudged –Criteria choice should be justified from field experience and the literature –Core and supplementary criteria should be differentiated for each jurisdiction –Core criteria should be under 40 in number –The number of scoring levels should be 4, 5 or 6

ENQA workshop, Sigtuna, Sweden, 7-8 October Concordances Mappings between systems are hard and rarely useful (Bacsich and Marshall, passim) Concordances of systems are easier and helpful – e.g. to reduce the burden of benchmarking with a new methodology –Such approaches will be used in the Distance Learning Benchmarking Club –for E-xcellence+/ESMU and ACODE

ENQA workshop, Sigtuna, Sweden, 7-8 October Experience on methodologies Methodologies do not survive without regular updating by a design authority –this is difficult in a leaderless group context Forking of methodologies needs dealt with by folding updates back to the core system –otherwise survival is affected Complex methodologies do not survive well A public criterion system allows confidence, transparency, and grounding in institutions

ENQA workshop, Sigtuna, Sweden, 7-8 October References A key paper on the international aspects is BENCHMARKING E-LEARNING IN UK UNIVERSITIES: LESSONS FROM AND FOR THE INTERNATIONAL CONTEXT, in Proceedings of the ICDE conference M-2009 at A specific chapter on the UK HE benchmarking programme methodologies is: Benchmarking e-learning in UK universities – the methodologies, in Mayes, J.T., Morrison, D., Bullen, P., Mellar, H., and Oliver, M.(Eds.) Transformation in Higher Education through Technology-Enhanced Learning, York: Higher Education Academy, 2009 (expected late 2009)