Evaluating E-Learning Efficacy University of York, UK Wayne Britcliffe and Simon Davis Edinburgh Napier Learning and Teaching conference 14 th June 2012.

Slides:



Advertisements
Similar presentations
Joint Information Systems Committee Supporting Further and Higher Education Making Learning Effective – MLE? New Environments for Learning Tuesday 19 March.
Advertisements

Dr David Nicol Project Director Centre for Academic Practice University of Strathclyde Re-engineering Assessment Practices in Scottish.
Being a TAPS student intern at the University of Roehampton Joy Vamvakari & Bridget Middlemas HEA York, November 2013.
Realising Opportunities at the University of York: A case study Tamlyn Ryan Tony Wilson University of York.
Study Skills Support: Where are we now? What should we aim for? Margaret March The University Library Karen Burton Study Support Service Student Services.
‘Colin, I need to speak to you some time about my own CPD - I’ve been so busy that I just haven’t had time to think about myself… ’. ‘The most important.
Study on the outcomes of teaching and learning about ‘race’ and racism Kish Bhatti-Sinclair (Division of Social Work Studies) Claire Bailey (Division of.
Second Legislated Review of Community Treatment Orders Ministry of Health and Long-Term Care November 9, 2012.
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
Lindsey Martin Meeting the challenges of e-learning: achieving and maintaining an e-ethos in an academic library ALDP April 2007.
The Graduate Attributes Project: a perspective on early stakeholder engagement Dr Caroline Walker Queen Mary, University of London.
DSEP, Personal Tutoring and Feedback in PAIS Dr Justin Greaves Director of Student Experience and Progression (DSEP) Department of Politics & International.
International Conference on Enhancement and Innovation in Higher Education Crowne Plaza Hotel, Glasgow 9-11 June 2015 Welcome.
Carpe Diem E-learning course design and embedding 1.
Embedding information literacy into the curriculum - How do we know how well we are doing? Katharine Reedy (Open University) Cathie Jackson (Cardiff University)
Case Study Methodology & e-Learning: Reflections on Evaluation Activities for Blended Modules Richard Walker & Wendy Fountain University of York.
Designing for inclusion and the role of the disability practitioner Caroline Davies and Tina Elliott IMPACT Associates Eileen Laycock, Disability Manager.
The Student Experience Project Overview for Kosovo Higher Education visit Mark Wilkinson October 2014.
Learning Development and Innovation Overview and Updates Steve Wyn Williams March 2013.
“We give grants so that students can present at conferences” ASPiRE (Advancing student participation in research excellence in Medicine) “The conference.
ELearning within MEDIC Where we are and where we're going Nick Webb: eLearning Programme Manager.
Delivering your blended course Richard Walker E-Learning Development Team University of York Preparing, supporting & evaluating student learning.
Dawne Gurbutt, Discipline Lead, Health Related Studies 11 th July 2013 Enhancing the student learning experience through Patient & Public Involvement Practice,
Incorporating Career Portfolio’s as a Course Wide Learner Tool Sophie McKenzie Deakin University School of Information Technology Case study of Information.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Delivering Transition Support Through the VLE “Vive la difference!”
What is an effective induction (within an academic context) and how do you implement it across the whole university or college? Michael Hill Action on.
VIRTUAL CLASSROOM TOUR Documents Web Links Innovative Teachers Date Title Creator/s Homepage Objective/s Learning Together in Dundee Initiative To raise.
Continuing to Discover! Leeds Beckett University Library’s usage of feedback and statistical data to develop EBSCO Discovery Service Libraries and Learning.
UCP JISC RSC Conference, Bristol 8 th April 2008 Embedding e-Learning in everyday practice David Benzie & Adam Read March 2008 USB_B\Pathfinder\RSC.
Students in the feedback loop Cathy Dantec & Wayne Britcliffe University of York, UK ALT-C Shaping the Future of Learning Together 8 th – 10 th September.
Foundation Degrees Foundation Degree Forward Lichfield Centre The Friary Lichfield Staffs WS13 6QG — Tel: Fax: —
The University for business and the professions Neal Sumner Instructional Designer E Learning Unit City University WebCT Regional Users Group meeting
Developing Strategies to support staff in the delivery of blended / online learning Judith Smith, Department of eLearning 21 April 2005.
Professor Norah Jones Dr. Esyin Chew Social Software for Learning – The Institutional Policy of the University of Glamorgan ICHL 2012, China
Important Information Have you got a username and password for the school SRF account? If your school has not registered before then you can do this if.
Joint Information Systems Committee 14/10/2015 | | Slide 1 Effective Assessment in a Digital Age Sarah Knight e-Learning Programme, JISC Ros Smith, GPI.
Leeds University Library Implementing an information literacy audit in the School of Healthcare, Leeds University Alison Lahlafi, Faculty Team Librarian.
Key themes covered Search engines Locating/ assessing suitable resources Information Skills – knowing where to look Free web-based RDN,NLN, Ferl JISC or.
CABLE – Creating a Better Learning Experience Presented by: Susan Reid & Shani Gbaja SLRM Project.
Annual Monitoring & Review THE CRIMEA PILOT Continuous Review for Improvement and the Monitoring of Enhancement Activities Quality Assurance Services.
HNC/D Administration and IT Integration of Units/Topics Dawn Hayes.
Developing networked learner support in UK higher education Perspectives from the NetLinkS project Phil Levy, UK-Nordic Conference 1997.
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
Development of Education Quality Management System Ričardas Ališauskas Ministry of Education and Science of Lithuania Rotterdam.
Birmingham Primary Strategy Team Renewing the Frameworks Training Session 4 Beginning the implementation process.
Seeking the views of the Mathematical Sciences Community Michael Grove 1 & Bill Cox 2 1 Maths, Stats & OR Network, School of Mathematics, University of.
What’s happening across the country. England 23 February 2016 Becca Knowles National STEM Learning Centre and Network.
Strategies for blended learning in an undergraduate curriculum Benjamin Kehrwald, Massey University College of Education.
Evaluating E-Learning Efficacy for blended learning courses : University of York, UK Wayne Britcliffe ALT-C : A confrontation with reality 11 th September.
Making the blend work – lessons learned in four years of cross college blended learning Peter Kilcoyne ILT Director Heart of Worcestershire College
Applying Laurillard’s Conversational Framework to Blended Learning Blogging and Collaborative Activity Design R Papworth, R Walker & W Britcliffe E-Learning.
Winning with wikis and blogs: Models for effective delivery of student online activities E-Learning Development Team University of York Simon Davis and.
- Collaborative report writing - Bridging the divide between formal and informal learning Richard Walker & Wayne Britcliffe E-Learning Development Team,
What is a Service Level Agreement? Service level agreements are part of a quality approach to help teams identify and agree what ‘good quality’ looks like.
Tracy Panther Associate Dean, Student Experience Lindsay Williams Principal Lecturer, Student Experience ACP CONFERENCE 2012 THE STUDENT EXPERIENCE IN.
Welcoming international students to York Katy Mann Learning Enhancement Team University of York Wayne Britcliffe E-Learning Development Team University.
QAA COLLABORATIVE PROVISION AUDIT DRAFT REPORT. QAA CPA Process Submission by the University of Self Evaluation Document (SED) (December 2005) Selection.
Effecting institutional change through the evaluation of e-learning Richard Walker & Rose Papworth E-Learning Development Team, University of York eLearning.
Pre-arrival & Induction A coordinated approach Jenni Murray Student Induction Coordinator Pre-arrival & Induction Team, Student Recruitment & Admissions.
SET for success: University of York Simon Davis Socialisation and E-learning Technology to facilitate first year transition
CATS Self Review and Planning Tool An Introduction and Overview Alison Poot and Melody West, CATS Project Team (University of Tasmania)
Access, Digital & Distributed Learning
Construction and Collaboration
Fostering academic skills development through an online hub:
Developing the Guided Learner Journey
Responses to change: an institutional language strategy
Hannah Clarke, sparqs Steph Kirkham, sparqs
HEEAPs of Kim Williams, Frances MacInnes &
Presentation transcript:

Evaluating E-Learning Efficacy University of York, UK Wayne Britcliffe and Simon Davis Edinburgh Napier Learning and Teaching conference 14 th June 2012 What’s going on?

Drivers  Initially approached by our Department of Health Sciences: –Health Sciences have extensive and embedded online support for teaching activity –Large student numbers and a number of CPD courses as well as UG and PG courses –Devolved VLE administration/acknowledged VLE Coordinator  Approach backed by Health Sciences senior management –Their primary goal was to establish the level of online engagement from staff and students and also try and get an idea of its impact  Language and Linguistics, Law, Economics, Environment, Management –Improve dialogue and inform departmental strategy, baseline development etc

Evaluation challenges  Students range from Undergraduates on campus through to CPD distance learners  Large number of modules to ‘audit’ (e.g. HS = 217)  How to broadly classify and indicate activity without being judgemental about the effectiveness of that activity?  Establishing actual impact on learning or quantifying where value has been added can be very difficult  E-Learning evaluation not built into course design and the focus of evaluation endeavours quite wide  Difficulty in getting good survey (survey fatigue) and focus group coverage across the whole offering

Process – Step 1  Consultation –Departmental VLE coordinator liaised with departmental management  Establish reporting requirements  Get senior backing for survey/report activity –E-Learning Team liaised with the departmental VLE co’ordinator to develop evaluation plan  Scoping of evaluation aims  Identify useful review and performance indicators  Establish data collection methods

Evaluation plan Focus Establish level of on-line activity per course focussed around usage of each course’s discrete VLE module Try and establish the impact of the on-line support/activity on learning Key Questions How has each VLE module been used by teaching staff? How has each VLE module been used and received by the students? Stakeholders For feedback : The department’s teaching staff, the department’s students For evaluation : ELDT, dept’s support staff and management Time Scale/Dependencies Need established late 2011 HS draft report required by mid-March 2012 Instruments and methods Statistical querying of the VLE database Survey of staff users Survey of student users Focus groups with students Systematic audit of current VLE sites

Process – Step 2  Data Capture –Statistical querying of the VLE database  VLE administrator written queries based on stat data that is deemed useful –Survey of staff users  Develop survey questions –Survey of student users  Develop survey questions –Focus groups with students/staff  Develop focus group questions –Systematic audit of all current academic year on-line sites  Decide on ‘positioning’ criteria and also on what to qualitatively look at

Data Capture – Closer look at module audit – 3E  Systematic audit of all current academic year on-line sites

Enhance – Health Sciences Example

Extend – Health Sciences Example

Empower – Health Sciences Example 1

Empower – Health Sciences Example 2

Process – Step 3 – Analysis and recommendations EARL (reading lists)

Process – Step 3 – Analysis and recommendations  Some survey feedback from students: “Better training for teaching staff, so they are able to offer the different opportunities for as example group files/file exchange/place to save articles and other documents to make them accessible to the whole group. This would decrease the amount of s sent between groups and make communication easier.” “The Wiki section of the VLE could use clear directions on conventions and getting started.” “I don't consider myself computer illiterate, however The VLE in my opinion is extremely difficult to negotiate.”

Process – Step 3 – Analysis and recommendations  Some survey feedback from staff: “Departmental strategy for implementation of VLE in teaching should be developed (and adhered to) - based on needs/requirements of both staff and students - this could (should) be tied into staff development so that awareness of what the VLE can do and how it can support their pedagogical framework and desired course outcomes is increased.” “VLE sites are generally set up by someone else, and I'm not sure if they consult about structure, nomenclature or branding.” “I find it difficult to invest in developing and maintaining the sites due to workload and technical skills.”

Process – Step 3 – Analysis and recommendations  In Brief: Early draft recommendations –Staff support and training/student support  Work with staff (targeting programmes rather than individuals) to help improve understanding and integration of E-learning delivery. Improve student induction to on-line tools/content and support conduits –Template development  Develop templates that suit specific programmes (not one size fits all) and make sure staff understand how to use them

Moving forward  Refine evaluation processes –Bespoke survey questions/focus group questions –Create a transferable framework  Staff support and training –PGCAP: 3 E forms basis of “Intro to E-Learning session” –Bespoke dept training: “Going further with VLE” –Position local exemplars within the framework –Build evaluation into module design so it’s easier to establish actual impact

Questions? Wayne Britcliffe and Simon Davis University of York, UK

References Smyth et al., 2011, Benchmark for the use of technology in modules, Edinburgh Napier University Higher Education Academy (2008), Challenges and Realisations from the Higher Education Academy/JISC Benchmarking and Pathfinder Programme: An End of Programme Review by the Higher Education Academy, Evaluation and Dissemination Support Team, September 2008, available at content/uploads/2008/09/Bench_and_PathFinalReview pdf content/uploads/2008/09/Bench_and_PathFinalReview pdf ACODE (2007), ACODE benchmarks for e-learning in universities and guidelines for use, June 2007, available at