The issue of retention in distance learning: a strategic approach to using data to inform retention interventions Alison Gilmour and Avinash Boroowa The.

Slides:



Advertisements
Similar presentations
© The Open University, Institute of Educational Technology 1 Alison Ashby, Naomi Jeffery, Anne Slee Student Statistics and Survey Team, IET The Open University.
Advertisements

Enhancing student learning through assessment: a school-wide approach Christine OLeary & Kiefer Lee Sheffield Business School.
Predicting High School Outcomes in the Baltimore City Schools: Findings and Reflections on the Research Partnership Presentation to Council of the Great.
Study on the outcomes of teaching and learning about ‘race’ and racism Kish Bhatti-Sinclair (Division of Social Work Studies) Claire Bailey (Division of.
Improvement Service / Scottish Centre for Regeneration Project: Embedding an Outcomes Approach in Community Regeneration & Tackling Poverty Effectively.
Learning Analytics and Assessment Analytics and Feed Forward.
Teaching and Learning Grants Workshop Teaching and Learning Enhancement at UQ Professor Deborah Terry 8 February 2008.
Peer Assisted Learning (PAL)
Transforming lives through learning Teaching Scotland’s Future: Legacy Event Career Long Professional Learning: Education Scotland Jayne Horsburgh and.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
What is an effective induction (within an academic context) and how do you implement it across the whole university or college? Michael Hill Action on.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
GETTING HIGH SCHOOL STUDENT’S BUY-IN: Target Language Only Mandarin Chinese Classes.
Leading improvement using the Primary Framework. Keys to further improvement A growing body of research identifies important and interrelated keys to.
National /14 Internal Assessment Operational Aspects of Quality Assurance.
University of Oxford Department for Continuing Education Diploma in Computing Retention Study Nic Hollinworth and Raymond Flood.
The ABLE project: How do we put what we have learnt from learning analytics into practice Tinne De Laet, Head of Tutorial Services, Engineering Science,
ESRC Research Methods Festival st July 2008 Exploring service user participation in the systematic review process Sarah Carr, Research Analyst,
Learning UEL: An update on developments Gary Tindell Information Improvement Manager Feb 3rd 2016.
Universities Scotland Retention Project Emerging findings Jane W. Denholm Project Consultant.
Dr Rebecca McGuire-Snieckus and Dr Janet Rose Brighter Futures and Bath Primary Partnership: an early intervention project to improve outcomes for vulnerable.
Enabling Collaborative Leadership Pioneer Programme A very brief introduction.
Categories Library Analytics Questions Improve individual student performance - user stories from (e..g staff, employers, parents, learners) that generate.
Quality Online Preparation: Qualities of Faculty, Courses, and Preparation Programs By Dr. Erin O’Brien Valencia College League of Innovation Conference,
Encompass – Learning Partnership SCHOOL BUSINESS MANAGER Course - Level 4 Diploma CLIVE HAINES and SUZANNE BEAN Welcome – Face to Face Session One.
CATS Self Review and Planning Tool An Introduction and Overview Alison Poot and Melody West, CATS Project Team (University of Tasmania)
This session commences the second part of the content.
Rationale In examining completion outcomes, student-level characteristics are key. Disaggregation of data by student groups are important for both intervention.
Developing an early warning system combined with dynamic LMS data
Learning Into Practice Plan
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Developing a Strategy for the Use of Learning Analytics
Thursday 2nd of February 2017 College Development Network
Sport England Families Fund
MHSC TB, HIV/AIDS and Silicosis Initiatives
The sample under analysis at the end of the scoring was made up of 99 participants as follows:

Links in the Chain: turning learning analytics data into actions
Planning (secondary version)
Lisa Dawson – Head of Student Systems Operations
What do students want from interventions?
Meeting children’s needs for care and protection
A project team approach for institutional change to improve belonging and success through improving retention data Alison Browitt.
Exploring the role of nurses in Antimicrobial Stewardship at Sheffield Teaching Hospitals NHS Foundation Trust Antimicrobial Nurse.
Success against the Odds
The Q Improvement Lab August 2017.
Student QEP Workshop Developing Student Engagement in Quality Assurance and Enhancement Eve Lewis Director.
Progress update Dr Sophie Doswell
Multiple Measures Susan Barbitta Associate Director, Special Projects
Search: Poll Everywhere
Improving retention: Using a voluntary diagnostic quiz
Building a Cross-Campus, Collaborative Academic Success Intervention for First Year Students at OSU Susie Brubaker-Cole Associate Provost for Academic.
PE and Impact – using the RDF to identify and develop the skills required Thursday, 28 February Heather Pateman, Project Manager, Vitae.
Aims of this session To get feedback from you on work that the planning group has done so far To what extent do you feel this structure supports your needs.
Running litigation surgeries
Learning gain metrics and personal tutoring: Opportunities and ethics
School of Dentistry Education Research Fund (SDERF)
Practice of Teaching Part 1
SOLSTICE & CLT Conference 2013
Learning gain metrics and personal tutoring: Opportunities and ethics
Using predictive analytics to support students: Case study 1
Celebrating enhancements ‘Achieved in partnership’,
Maureen McAteer, Scottish Government
What works to enhance students' transition into HE
Welcome Week briefing for Schools and Institutes
The power of learning analytics to impact learning and teaching: a critical Professor of Learning Analytics Open University.
Objective of the workshop
Lynda Jones Senior Progression Adviser 23rd May 2019
Experiences supporting Open University apprenticeship students: A case study Focus is on the OU’s Digital and Technology Solutions Professional Apprenticeship.
Quality and Impact AIM PRIMARY DRIVERS SECONDARY DRIVERS INTERVENTIONS
Presentation transcript:

The issue of retention in distance learning: a strategic approach to using data to inform retention interventions Alison Gilmour and Avinash Boroowa The Open University

Struggling students want to be noticed.

Overview: our retention project Proactive student support Predictive modelling Studying more than 120 credits Focus of the session: retention project which draws together and is informed by our previous experiences of proactive student support, in particular the pre-course calls, as well as the use of predictive modelling to try to target resource most effectively. Keen to share our approach and interim findings, as well as to get your views on the nature of this type of student support and how it compares with work that has been done previously in OUiS. We also want to open up discussion on some of the difficult topics that we’ve encountered as part of this project to seek your views. Will return to predictive modelling and the nature of student support later in the presentation. Key is that our project objective is to … Objective To improve the retention rate of ‘at risk’ students as defined by the Strategy and Information Office Predictive Model

Multiple project drivers Overview Multiple project drivers

Project Team Various staff from The Open University in Scotland including ALs, Educational Advisors and Learning Development Strategy & Information Office Learning & Teaching Innovation A key aspect of this project, and one we feel is quite distinct from previous retention projects at the OUiS is the collaborative approach which we feel has been a key strength in shaping all aspects from the design and methodology, through to collection and analysis of data and next steps. It’s the input of different types of expertise from whether that be statisticians, data analysts, student advisors, associate lecturers, learning development etc. Another key element of the project is in the collation of evidence (not just related to the outcomes for students, but also operational data associated with the interventions). Although this is being run as a pilot project in Scotland, we are feeding into the body of evidence being generated as part of the Early Alert Indicators / Data Analytics projects.

What are the student probabilities? Phase 1: SIOPM What are the student probabilities? Overarching – three phases to the project, currently in phase 2 Phase 1 took place between February 2016 – June 2016, where the Strategy and Information office compared the actual performance of the entire Scottish cohort in 2014J to the predictive model run against the same cohort. The project team felt confident that the predictive model was fit for use with a Scottish cohort. Correctly identifying more non-completers Within IET, statistical models have been trialled to identify the expected pass rate for cohorts of students taking particular modules. These models use information relating to module design, and both demographic characteristics and study history of module students to yield a predicted probability that each student will pass the module. The Open University is interested in ensuring that all students receive appropriate and timely support. Models of this kind could, in principle, be used to identify students who are at risk of dropping out or failing their modules so that they could be targeted using appropriate interventions by the relevant Student Support Team. Pilot work is being undertaken to determine the usefulness of this approach… Probabilities are generated for: - retention to the above milestones - module completion - module pass - return in the following academic year

How are the student probabilities generated? Phase 1: The SIOPM How are the student probabilities generated? Different resilience factors influence the model at different points in the module.

Testing the application of predictive modelling with a Scottish Cohort Phase 1: February – June 2016 Testing the application of predictive modelling with a Scottish Cohort Our aim is to reduce the number of non-completions How accurate was the SIOPM in predicting non-completions for students in Scotland for 2014J? IOPM October 2014 completion predictions were compared with actual completion status at the end of the module How does the using the SIOPM to predict non-completion, compare with using a single variable? Correctly identifying more non-completers Within IET, statistical models have been trialled to identify the expected pass rate for cohorts of students taking particular modules. These models use information relating to module design, and both demographic characteristics and study history of module students to yield a predicted probability that each student will pass the module. Predictive modelling is not currently used to enhance the attainment or the experience of individual students. However, the Open University is interested in ensuring that all students receive appropriate and timely support. Models of this kind could, in principle, be used to identify students who are at risk of dropping out or failing their modules so that they could be targeted using appropriate interventions by the relevant Student Support Team. Pilot work is being undertaken to determine the usefulness of this approach…

Identifying ‘at risk’ students Step 1: Comparing Selection Methods

Identifying ‘at risk’ students Step 2: Further Refinement of the Selection

Identification of ‘at risk’ students Step 2: Further refinement of the selection The lower the range selected, the higher the percentage of non-completers within the selection. Student Probabilities allow users to refine the selection to identify a number of students that suits the capacity and resources available, by narrowing the selection range. The example shows the number of students in each probability band, and this can be used to focus on one or more bands to suit both the target number of students, and the target range.

Phase 2: Retention Intervention Nature of Intervention Rationale: To ensure students who were identified as ‘at risk’ were contacted to offer additional support if required.

Interim Data The first interim analysis highlights a positive impact on retention on day 14 of students contacted via the project, the Intervention Group, compared to the Control Group, and that this is statistically significant Retention to 50% Fee Liability Point (31st December) for Intervention and Control Groups. Again, this shows a statistically significant positive difference in retention for those students in the Intervention Group compared to those in the Control Group. Caution: these are only active withdrawals; this contains all students receiving the intervention including those who only received SMS and Email. Not only is the actual retention to 50% is better for the intervention group, this also converts into an increased projected number of completion and pass (but we need to be cautious as these are still predictions). In terms of the student number gain, the sizes of the two start groups are aligned. (So if we assume that the control and intervention group were of the same size, this is what the gain would be. )

Some caution … but encouraging signs in the interim data Next Steps and Phase 3 Some caution … but encouraging signs in the interim data Data on completion (July 2017) Disaggregation of students within Intervention Group (a difference between those who were contacted by SMS/ Telephone compared to SMS/ Email) Consideration of students who moved out of the band Currently exploring the potential of broader contextual evidence that would allow us to better understand student behaviour: Mining of student records for both Control and Intervention Groups (to consider specific module intervention and the broader intervention landscape) Analytics such as VLE behaviour Planning for Phase 3 – Changing the intervention? Running additional interventions with the same band? Running the same intervention with another band?

Contact Alison Gilmour, The Open University in Scotland alison.gilmour@open.ac.uk Broader Project Team includes: Avinash Boroowa, Lucy Macleod, Hannah Jones, Galina Naydenova, Rebecca Ward and Christothea Herodotou.