Advance HE Surveys Conference

Slides:



Advertisements
Similar presentations
Drafting an Improvement Plan Using NSS Data Catherine Rendell – Deputy Director Academic Quality Assurance and Enhancement, University of Hertfordshire.
Advertisements

IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.
Miguel Martins, Senior Lecturer – Marketing Department eLearning Celebration 22 July 2007 Improving performance on a Marketing module through the use of.
Patient Experience Body of Knowledge Metrics and Measurement Domain Team Week 2 Check-in Call Aug 10, 2012.
TEACHING, LEARNING AND ASSESSMENT UPDATE 1 ST March 2013.
TSDS Pilot – Lubbock ISD Dashboard Training Module September 26, 2011.
The National Student Survey (NSS) Penny Jones, Strategic Planning Office Tracy Goslar and Miles Willey, Academic Standards & Partnership Wednesday 16 March.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
On-line briefing for Program Directors and Staff 1.
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
Instructors’ General Perceptions on Students’ Self-Awareness Frances Feng-Mei Choi HUNGKUANG UNIVERSITY DEPARTMENT OF ENGLISH.
Student Led Module Feedback Scheme. londonmet.ac.uk A joint initiative from the Student Union and the University to improve our L and T (Strategic plan.
Some quality cycle planning monitoring and sharing examples 1.
Introduction Feedback is ‘one of the most powerful ways to… enhance and strengthen student learning’ (Hepplestone et al 2009) Anecdotal evidence suggests.
Quality Online Preparation: Qualities of Faculty, Courses, and Preparation Programs By Dr. Erin O’Brien Valencia College League of Innovation Conference,
Gabrielle Wong HKUST Library
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Digital Campus: Foundation Projects
Number of Respondents: 245
Taught Postgraduate Program Review
myTimetable Module Timetable Academic Review Session
Student comments are just the start
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
UCL Annual Student Experience Review
Research team: Dr Mei-Li Roberts, Hannah Bell & Lesley Connaghan
Factors facilitating academic success: a student perspective
brightonbusiness.qualtrics.com Qualtrics

Evaluating performance management
Uniforum Service Effectiveness Survey
Ella Reeves, Patient Experience Manager
General Education Assessment
Course Evaluation Committee
Department of Teacher Education, University of Helsinki, Finland
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
Building a GER Toolbox As you return from break, please reassemble into your working groups: Surveys and Instruments Analytical Tools Getting Published.
This presentation will include:
Engagement Follow-up Resources
All Staff Meeting Monday 24 October 2016
Helen Jefferis, Soraya Kouadri & Elaine Thomas
The context ADSS ahead of university and sector averages in O&M questions But - 13 programmes out of 32 fell 13 below their JACS code average Highest correlation.
Keeping Students on Track Using Technological Retention Tools
NHS Adult Inpatient Survey 2018
SOLSTICE & CLT Conference 2013
Identifying enablers & disablers to change
The Efficacy of Student Evaluations of Teaching Effectiveness
BPM in E-Gov <Results of the Study> <Recommendations>
Dealing with users: From the quality of the reception to the quality of the service provided Hello!
Engagement Follow-up Resources
Butler University Great Colleges To Work For
The impact of small-group EBP education programme: barriers and facilitators for EBP allied health champions to share learning with peers.
Periodic Review Departmental Review.
Using exemplars to improve students’ assessment literacy & self-efficacy Jeremy Schildt (SSS) - Neil Cooper (PSY) - Gary Parlett (HSC) - Helen James (BIO)
Using the LibQUAL+ Survey to Inform Strategic Planning
2. Centre for Primary Care, University of Manchester
Student Evaluations of Teaching (SETs)
IDEA Student Ratings of Instruction
Finalization of the Action Plans and Development of Syllabus
eWalk from Media-X Systems Inc.
Taught Postgraduate Program Review
THE CURRICULUM STUDY CHAP 3 – METHODOLOGY.
2017 Postgraduate Research Experience Survey (PRES) Results
KVI Analysis for period 1st May 2018 to 31st March 2019
Evidence for Enhancement: Making sense of surveys
Engage. Survey. Enhance. Repeat.
KVI Analysis for period 1st May 2018 to 31st March 2019
Do we really know how students use technology to maximise their learning? #digitalstudent Sarah Knight, Clare Killen and Alicja Shah, Jisc.
Bridging the ITSM Information Gap
A data-driven, multi-disciplinary approach to understanding student non-engagement with employability initiatives Presenters: Dr Stephanie McBurney, Faculty.
COMM 464 class 3 Agenda Marketplace news
Presentation transcript:

Advance HE Surveys Conference 8th May 2019 Using MEQs to inform teaching excellence Dr Tim Linsey Head of Academic Systems & Evaluation Academic Systems & Evaluation Directorate for Student Achievement Kingston University t.linsey@Kingston.ac.uk

Background – Reintroduction of MEQs Decision taken in January 2017 to reintroduce MEQs MEQ Working Group 10 Quantitative + 2 Qualitative questions March 2017 – University using Blue and Paper surveys November 2017 to July 2018 – Primarily online surveys September 2018 – All online surveys (with option for paper) MEQ Environment: Blue from Explorance

Orchestrated approach Briefing guide and PowerPoint for all module leaders Set of agreed statements to conveyed to students Student created video introducing MEQs Staff asked to find a slot in class Staff requested to leave class for 15 mins. Use of course representatives

VLE Integration My Module Evaluations

Processes & Timing MEQs run all year but two main survey windows (16 days) Automatic publishing of MEQs in to each module in the VLE Reports automatically Published into the VLE within a few hours of an MEQ completing Systems – mostly automated Integration of Blue with the SIS and VLE Tableau Dashboards Aiming for full automation for 2019/20

2018/19 (to March) 832 MEQ reports generated (exceeding minimum threshold of 4) 76% of student responses contained qualitative feedback 38% students completed one or more MEQs 47% completed via mobile devices Communications Plasma screens University Buses Emails VLE Intranet

Module Reports Staff and student reports similar except the student version excluded comments and comparisons (Department and Faculty averages)

Best things Improve

Further Reports Department, Faculty and University aggregate reports Summary reports for each Faculty Modules with zero responses or not met threshold Custom Reports

Summary Report for all Modules 2016/17 Summary table ranking all modules by their mean overall score. Colour coded => 4.5 =< 3.5

Summary Report for all Modules 2017/18 Colour coding was problematic Staff suggestion to rank by standard deviation from the overall university mean.

Additionally Comparison of 2016/17 vs 2017/18

Statistical Analysis Wilcoxon test used to compare aggregate data between 2017 & 2018 (mixed and Faculty aggregated level) Weak but significant –ve correlation between module size and mean MEQ score (Spearmans Rank) Weak but Significant +ve correlation between mean score and completion %. (Spearmans Rank)

We noted Care needed to be taken with aggregated data and inferences drawn from it An individual MEQ report is informative for the module team knowing the local context but care needs to be taken without looking at trends and other metrics. Significant churn in MEQ Module rankings 2017 vs 2018

Summary Report for all Modules 2018/19 Reviewed our approach to consider issues raised in the literature: Comparisons between modules of different types, levels, sizes, functions, or disciplines Averaging Ordinal scale data Bias Internal consistency (e.g. Boring, 2017; Clayson, 2018; Hornstein, 2017; Wagner et. al. 2016)

November 2018 Summary Report Sorted by Faculty, Level, Response rate

Statistical confidence Methodology: Dillman, D. Smyth, J, Christian, L. 2014 Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method. John Wiley & Sons.

Ranking by % Agree

Frequency Distributions Request that staff also review the frequency distribution of their responses Is the distribution bimodal, and if so why? Mean = 2.9

Aggregating Questions to Themes Teaching Assessment Academic Support Organisation

Data Warehouse Raw data passed to the KU Data Warehouse Tableau Dashboards (Strategic Planning and Data Insight Department). Dashboards accessible by all staff including showing top 5 and bottom 5 modules for each level. Data aggregated with ability to drill down to module level

Annual Monitoring and Enhance Process MEQ results are pre- populated into Module Enhancement Plans Course Metrics dashboard

Issues & Developments When should the MEQ be distributed? – Focus Group feedback Staff being named in qualitative feedback & issues of etiquette Students concerned about anonymity GDPR 47% students completing MEQs via Mobile Devices Automation – Administration & Analysis Response rates – followed up with modules with high response rates. Feedback to Students Demographic analysis

Collaborative Led by Academic Systems & Evaluation Team Information & Technology Services Strategic Planning and Data Insight Academic Registry Faculties via the MEQ Working Group Student Course Representatives Explorance

Any Questions?