A Structured Conversation: Enabling and Measuring Responsive Pedagogy Dr Christine Couper & Dr Cathy Molesworth Planning and Statistics, January 2018.

Slides:



Advertisements
Similar presentations
Self- and Peer-Assessment
Advertisements

Making Feedback a positive learning experience Joint Academy/NUS Special Interest Group Professor Brenda Smith Senior Associate
Summary Input – process – output model. Input Written Oral Observational.
Creating a dialogue over feedback between staff and students in the classroom and online Heather A. Thornton
May 22, Today is a chance to... - look at the provincial report card template - understand the new process -inform a decision about implementation.
Course Review on Assessment Professor Brenda Smith.
Peace Wapiti Academy Flexibility Project: How did we get here? Josie Nagtegaal and Debbie Terceros.
Elaine Gerber & AJ Kelton, Montclair State University & Tracy Chu,
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
The Current Refocusing of General Education. Objectives for the Workshop Proposing and/or Renewing a Course Assessing the general education aspect of.
Aligning Course Competencies using Text Analytics
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Dr Elena Luchinskaya, Lancaster University/ Leeds Metropolitan University, UK.
Educating for professional life Dr Mark J.P. Kerrigan, Dr Mark Clements, Dr Andrew Bond, Ms Federica Oradini, Ms Yanitsa Nedelcheva & Prof Gunter Saunders.
FOLLOW UP SITE VISIT Dr Robert Schofield Dr Arthur Brown Advisors to the Quality Assurance and Accreditation Project Republic of Egypt.
Challenges in Formative Assessment across ENCAP, HISAR, and CARBS Melanie Bigold, Rob Gossedge, Casper Hoedemaekers, Tracey Loughran, and Maki Umemura.
Transition, Engagement and Retention of First Year Computing Students Heather Sayers Mairin Nicell Anne Hinds.
8.Implications for Analysis: School Survey, Student Assessment, and Transcript Data.
Prepared and presented by Reda Saad El-Mahdy Ahmed Bin Hanbal Independent Secondary School for Boys And “SEC Curriculum Standards”
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
Rhona Sharpe, Head of OCSLD Liz Turner, Head of APQO 11 th April 2013 CHAIRING VALIDATION PANELS.
CLT Conference th July 2015 Nudge! A structured programme of support aimed at improving student achievement where average grades are just below.
New Advanced Higher Subject Implementation Events Computing Science Advanced Higher Course Assessment.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
An Orientation: General Psychology Online. The Course Menu Shown on the far left is the menu used to navigate our Psychology course.
Welcome Richard Montgomery Parents! Richard Montgomery PTSA Information Session on the MYP.
SESSION 2.1 PROGRAMME OF ASSESSMENT TASKS IN GET GRADES R-9.
Dr. Amina M R El-Nemer Lecturer Maternity and Obstetric Nursing Dep. IQAP Manager Program Specification.
Project Based Learning
Group writing tutorials: Do they improve student writing? Roger Graves University of Alberta, CAN
….A Different Way Leaving Certificate Applied. Subjects Career Preparation & Guidance English & Communications Mathematical Applications Hotel, Catering.
Taking Learning Development outside of the university Catherine McConnell, University of Brighton.
1 Using team based learning (TBL) to maximise the effectiveness of flipped learning School of Law Zoe Swan Sally Gill Kristian P Humble.
Implementation Training
Dr Camille B. Kandiko King’s College London
MFHS and the new Stage 6 English Syllabus: An overview of courses for implementation Year and Year
OCR Sport Science HOW THE COURSE WORKS.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Introduction to Teacher Evaluation
Anthony Williams, Maria Northcote, Jason Morton and John Seddon
This project has been funded with support the European Commission
Professional Development: Imagine Difference Shapes and Sizes
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
Student Surveys - Best Practice
Irish Survey of Student Engagement (ISSE) Results 2016
Factors facilitating academic success: a student perspective
Effects of Targeted Troubleshooting Activities on
Teaching Psychology Demonstrating for stage 1 & 2 practical classes, statistics, some tutorials Teachers should have a Psychology degree Work in teams.
B.A. 4 Placement Overview (Placement 1) 4th October 2016
Instructor Course Evaluation (ICE)
EDA 6260 Educational Systems Planning and Management
Assistant Headteacher
Learning Gain: Evaluation, Evidence and Enhancement
CS 1302 Programming Principles II
Introducing a PASS (Peer Assisted Study Sessions) Scheme into the School of Health Sciences Clair Zawada Academic Lead
Roles and Responsibilities of an External Examiner
How Blended is your Moodle
Learning gain metrics and personal tutoring: Opportunities and ethics
New Science Specifications for 1st examination in Summer 2018
Learning gain metrics and personal tutoring: Opportunities and ethics
Selena Killick Richard Nurse Helen Clough
January 2019 Designing Upper Economics Electives with a significant writing component Helen Schneider The University of Texas at Austin.
WHICKHAM SCHOOL AND SPORTS COLLEGE
Creative assessment and feedback
IDEA Student Ratings of Instruction
Kingdom of Saudi Arabia
A Moodle-based Peer Assessment Tool
Using SharePoint in a flipped classroom
Understanding Standards: An Overview of Course Assessment
A data-driven, multi-disciplinary approach to understanding student non-engagement with employability initiatives Presenters: Dr Stephanie McBurney, Faculty.
Presentation transcript:

A Structured Conversation: Enabling and Measuring Responsive Pedagogy Dr Christine Couper & Dr Cathy Molesworth Planning and Statistics, January 2018 A HEFCE funded project

AIMS Survey responses & learning analytics A Structured Conversation Survey questions Outcomes: Engagement & Success Apply change

What we did? Created survey of teaching methods for staff & summarised outcomes Linked teaching survey responses to average grade per module and fail rate per module with statistical analyses Streamlined links between JISC learning analytics and EvaSys module evaluation surveys We used statistical analysis to create links between available module related information such as level of study and student satisfaction rates measured with EvaSys surveys. Created a “pulse survey” to measure students’ engagement, views of learning gain, and views of teaching. Then we summarised the outcomes.

Staff Survey Overview Thus summary is focused on 537 survey responses, one for each module, from 275 staff who stated either: they were module leader or they were familiar with the most recent module presentation. Percentage of 537 survey responses relating to 2015/16 vs. 2016/17 academic session

Introduction: Methods Illustrative example of multiple choice and free text question types Questions were compulsory multiple choice questions. Most of these had free text options for optional comment. For an example see

Predictors of average grade per module & fail rate per module Topics we asked staff survey questions on Academic session? No. of staff – Academic? No. of staff – Hourly paid? Active Inquiry: Active inquiry – Team based? Active inquiry – Collaborative? Active inquiry – Peer? Active inquiry – Situated? Active inquiry – Flipped?

Predictors of average grade per module & fail rate per module Topics we asked staff survey questions on Research: Research – Taught? Research – Latest findings? Research – Trained? Research – Prepared? Research – Project others’? Research – Project student’s? Co-design/ co-production? Diagnostic tests? Novel assessments? MMP/ MMA? TESTA? Employability? Moodle enhancements

Predictors of average grade per module & fail rate per module Topics we asked staff survey questions on Audiovideo recording: Audiovideo – Recording session? Audiovideo – Assignment briefings? Audiovideo – Assignment feedback? Audiovideo – Self assessment? Audiovideo – Recording teaching? Audiovideo – Recorded advice? EvaSys used? EvaSys feedback?

Predictors of average grade per module & fail rate per module Module information that we gathered No. of students Level Credit Teaching: Scheduled hours Independent hours Placement hours Assessment: Coursework % Written work % Practical work % JACS 16 subject area codes Proportion dissatisfied (EvaSys)

Statistically significant predictors of average mark per module BLUE is bad! Note: Reference categories in bold italics Lower average mark than reference in blue Used linear regression

Statistically significant predictors of average mark per module BLUE is bad! Note: Reference categories in bold italics Lower average mark than reference in blue Used linear regression

Statistically significant predictors of failure rate per module BLUE is bad! Note: Reference categories in bold italics Larger failure rate than reference in blue Used negative binomial regression

Statistically significant predictors of failure rate per module BLUE is bad! Note: Reference categories in bold italics Larger failure rate than reference in blue Used negative binomial regression

From a different analysis: Statistically significant predictors of failure rate per module – written work BLUE is bad! Note: Reference categories in bold italics Larger failure rate than reference in blue Used negative binomial regression

What helps students? Not having team based active inquiry all or most of the time Taking subjects within certain JACS codes with Biological sciences as ref. category: Physical sciences Languages Business & administrative studies Mathematical sciences Mass communications & documentation Creative arts & design Having 100 hours or more of independent work Having a higher percentage, 35% or more, of assessment by course work

What hinders students? Not having reached Level 6 yet Students not conducting their own research project at all Not implementing TESTA Increases in student number (effect is small) Having a high percentage of assessment as written exams (>= 70%)

Caveats Some findings are difficult to interpret at this stage e.g. Missing data is a significant predictor Having 1 staff on an academic contract predicts higher grade, than having 2 staff on academic contracts but lower grades than zero staff on academic contract Some of the sample sizes for the sub-categories are rather too small to reliable The statistical analyses pick up on patterns of association not causality Does having a higher percentage of coursework really help or does it merely signify an absence of written exams that are problematic for some students

Staff Survey January 2018