College of Science and Engineering Evaluation of the Learning and Teaching Strategy: The Way Forward? Velda McCune Centre for Teaching, Learning and Assessment.

Slides:



Advertisements
Similar presentations
Developing Satisfaction Surveys: Integrating Qualitative and Quantitative Information David Cantor, Sarah Dipko, Stephanie Fry, Pamela Giambo and Vasudha.
Advertisements

Professional Learning Communities Connecting the Initiatives
Experience of using formative assessment and students perception of formative assessment Paul Ong Greg Benfield Margaret Price.
Post 16 Citizenship Liz Craft Valuing progress Celebrating achievement.
Users reactions to innovative computerised feedback – the case of DIALANG Ari Huhta Centre for Applied Language Studies University of Jyväskylä Finland.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Community-Led Evaluation Vancouver Public Library CLA 2012.
Project Monitoring Evaluation and Assessment
OCR GCSE Humanities Get Ahead - improving delivery and assessment of Unit 3 Unit B033 Controlled Assessment Approaches to Preparing Candidates for the.
Alvin Kwan Division of Information & Technology Studies
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Improving Students’ understanding of Feedback
Embedding NVivo in postgraduate social research training Howard Davis & Anne Krayer 6 th ESRC Research Methods Festival 8-10 July 2014.
Project Workshops Results and Evaluation. General The Results section presents the results to demonstrate the performance of the proposed solution. It.
Standards and Guidelines for Quality Assurance in the European
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
International Conference on Enhancement and Innovation in Higher Education Crowne Plaza Hotel, Glasgow 9-11 June 2015 Welcome.
Creating a high performing School What the research says on how our best performing schools come out on top Courtesy of AITSL.
Teacher Impacts of the RET Program: Survey Results from Six Sites Suzanne Coshow, Ph.D. Joint Institute for Nuclear Astrophysics University of Notre Dame.
MAST: the organisational aspects Lise Kvistgaard Odense University Hospital Denmark Berlin, May 2010.
NSW Curriculum and Learning Innovation Centre Draft Senior Secondary Curriculum ENGLISH May, 2012.
SUPI Coordination Day April 2014 Dr Jenni Chambers Senior Policy Manager, RCUK PER
Kazakhstan Centres of Excellence Teacher Education Programme Assessment of teachers at Level Two.
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Robert W. Arts, Ph.D. Professor of Education & Physics University of Pikeville Pikeville, KY The Mini-Zam: Formative Assessment for the Physics Classroom.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Use of new technologies in first year biology teaching Mary Peat, Sue Franklin, Rob Mackay-Wood & Aida Yalcin School of Biological Sciences,
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
1: Overview and Field Research in Classrooms ETL329: ENTREPRENEURIAL PROFESSIONAL.
Demonstrating Effectiveness Background and Context.
INTRODUCTION TO STUDY SKILLS. What are Study Skills?  Study skills are approaches applied to learning. They are considered essential for acquiring good.
Intel ® Teach to the Future Pre Service Evaluation across Asia - Gaining the Country Perspective - Deakin University Faculty of Education Consultancy and.
Improving Academic Feedback - Turning the ship around with the power of 1000 students David Hope
Slides to accompany Weathington, Cunningham & Pittenger (2010), Chapter 17: Qualitative and Mixed-Method Research 1.
Semester 2 Situation analysis TESL 3240 Lecture 3.
MYP Pre-authorisation Report April 12-13, 2010 Recommendations Summary Professional Development Day May 17 th 2010.
Planning an Applied Research Project Chapter 3 – Conducting a Literature Review © 2014 by John Wiley & Sons, Inc. All rights reserved.
Questionnaire Surveys Obtaining data by asking people questions and recording their answers Obtaining data by asking people questions and recording their.
Student Services Learning Outcomes and Assessment Jim Haynes, De Anza College Sept. 17, 2010.
Teacher Authority WEEK 1 – Welcome webinar preparation + Free word associations Learn about course syllabus and learning environment Course basics Provide.
Classroom Research Workshop at Darunsikkhalai, 2 November 2012 Richard Watson Todd King Mongkut’s University of Technology Thonburi
Market research for a start-up. LEARNING OUTCOMES By the end of this lesson I will be able to: –Define and explain market research –Distinguish between.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
March E-Learning or E-Teaching? What’s the Difference in Practice? Linda Price and Adrian Kirkwood Programme on Learner Use of Media The Open University.
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
LL22/3204: WORKPLACE PROJECT What to do and when (pp Handbook)
Chapter 14: Affective Assessment
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
The purpose of evaluation is not to prove, but to improve.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Educational Psychology Jeanne Ormrod Eighth Edition © 2014, 2011, 2008, 2006, 2003 Pearson Education, Inc. All rights reserved. Developing Learners.
Using scenarios to promote learning analytics practice for teachers Project: Building an evidence-base for teaching and learning design using learning.
M ARKET R ESEARCH Topic 3.1. W HAT IS MARKET RESEARCH ? The process of gaining information about customers, products, competitors etc through the collection.
Chapter 3 Research Design.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
1 Vanderbilt University Name: Vanderbilt TAR Fellows Program Persons responsible: Thomas R. Harris, Derek Bruff, Jean Alley Time Commitment: Introductory.
Researching Innovation.  By definition, an innovation means that you are testing out something new to see how effective it is.  This also means that.
Understanding different types and methods of research
Professional Development: Imagine Difference Shapes and Sizes
Equality Project Part A Observations
Teaching and Educational Psychology
Research on Geoscience Learning
Final Research Question
Presentation transcript:

College of Science and Engineering Evaluation of the Learning and Teaching Strategy: The Way Forward? Velda McCune Centre for Teaching, Learning and Assessment

The current evaluation strategy Questionnaire 1 - given at the start of the semester responsible learning possible influences on responsible learning from the students Questionnaire 2 responsible learning perceptions of the course Qualitative data group interviews in selected courses

Advantages of the present strategy Allowed us to demonstrate correlations between - responsible learning and student influences (e.g. intrinsic motivation positively related to RL) responsible learning and students perceptions’ of courses (e.g. informative feedback positively related to RL) Qualitative data offered richer insights into the students’ experiences e.g. the value of opportunities to test understanding (and there is more to come from these data)

Advantages of the present strategy (cont’d) Provided an indication that the courses implementing the strategy were typically well received Provided a positive view of students’ motivation Allowed some specific feedback to course teams (But there is more to do here) Could be integrated with a College-wide evaluation process

Disadvantanges of the present strategy There are often a lot of missing data from the questionnaires It is difficult to infer causality because - courses are typically only modified in modest ways we typically don’t have access to pre and post intervention data courses not officially involved in the strategy are often quite in line with it missing data makes comparing the same student across courses problematic Struggling or failing students are the least likely to complete questionnaires Qualitative data is really needed to make sense of the questionnaire findings but it is time consuming and costly to collect and analyse

Disadvantanges of the present evaluation strategy (cont’d) The Learning and Teaching Strategy covers a wide range of possible changes to courses which makes it difficult to have a clear cut picture of its impact It is rare to show clear cut findings with such measurement instruments in large first year courses The current questionnaire does not cover all of the areas in the National Student Survey We do not have the staffing levels to scale up this evaluation much more and our turn around time on the data could be better There is a question as to whether more data collection of the same kind will really add to what we know

Alternative 1 - The same overall approach focused on fewer courses This would allow more complete quantitative data to be collected (e.g. turning up in person to collect it, following up non-response) This would allow richer qualitative data to be collected (e.g. more interviews with students, interviews with staff) This would allow a more reasonable turn around time on the data This might allow us to do additional types of analysis (e.g. looking at grades or prior qualifications)

Alternative 2 - In depth development and evaluation case work in very few courses This would allow a very detailed picture of a small number of courses, which could be written up so as to show their potential relevance to other courses This would allow a more integrated approach to educational development and evaluation work, which would make successful substantive changes more likely The process can be tailored and iterative (e.g. questionnaires can be designed to tap into particular innovations) This could only be done in very few courses e.g. one per semester

Alternative 3 - Focus on the experiences of particular groups of students A focus on the most successful students would help us to understand the roots of responsible learning (e.g. in their prior experiences or attitudes) Data from successful students could be worked up into guidelines for all students A focus on struggling or failing students would help us to understand what might get in the way of responsible learning It would be very important to look at the effects of courses through the eyes of these students (and not to assume a ‘good’ student ‘bad’ student model)

Alternative 4 - Focus on the experiences of academic staff It may be that the most measurable impacts of the strategy will be on staff attitudes and expectations Academics views of themselves as teachers are fundamental to how they approach their teaching We could collect data about what supports or hinders educational change

Alternative 5 - Focus on a specific topic for a semester (e.g. feedback) Would allow an in-depth approach to a topic which might support faster progress Data collection could focus on courses which were of particular interest on the topic Would allow time for wider literature search Would allow time to make more of examples of good practice from other HEIs