To prove, to improve or to learn? Lessons for evaluating technical, practical and vocational educational initiatives from an analysis of STEM evaluations.

Slides:



Advertisements
Similar presentations
Quality Assurance in eLearning Denise Kirkpatrick Pro Vice-Chancellor The Open University, UK.
Advertisements

HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Workshop based on QCDA Co- Development Folder Led by Brian Pengelly Primary Curriculum Conference 19 th November 2009.
Appraisal of Literature. Task 4 The task requires that you:  Obtain a piece of literature from a journal, book or internet source. The literature should.
Mywish K. Maredia Michigan State University
Screen 1 of 43 Reporting Food Security Information Reporting Formats for Food Security Report Types Learning Objectives At the end of this lesson you will.
A Curriculum Model based on the work of Ralph Tyler
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
The IGERT Program Preliminary Proposals June 2008 Carol Van Hartesveldt IGERT Program Director IGERT Program Director.
“The Lewisham Edge Project: An apprenticeship model for in-service vocational teacher training” Lewisham College Edge Foundation London South Bank University.
Business research methods: data sources
Research problem, Purpose, question
Make a difference Welcome A Level Critical Thinking.
Georgios Tsirigotis, Electrical Engineering Department, Kavala Institute of Technology, Greece Anna Friesel Electronics and Information Technology, Technical.
© Engineering Council 2011 UK Engineering Degree Accreditation Engineering Doctorate EngD ENAEE, November 2012 Deborah Seddon, Head of Policy and.
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
Click to add title Household energy efficiency programme evaluation: does it tell us what we need to know? Dr Joanne Wade CXC
Discussion Gitanjali Batmanabane MD PhD. Do you look like this?
1 A proposed skills framework for all 11- to 19-year-olds.
Curriculum for Excellence Aberdeenshire November 2008.
Business research methods: using questions and active listening
Margaret J. Cox King’s College London
Impact assessment framework
A presentation to: Making a Difference? Evidencing Impact for Youth Friday 18th July 9.30 – The Lift, 45 White Lion Street, London N1 9PW Creating.
Transforming lives through learning Profiling Education Scotland.
Transforming lives through learning Arts and culture education ‘Content and outcomes in Scotland‘ Education Scotland September 2013.
© University of South Wales Learning Disability Nursing Research Ruth Northway Professor of Learning Disability Nursing November 2014.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
A Curriculum for the future The new Secondary Curriculum Phase 2 Implementation (key messages)
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
Student volunteers and the volunteer- involving community organisations vinspiredstudents research.
UKPopNet Workshop 1 Undertaking a Systematic Review Andrew S. Pullin Centre for Evidence-Based Conservation University of Birmingham, UK.
WORKING TOWARDS INCLUSIVE ASSESSMENT Messages from across Europe Reutte 28th February 2008.
School Improvement Partnership Programme: Summary of interim findings March 2014.
FYITS – Students Mktg Briefing Nov 2010 BSc (Hons) Engineering Management Nature of Course The course seeks to equip students with management knowledge.
From description to analysis
Quality Assuring Deliverers of Education and Training for the Nuclear Sector Jo Tipa Operations Director National Skills Academy for Nuclear.
1 Research Problems, Questions, Hypotheses,& Frameworks.
Developing a Research Question… An Overview of A Family-Focused Research Program Presentation for LEAH Fellows Friday, October Kathleen (Katy) M.
This work is part of the Joint Action on Improving Quality in HIV Prevention (Quality Action), which has received funding from the European Union within.
The Proposal AEE 804 Spring 2002 Revised Spring 2003 Reese & Woods.
Early Childhood Outcomes Center New Tools in the Tool Box: What We Need in the Next Generation of Early Childhood Assessments Kathy Hebbeler ECO at SRI.
Kathy Corbiere Service Delivery and Performance Commission
Workshop A. Development of complex interventions Rob Anderson, PCMD Nicky Britten, PCMD.
Assessment Design and its relationship to NARS and ILOs Arthur Brown Advisor to the Quality Assurance and Accreditation Project Republic of Egypt.
Introduction to Research for Physical Therapy Students.
School Improvement Partnership Programme: From principles to practice Chris Chapman.
Supporting People Review Presentation for NIFHA Care and Support Conference – 3 June 2015 Stephen Martin, Deputy Director, Housing Policy Delivery.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.
Advancing teaching: inspiring able learners every day Meeting the Challenge 14 th November 2012.
Stimulating innovation in engaged practice and developing institutional cultures that support it 3. Capacity building and skills development  Supporting.
DECC Framework Contract for Innovation Delivery Support – presentation to potential contractors Thursday 20 October.
Critical Realism and Realist Synthesis Sam Porter School of Nursing and Midwifery March 2016.
VIENNA ON FEBRUARY 2016 The Future Skills and Competences in the Pulp and Paper Industry 1.
How to write 12M Questions AO1 - FACTS AND THEORIES Get marks by describing and recalling psychological knowledge and understanding. AO2 - APPLICATION.
4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.
School Effectiveness Framework Professional Learning Communities Professor Alma Harris Michelle Jones.
© Copyright Richard Dealtry th & 10 th February 2012 Revolutionary Corporate University Forum Web:
Evaluation What is evaluation?
Principles of Quantitative Research
Outcomes of the Surveys and Literature Reviews - Researchers
Development of a Stoke-on-Trent & Staffordshire Industrial Strategy (SSIS) Executive Board
Standard for Teachers’ Professional Development July 2016
Seminar on the Evaluation of AUT STEM Programme
Local Authority Research Difference to Services for
Follow up workshop on the Evaluation of AUT STEM Programme
Presentation transcript:

To prove, to improve or to learn? Lessons for evaluating technical, practical and vocational educational initiatives from an analysis of STEM evaluations Edge Research Conference 2012 Friday November 16th

Aims: Drawing on an analysis of evaluations in STEM education we will: identify some of the problems with such evaluations; examine some potential solutions for the future; and examine the applicability to wider TPVL.

Introduction CSE and CEIR Science and Innovation Observatory Priorities Research, intelligence, evaluation, polemics Informing and influencing Independent body Reports, think-pieces, associates

Background - what is STEM and how does it relate to TVPL? In UK - developed from SET for Success (Roberts, 2002) STEM Framework launched Development of a set of 11 'ActionProgrammes' Each of these contains more projects, many of which were evaluated - these from the bulk of our analysis Technical vocational routes part of some Action Programmes, but main focus was academic

What do we know about STEM evaluations? Analysis of 20 STEM evaluations: – 13 Projects/activities or programmes – 4 Event evaluations – 2 Evaluations of organisations – 1 CPD evaluation

Examined: Aims Timings Methods Evaluation models Use of prior evidence Results and outcomes Impact on policy and practice Limitations Contribution to knowledge

Key points from the review Evaluation aims were not always explicitly stated. Timings do not always appear to match the purposes of the initiative being evaluated. Robust counterfactuals were rarely used. Explicit evaluation models were used in only a small number of cases. Reviews of literature, policy or similar initiatives were not usually presented.

Key points from the review continued Negative results and were not usually presented in the same depth as positive results. Few evaluations looked to make recommendations beyond the project at hand. Evaluations tended not to make explicit their limitations. Contributing to a developing STEM knowledge base is very rare in the evaluations we looked at. Conclusion: The potential for learning from these evaluations is severely limited.

Linked to key point: the purposes of Evaluation Controlling To understand whether the project is going to plan Proving To understand if the project is achieving what was intended Improving To understand how to modify the initiative to make it work better Learning To provide transferable insights to help build a body of knowledge beyond the project at hand

Responses :1 A single evaluation framework? E.g. Stake’s (1996) Stufflebeam (2002): Cronbach's (1982) Each of these organises the focuses of evaluation into three broad areas: context [antecedent, context, unit of focus/setting]; process [transaction, input/process, treatment]; and outcome [outcome, product, observations/outcomes].

Responses :1 Guskey? reactions learning organisational support and change use of new knowledge and skills student outcomes No support for this idea: One approach could not be designed that would be appropriate to the aims of every STEM project or evaluation. A multiplicity of approaches allows greater fit, flexibility and creativity: and hence is more likely to lead to transferable learning.

Responses 2: Theory-based approaches There are a number of well established 'theory-based' approaches e.g. Realist evaluation; Theory of Change. These develop hypotheses about the social world, and test them out using a variety of means. Close to the scientific method.

EXAMPLE - Interventions aimed at directly improving students’ attitudes to STEM subjects EXAMPLE THEORY - using interesting, innovative opportunities to learn improves attitudes to STEM hence improved learning outcomes and interest in STEM careers (e.g. After school Science and Engineering Clubs; Engineering Education Scheme )

Next steps for STEM: 1.Development of effective use of theory-based approaches to evaluation. 2.Systematic mining of current evaluation and research to develop a bedrock of evidence of the theoretical bases for initiatives, and their effectiveness in various contexts. 3. A commitment to using and building the evidence base through evaluation and research.

Next steps for technical, practical and vocational learning (TVPL)? Questions: Is there evidence that there is a similar lack of impact of evaluations in relation to TVPL? What analysis needs to be done to help answer this question? What needs to be done in TVPL to improve evaluation - and to what extent do the prescriptions in this paper for STEM evaluation apply to TVPL?

Want to get involved? Contact us: Mike Coldwell Ken Mannion