Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Producing Quality Evidence in a Well Organised Portfolio Doc Ref: 20/04/09-portfolio-quality-evidence.
Analyzing Student Work
Parkland School Division
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Cornell University Cornell Office for Research on Evaluation (CORE) A Systems Approach to Planning Evaluations The SEP and the Netway William M. Trochim.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
Definitions types added-value tutor role building-up informal learning awareness raising examples 1 Astrid Quasebart ESTA-Bildungswerk gGmbH senior project.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Designing an Effective Evaluation Strategy
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Evaluating Teaching and Learning Linda Carey Centre for Educational Development Queen’s University Belfast 1.
Research Presentation Day Glasgow, 24 th January 2002 "Adding value in design development: A qualitative research approach". Michael Reber (PhD Student)
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Chapter 16 Becoming a Better Teacher by Becoming a Reflective Teacher.
Data Collection* Presenter: Octavia Kuransky, MSP.
Reviewing and Critiquing Research
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
1 Core Module Three – The Summative Report Core Module Three: The Role of Professional Dialogue and Collaboration in the Summative Report.
Evaluating and Revising the Physical Education Instructional Program.
Reality Check For Extension Programs Deborah J. Young Associate Director University of Arizona Cooperative Extension.
Evaluation. Practical Evaluation Michael Quinn Patton.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Collecting Data This is STEP 3 of the five steps.
Formulating the research design
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Analytical Thinking.
Classroom Action Research Overview What is Action Research? What do Teacher Researchers Do? Guidelines and Ideas for Research.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
DPW BD&C Employee Performance Evaluation Guideline Discussion
Chapter 4 Evaluating and Creating Interactive and Content- Based Assessment.
FACT OVERVIEW. 22 Inquiry Focus and Number /Year Program Level Decision  CONTEXT FOR TEACHING Class, School, District, and Community Conversation Guides.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
The Evaluation Plan.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Chapter 2 Observation and Assessment
Alternative Assessment
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Quality Assessment July 31, 2006 Informing Practice.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Planning an Applied Research Project Chapter 3 – Conducting a Literature Review © 2014 by John Wiley & Sons, Inc. All rights reserved.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Facilitate Group Learning
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Techniques for presenting content
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Personal Project: THE RUBRIC Learning Intention We are learning to identify the important components of the Personal Project, and understand.
Argumentative Writing Grades College and Career Readiness Standards for Writing Text Types and Purposes arguments 1.Write arguments to support a.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
Stage 1 Integrated learning Coffee Shop. LEARNING REQUIREMENTS The learning requirements summarise the knowledge, skills, and understanding that students.
Writing a Classical Argument
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Rethinking data: Get creative!
Evaluating and Interpreting Oral History
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Presentation transcript:

Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell Office for Research on Evaluation

quick overview of evaluation merit and worth program improvement knowledge generation very similar to research, with some differences

CORE’s “touchstone” statement on the goal of evaluation: To obtain accurate, useful insights about the answers to evaluation questions in a manner that is feasible, is credible to relevant stakeholders, makes strategic use of limited time and resources, and contributes to our general knowledge, to future evaluations and to program evolution.

a systems approach for evaluation planning Programs are viewed (and modeled) as  parts of larger systems  dynamic and evolving  related to other programs in the present  connected to past and future programs  being perceived differently by different stakeholders and systems Evaluation plan development takes this into account and should help programs and systems evolve 4

5

6 InitiationDevelopmentMaturityTranslation & Dissemination RET Program Phase Evaluation Phase REU Module-Based Education Step Program and Lifecycle Analysis

7 Step Pathway Modeling

The key elements of Evaluation Plan quality are 1.Consistency with a high-quality program model 2.Fitness of evaluation plan elements to the program and program context 3.Internal alignment of the evaluation plan elements 4.Holistic coherence 8

1.Consistency with a high-quality program model – A “high-quality model” … is grounded in knowledge of the program incorporates perspectives of multiple stakeholders shows causal pathways (program logic) reflects careful thought about program “boundaries” includes program assumptions and key elements of context is connected to program evidence base (relevant research) – “consistency with…” means evaluation questions can be located in terms of model elements evaluation “scope” makes sense 9

2.Fitness of evaluation questions and other evaluation plan elements to the program and program context – Evaluation questions are “mined” from the model – Evaluation questions are appropriate for the program’s maturity and stability, existing state of knowledge, and program needs – Evaluation focus, methods, and tools meet needs of key stakeholders – Evaluation plan makes efficient and strategic use of program and evaluation resources 10

3.Internal alignment of the evaluation plan elements – Measures fits the constructs – Measure is the most strategic option among those that fit – Design is appropriate for lifecycle stage – Design can support claims implied in purpose statement – Sampling and Analysis plans can generate evidence needed 11

4.Holistic coherence… the elusive element – Evaluation planning requires myriad decisions about multi-faceted tradeoffs – Making these decisions well requires a holistic comprehension of the program and the environment and systems that embed it – The decisions may be invisible in the written plan so some of the resulting quality might be ascertainable, some of it will not be 12

Getting to Measures Worksheet 1. Starting Points: (a) Copy here what you want to know about your program for this Evaluation Question. (b) Copy here your formal Evaluation Question for this inquiry. 2. Clarify the “constructs” in your Evaluation Question above – what exactly do you mean by the activity and/or outcome you are focusing on? Note: revisit your Evaluation Question above and see if it needs to be revised to capture this precision.

3. Using this sharper definition of what’s being evaluated, work through the following sequence of questions: (a) What would this “look like” or consist of in practice? (List as many options as you can. Be creative – think outside the “usual” options.) (b) What might serve as evidence? For the most promising candidates in (a), write down various kinds of evidence that would be informative about whether this has occurred. (c) Review the strengths and weaknesses of the options, taking into account several aspects that matter: “closeness” to the real thing; accuracy and reliability. Write down a short list of the most promising candidates from (b).

4. How could you gather this evidence? Again, think of as many options as you can and list them here. (Possibilities might include video-taping a demonstration, live observation during class sessions, directly asking participants, asking people who know the participants, testing the end-products, etc.) Identify the ways that seem best, taking into account accuracy, feasibility, fit with program context and target population. Indicate the top choices among those on your list. Once again, take a moment to revisit your Evaluation Question phrasing in 1(b) and see if it needs updating. 5. What types of measure would be needed? From your answer(s) to 4, now ask yourself what type(s) of measure would allow you to gather evidence in this way. Here is a list of possible measure types from which to choose: case study, interview, observation, group assessment (e.g. focus group), expert or peer review, portfolio review, testimonial, test of knowledge or skill, photograph, slide or video, diary or journal, log, document analysis, action cards, simulation, problem story, creative expression, unobtrusive measures.

For more explanation of measure types, see: Taylor-Powell, E., S. Steele (1996) Collecting Evaluation Data: An Overview of Sources and Methods, Cooperative Extension Publications, University of Wisconsin, Madison, WI. ( Evaluation-Data-An-Overview-of-Sources- and-Methods-P1025C237.aspx) Evaluation-Data-An-Overview-of-Sources- and-Methods-P1025C237.aspx

find v. write Summary checklist (questions to help make the “find” versus “write” decision): 1.Is the evaluation question assessing an outcome? If yes, do you need to make a strong claim of effectiveness? 2. Do you need to compare your evaluation results to those of others or to some external norm? 3. Do stakeholders require a pre-validated measure? 4. Are staff resources available to draft measures in accordance with best practices? To conduct a high quality search for measures? 5.After a quick scan, does it appear you are likely to be able to find measure that fits your construct through peers, the Netway, your program system, program funders or professional groups related