Collecting High Quality Outcome Data: Part 1 Copyright © 2012 by JBS International, Inc. Developed by JBS International for the Corporation for National.

Slides:



Advertisements
Similar presentations
Designing Effective Action for Change
Advertisements

Chapter 5 Transfer of Training
Introduction to Performance Measurement for Senior Corps Project STAR Support and Training for Assessing Results Recorded 9/10/2007.
Evidence: What It Is And Where To Find It Evidence: What It Is and Where to Find It How evidence helps you select an effective intervention.
Evidence 2/8/2014 Evidence 1 Evidence What is it? Where to find it?
Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure?
Data Collection and Aggregation
Collecting High Quality Outcome Data: Part 2 Copyright © 2012 by JBS International, Inc. Developed by JBS International for the Corporation for National.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
Collecting High Quality Outcome Data, part 1. Learning objectives By the end of this module, learners will be able to: Recognize the benefits of collecting.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
Collecting High Quality Outcome Data, part 2. Learning objectives By the end of this module, learners will understand: Steps to implement data collection,
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
Alignment and Quality 2/8/2014 Alignment and Quality 1 What is it? What do I need to look for?
1 Instruments and Data Collection New Mexico AmeriCorps April 20, 2006 Sue Hyatt, Project STAR Coach.
1 Creating Strong Reports New Mexico AmeriCorps April 20, 2006 Sue Hyatt, Project STAR Coach.
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Classroom Factors PISA/PIRLS Task Force International Reading Association January 2005.
1111 National Centre for First Nations Governance rebuilding our nations Facilitation Techniques.
California Preschool Learning Foundations
Illinois Department of Children and Family Services, Pathways to Strengthening and Supporting Families Program April 15, 2010 Division of Service Support,
BUILDING THE CAPACITY TO ACHIEVE HEALTH & LEARNING OUTCOMES
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
APS Teacher Evaluation
Teacher Keys Effectiveness System
4. NLTS2 Data Sources: Parent and Youth Surveys. 4. Sources: Parent and Youth Surveys Prerequisites Recommended modules to complete before viewing this.
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
The SCPS Professional Growth System
Fact-finding Techniques Transparencies
Chapter 5 – Enterprise Analysis
Evaluating Training Programs The Four Levels
Session 2: Introduction to the Quality Criteria. Session Overview Your facilitator, ___________________. [Add details of facilitators background, including.
Module 4: Establishing a Data-based Decision- making System.
© 2012 Common Core, Inc. All rights reserved. commoncore.org NYS COMMON CORE MATHEMATICS CURRICULUM Rigor Breakdown A Three Part Series.
Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
SNDT Women's University Introduction to Evaluation and Assessments  Presented by Kathleen (Kat) Miller, Senior Consultant with Booz Allen Hamilton  4.
District Advisory Council (DAC) 1 October 22, 2012 Westlawn Elementary School.
Science as a Process Chapter 1 Section 2.
Developing a Global Vision Through Marketing Research
Evidence: What It Is And Where To Find It Evidence: What It Is and Where to Find It How evidence helps you select an effective intervention Copyright ©
Assessment Literacy: Formative Instructional Practices
Assessment Literacy: Formative Instructional Practices
The Rubric Reality Cobb Keys Classroom Teacher Evaluation System.
Delivering training at work. Housekeeping › mobile phones › break times › toilets › emergencies © smallprint 2.
SCIA Special Circumstances Instructional Assistance
RTI Implementer Webinar Series: Establishing a Screening Process
Orientation to the Planning 10 IRP  How does Planning 10 fit into the 2004 Graduation Program?  How was Planning 10 developed?  What’s the difference.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
Chapter 12 Analyzing Semistructured Decision Support Systems Systems Analysis and Design Kendall and Kendall Fifth Edition.
1 Phase III: Planning Action Developing Improvement Plans.
PSSA Preparation.
Virginia Teacher Performance Evaluation System 0 August 2012.
1 Literacy PERKS Standard 1: Aligned Curriculum. 2 PERKS Essential Elements Academic Performance 1. Aligned Curriculum 2. Multiple Assessments 3. Instruction.
The Framework for Teaching Charlotte Danielson 4c: Communicating with Families 1 6/12/201 3.
1 Literacy Leadership Teams December 2004 Common High-Quality Differentiated Instruction for Achievement for All within The Cleveland Literacy System Module.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
MIS (Management Information System)
Data Collection* Presenter: Octavia Kuransky, MSP.
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
CSCI 4163/6904, summer Quiz  Multiple choice  Answer individually - pass in  Then class discussion.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Collecting High Quality Outcome Data: Part 1 Collecting High Quality Outcome Data Copyright © 2012 by JBS International, Inc. Developed by JBS International.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Presentation transcript:

Collecting High Quality Outcome Data: Part 1 Copyright © 2012 by JBS International, Inc. Developed by JBS International for the Corporation for National & Community Service

Collecting High Quality Outcome Data: Part 1 Learning Objectives By the end of this module, you will be able to: Recognize the benefits of collecting high-quality data Use theory of change to think about measurement Identify and evaluate merits of data sources and instruments Describe some uses of data collection methods, and evaluate their merits 2

Collecting High Quality Outcome Data: Part 1 Module Overview Determining what information is needed: Theory of change as a guide to measurement Collecting data that answers the measurement question: Data source, method, instrument Summary of key points; additional resources A note about terminology: Programs and projects are used interchangeably. 3

Collecting High Quality Outcome Data: Part 1 What Do We Mean By Data? Data: Information collected to answer a measurement question, also known as evidence Data collection occurs as a planned process that involves recording information in a consistent way Instruments aid in collecting consistent data 4

Collecting High Quality Outcome Data: Part 1 Ensuring Data Quality: Reliability, Validity, Bias Reliability is the ability of a method or instrument to yield consistent results under the same conditions. Validity is the ability of a method or instrument to measure accurately. Bias involves systematic distortion of results stemming from how data are collected and how instruments are designed. 5

Collecting High Quality Outcome Data: Part 1 Benefits of Collecting High-quality Data Sound basis for decision making Improve service quality and service outcomes Increase accountability Tell story of program achievements 6

Collecting High Quality Outcome Data: Part 1 Cause-and-effect relationship between a community problem/need and an outcome intended using a specific intervention A program or projects theory of change identifies the outcome that will be measured to gauge the success of the intervention in meeting the community problem/need Evidence Guides choice of intervention Supports cause-effect relationship Evidence Guides choice of intervention Supports cause-effect relationship Community Problem/Need Specific Intervention Intended Outcome Theory of Change – Review 7

Collecting High Quality Outcome Data: Part 1 Measurement Question Implied by Theory of Change 8 Intended Outcome Students improve attitudes towards school. Intended Outcome Students improve attitudes towards school. Community Problem/Need Students with poor attitudes towards school at risk of failing academically. Community Problem/Need Students with poor attitudes towards school at risk of failing academically. Specific Intervention Individualized mentoring to promote positive attitudes towards school. Specific Intervention Individualized mentoring to promote positive attitudes towards school. "Did students in the mentoring program improve their attitudes towards school?"

Collecting High Quality Outcome Data: Part 1 More Measurement Questions Attitude Individuals increase interest in volunteering Knowledge Students improve reading skills Behavior Children improve exercise habits Condition Organization recruits more volunteers 9 "Did individuals who attended info sessions become more interested in volunteering?" "Did students in the literacy tutoring program improve reading skills?" "Did children in the fitness program improve exercise habits?" Did capacity building activities allow our organization to recruit more volunteers?"

Collecting High Quality Outcome Data: Part 1 Identifying a Data Source Data source: The person, group or organization that has information to answer the measurement question Identify possible data sources; list pros and cons of each Identify a preferred data source; consider its accessibility Alternative data sources: consider if they can give you same or comparable data 10

Collecting High Quality Outcome Data: Part 1 Data source and type of outcome Depends partly on the type of change you want to measure - attitude, knowledge, behavior, or conditions. Data on changes in attitudes or knowledge usually come directly from persons experiencing these changes. Data on changes in behavior or conditions can come from either persons experiencing these changes or from other observers. 11 AttitudeKnowledge BehaviorCondition

Collecting High Quality Outcome Data: Part 1 How did mentored students feelings towards teachers change over time? ProsCons Students In best position to describe how they feel about their teachers May not be open about their feelings towards teachers Teachers May know how students feel towards them May not know how students feel about other teachers May only spend one class period with students Mentors May know how students feel about a wide range of issues, including teachers Depends on students willingness to share feelings with mentors Students and mentors may not discuss this issue much Comparing Data Sources 12

Collecting High Quality Outcome Data: Part 1 Method: Process or Steps Taken to Systematically Collect Data SurveyWritten questionnaire completed by respondent Interview Interviewer poses questions and records responses; face-to-face or via telephone Observation Observer records behavior or conditions using via checklist or other form Standardized Test Used to assess knowledge of academic subjects (reading, math, etc.) Next, Consider Choice of Methods 13

Collecting High Quality Outcome Data: Part 1 Method: Process or Steps Taken to Systematically Collect Data Tracking Sheet Used to document service delivery; used primarily to track outputs Focus Group Facilitator leads small group through discussion in- depth discussion of topic or issue Diaries, Journals Respondent periodically (daily) records information about his/her activities or experiences Secondary Data Using data gathered by other agencies that can be used to assess program performance Consider Choice of Methods (continued) 14

Collecting High Quality Outcome Data: Part 1 Method: Ease/Difficulty of Use, Data Analysis Survey May be difficult to find or create; very easy to use and analyze Interview/ Observation Requires trained, skilled personnel; can provide data that cannot be gathered through surveys Tracking Sheet Easy to develop and use; may not be completed consistently Focus Group Difficult to implement; generates large volume of qualitative data that are difficult to summarize Diaries, Journals Require commitment on the part of subjects; data can be challenging to interpret and analyze Consider Feasibility of Methods 15

Collecting High Quality Outcome Data: Part 1 Method and Outcomes Type Attitude and Knowledge Attitude/BeliefKnowledge/Skill DefinitionThoughts, feelings Understanding, know-how Examples Attachment to school (academic engagement) Becoming a better reader Generally Preferred Data Source/Method Student: Survey or interview Learner: Standardized test* * Use of standardized tests is mandated for certain performance measures in the Education Focus Area. Other types of knowledge (e.g., financial literacy) can be measured using other types methods. 16

Collecting High Quality Outcome Data: Part 1 Method and outcome type behavior and condition BehaviorCondition/Status DefinitionAction, conduct, habits Situation or circumstances Examples Exercising more frequently Improving stream banks Generally Preferred Data Source/Method Beneficiary: Exercise log Land manager: Observation checklist or rubric 17

Collecting High Quality Outcome Data: Part 1 Where to Find Instruments For CNCS priorities and performance measures, look for instruments by goal and focus area Go to e e Programs and projects can look anywhere they like to find instruments: Use Internet search engines Talk to others within you professional network to find out what they are using Look at evidence for intervention – how measured before? 18

Collecting High Quality Outcome Data: Part 1 Evaluating Instruments Pre-post measurement is preferable to post-only Can the instrument measure the outcome? Appropriate for your intervention? Appropriate for your beneficiaries? How many questions measure the outcome? Single question low-quality data Series of questions: Too long or complex? Instrument should not exceed 2 pages Do questions cover all relevant aspects of your intervention? Can questions not specific to your intervention be removed? 19

Collecting High Quality Outcome Data: Part 1 Define Outcome Dimensions Outcome Dimensions: The main aspects, features, or characteristics that define an outcome and that should be taken into account for measurement to be valid Example: Increased attachment to school: Feelings about being in school Feelings about doing school work Feelings towards teachers Feelings towards students 20

Collecting High Quality Outcome Data: Part 1 Outcomes Often Consist of Multiple Dimensions Transitioned to housing: Safe, healthy, affordable housing (O11) Increased physical exercise: Frequency, intensity, duration of exercise Increased attachment to school: Feelings about being in school and doing school work, feelings towards teachers and students (ED27) 21

Collecting High Quality Outcome Data: Part 1 Example: Dimensions of Attachment to School a.Feelings about being in school b.Feelings about doing school work c.Relations with other students d.Relations with teachers a a b b c c 22 d d a b c d

Collecting High Quality Outcome Data: Part 1 Summary: Identifying Outcome Dimensions National performance measures: look at performance measurement instructions Look at your theory of change Talk to stakeholders and program staff Build up a list of dimensions; look for repeated themes 23 Evidence Guides choice of intervention Supports cause-effect relationship Evidence Guides choice of intervention Supports cause-effect relationship Community Problem/Need Specific Intervention Intended Outcome

Collecting High Quality Outcome Data: Part 1 Instrument Design Issues Crowded layout Double-barreled questions Biased or leading questions Questions that are too abstract Questions that use unstructured responses inappropriately Response options that overlap or contain gaps Unbalanced scales 24

Collecting High Quality Outcome Data: Part 1 Crowded Layout 25 Problem: Crowded layout Most of the time, how do you feel about doing homework? I usually hate doing homework I usually dont like doing homework I usually like doing homework I usually love doing homework Solution: Dont use crowded layouts Most of the time, how do you feel about doing homework? I usually hate doing homework I usually dont like doing homework I usually like doing homework I usually love doing homework

Collecting High Quality Outcome Data: Part 1 Double-barreled Question 26 They strongly like it They like it They are undecided They dislike it They strongly dislike it Problem: Asking two questions in one How do teachers and students at your school feel about the mentoring program? Solution: Break out questions separately How do teachers at your school feel about the mentoring program? How do students at your school feel about the mentoring program? They strongly like it They like it They are undecided They dislike it They strongly dislike it They strongly like it They like it They are undecided They dislike it They strongly dislike it

Collecting High Quality Outcome Data: Part 1 Biased or Leading Question 27 Problem: Biased or leading questions Has the mentoring program improved how you feel about going to school? Yes No No opinion Solution: Use neutral questions How has the mentoring program affected how you feel about going to school? I feel better about going to school. I feel worse about going to school. I feel about the same about going to school. No opinion

Collecting High Quality Outcome Data: Part 1 Abstract or Broad Question 28 Problem: Questions are too abstract or broad. Did you enjoy the mentoring program? Yes No Not Sure Solution: Make questions more concrete and specific. Would you recommend the mentoring program to other students? Yes No Not Sure

Collecting High Quality Outcome Data: Part 1 Not Using Structured Responses 29 Problem: Using unstructured responses when structured responses are appropriate How much do your grades matter to you? Solution: Provide structured responses when appropriate How much do your grades matter to you? Not at all A little Somewhat A lot

Collecting High Quality Outcome Data: Part 1 Response Options with Overlaps or Gaps 30 Problem: Response options that overlap or contain gaps Approximately how many hours a day to you typically spend doing homework? Less than 1 hour 0 to 2 hours 4 to 5 hours More than 5 hours Solution: Scale with no overlaps or gaps Approximately how many hours a day to you typically spend doing homework? Less than 1 hour About 1 hour About 2 hours About 3 hours About 4 hours More than 4 hours

Collecting High Quality Outcome Data: Part 1 Unbalanced scales 31 Problem: Using unbalanced scales Solution: Use balanced scales Poor Average Good Very Good Excellent Very Poor Poor Average Good Very Good

Collecting High Quality Outcome Data: Part 1 What else to look for in selecting an instrument Can the instrument work in your context? Does the instrument use simple and clear language? Is the instrument appropriate for the age, education, literacy, and language preferences of respondents? 32

Collecting High Quality Outcome Data: Part 1 What else to look for in selecting an instrument, continued Does the instrument rely mostly on multiple choice questions? Is the ready for use, or does it need to be modified? How will you extract information from the instrument to address performance measurement targets? 33

Collecting High Quality Outcome Data: Part 1 Summary of key points The benefits of collecting high-quality data include providing a sound basis for decision making, improving service quality and outcomes, increasing accountability, and telling your story in a more compelling way. Your theory of change, and the key measurement question embedded in it, is a useful a guide to measurement. The type of outcome to be measured influences decisions about data sources, methods, and instruments. 34

Collecting High Quality Outcome Data: Part 1 Summary of key points Knowing the pros and cons of a data sources is helpful for choosing one and for designing an appropriate measurement process. CNCS provides sample instruments for most national performance measures. In addition, programs are permitted to look anywhere to find instruments that meet their needs. High-quality outcome measurement often requires using an instrument that can capture multiple dimensions of the outcome. Instruments should also be free from other design problems. 35

Collecting High Quality Outcome Data: Part 1 Additional resources CNCS Performance Measurement o Instrument Formatting Checklist o opment_Checklist_and_Sample.pdf opment_Checklist_and_Sample.pdf Practicum Materials o curriculum curriculum 36