Principles and Strategies of KEYS 2.0 Data Analysis and Interpretation GAE Training January 1009 Jacques Nacson Senior Policy Analyst NEA New Products.

Slides:



Advertisements
Similar presentations
Quality control tools
Advertisements

Leading Learning in the Social Sciences MARGARET LEAMY National Coordinator Social Sciences Te Tapuae o Rehua Consortium
Measurement, Evaluation, Assessment and Statistics
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Victorian Curriculum and Assessment Authority
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
Collecting and Analyzing Data to Inform Action. Stage 2: A theory of action for your project Exploring research and best practices to provide a strong.
COSA Assessment Conference Mickey Garrison Tony Alpert Jon Wiens Beth LaDuca.
Voice Project Survey Report (c) Voice Project Pty Ltd & Access Macquarie Ltd – Overview Of Results Page 1 Guidelines For Interpretation Of Results.
The Purpose of Action Research
Lesson Fourteen Interpreting Scores. Contents Five Questions about Test Scores 1. The general pattern of the set of scores  How do scores run or what.
Agenda for January 25 th Administrative Items/Announcements Attendance Handouts: course enrollment, RPP instructions Course packs available for sale in.
Formative and Summative Evaluations
SUNITA RAI PRINCIPAL KV AJNI
Evaluation. Practical Evaluation Michael Quinn Patton.
Ways to Utilize the 2012 FCPS Working Conditions Survey April 11, 12, 13 Laurie Fracolli, Sid Haro, and Andrew Sioberg.
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
Test Validity S-005. Validity of measurement Reliability refers to consistency –Are we getting something stable over time? –Internally consistent? Validity.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Research Methods. Research Projects  Background Literature  Aims and Hypothesis  Methods: Study Design Data collection approach Sample Size and Power.
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
The Vocabulary of Research. What is Credibility? A researcher’s ability to demonstrate that the study is accurate based on the way the study was conducted.
Program Evaluation Using qualitative & qualitative methods.
Research and Statistics AP Psychology. Questions: ► Why do scientists conduct research?  answer answer.
Action Research March 12, 2012 Data Collection. Qualities of Data Collection  Generalizability – not necessary; goal is to improve school or classroom.
Research Methods in Computer Science Lecture: Quantitative and Qualitative Data Analysis | Department of Science | Interactive Graphics System.
Saginaw KEYS Data Analysis Training for Continuous School Improvement March 20 and 21, 2006 Jacques Nacson Gary Obermeyer.
Test item analysis: When are statistics a good thing? Andrew Martin Purdue Pesticide Programs.
IIT BOMBAYIDP in Educational Technology * Paper Planning Template Resource – Paper-Planning-Template(SPT)Version 1.0, Dec 2013 Download from:
Evaluating a Research Report
User Study Evaluation Human-Computer Interaction.
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
1 Cronbach’s Alpha It is very common in psychological research to collect multiple measures of the same construct. For example, in a questionnaire designed.
What is HQPD?. Ohio Standards for PD HQPD is a purposeful, structured and continuous process that occurs over time. HQPD is a purposeful, structured and.
Developing a Tool to Measure Health Worker Motivation in District Hospitals in Kenya Patrick Mbindyo, Duane Blaauw, Lucy Gilson, Mike English.
Counseling Research: Quantitative, Qualitative, and Mixed Methods, 1e © 2010 Pearson Education, Inc. All rights reserved. Basic Statistical Concepts Sang.
Iowa Support System for Schools and Districts in Need of Assistance Phase II: Diagnosis AEA 267 September 2011.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Problem Solving.
Why must we do Data Teams? We know the implementation of DT benefit STUDENTS -helps teachers identify exceeding/proficient students and plan for targeted.
Systems Life Cycle. Know why it is necessary to evaluate a new system Understand the need to evaluate in terms of ease-of- use, appropriateness and efficiency.
QUANTITATIVE RESEARCH AND BASIC STATISTICS. TODAYS AGENDA Progress, challenges and support needed Response to TAP Check-in, Warm-up responses and TAP.
Appraisal and Its Application to Counseling COUN 550 Saint Joseph College For Class # 3 Copyright © 2005 by R. Halstead. All rights reserved.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Social Science Inquiry Model. Scientific inquiry has 5 steps Identify a problem Develop a hypothesis Gather data Analyze the data Draw conclusions.
Finding & Using Standard Deviation. Entry Task What trends do you see in your experimental results? How confident are you in your data? (very confident,
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Descriptive & Inferential Statistics Adopted from ;Merryellen Towey Schulz, Ph.D. College of Saint Mary EDU 496.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
Measurement Issues General steps –Determine concept –Decide best way to measure –What indicators are available –Select intermediate, alternate or indirect.
Quick Write Reflection How will you implement the Engineering Design Process with your students in your classes?
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Action Research. How trustworthy are the articles you are reading for your literature review? What other issue are you discovering?
Chapter 14: Affective Assessment
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Mariana Lopez Carlos Aguilar Alex Rodriguez Period Commute time for senior student’s high school experience.
Lesson 13-3 Histograms.
Standardized Testing. Basic Terminology Evaluation: a judgment Measurement: a number Assessment: procedure to gather information.
Applied Opinion Research Training Workshop Day 3.
Office of Service Quality
Lab Report & Rubric Exercise. Title Title is descriptive and appropriate for the study conducted Interpret and analyze scientific information.
Developing Smart objectives and literature review Zia-Ul-Ain Sabiha.
CHAPTER ONE: INTRODUCTION TO ACTION RESEARCH CONNECTING THEORY TO PRACTICE IMPROVING EDUCATIONAL PRACTICE EMPOWERING TEACHERS.
Welcome To The Presentations. Presentation on The Topic : “Best Recruitment and Selection is the Pre- Condition of Organizational Success”
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Chapter 12 Understanding Research Results: Description and Correlation
Test Validity.
What It Is and How to Design an Action Research Project
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Presentation transcript:

Principles and Strategies of KEYS 2.0 Data Analysis and Interpretation GAE Training January 1009 Jacques Nacson Senior Policy Analyst NEA New Products and Programs

Objectives  To examine how KEYS fits within the context of continuous school improvement  To review key statistical terms  To review basic research concepts  To consider basic principles of data analysis  To examine and learn to interpret the KEYS 2.0 school and district reports

Problem or Opportunity Identified Data Collection Diagnosis & Refinement of Problem Action Planning Action Taking Implementa tion & Evaluation

Statistical Terms  Mean, Median, Mode  90 th Percentile  Standard deviation  Factor analysis  Regression analysis  Correlation analysis  Statistical significance (.05 level,.01 level)

Data Collection: “How much information do we need?”  at the MOST --  Is the information collected compelling enough to convince any skeptic?  at the LEAST --  Will the information collected at least create cognitive dissonance for the resistors?

Triangulation of Data  Compensates for imperfections of data- gathering instruments.  When multiple measures yield same results, it increases confidence in results.  When multiple measures fail to yield same results, it raises important follow-up questions

Data Collection: “What must we keep in mind?” 1.Do the instruments and methods we plan to use measure what we claim they do? (Validity) 2. Do the instruments and methods we plan to use accurately measure the phenomena we are studying? (Reliability)

Data Gathering Techniques  Interviews  Checklists  Diaries  Logs  Questionnaires  Audio or video tapes  Photographs  Consultative advice

General Principles of Data Interpretation Before Looking at Your KEYS 2.0 Results

General Principles to Consider Specifically Related to Your KEYS 2.0 Data

Horizontal Axis Measure of quality: 5 point scale Left side Disagree (low value) Right side Agree (high value) Vertical Axis Indicators (groups of questions that measure the same concept) KEY 1. Shared Understanding and Commitment To High Goals Respondents Provide Direct Instruction to Students Understanding the Graph for each KEY

School AverageAll Schools Average 90 th Percentile Score Data Points School average (Black) All schools average (Red) 90 th percentile score (Yellow) Length of the horizontal bar (Blue/Purple) 1 standard deviation above and one below the school average (measure of agreement or consensus) Standard Deviation Understanding the Graphs for each KEY The Goals for your school in terms of continuous improvement for each indicator The school average moving continuously toward the right side (agree – high value of quality for that indicator) At the same time, reduce the standard deviation (narrow the length of the horizontal bar, meaning greater agreement among respondents)

Hierarchical organization – Keys—Indicators--items Rating - the degree to which the respondents believe that the indicator accurately describes the school: average of all respondents on the questions that make up each indicator  1 = strong belief that indicator does not describe the school  2 = some belief that indicator does not describe the school  3 = neutral  4 = some belief that the indicator describes the school  5 = strong belief that the indicator describes the school KEYS Vocabulary

KEYS Vocabulary (Con’d)  Consensus - level of agreement among respondents on the rating for each indicator Low consensus = wide band High consensus = narrow band

Demo School Report with Links to Resources

Demo School Report with Links to Resources (Drilling Down)

Demo School Report with Links to Resources (Drilling Down further)

Example of a District Report (Aggregate Scores for Key 6)

Example of District Report Distribution of School Scores for Indicator 6.5 (Interventions Example of District Report Distribution of School Scores for Indicator 6.5 (Interventions)

Example of a District Report Aggregate Scores for Key 5 (Resources)

Example of District Report: Distribution of School Scores for Indicator 5.4 (Safe & Healthy Learning Environment)

PROCESSING THE DATA Looking at your groups ’ assigned Key Graph answer these questions: On which indicators is there the greatest agreement (the shorter bar)? On which indicators is there the least agreement (the longer bar)? Why might that variability of perspective exist?

PROCESSING THE DATA – Cont ’ d How might you come to greater consensus on this? Which of the KEYS indicators have both a high mean score and also have a shorter bar? (Strengths) Which of the indicators have both a low mean score and a wide degree of variability? (Areas for Improvement) Which items have the highest correlation with student achievement?

Analyzing and Interpreting KEYS 2.0 Data - The Logic Sequence 1. Are results surprising or do they confirm what you know or believe? 2. If the results are surprising, what might have caused the difference? 3. How might you go about checking further to determine the validity of the findings? 4. If you determine that the findings are valid and the issue is important and relevant to your particular situation, what might be the reasons for such findings? 5. What is the most likely cause? The underlying or root cause? How did you arrive at this conclusion?

Analyzing and Interpreting KEYS 2.0 Data -The Logic Sequence – Cont ’ d 6. What are some possible actions/solutions that you and your colleagues might take to alter or reverse and improve the condition? 7. What is the best possible solution given your particular context or situation? 8. What are the steps that you must plan and take in order to implement the best possible solution? 9. What are the resources and the skills needed to implement the solution successfully? 10. How would you know if your actions/solutions were successful in ameliorating the condition?

Focus group Questions: Do the KEYS data support or refute other data previously collected? What professional development opportunities are indicated by the data? What barriers to achieving previously stated goals surfaced during your discussion? What other learning does your focus group wish to share with the larger group?

Steps a School Might Take Once KEYS Preliminary Analyses are Completed 1. GAPS: Decide on one indicator or a group of indicators where gap (s) exist. 2. RELEVANCE: Reflect with the “team” on the relevance, importance and priority of the selection. (Get feedback from stakeholders) 3. DATA COLLECTION-VALIDATION: Consider the need to collect additional data to validate the KEYS findings. 4. DATA COLLECTION-DIAGNOSIS AND REFINEMENT: Examine what is the “root” cause of the problem.

Steps a School Might Take Once KEYS Preliminary Analyses are Completed 5. THEORY OF ACTION: Identify and select the most appropriate solution for your context. (Get Feedback from stakeholders) 6. ACTION PLANNING: Set SMART goals and develop specific action/project plans. (Get feedback and commitments from stakeholders) 7. IMPLEMENTATION: Action plans must be implemented for improvement to occur. 8. DATA COLLECTION-EVALUATION: Both process and product evaluations are necessary for learning to happen. 9. BACK TO STEP 1: Repeat the cycle