Download presentation
Presentation is loading. Please wait.
Published byKelley Lucas Modified over 9 years ago
1
Decision Making for Results
2
Part One: Objectives Develop a deeper understanding of the Decision Making for Results: Data-Driven Decision Making process Increase awareness of the relevance of data and its impact on leadership, teaching, and learning Reinforce the importance of collecting both cause and effect data
3
Objectives Apply the Decision Making for Results: Data-Driven Decision Making process to monitor leadership, teaching, and learning Implement the Decision Making for Results: Data-Driven Decision Making process to monitor school improvement
4
Principles of Decision Making For Results Antecedents CollaborationAccountability
5
Seminar Overview Introduction Building the foundation Process and application Action planning
6
Becoming Data Driven How are you currently embracing a data-driven decision making process that leads to results?
7
Results-Driven Schools Where is the proof? 90/90/90 Schools, Reeves 2003 Education Trust, 2002 NCREL, 2000 Consortium for Policy Research in Education, 2000 EdSource, 2005 Northern Illinois University Center for Governmental Studies, 2004
8
Reflection “The value of the data emerges only when analysis provides insights that direct decisions for students.” S. White, 2005
9
Part Two Building the Foundation Cause data and effect data Continuous improvement cycle Principles and processes of Decision Making for Results: Data-Driven Decision Making
10
“Only by evaluating both causes and effects in a comprehensive accountability system can leaders, teachers, and policymakers understand the complexities of student achievement and the efficacy of teaching and leadership practices.” Reeves, 2006
11
Definitions and Examples Effect data: Outcomes or results Cause data: Professional practices that create specific effects or results
12
The Leadership & Learning Matrix Effects/Results (stud.out.) Lucky High results, low understanding of antecedents Replication of success unlikely Leading High results, high understanding of antecedents Replication of success likely Losing Ground Low results, low understanding of antecedents Replication of failure likely Learning Low results, high understanding of antecedents Replication of mistakes unlikely Antecedents/Cause Data (Adult Actions)
13
PIM Monitoring FrequencyEvaluation Implementation Strategies Professional Development Parental Involvement Planning Needs AssessmentInquiryGoals
14
Part Three: Process and Application
15
Ocean View Elementary School A Look at Collaboration
16
The Process for Results Analyze to Prioritize Monitor & Evaluate Results Treasure Hunt SMART Goals Specific Strategies Results Indicators Inquiry; Develop Questions
17
Inquiry “Data-driven decision making begins by asking fundamental questions.” Doug Reeves What questions do you have about teaching and learning in your school? What data sources are you using to gather the specific information?
18
Step 1: Conduct a Treasure Hunt Why? To gather and organize data in order to gain insights about teaching and learning practices Considerations Measures of data Disaggregation Triangulation Reflection
19
Measures of Data Student learning Demographics Perceptions School processes – Behaviors within our control: instructional and leadership strategies, programs and resources, and organization
20
Disaggregation To separate something into its component parts, or break apart “Disaggregation is not a problem-solving strategy. It is a problem-finding strategy.” Victoria Bernhardt, Data Analysis, 1998 Think, pair, share: What data do you disaggregate, and how do you use the information?
21
Triangulation A Look at Learning DRA Benchmark Running Records
22
Case Study Read case study Part 1: How did they categorize the different data sets and record their observations? Part 2: What did they discover?
23
Conduct a Treasure Hunt Application 1.Review inquiry questions 2.Conduct a “Treasure Hunt” 3.Organize data on templates 4.Use rubric to monitor and evaluate your work
24
Can You Identify with This? “It is not so much a lack of data, but an absence of analysis, and an even greater absence of actions driven by the data.” White, 2005
25
Step 2 Analyze Data to Prioritize Needs Data Analysis at Northside Middle School
27
Analyze Data to Prioritize Needs Why? To identify causes for celebration and to identify areas of concern Considerations Strengths Needs Behavior Rationale
28
Quality Prioritization Why? To take immediate action on the most urgent needs Quality prioritization requires a thorough understanding of: Student population Curriculum and Power/Priority Standards (leverage, readiness) Antecedents affecting student achievement Quality of program implementation White, 2005
29
Case Study Review case study What insights did you gain after reading analysis of student performance? Make a recommendation: What is the most urgent need?
30
Review, Analyze, and Prioritize Application 1.Review data from Step 1 2.Conduct analysis using the guiding questions 3.Prioritize urgent needs using the suggested criteria 4.Record your work on the templates 5.Use rubric to monitor and evaluate your work
31
Step 3 Establish SMART Goals Why? To identify our most critical goals for student achievement based on the challenges that were identified through the inquiry process Specific, Measurable, Achievable, Relevant, Timely
32
Establish Your SMART Goals Application Review prioritized needs Review Treasure Hunt baseline data Apply SMART goal formula, use templates Use rubric to monitor and evaluate your work
33
Goals – Application 1.Review prioritized needs 2.Review Treasure Hunt baseline data 3.Apply SMART goal formula; use templates to record your work 4.Use rubric to monitor and evaluate your work
34
Share Your Findings with Colleagues Meet in the middle of the room Be prepared to share your findings from Steps 1-3 Highlight one celebration from a colleague
35
Step 4 Select Specific Strategies Let’s watch Lake Taylor High School as they discuss strategies.
37
Select Specific Strategies Why? Adult actions will impact student achievement Strategies are – Action-oriented Measurable/accountable Specific Research-based Considerations: Instructional, organizational, leadership, programmatic
38
Research-Based Strategies Reeves, D.B. (2003). 90/90/90 schools. Retrieved from www.LeadandLearn.com Reeves, D.B. (2006). Ten things high schools can do right now to improve student achievement. Learning 24/7 Observation Study (2005). What’s happening in schools? Or not?
39
Additional Evidence in Support of Research-Based Strategies Zemelman, S., Daniels, H., & Hyde, A. (2005). Best practice. Portsmouth, NH: Heinemann. Marzano, R. (2007). The art & science of teaching. Alexandria, VA: ASCD. Barr, R., & Parrett, W.H. (2007). The kids left behind. Bloomington, IN: Solution Tree. Marzano, R., Waters, T., & McNulty, B. (2005). School leadership that works. Alexandria, VA: ASCD.
40
Let’s Do It! Guided Practice
41
Case Study Revisit case study analysis What types of strategies (instructional, organizational, leadership, programmatic) did they select? How will the strategies help students overcome the obstacles?
42
Select Your Specific Strategies 1.Revisit your prioritized needs 2.Research the best possible strategies to meet the learner needs 3.Group by type of strategy: Instructional, organizational, programmatic, and leadership 4.Use rubric to monitor and evaluate your work
43
Step 5 Determine Results Indicators Why? To monitor the degree of implementation and evaluate the effectiveness of the strategies
44
Results Indicators Considerations Serve as an interim measurement Used to determine effective implementation of a strategy Used to determine if strategy is having the desired impact Help to determine midcourse corrections
45
Case Study Review case study How will their results indicators serve as an interim measurement? How clearly will the results indicators help to monitor implementation and impact?
46
Results Indicator Application 1.Revisit strategies (Step 4) 2.Develop results indicators 3.Use rubric to monitor and evaluate your work
47
“Improvement cycles require leadership follow-up and relentless efforts to maintain the focus on data if decisions are truly going to be driven by informed data.” White, 2005
48
Step 6 Monitor and Evaluate Results Why? To engage in a continuous improvement cycle that – Identifies midcourse corrections where needed Adjusts strategies to assure fidelity of implementation
49
Case Study Review the case study How did they monitor strategies? Was there any evidence of midcourse corrections?
50
Develop Your Monitoring Plan Review your work from developing questions to determining results indicators then determine how you will monitor the strategies. When you create your monitoring plan consider: Teacher or administrator teams Monitoring cycles Goals Strategies Impact on student and adult behavior Ability to make midcourse corrections
51
Educators Matter “Many people live their lives aspiring to make a difference and lead a life that matters. There need be no such uncertainty in the life of an educator or school leader. Every decision we make, from daily interactions with students to the most consequential policies at every level of government, will influence leadership and learning…
52
… After all these words, statistical analyses, and graphs,… What we do matters.” Reeves, 2006
53
Questions and Discussion Your ideas and reflections are important to us. Please take time to complete the short evaluation form that we reviewed at the beginning of this seminar. The Leadership and Learning Center 866.399.6019 LeadandLearn.com
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.