Based on the work of Vicki Bernhardt ITF January 27, 2009.

Slides:



Advertisements
Similar presentations
ACCOMMODATIONS MANUAL
Advertisements

Self-Study Tool for Alaska Schools Winter Conference January 14, 2010 Jon Paden, EED Deborah Davis, Education Northwest/Alaska Comprehensive Center.
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
Semonti Basu PBS Technical Assistance Facilitator Grace Martino-Brewster PBS Specialist Austin Independent School District Connecting Data Dots Using data-based.
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
DISTRICT IMPROVEMENT PLAN Student Achievement Annual Progress Report Lakewood School District # 306.
INSTRUCTIONAL LEADERSHIP: CLASSROOM WALKTHROUGHS
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Unit 15 Assessment in Language Teaching. Teaching objectives By the end of the lesson, students should be able to:  know what assessment is and how it.
Grade 12 Subject Specific Ministry Training Sessions
Leadership For School Improvement
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Assessment COURSE ED 1203: INTRODUCTION TO TEACHING COURSE INSTRUCTOR
Principles of Assessment
District Continuous Improvement Team December 10, 2013.
Chapter 4 Evaluating and Creating Interactive and Content- Based Assessment.
DISTRICT IMPROVEMENT PLAN Student Achievement Annual Progress Report Lakewood School District # 306.
Stone-Robinson Math Information Night Dr. Nicholas King, Principal December 9, 2014.
ASSESSMENT Formative, Summative, and Performance-Based
The common inspection framework: education, skills and early years.
Becoming a Teacher Ninth Edition
CIV 1 Lincoln County Administrators Total Instructional Alignment.
UNDERSTANDING BY DESIGN
Stronge Teacher Effectiveness Performance Evaluation System
Leaders of Learners HOW DO WE LEAD THE CHANGE WE WANT TO SEE IN OUR SCHOOLS?
Integrating Differentiated Instruction & Understanding by Design: Connecting Content and Kids by Carol Ann Tomlinson and Jay McTighe.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
Office of School Improvement Differentiated Webinar Series A Framework for Formative Assessment November 15, 2011 Dr. Dorothea Shannon Dr. Greg Wheeler.
Advancing Assessment Literacy Data Gathering IV: Collecting and Collating Data.
One Common Voice – One Plan School Improvement Module 2
FEBRUARY KNOWLEDGE BUILDING  Time for Learning – design schedules and practices that ensure engagement in meaningful learning  Focused Instruction.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
Smarter Balanced Assessment System March 11, 2013.
Supports K–12 School Effectiveness Framework: A Support for School Improvement and Student Success (2010). The integrated process of assessment and instruction.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
Distinguished Educator Initiative. 2 Mission Statement The Mission of the Distinguished Educator is to build capacity in school districts to enable students.
Adapted From the Work and Wisdom of Grant Wiggins & Jay McTighe, UBD 08/2002 Understanding by Design the ‘big ideas’ of UbD.
Problem Solving December 17, 2012 Aubrey Tardy Michelle Urbanek.
Readiness for AdvancED District Accreditation Tuscaloosa County School System.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Module Four: Resources for Learning A Collaboration between NCSA, NDE, and ESUs.
10+ Ways to Analyze Data Presenter: Lupe Lloyd Lupe Lloyd & Associates, Inc.
An Integral Part of Professional Learning Communities.
Hudson Area Schools - Lincoln Elementary School Improvement Professional Development Friday, February 26, 2010.
SACS/CASI District Accreditation  January 2007  April 2007  May 2007  January – April 2008  Board Approval for Pursuit of District Accreditation.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
 In your school, what data have you been exposed to about students that enables you to make decisions about your instruction? Is it enough?  Why/why.
Chapter 10 Evaluating and Reporting Sarah Chrobot, Leigh Tremblay, Jessica Gent, Emma Weighill, Jewel Springer “The process of assessing children’s learning.
AYP Aigner Allen Shoemaker Elementary  Shoemaker did not make AYP because of the following subjects:  Math  Writing.
School Improvement Needs Assessment – © Iowa Association of School Boards Assessment Conducted by the Iowa Association of School Boards.
What do you think? The most effective method for assessing my students is to use a large end of unit test.
Welcome to the (ENTER YOUR SYSTEM/SCHOOL NAME) Data Dig 1.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
DPI Javits Grant: Expanding Excellence Initiative State Leadership Cadre March 2016.
Chapter 1 Assessment in Elementary and Secondary Classrooms
Assessing Musical Behavior
Consider Your Audience
Assessment in Language Teaching: part 1 Lecture # 23
Fletcher’s Meadow Secondary School Assessment and Evaluation May 2017
ASSESSMENT OF STUDENT LEARNING
Welcome and Announcements
Gary Carlin, CFN 603 September, 2012
Presentation transcript:

Based on the work of Vicki Bernhardt ITF January 27, 2009

 Replace hunches with facts concerning changes  Facilitate clear understanding of gaps between where school is and where it needs to be, and identify root causes-not treat symptoms  Provide information to eliminate ineffective practices  Ensure efficient use of $$$  School if school goals and objectives are being accomplished  Ascertain if school staffs are implementing their visions  Generate answers and effectively educate the community  Predict and prevent failures  Predict and ensure successes  Improve instruction  Provide students with feedback on their performance.

 Gain understanding of what quality is and how close we are to achieving it  Make sure students are not “falling through the cracks  Get the “root causes” of the problems  Guide curriculum development and revision  Meet state and federal requirements  Promote accountability

 In contrast to business, work culture in education usually doesn’t focus on data  Few are trained and few see it as part of their jobs  Schools don’t have data warehousing solutions that make it easier  Teachers are generally trained to be subject oriented, not data oriented  There’s a perception that data is gathered for other people not mine  The legislature keeps on changing the rules! (e.g. NHEIAP, NECAP)

 Data analysis should be  Systematic, Systemic All of the the processes and procedures that contribute to learning-the whole and the interrelationships of the parts of the whole to each other  Continuous Measuring and evaluating processes on an ongoing basis to identify and implement improvement “Upstream process improvement not downstream damage control” Teams and Tools (1991)

 One measure gives useful information, but…  Comprehensive, multiple measures give much richer info.

 Demographics (enrollment, drop out rate, gender, ethnicity)  Perceptions (values/beliefs, attitudes)  Student Learning (standardized tests, authentic assessments, teacher observations)  School Processes (how we teach children to read, our Math scope and sequence through the years)

Each of the four measures, by themselves, give valuable information. But the deeper we dig and the more levels we utilize, the more effective the resulting information.

 How many students are enrolled at MSS this year? (Demographics)  How satisfied are parents, students, and/or staff with the learning environment at HMS? (Perceptions)  How did students at the Hopkinton High School score on the SATs this year? (Student Learning)  What new courses are being offered by the high school this year? (School Processes)

 How has enrollment changed throughout the HSD over time? (Demographics)  How have student perceptions of the A.P. program changed over time? (Perceptions)  Are there differences in boy’s 6th grade math scores in the NHEIAP over time? (Student Learning)  What programs have been consistent at MSS over the last five years? (School Processes)

Looking at more than one type of data within each of the circles..  What percentage of students in the HSD are fluent speakers of languages other than English and are there equal numbers of males and females? (Demographics)  Are staff, students, and parent perceptions of the culture within HMS in agreement? (Perceptions)  Are student’s scores on the 6th grade NHEIAP consistent with K-6 assigned grades and performance assessment rubrics? (Student Learning)  What are the major instructional strategies used by the Math and Science departments at the high school? (School Processes)

 Similar to Level 3…just does the analysis over time…  For example: How has the enrollment of non-English speaking children changed in the last three years? (Demographic)

 Do students who have perfect attendance perform better on the NHEIAP than students who miss more than five days per month? (Demographics by Student Learning)  Is there a gender difference in students’ perceptions of The Senior Project? (Perceptions by Demographics)  Do students engaged in classrooms utilizing differentiated instruction perform better on the NHEIAP than in those classrooms that do not employ this strategy? (Student Learning by School Processes)

Same as Level 6, just over time, for example…  How have students of different ethnicities scored on the SAT over the past three years? (Demographics by Student Learning)

 Is there a difference in students’ reports of what they like most about MSS by whether or not they participate in extracurricular activities? Do these students have higher grades than those who do not participate in extracurricular activities? (Perceptions by Student Learning by School Processes)  What program at the high school is making the biggest difference with respect to student achievement for at-risk students this year and is one group of students responding better to the processes? (School Processes by Student Learning by Demographics)

Same as Level 7, but over time, for example…  What programs throughout the district do all types of students like the most every year? (Demographics by Perceptions by School Processes)

This is the ultimate analysis. It allows us to answer questions such as:  Are there differences in NWEA scores for 8th grade girls and boys who report they like school, by the type of program in which they are enrolled? (Demographics by Perceptions by School Processes by Student Learning)

Same as Level 9, but over time, for example…  Based on whom we have as students, how they prefer to learn, and what programs they are in, are all students learning at the same rate? (Student Learning by Demographics by Perceptions by School Processes)

Demographics, Perceptions, Student Learning, and School Processes

 Community  Location, history, population trends…  School District  Description and history, number of schools…  School  Grants and awards received, class sizes, facilities

 Students Over Time, and by Grade Level  Attendance, gender, race…  Staff Over Time  Years of experience, ethnicity, certifications…  Parents  Educational levels, involvement with child’s learning...

The separation of data into subgroups. Why is it important?  Disaggregation shows if there are differences in subgroups-there should be few differences.  Helps us find subgroups that are not responding to our techniques.

 A generally held view.  A belief stronger than impression and less strong than positive knowledge.

 Teachers  Parents  Students  Administrators  Community members

 Give parents a chance to experience the approach, e.g. parent night, open house.  Cognitive dissonance; the discomfort one feels when two thoughts, opinions, or ideas are inconsistent.

 Questionnaires  Interviews  Focus Groups (All three were done as part of the SPEDMIP report)

They must be:  Valid  Reliable  Understandable  Quick to complete

 Interview (face to face)  Telephone Interview (person to person)  Mailed  Paper  Online (recommended, e.g. SurveyMonkey.com

 Standardized tests  Norm-referenced, criterion referenced  Authentic assessments  Teacher-made test  Teacher-assigned grades  Performance assessments  Standards based assessments  Diagnostic testing

 Compares test performance of a school, group, or individual with the performance of a norming group (e.g. CAT, Iowa)

 Compares individuals performance to a set of standards and not to the performance of other test takers. (e.g. NHEIAP, NECAP)

 Tests given to students to know the nature of a student’s difficulty but not necessary the cause of that difficulty.(e.g. Sped testing)

 Measures skill and knowledge directly, e.g. if you want students to learn to write, assess it on a writing assignment  Advantages: measures process, can match state standards, is a learning tool in and of itself, provides reflection for students...

1. Identify desired results 2. Determine acceptable evidence 3. Plan learning experiences & instruction (from UbD PPT)

 The evidence should be credible & helpful.  Implications: the assessments should –  Be grounded in real-world applications, supplemented as needed by more traditional school evidence  Provide useful feedback to the learner, be transparent, and minimize secrecy  Be valid, reliable - aligned with the desired results of Stage 1 (and fair) (from UbD PPT)

 We need patterns that overcome inherent measurement error  Sound assessment (particularly of State Standards) requires multiple evidence over time - a photo album vs. a single snapshot (from UbD PPT)

 Varied types, over time:  authentic tasks and projects  academic exam questions, prompts, and problems  quizzes and test items  informal checks for understanding  student self-assessments (from UbD PPT)

 Programs  Reading Recovery, Write Traits, UBD  Practices  HBA, Mini-Mentors, Student Councils  Instructional strategies  Differentiated Instruction, Cooperative Learning

 What teachers and administrators do to achieve the vision of the school  What do we want students to be able to know and do?  How are we enabling students to learn in terms of Instructional strategies, learning strategies, instructional time, student-teacher ratio, philosophies of classroom management  How will we know if any given approach works?  What will we do with students who don’t learn this way?  How will the parts of the curriculum relate?  What learning strategies do successful learners use?

 Solid training in UbD  Conducting a School Portfolio  Setting up a systematic system for data collection and analysis  Selecting a data warehouse  Training the staff in general