Presentation is loading. Please wait.

Presentation is loading. Please wait.

Haywood County Schools February 20,2013

Similar presentations


Presentation on theme: "Haywood County Schools February 20,2013"— Presentation transcript:

1 Haywood County Schools February 20,2013 http://region8wnc.ncdpi.wikispaces.net/

2 Becky Pearson Professional Development Consultant Region 8 becky.pearson@dpi.nc.gov 828.803.8315

3 Data Literacy Module https://center.ncsu.edu/nc

4 Pre-Assessing Our Knowledge of EVAAS Take a dot from your table. Place your dot on the consensogram in the area that best describes your level of knowledge about EVAAS reporting:  I have no knowledge of EVAAS.  I know how to access and read EVAAS reports.  I know how to read and analyze EVAAS reports.  I know how to make instructional decisions based on EVAAS data.

5 Navigating EVAAS

6 BOOKMARK IT!

7

8 Step 2: Click on “Account” Click on “hide”

9 Step 3: If You Get Lost… Click on “home”

10 Step 4: Use the Blue Bar to…

11 Reports School Value Added School Any Sub Group % of Students Select Subgroups

12 EVAAS The “BIG” Picture

13 What is EVAAS? So What Does It Do?

14 What About Reports?

15 Answers the question of how effective a schooling experience is for a student. Produces reports that... o Predict student success o Show the effects of schooling at particular schools o Reveal patterns in subgroup performance

16 Underlying Philosophy of EVAAS All students deserve opportunities to make appropriate academic progress every year. There is no “one size fits all” way of educating students who enter a class at different levels of academic achievement. Adjustments to instruction should be based on the academic attainment of students, not on socio-economic factors..

17 Underlying Philosophy of EVAAS Given reliable information on past effectiveness, educators can make appropriate adjustments to improve student opportunities. One of the most important things educators can know is with whom they are effective and where they need to develop new skills.

18 How Can EVAAS Help Me?

19 EVAAS Focus is on STUDENT PROGRESS (Growth) Student Achievement: Where are we? Highly correlated with demographic factors Student Growth: How far have we come? Highly dependent on what happens as a result of schooling rather than on demographic factors

20 End of School Year Proficient

21 End of School Year Proficient Start of School Year Not Proficient Change over time

22 How is this fair?

23 No one is doomed to failure.

24 ScenarioProficientGrowth 5 th grader begins the year reading at a 1 st grade level. Ends the year reading at a 4 th grade level. 5 th grader begins the year reading at a 7 th grade level. Ends the year reading at the 7 th grade level. NO YES

25 We are not trying to get our students to reach a score on a test. Every student can grow even if they are not proficient. If we concentrate on growth, proficiency will come. No matter where a student comes into your class, they can still grow. Every student matters. We have to meet students at their “level” and help them grow from there.

26 Changing the CULTURE of your School  Every school is different—What works in one school, may not work in another!  EVAAS creates a LEVEL playing field  EVAAS can build and encourage you as a professional educator  EVAAS can simplify the process as you strive to be data savvy and data driven  EVAAS can help eliminate blame, excuses and finger pointing  EVAAS encourages the change necessary to improve teaching and learning

27 Given a specific set of circumstances… …what’s the most likely outcome?

28 Given this student’s testing history, across subjects… …what is the student likely to score on an upcoming test, assuming the student has the average schooling experience?

29  Expectations based on what we know About this student and other students who have already taken this test Prior test scores (EOC/EOG), across subjects Their scores on the test we’re projecting to

30 Projections are NOT about predicting the future. They ARE about assessing students’ academic needs TODAY.

31  What are this student’s chances for success?  What goals should we have for this student this year?  What goals should we have for this student in future years? What can I do to help this student get there?

32  Identify students  Assess the level of risk  Plan schedules  Identify high-achievers  Assess the opportunities  Inform

33  Reflective Assessments  Proactive Assessments

34

35 Use to evaluate the overall effectiveness of a district on student progress Compares each district to the average district in the state for each subject tested in the given year Indicates how a district influences student progress in the tested subjects

36

37 The School Value Added Report compares each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects.

38  If the Mean NCE Gain is greater than or equal to zero, the average student in this school has achieved a year’s worth of academic growth in a year  If the Mean NCE Gain is less than zero, the average student in this school has achieved less growth than expected

39 District Diagnostic Reports

40  This report is intended for diagnostic purposes only and should not be used for accountability.

41

42  Quintiles  Green Zero Line  Previous Cohort(s)  Confidence Band  Whiskers  2 Standard Errors

43

44  Clickable Information  Reference  Gain  Standard Error

45  Use to identify patterns or trends or progress among students predicted to score at different performance levels as determined by their scores on NC tests  Students assigned to Projected Performance Levels based on their predicted scores  Shows the number (Nr) and percentage of students in the district that fall into each Projected Performance Level

46 Interpreting the Pie Chart Yellow Green Light Red

47

48

49 Diagnostic Reports Looking for Patterns

50

51

52

53

54

55 What would an ideal pattern on a Diagnostic Report look like for closing the achievement gap?

56

57

58 With a partner: Look at your school diagnostic reports for your subject area(s). What patterns do you see? How does this information influence future instructional practices and student support?

59

60

61 Teacher Value-Added Report

62  Beginning with your 2013 report, it becomes part of your evaluation.  Standard 6 – Teachers contribute to the academic success of their students. (Measurable Progress)  Standard 4 – Teachers facilitate learning for their students ▪ Teachers plan instruction appropriate for their students ▪ Use data for short and long range planning  Standard 5 – Teachers reflect on their practice. ▪ Teachers analyze student learning.

63 You care about your students.

64 Student Progress – How far have I come?  Highly dependent on what happens as a result of schooling rather than on demographic factors.

65  Focus on progress  Educators can influence this  Minimum expectation = one year of academic gain

66  Projection report looks at past testing information and projects how a student will perform.  Student’s own past performance  Performance of students who have taken the test previously  Students must have three prior test scores for something to be included in the teacher’s predictive report.  Whole cohort of students analyzed.

67 Improve the Education Program EVAAS Local Knowledge & Expertise

68  State Growth Standard/State Average = 0.0  Standard Error = a measure of uncertainty  Usually, the more data you have, the smaller the standard error.  Index = Teacher Estimate divided by Standard Error

69

70  Exceeds Expected Growth:  Teachers whose students are making substantially more progress than the state average  Index is 2 or greater

71  Meets Expected Growth:  Teachers whose students are making the same amount of progress as the state average  Index is equal to or greater than -2 but less than 2

72  Does Not Meet Expected Growth:  Teachers whose students are making substantially less progress than the state average  Index is less than -2

73 Index: Teacher Estimate Divided by Standard Error Courses included in calculation Statewide distribution of teacher status.

74

75

76  Teacher Estimate: How much progress did this teacher’s students make compared to other students across the state?  Index: Teacher estimate divided by the standard error. Index is the basis by which teachers are assigned to effectiveness levels.

77

78

79

80 Supplemental Information Table

81

82

83 Teacher Diagnostic Report

84

85

86  What generalizations can we make?  What do we not know?  How do we find out?

87

88

89

90 What generalizations can we make? What do we NOT know? Based on what you have learned about Kathleen Joseph, what types of questions would help her reflect on how to make instructional changes?

91 In light of what you have learned about Kathleen Joseph, let’s look at YOUR data. What steps will you take based on what the data tell you?

92 The report shows growth for the lowest, middle, and highest achieving students within the chosen group. The report can be used to explore the progress of students with similar educational opportunities. Like all diagnostic reports, this report is for diagnostic purposes only. A minimum of 15 students is needed to create a Student Pattern Report.

93 Student Pattern Report

94

95 Key Questions

96 Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours?

97 Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours? Rerun the report with new criteria. YES!

98 Student Pattern Report – Next Steps 16 Students who attended for 40+ hours All 31 Students in the Program

99 Less Informed Conclusion: We need to change the selection criteria for this program. More Informed Conclusion: We need to adjust the recommended hours for participants.

100

101 Custom Student Report HANDOUT

102  Reports  Academic At-Risk Report

103 Academic At-Risk Reports 3 Categories At Risk- at risk for not meeting the expected academic indicators Graduation at Risk-reports for students at risk for not making a Level III on EOC subjects required for graduation Other at Risk-reports for students at risk for not making Level III on other EOC subjects

104 Making Data Driven Decisions

105

106 Insights Questions What’s Next?


Download ppt "Haywood County Schools February 20,2013"

Similar presentations


Ads by Google