Presentation is loading. Please wait.

Presentation is loading. Please wait.

October 2012 NCDPI NCRESA EVAAS Training Refresher Session.

Similar presentations


Presentation on theme: "October 2012 NCDPI NCRESA EVAAS Training Refresher Session."— Presentation transcript:

1 October 2012 NCDPI NCRESA EVAAS Training Refresher Session

2 Before We Begin… Visit: http://region1rttt.ncdpi.wikispaces.net/
Add the Region 1 wikispace to your favorites. Click “Region 1 Events” found on the left menu. Click “NERESA EVAAS Training ” to access the presentation” Click “presentation” to access the PPT (Section A)

3 Housekeeping Virtual Parking Lot http://www.stixy.com/guest/212656
Breaks Lunch NCSU Survey AGENDA: Sign-in, Parking Lot, and Penzu completed while visiting wiki Breaks – Breaks are scheduled throughout the session. Please take breaks as needed. Evaluation – your input is essential and valued! Evaluation will be an electronic Google form (plus, delta, and future directions) and also available in hard copy. This activity uses digital and data literacy (formative assessment via sign-in google form) FreeDigitalPhotos.net

4 Appreciation for one another Exchange ideas freely
Can We Agree? Appreciation for one another Exchange ideas freely Influence what we can Opportunity to reflect Unite in purpose Beth

5 Learning Targets I will be able to create and identify patterns in District and School Value Added Reports. I will be able to create, identify trends, and have conversations about Academic Preparedness and Academic At-Risk Reports. I will be able to have critical conversations with my School Improvement Team and other educators about students academic preparedness.

6 Pre-Assessment “What do you know about EVAAS?”
The following slides are PollEverywhere slides to get a feel for what participants already know. The PollEverywhere questions are hyperlinked on the agenda on wiki Facilitators should clear poll results after the presentation and/or check to see if the group before you cleared their results

7 I am very familiar with EVAAS Reports.
Strongly Agree Agree Disagree Strongly Disagree

8 I use EVAAS to make instructional decisions in my class or school.
Strongly Agree Agree Disagree Strongly Disagree

9 I am able to analyze the metrics in EVAAS reports for instructional planning.
Strongly Agree Agree Disagree Strongly Disagree

10 I know how to collect evidence from EVAAS to assess student achievement potential.
Strongly Agree Agree Disagree Strongly Disagree

11 Strongly Agree Agree Disagree Strongly Disagree
I know how to interpret the following reports: At Risk Report, Customized Student Reports, Valued Added Reports and Diagnostic/ Performance Reports. Strongly Agree Agree Disagree Strongly Disagree

12 I am able to communicate the findings of EVAAS reports to initiate conversations about student achievement. Strongly Agree Agree Disagree Strongly Disagree

13 A Quick Review of Resources

14

15 Virtual Professional Development
It may be a good idea to watch this…

16 Data Literacy Module https://center.ncsu.edu/nc
Data Resource Guide This module provides an introduction to data literacy. It includes information on types of data, strategies for analyzing and understanding data, and processes for determining how these can influence instructional practices. This module aims to provide learning experiences that develop or enhance abilities to find, evaluate, and use data to inform instruction. of the Data Resource Guide is to provide information and resources to help administrators, data coaches, teachers, and support staff with data driven decision making for their schools and districts. Districts and charter schools are encouraged to use this guide to design and implement training for data teams with the goal of increased data literacy and student achievement.

17 Articles below can be located on the Agenda, Section E.
Supporting Articles Articles below can be located on the Agenda, Section E. The X Factor is ‘Why’ by Anne Conzemius A Playbook for Data Learning Forward, 2012

18 Making Connections with the NCEES
Take responsibility for the progress of all students Use data to organize, plan, and set goals Use a variety of assessment data throughout the year to evaluate progress Analyze data STANDARD I: Teachers demonstrate leadership. STANDARD IV: Teacher facilitate learning for their students. STANDARD V: Teachers reflect on their practice. STANDARD VI: Teachers facilitate academic growth. Use data for short and long range planning The work of the teacher results in acceptable, measurable progress for students based on established performance expectations using appropriate data to demonstrate growth Collect and analyze student performance data to improve effectiveness Let’s have a quick review of the standards before we delve into them more deeply. St. 1 - Leadership in and beyond the classroom. Teachers take responsibility for all students’ learning and use data to drive instruction. They establish a safe learning environment. They strive to lead the profession and serve as an advocate for students. St. 2 - Teachers provide an environment in which each child has a positive, nurturing relationship with caring adults. They embrace diversity, treat students as individuals, adapt their teaching, and work collaboratively. St. 3 - Teachers align their instruction w the NCSCOS, know their content, and teach relevant, connected lessons. St. 4 - Teachers understand learning, and they know the appropriate levels of intellectual, physical, social, and emotional development of their students. St. 5 - Teachers analyze student learning and think critically about learning in their classrooms. They seek appropriate pd that matches their professional goals, and they are active learners. When you are finished sharing the posters, you can talk about standard 6. Use this if you would like: This is the newest standard, and there is much more work underway to determine how this standard will be measured. While principals should, of course, always be looking for evidence of student learning, there will be no “observation-based” component to Standard Six. Some are concerned that the sixth standard means more work for principals. It really doesn’t since the rating will be based squarely on data that are collected and aggregated by the state. Here is some help for responding if there are questions: (Tread carefully here…and remind principals not to shoot the messenger!) This info was on the Superintendent's update from Dr. Atkinson and I think it provides a good "blurb" to include somewhere in our training. Teacher Effectiveness: Sixth Standard Update - Effective this school year, the State Board of Education has added a sixth standard to the Teacher Evaluation Instrument. A teacher’s rating on the sixth standard will be based on whether a teacher’s students meet growth expectations, exceed growth expectations, or fail to meet growth expectations. An average of three years of student growth information will be used to determine the teacher’s rating. Only teachers with three or more years of data will receive a formal rating on the sixth standard, although principals are encouraged to discuss any student growth information with teachers. For the school year, there will be no state-mandated consequences for teachers based on their sixth standard rating. The sixth standard requires no change to the evaluation process during the school year. The standard will be automatically populated through the use of three years of data points on student growth. Principals and other classroom observers do not need to take additional action during the year to ensure that data is included. Detailed webinar information will be included in the next communication. For additional information please contact Jenn Preston, Race to the Top Project Coordinator for Teacher Effectiveness, at

19 Why is using Data to Make Decisions Important?

20 Benefits and Considerations for Teachers
Understand academic preparedness of students before they enter the classroom Monitor student progress, ensuring growth opportunities for all students Modify curriculum, student support, and instructional strategies to address the needs of all students Talk about how PD is imperative. We have to teach the teachers HOW to use the data. Having DATA CONVERSATIONS is an imperative. Knowledge is power, and your role as a principal is to prepare teachers for how to handle the data. Understanding that the purpose of EVAAS is for us to support student understanding and to make appropriate instructional, logistical, and professional decisions to support student achievement and growth. You know your teachers and know who may not be able to handle access.

21 What is Data? Data can be defined as information organized for analysis or used to make decisions.

22 Understanding needed to: Find Evaluate Utilize to inform instruction.
What is Data Literacy? Understanding needed to: Find Evaluate Utilize to inform instruction. Data literacy refers to the understanding needed to find, evaluate, and utilize data to inform instruction. A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making.

23 A Data Literate Person…
Possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making. Turn and talk: Do you consider yourself a data literate teacher? If so, how do you know? Data literacy refers to the understanding needed to find, evaluate, and utilize data to inform instruction. A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making.

24 Data Categories Achievement Demographic Program Perception
Teachers use this data at the beginning of the school year to determine the entry level of performance and instructional effectiveness. Demographic Teachers use this data to determine the subset of students and their grades or determine outside factors that affect student performance. Program Teachers collect this data to identify what instructional effectiveness of the strategies that were implemented. Perception Teachers may collect this data from students to determine how the students feel about their school.

25 What Data Do You Have? How can you access this data?
With your table, identify as many data sources that you are aware which might belong to the following categories: Achievement Demographic Program Perception Record one data source per sticky note Post your data sources on the chart found on the wall How can you access this data? What data can you use on a daily basis? Once you have the data what can you do with it to adjust your daily instruction?

26 D.R.I.P. What does it mean? Data Rich Information Poor
Check to see if any participants have heard of this acronym. (Then advance the slide to show what it stands for…) Image from Microsoft Online Images

27 Achievement vs. GROWTH

28 Student Achievement End of School Year Proficient
A focus on achievement or proficiency looks like this… Student is able to meet specific standards Students fall into a limited range or band of achievement Does not account for change outside of that range Does not account for student ability before they came to class “I can do it” End of School Year

29 Student Growth Change over time Start of End of School Year
Proficient Change over time A focus on student growth or progress looks like… Takes into account student achievement within range or beyond that range Compares student achievement to how they were predicted to achieve Discerns between teacher impact and student ability Accounts for student ability before they came to class “Improvement or progression” Not Proficient Start of School Year End of School Year

30

31

32

33 A Quick Review of EVAAS

34 What is EVAAS? E V A S Education Value Added Assessment System
EVAAS measures the progress students make within your district, school, or classroom, compared to the progress students make, on average, statewide. It is available to all schools and districts in North Carolina. So What Does It Do? Copyright © 2010, SAS Institute Inc. All rights reserved.

35 SAS EVAAS Analyses LOOKING BACK Evaluating Schooling Effectiveness:
Writing SAT/ACT End of Course End of Grade LOOKING AHEAD Planning for Students’ Needs Student Projections to Future Tests LOOKING BACK Evaluating Schooling Effectiveness: Value Added & Diagnostic Reports ACT data for this year is 2010 data. Copyright © 2010, SAS Institute Inc. All rights reserved.

36 Past Program Effectiveness Incoming Student Needs
How can EVAAS Help Me? Improve the Education Program EVAAS: Looking Back Past Program Effectiveness Local Knowledge & Expertise EVAAS: Looking Ahead Incoming Student Needs EVAAS helps by allowing educators to: Analyze past program performance for trends Make informed projections for current and incoming students

37 Education Value Added Assessment System
Answers the question: How effective a schooling experience is Produces reports that: Predict student success Show the effects of schooling at particular schools Reveal patterns in subgroup performance EVAAS was created by a university professor in Tennessee – originally TVAAS – “value-added, effectiveness data.” EVAAS looks at the district, school, teacher, and student. If is often difficult for us to come out of the AYP/ABC box, but it is important to understand that EVAAS data is very different. In the broad sense, EVAAS takes into consideration the fact that students come to us with very different ability levels. Instead of just measuring each student against a proficiency score, EVAAS measures the results of a student’s classroom experience – in essence – EVAAS measures the EFFECT of schooling on student learning. EVAAS extracts data AFTER DPI collects data through the secure shell. DPI runs processes and checks for validity. Once DPI has completed their processes with the data, they present to the SBE. At this point, data is sent to EVAAS.

38 (School, Subject, Grade Level)
Value Added Diagnostic Performance Diagnostic Big Picture (8x10) (School, Subject, Grade Level) Smaller Shot (5x7) (Teacher) Pocket Photo (Student) School Academic Preparedness Report Custom Student Report Student Pattern Reports Student Search Academic At-Risk Report

39 2012-13 Changes in Reporting 2011-12 2012-13 Above
Exceeds Expected Growth Not Detectably Different Meets Expected Growth Changes in reporting for Color coding and descriptors Above (Green) – students in the district made significantly more progress in this subject than students in the average district in NC. Progress was at least two standard errors above average. NDD (Yellow) – Not Detectably Different from students in the average district. Less than two standard errors above average and no more than two standard errors below it. Below (Light Red) – students in the district made significantly less progress in this subject than students in the average district in NC. Progress was more than two standard errors below average. Color coding and descriptors Exceeds Expected Growth (Blue): Estimated mean NCE gain is above the growth standard by at least 2 standard errors. Meets Expected Growth (Green): Estimated mean NCE gain is below the growth standard by at most 2 standard errors but less than 2 standard error above it. Does Not Meet Expected Growth (Red): Estimated mean NCE gain is below the growth standard by more than 2 standard errors. The descriptors in EVAAS now match the Standard 6 Ratings. Below Does Not Meet Expected Growth

40 District Value Added Report With this Report You Can:
Observe the average progress of students in a district Compare a district’s progress rate for a grade to the Growth Standard Compare a district’s achievement level to the state’s average achievement We will look at three kinds of reports - value-added, diagnostic, performance diagnostic - at the district level and review how to read them b/c this is the same way you will read your school data. The reports have elements in common and once you can interpret the district reports, you’ll be able to read your school reports easily. The growth standard is the average growth for students statewide.

41

42

43 Interpreting Value Added Reports
Scores from the EOG tests are converted to State NCEs (Normal Curve Equivalent scores) for the purpose of these analyses. NCE scores have the advantage of being on an equal-interval scale, which allows for a comparison of students' academic attainment level across grades. NCE scores remain the same from year to year for students who make exactly one year of progress after one year of instruction, even though their raw scores would be different. Their NCE gain would be zero. You do not have to have 3 data points- can add 4th grade Resource link- has NCE tutorial

44 Student Achievement Levels
The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state. Student achievement levels appear at the bottom of the report in the Estimated School Mean NCE Scores section. The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. Compare the estimated grade/year mean for a school to the NCE Base. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state.

45 School Value Added Report with this Report You Can:
Observe the average progress of students in a school Compare a school’s progress rate for a grade to the Growth Standard Compare a school’s achievement level to the state’s average achievement

46 Value-Added Reports Use to evaluate the overall effectiveness of a school on student progress. Compares each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects. Has to be more than -1.8 to be below; more than 2 standard errors to be above. (Provide definition of value-added.) Some things to note are that “like” students are in a subgroup. Show how to read the report. Explain that 0 is the equivalent of one year of growth. Looking at the bottom of the page at the green, yellow, and red descriptors. Explain that there is a blue descriptor when there is not enough data to make a distinction. If your LEA uses DIBELS, the reports have similarities in design with the colors. Have the participants look at the data and talk about how you can see here. Look at trends in the same grade. Look at the students that moved from 6th grade in 2008 to 7th grade in 2009 and to 8th grade in 2010. Talk about what you can take a way from this report.

47 School Value Added Report Activity
What does this report tell us about your math program? Look at the trends in the same grade Discuss: What can you take away from this report? Students in 6th grade in 2011 Student in 7th grade in 2012 Students in 8th grade in 2012

48 District Diagnostic Reports
Use to identify patterns or trends of progress among students expected to score at different achievement levels Use this report for diagnostic purposes only and not for accountability Caution: subgroup means come from “a liberal statistical process” that is “less conservative than estimates of a district’s influence on student progress in the District Value Added Report”

49 District Performance Diagnostic Reports
Use to identify patterns or trends or progress among students predicted to score at different performance levels as determined by their scores on NC tests Students assigned to Projected Performance Levels based on their predicted scores Shows the number (Nr) and percentage of students in the district that fall into each Projected Performance Level Mean Differences The Mean of the difference between the students’ observed test performance and their predicted performance appears for each Projected Performance Level, along with the Standard Error associated with the Mean. The Standard Error allows you to establish a confidence band around the Mean. A large negative mean indicates that students within a group made less progress than expected. A large positive mean indicates that students within a group made more progress than expected. A mean of approximately 0.0 indicates that a group has progressed at an average rate in the given subject. When the means among groups vary markedly, districts may want to explore ways to improve the instruction for students making less progress.

50 Diagnostic Report Use this report to identify patterns or trends of progress among students expected to score at different achievement levels. This report is intended for diagnostic purposes only and should not be used for accountability.

51 Diagnostic Reports Interpreting the Chart
In this diagram, the two bars have the same height, but the whiskers extend to different lengths. On the left, the whiskers lie completely below the green line, so the group represented by the bar made less than average progress (↓). On the right, the whiskers contain the green line, so the group represented by the bar made average progress (-). The more kids in the subgroup the shorter the bar. Margin of error. Mean falls in the middle some place.

52 District Performance Diagnostic Reports can tell you even more when you click on %of Students
Click on the underlined number in the Mean or Nr of Students row for a subgroup to see the names of the students assigned to the subgroup Click on the % of Students for the current year or for Previous Cohort(s) to see the data in Pie Chart format. The Reference Line in the table indicates the gain necessary for students in each Prior-Achievement Subgroup to make expected progress, and it reflects the growth standard. When Gain is reported in NCEs, as it is here, the growth standard is 0.0. The Gain is a measure of the relative progress of the school's students in each Prior-Achievement Subgroup compared to the Growth Standard Standard errors appear beneath the Gain for each Prior-Achievement Subgroup. The standard error allows the user to establish a confidence band around the estimate. The smaller the number of students, the larger the standard error. A student becomes a member of a Prior-Achievement Subgroup based on the average of his or her current and previous year NCE scores. A single student score contains measurement error. Using the average of two years allows a more appropriate assignment. The ro. of Students row shows the number of students in a subgroup. Some subgroups may contain more students than others because students are assigned to groups on a statewide basis. The assignment pattern shows schools how their students are distributed compared to other students in the same grade across the state.

53 Interpreting the Pie Chart
Yellow: students in this group progressed at a rate similar to that of students in the average district in the state. Light Red: students in the group made more than one standard error less progress in this subject than students in the average district in the state. Green: the progress of students in this group was more than one standard error above that of students in the average district in the state.

54 Break: 10 Minutes

55 Value Added and Diagnostic Reports: Let’s Practice

56 1. Go to ncdpi.sas.com 2. BOOKMARK IT! 3. Secure & Convenient
Online Login Copyright © 2010, SAS Institute Inc. All rights reserved.

57 Do you see this? Copyright © 2010, SAS Institute Inc. All rights reserved.

58 Let’s Practice! Reports School Value Added School Any Sub Group
% of Students Or Select Subgroups These reports may be used to determine local policy for providing targeted intervention and support to students who are at risk for not meeting future academic milestones. At Risk reports for EOG and EOC subj. include students with a 0-70% probability of scoring in the level 3 range. The range for writing in 0-80%. The reports are presented in 3 categories: AYP AT Risk- at risk for not meeting the academic indicators for AYP. EOG M & R grades 4-8. EOC Alg. I and Eng. I. For EOG tests students w/at least 3 prior data points (or test scores) will have projections in M & R in the next grade. These scores are not content specific. Projections for Alg. I and Eng. I may be made as early as 6th grade with sufficient data. Graduation at Risk-reports for students as risk for not making a level 3 on EOC subjs. Like Alg. 2, Chem., Geom. Phys. Sci, and Physics. Students that have taken these tests but have not scored at least level 3 will still have projections to these subjects. Under Reports – Click Academic At Risk Reports These are reports that you will want to spend some time really pouring through.

59 Task 1: Value-Added and Diagnostic Reports
Directions for this activity can be found on the agenda (Section G) or Locate the task directions on your table

60 Overview of School Value Added
What did you find? Interesting Patterns Insights Areas of Concern Areas of Celebration

61 Key Points To Remember:
The report shows growth for the lowest, middle, and highest achieving students within the chosen group. The report can be used to explore the progress of students with similar educational opportunities. Like all diagnostic reports, this report is for diagnostic purposes only. A minimum of 5 students is needed to create a Student Pattern Report. Enables you to see how effective the school has been with lowest, middle and highest achieving students at least 15 w/predicted and observed scores.

62 Lunch

63 A Few Strategies for Helping Teachers to Improve
First Priority! Determine if an issue is a teacher problem or a program/curricular problem Partner teachers with other teachers who complement their strengths Identify students who are not making sufficient progress and design intervention plan Customize professional development based on student growth patterns Stimulate discussions during the school year about ongoing measures of student growth Pair teachers with students with whom they are most successful

64 Value-Added Results Reveal the Effectiveness of C & I in Your School
Look for overarching patterns of strength in your data How can teachers leverage effective practices? Look for student achievement groups (quintiles) that are producing the most growth. Why is this? How can you use this knowledge to help other students? How can teachers leverage their strengths? Who is producing the most overall growth? With which student achievement groups? What can you learn from each other? How can teachers leverage the strengths of their team?

65 School Improvement Conversations
How do your aggregate-level results compare to the aggregate-level results of your team? Is this subject an area of strength or an area of challenge for you relative to your team? If this subject is an area of strength for you, how can you provide support for other members of your team? If this subject is a challenge for you, who on your team can you call on for support? How do the trends in the School Diagnostic Report compare to your District/State Diagnostic Report? Which subgroups show growth and greatest challenge? How do your aggregate-level results compare to the aggregate-level results of your team? (Compare your value-added estimate and standard error to the value-added estimate and standard error in the school value-added report.) 2. Is this subject an area of strength or an area of challenge for you relative to your team? 3. If this subject is an area of strength for you, how can you provide support for other members of your team? 4. If this subject is a challenge for you, who on your team can you call on for support? . How do the trends in the School Diagnostic Report compare to your District/State Diagnostic Report? What contributed to these trends? 6. Which subgroups show growth? Are there team members who could benefit from your support with these subgroups? 7. Which subgroups present challenges? Are there team members who could provide you support with these subgroups?

66 Factors That Influence Instructional Effectiveness
Was student progress measured in time? Are the learning, the feedback, and the assessments focused on the right goals? Instructional Practices How were the data used to determine student assignments to courses and to create flexible groupings? Instructional Arrangements Data-driven? Embedded: collaborative time centered around student learning? Teachers introduced to protocols for examining student work? Focused on “vital” behaviors? Focused Professional Development

67 Academic At-Risk Reports Let’s Practice

68 Academic At-Risk Reports
Three Categories: AYP at Risk- at risk for not meeting the academic indicators for AYP Graduation at Risk-reports for students at risk for not making a Level III on EOC subjects required for graduation Other at Risk-reports for students at risk for not making Level III on other EOC subjects Same report

69 Academic at Risk Reports Be Proactive Use these reports to determine local policy for providing targeted intervention and support to students who are at risk for not meeting future academic milestones. for EOG and EOC subjects include students with a 0-70% probability of scoring in the Level III range or 0-80% for writing

70 Task 2: Academic Preparedness and Academic At Risk Reports
Directions for this activity can be found on the agenda (Section H) or Locate the task directions on your table

71 Student Reports and Projections

72 What Are Projections Anyway?
Given this student’s testing history, across subjects… …what is the student likely to score on an upcoming test, assuming the student has the average schooling experience?

73 What’s the Value of the Projections?
Projections are NOT about predicting the future. They ARE about assessing students’ academic needs TODAY. Although projections indicate how a student will likely perform on a future test, their real value lies in how they can inform educators today. By incorporating the projections into their regular planning, teachers, administrators, and guidance counselors can make better decisions about how to meet each student’s academic needs now. Copyright © 2010, SAS Institute Inc. All rights reserved.

74 Assessing Students’ Needs
What are this student’s chances for success? What goals should we have for this student this year? What goals should we have for this student in future years? What can I do to help this student get where they need to be? When assessing students’ academic needs, educators will want to keep these key questions in mind. Copyright © 2010, SAS Institute Inc. All rights reserved.

75 Using Projections to Take Action
Identify students Assess the level of risk Plan schedules Identify high-achievers Assess the opportunities Inform Identify students who need to participate in an academic intervention Assess the level of risk for students who may not reach the Proficient mark Plan schedules and resources to ensure that you can meet students’ needs Identify high-achievers who will need additional challenges Assess the opportunities for high-achieving students who are at risk of not reaching Advanced Inform course placement decisions

76 Making Data Driven Decisions
Have participants access their academic at risk report. Select a grade level and subject to view achievement probability. These students will need support and intervention to provide them with a better than average schooling experience if they are to be successful. Consider different strategies Talk about the defaults

77 EVAAS Projections What are they based on?
Expectations based on what we know… about this student and other students who have already taken this test prior test scores (EOC/EOG), across subjects their scores on the test we’re projecting to

78 Task 3: Student Reports Directions for this activity can be found on the agenda (Section I) or Locate the task directions on your table

79 Collegial Conversations
Count off by 9s and locate your assigned table Locate the “Scenario” document on your table Ask one person to read the scenario aloud Have a ten minute discussion about the scenario, then ... play devil’s advocate and ask your table members some tough questions to give each person some practice in addressing questions/situations that may arise. When prompted move to the next table You will only have time for 3 stations 15 minutes per station Use timer (Next slide shows topics) (PD leads: use your discretion on number of table tents/topics to place based on your location’s size/needs) Image from Microsoft online gallery

80 Table Reflection Scenario
Discuss the following at your table: Why do you think students did not make the progress expected/the progress you’d like to have seen, last year? Why have students in the chosen achievement group(s) not made the progress the teacher/you would like them to have done,in the past? This is probably the most important question you can ask, because it lets you into the mind of the teacher. Some teachers may have a clear idea of what needs to be changed to improve the progress of their students, but others may not. Your classroom observations and your personal knowledge provide you with the perspective to suggest productive changes, if the teacher is unable to do so. Once a possible reason for lack of progress is agreed upon, you can move on to finding a solution.

81 Return to your seat in 10 Minutes
NCSU Survey and Break While we take a break, please complete the session NCSU Survey visit the link below: Return to your seat in 10 Minutes

82 Custom Student Reports Let’s Practice

83

84

85 Task 4: Custom Student Reports
Directions for this activity can be found on the agenda (Section M) or Locate the document on your table

86 Questions

87 Contact Information Abbey Futrell, PD Consultant, Region 1 (252) Beth Edwards, PD Consultant, Region 1 (252) Dianne Meiggs, PD Consultant, Region 1 (252) Change this to reflect your PD leads.


Download ppt "October 2012 NCDPI NCRESA EVAAS Training Refresher Session."

Similar presentations


Ads by Google