# PVAAS (The Pennsylvania Value Added Assessment System )

## Presentation on theme: "PVAAS (The Pennsylvania Value Added Assessment System )"— Presentation transcript:

PVAAS (The Pennsylvania Value Added Assessment System )

PVAAS IS: Another tool we can use to look at our school data
A way to determine if students made one year’s worth of growth When combined with PSSA scores, we get a clearer picture of school achievement PVAAS is not another test. PVAAS uses existing assessment data specifically the PSSA in its statistical models. While other states include a methodology to investigate teacher effectiveness, Pennsylvania has not done so. Pennsylvania does not use the methodology for teacher effectiveness. Statistically no data are collected nor methodology used that allows for a measure of teacher effectiveness, or teacher accountability.

PVAAS is NOT: Another Test; the data is taken from the PSSA
A measure for teacher accountability PVAAS is not another test. PVAAS uses existing assessment data specifically the PSSA in its statistical models. While other states include a methodology to investigate teacher effectiveness, Pennsylvania has not done so. Pennsylvania does not use the methodology for teacher effectiveness. Statistically no data are collected nor methodology used that allows for a measure of teacher effectiveness, or teacher accountability.

Things to Keep in Mind PVAAS does not take into account a student’s starting point. It looks at growth over a year. PVAAS measures growth, not AYP level A Scatter Plot graph combines both

The Scatter Plot Coordinate System
Vertical Axis PSSA Percent Proficient or Advanced Growth Line AYP Percent Proficient Target On this screen you see displayed a blank scatter plot. (Click) The Horizontal Axis is for displaying the PVAAS Growth Measure. Recall that a Growth Measure value of 0 means that the cohort of students has met the Growth Standard for its grade and has maintained its position in the distribution of all scores in that grade across the Commonwealth relative to the previous year. We have highlighted the Growth Standard boundary of 0. If the Growth is positive, that is, to the right of 0, then we have an indication that the cohort exceed the Growth Standard and has increased its position in the statewide distribution of scores. Similarly, if the Growth value is negative, then we have an indication that the growth of the cohort was below the Growth Standard and the position of the cohort in the statewide distribution of scores has decreased during the recently completed academic year. The vertical axis displays the percent proficient or advanced for each cohort. The AYP Proficiency Target for math is currently 45% - we have highlighted that target in the graph. Note tha the Target for reading is different and that all targets will increase in 2008. Each of the regions isolated by the AYP Target level and the PVAAS Growth Standard splits the scatter plot into 4 quadrants that we number just as you did in Algebra 1! (Click) First (Click) Second (Click) Third (Click) Fourth Horizontal Axis PVAAS Growth Measure

Scatter Plot Quadrants and Their Meanings

What does PVAAS show? Looking Forward/Planning…
PVAAS Projection Reports For Individual Students and Cohorts of Students Looking back Looking Back/Evaluation… Value-added Growth Reports For Cohorts of Students Looking ahead The two methodologies really look at two different issues. The Value-added, or Growth, methodology looks back…it helps schools to evaluate how much students have gained. How much growth did students make in the past school year? The Projection methodology looks forward…it helps schools plan for the future. Are students on a path to proficiency? Both serve different purposes, and both are equally important for continuous school improvement. Today

Looking back Comparison to a State Growth Standard The Growth Standard specifies the minimal designated academic gain from grade to grade for a cohort of students. The use of a Growth Standard creates the possibility that ALL schools can demonstrate appropriate growth. In the three pilot phases from , students were compared to the average school in their testing pool. Comparison to averages always guarantees that there must be some schools below average; therefore, all schools could not be at or above the expected growth target. Now that Pennsylvania is implementing PVAAS statewide, the data set allows Pennsylvania to yield a more defined and accurate picture of growth in schools. The introduction of a Growth Standard sets a target of growth that ALL schools can achieve since growth only depends on the performances of that school and not the average performance of schools in a testing pool. [CLICK]

Looking back Favorable Indicator Estimated gain at or above growth standard. Students in this cohort have made at least one year growth. All schools can achieve this rating. Caution Indicator Estimated gain below growth standard but by less than one standard error. Students in this cohort have grown less than the standard. Stronger Caution Estimated gain below growth standard by more than one but less than two standard errors. Students in this cohort have fallen behind their peers. The Ratings on the PVAAS School Report and on the District Value-Added Report are color coded to assist with quick recognition of the rating. [CLICK] Green - Estimated gain at or above growth standard. Students In this cohort have made at least one year of growth. All schools can achieve this rating. Yellow - Estimated gain below growth standard but by less than one standard error. Students in this cohort have grown less than the standard. Light Red - Estimated gain below growth standard by more than one but less than two standard errors. Students in this cohort have fallen behind their peers. Red - Estimated gain well below growth standard by more than two standard errors. Students have made little progress. Emphasize that the Value-Added side of PVAAS is about the growth of cohorts of students – not about the growth of individual students. Strongest Warning Estimated gain well below growth standard by more than two standard errors. Students in this cohort have made little progress.

PVAAS School Report Looking back
This is an example of a School Value Added Report. It shows us growth by grade level. We can also use the chart at the bottom of the page to see growth of the same cohort over time. This is an example of a Value-Added Report that is available at both the district and school level. The top circle indicates the gain in 2007 for each grade level 4 through 8. In this example, grades 4, 5, and 7 received a ‘green’ rating indicating that students in those grades made at least one year of growth. These students as a cohort met or exceeded the growth standard. However, students in grades 6 and 8 fell behind their peers. In fact, the red indicator tells us that there is significant evidence that this cohort of students made little progress during this past school year.

PVAAS Performance Diagnostic Report
Looking back This chart is an example of a school performance diagnostic report We can see the growth for groups of students The green line is at zero. This is the growth line. Everything at or above the line shows that the students have made one year’s worth of growth. Everything below the green growth line shows that the students slipped from last year. (They did not remain in the same position as last year) In the district/school diagnostic report, students’ gains can be disaggregated by quintiles or by PSSA performance levels based on their prior achievement levels. This allows the school personnel to assess the growth of these subgroups quickly and easily. Since this part of the report is to be used only for diagnostic purposes, one standard error is used in interpreting the significance of the results. The chart provides a visual representation of the district/school’s diagnostic report. It shows the most recent information as well as the previous cohort’s information. The power of this report is in looking at patterns to reflect on local practices. This report does not tell us WHY progress is being made or not made. The report tells us about the amount of progress made for different groups of students. This particular report suggests: The students who were predicted to be Advanced or Proficient in math did not meet the Growth Standard – their confidence bands are below the green line and therefore we can conclude that these students decreased their positions in the performance distributions between 7th and 8th grades. The students who were predicted to be Basic exceeded the Growth Standard – their confidence band was above the green line. This indicates that these students increased their position in the performance distribution of all students from their position at the end of the 7th grade to the end of 8th grade. The students who were predicted to be Below Basic met the Growth Standard – their confidence band contains 0. This indicated that this cohort has maintained its position. However, since these are Below Basic students, maintaining their position is not the desired outcome. Diagnostic reports will be generated for subgroups that consist of 5 or more students. Since there are only 2 students in the lowest quintile, analysis for this quintile is not provided. The goal of the district or school is to have significantly positive bars for all students disaggregated by statewide quintiles of data. [CLICK] This type of diagnostic report is also available for subgroups based on the demographic data submitted by districts.

More Detail – Diagnostic Report
Looking back By clicking on the hot link, % of Students, on the Diagnostic Report, users can receive the same information in a more detailed pie chart. This diagnostic pie chart visually displays the percent of students who were predicted to be in each of the four performance levels. In this case the majority of students (54.5%) were predicted to perform at the proficient level while another 39.7% were predicted to be advanced. The color of each piece of the pie indicates the group of student’s progress. A green indicates the group of students met or exceeded the growth standard. A yellow indicates the group of students grew less than the standard. A rose, or pink, indicates the group of students did not meet the growth standard, meaning they are falling behind their peers.

Looking at a subgroup… Looking back
Diagnostic Reports from PSSA Subgroups are also available- This includes Econ Dis, Special Education, ELL, Gifted, Tutoring Eligible, Tutoring Received, NOT Enrolled Full Academic Year ( helpful for people who assume the issue is the transient students), Title I – as well as several other subgroups from the PSSA file. As schools begin to analyze their PVAAS data, teachers and administrators need to look at their effectiveness across the entire continuum of students. To assist with this analysis, schools may view diagnostic reports for specified subgroups of students. This can be achieved by [CLICK]ing on “yes” in response to subgroup reporting on the Diagnostic Report and selecting the desired subgroups. Each category must contain at least five students for it to be represented on the graph. For a more targeted Line of Inquiry, schools may wish to use the “Student Search” report. Participants can use the protocol of the “Student Search” report to compare the reports on the targeted subgroups with the general student population. This may elicit a rich discussion about curriculum, assessment and instruction. [CLICK]

Looking back Go to: Login: PVAAS.Training Password: PVAAS To view School Value Added Reports Under Reports, select School Value Added Reports Using the tabs on the top of the page, you can change the subject or school To View School Performance Diagnostic Reports Under Reports, select School Performance Diagnostic Select Subgroups (this is an optional filter to view the diagnostic report for specific subgroups of students)

Looking back

What does PVAAS show? Looking Forward/Planning…
PVAAS Projection Reports For Individual Students and Cohorts of Students Looking back Looking Back/Evaluation… Value-added Growth Reports For Cohorts of Students Looking ahead The two methodologies really look at two different issues. The Value-added, or Growth, methodology looks back…it helps schools to evaluate how much students have gained. How much growth did students make in the past school year? The Projection methodology looks forward…it helps schools plan for the future. Are students on a path to proficiency? Both serve different purposes, and both are equally important for continuous school improvement. Today

Student Projections to know the likelihood that a student will
Looking ahead Wouldn’t it be great to know the likelihood that a student will be proficient on a future PSSA? Project the value of knowing the probability of a student’s performance on a future PSSA.

PVAAS Student Projection Report
Looking ahead This is what a Student Projection Report looks like. The red dots show the student’s observed performance in Math in grades 5, 6, and 7. The yellow square to which the red arrow points indicates the projection to proficient for 8th grade math. This says that given Freddie’s history and his achievement pattern (taking both reading and math into account) Freddie is projected to have a score of 38 on the 8th grade math assessment. The circle highlights the probability of proficiency, 91.3%, if Freddie enjoys the average schooling experience for 8th grade. (The purple line indicates the State Percentile necessary to be at this level of proficiency on the PSSA.) Some districts are sharing these reports with parents. PDE has a template/parent letter available to accompany this report.

PVAAS Projection Reports for Groups of Students
Looking ahead If schools are interested in targeting a student or particular groups of students, PVAAS will allow schools to create lists based on filtering for specific subgroups or projected proficiency levels. Participants may identify specific demographic information or proficiency levels, and PVAAS will search for students meeting that profile. The Search for Students also allows teachers and administrators to generate a listing of students whose probabilities of achieving a selected proficiency level fall within a specified range of probabilities. This very useful feature assists in identifying students who may need instructional interventions of some type. For example, this search will return all of the economically disadvantaged students who were last tested in 5th grade for whom the probability of scoring Proficient on the 6th grade PSSA Math Test is between 70% and 100%. This is particularly helpful when looking for patterns of performance across a school or district and when attempting to identify students with a small probability of achieving a proficient rating. For example, restricting the search to students with between a 0 and 50% chance of being Proficient on the next PSSA provides an “at risk” list. If you specify the probability range of 0% to 100%, you will generate a list of ALL students and their likelihood to be in the selected performance category on the next PSSA exam on the specified subject. [CLICK] by Subgroup and/or Projected Proficiency Level

Identified Students Meeting Search Requirements
Looking ahead When the parameters of the search have been submitted, a report is generated that lists the students that meet the particular criteria outlined in the student search. In this example, the report lists the economically disadvantaged students whose probability of a Proficient rating on the 6th grade PSSA math exam is between 70% and 100%. Note that all demographic information on the listed students is also provided. These are the demographics from the PSSA file. Note also that each of the columns is “sortable.” By clicking on the column name, the report is sorted by the contents of that column. This allows the user to group/cluster students based on demographic information. [CLICK]