Presentation is loading. Please wait.

Presentation is loading. Please wait.

Www.GeorgiaEducation.org Data Analysis This presentation is intended to accompany the Georgia School Council GuideBook.

Similar presentations


Presentation on theme: "Www.GeorgiaEducation.org Data Analysis This presentation is intended to accompany the Georgia School Council GuideBook."— Presentation transcript:

1 www.GeorgiaEducation.org Data Analysis This presentation is intended to accompany the Georgia School Council GuideBook.

2 www.GeorgiaEducation.org Why so many tests? Assessments are a central part of the accountability system required by the federal No Child Left Behind Act and the Georgia A+ Reform Act of 2000. Both laws require annual testing of all students in specific grades and subjects. See p. 2.15 of the Georgia School Council GuideBook for a chart of all required assessments in Georgia.

3 www.GeorgiaEducation.org Purpose of Assessments To identify those students performing below grade level, at grade level and above grade level in order to tailor instruction to individual student needs. To provide teachers with information to guide instruction. To assist schools and school systems in setting priorities.

4 www.GeorgiaEducation.org Two Kinds of Assessments Criterion-referenced tests (CRTs) measure how well the student has learned a particular curriculum. A group of experts decides how many questions must be answered correctly for a student to pass or to receive a score in a particular category (e.g., Pass, Pass Plus). The number of correct answers required is called the cut score. CRCTs, end of course tests (EOCTs), and graduation tests are criterion-referenced assessments used in Georgia.

5 www.GeorgiaEducation.org Two Kind of Assessments Norm-referenced tests (NRTs) measure how well students perform compared to all the other students taking the test. Scores are reported in percentiles of 1 to 99. The 50 th percentile is the median. Half of the students did better; half did worse. The Iowa Test of Basic Skills (ITBS) and the Stanford 9 are the two most commonly used NRTs in Georgia.

6 www.GeorgiaEducation.org Compare and Contrast Georgia’s criterion-referenced tests show how well students have mastered the curriculum adopted by the State Board of Education. Norm-referenced tests, such as the Iowa Test of Basic Skills (ITBS), compare our students’ mastery of general knowledge to other students in the nation.

7 www.GeorgiaEducation.org Discuss Some criticize the requirement for annual assessments saying it leads to teachers “teaching to the test.” Do you agree? Does your answer depend on whether a CRT or NRT is used? If assessments were not used, how would student achievement be measured and monitored?

8 www.GeorgiaEducation.org The National Test: NAEP The National Assessment of Education Progress (NAEP) is sometimes referred to as the nation’s report card. Only certain students in certain grades in certain schools take the test. Results are provided only at the state level. Federal law requires states to participate in NAEP reading and math assessments in grades 4 and 8 every other year.

9 www.GeorgiaEducation.org NAEP Is A Check and Balance Each state adopts its own criterion-referenced test and sets the standards to be met. No Child Left Behind requires NAEP to be given to ensure states do not use easy tests or low standards. Each state uses its own tests and standards to determine whether or not it meets the goals (Adequate Yearly Progress) of NCLB. Comparing the percentage of students who meet the standards on the state test to the percentage of those who meet the NAEP standards can reveal the rigor of the state test.

10 www.GeorgiaEducation.org NAEP vs. CRCT Results NAEP scores are listed as Advanced, Proficient, Basic, and Below Basic. The expectation is that students will be Proficient or Advanced. CRCT scores are Exceeding the Standard, Meeting the Standard, and Did Not Meet the Standard. Because of the 4 categories of scores versus three, it is difficult to directly compare them. The fairest way is to compare the percent Below Basic on NAEP and the percent Not Meeting the Standard on the CRCT.

11 www.GeorgiaEducation.org Comparing the percent of students who scored “Below Basic” on NAEP and “Does Not Meet Standard” on CRCT in 2003

12 www.GeorgiaEducation.org NAEP vs. CRCT Results The closer the results are, the more likely it is that the tests have a similar level of difficulty and a similar cut score set. 4 th Grade Math has very similar results on the two tests. 8 th Grade Math is not quite as close. The largest disparity is in 4 th grade Reading. 8 th Grade Reading also has a large difference in the results. It appears that the NAEP Reading tests are more rigorous than the Georgia CRCT in Reading.

13 www.GeorgiaEducation.org Where To Find Georgia Assessment Data Office of Student Achievement: www.gaosa.org www.gaosa.org Georgia Department of Education: www.gadoe.org www.gadoe.org Georgia School Council Institute: www.GeorgiaEducation.org www.GeorgiaEducation.org Georgia Public Policy Foundation: www.gppf.org www.gppf.org

14 www.GeorgiaEducation.org Where To Find National Data Standard and Poor: www.schoolmatters.org www.schoolmatters.org Education Trust: www.edtrust.org www.edtrust.org NAEP: http://nces.ed.gov/nationsreportcard/ http://nces.ed.gov/nationsreportcard/

15 www.GeorgiaEducation.org Resources Pp 2.3 – 2.10 in the Georgia School Council GuideBook are a step-by-step guide to analyzing test scores. The GuideBook and this presentation are written using the Georgia School Council Institute’s website. Test scores are found in the Center for School Performance section at www.GeorgiaEducation.org. www.GeorgiaEducation.org

16 Purpose of Data Analysis Are all students learning what we expect them to know? Which students are not succeeding? How do we improve the achievement of all students? “That which is not measured cannot be improved.”

17 www.GeorgiaEducation.org Three Levels of Test Data There are three levels of test data available to the public: State Level System Level School Level Individual schools have class level and student level results.

18 www.GeorgiaEducation.org Analyzing Test Data Begin with state level data.

19 www.GeorgiaEducation.org Begin with State Level Data Understanding the state statistics helps put your school and system data into perspective. Learning the terminology helps you identify what is relevant. First, look at the Profile Report to see the demographics of the state and the changes that are occurring.

20 www.GeorgiaEducation.org State Level Profile Report

21 www.GeorgiaEducation.org When you are provided data using percentages, always be clear as to whether you are looking at a change in percent or a change in percentage points.

22 www.GeorgiaEducation.org Understanding the Trends When changes in percentages are listed, it is a change in percentage points, not the percent change itself. There was a 3 percentage point increase in the number of Hispanic students. (4% in 2000 and 7% in 2004) 48,366 more Hispanic students is a 54 percent increase from 2000 to 2004. 7% of the students in 2004: 1,486,125 x.07 = 104,029 4% of the students in 2000: 1,391,579 x.04 = 55,663 Percent increase: 55,663 / 104,029 = 54%

23 www.GeorgiaEducation.org Understanding the Trends Percentage point changes tell only part of the story. What are the demographics of the students moving into the schools? If Georgia gained 94,546 students in five years and 48,366 were Hispanic, then 51% (48,366 / 94,546) of the new students were Hispanic. It is as important to know what the population trends are in your school as it is to know the demographic percentages.

24 www.GeorgiaEducation.org Discuss The number of Hispanic students has grown tremendously, but the percent of students in the Limited English Proficient (LEP) program has remained steady. What conclusions can you draw? What impact on student achievement could changing demographics have?

25 www.GeorgiaEducation.org Pop Quiz 43% of all students (1,391,579) were eligible for free or reduced lunch (FRL) in 2000. In 2004, 46% of the students (1,486,125) were eligible for FRL. How many more students are eligible in 2004? What is the percent increase in those eligible?

26 www.GeorgiaEducation.org Answers In 2000: 598,379 students were eligible for FRL. (1,391,579 x.43 = 598,379) In 2004: 683,618 were eligible. (1,486,125 x.46 = 683,618) 85,239 more students were eligible for FRL. (683,618 - 598,379 = 85,239) That is an 88% increase in the number of students eligible. (598,379/683,618) x 100 = 88%

27 www.GeorgiaEducation.org Analyzing Test Data Next, look at test scores for all the students at the state level.

28 www.GeorgiaEducation.org State Level Test Scores Report

29 www.GeorgiaEducation.org What do the numbers say? The percent Exceeding the Standard should be going up, and the percent Not Meeting the Standard should be going down. If the percent Meeting the Standard drops, which category is increasing? Are more students moving to a higher level or a lower level? Are there changes in the number of students tested? Has improvement been greater in one subject? In one particular grade? Is there a reason?

30 www.GeorgiaEducation.org Keep in Mind Trend information is more important than comparing one year to another. The same group of students is not being compared. One year’s results alone does not indicate a trend.

31 www.GeorgiaEducation.org Look for Achievement Gaps After looking at scores for all students, look at the scores for subgroups of students. This is called disaggregating the data. Federal and state law require the disaggregation and reporting of scores by ethnicity, gender, socioeconomic status, disability and migrant status.

32 www.GeorgiaEducation.org Look for Achievement Gaps On the Test Scores Report, click on a subject under “Achievement Gap Analysis” to see the current year’s scores of each subgroup in graph form.

33 www.GeorgiaEducation.org Look for Achievement Gaps On the Test Scores Report, click on the box labeled “View Scores by Group” and select a subgroup. You will see the disaggregated data by year. This allows you to see the trends in the scores of the subgroup.

34 www.GeorgiaEducation.org Analyzing Achievement Gaps Compare the scores of the different groups of students. Differences in scores reveals the achievement gaps. Which group has the highest proportion of students performing below grade level? Are some groups doing better than others? Are the differences the same in every subject? In every grade?

35 www.GeorgiaEducation.org Exercise Jones Elementary is excited because the percent of fourth grade students Exceeding the Standard in reading has increased by 16 percentage points in the last four years. Some members of the school council are concerned because the percent of students who Did Not Meet the standard in reading did not decrease this year. Is this grounds for concern? What should the school council look at to answer this question? Is additional data needed?

36 www.GeorgiaEducation.org What did you decide? Results over time (4 years) is more significant than one year’s results. Look at the number of students tested overall and in each subgroup. If more students were tested, it is not necessarily grounds for concern that there was no change in the percent not meeting the standard. If some subgroups improved, that is a positive change. If some subgroups lost ground, this may be something for the school council to keep in mind as they look at future data. Think of it as a caution light rather than a red flag.

37 www.GeorgiaEducation.org Analyzing test scores is more than just comparing numbers. It is comparing numbers in a way that puts them in perspective and gives them meaning.

38 www.GeorgiaEducation.org System and School Analysis Do the same kind of analysis for the school system and your school. Start with the Profile Report. Look for changes in the student population. Look at Test Score Reports for all students. In each subject area, check the change in the percent Exceeding and the percent Not Meeting the standard. Has the number of students being tested changed? Compare scores to the system and state. Check for achievement gaps.

39 www.GeorgiaEducation.org Comparing Schools or Systems The unique part of GeorgiaEducation.org is the ability to compare test scores of schools and systems that are demographically similar. The “Similar Systems” and “Similar Schools” Reports will give you this information.

40 www.GeorgiaEducation.org Comparing Schools You will first see a listing of your school, the state, and schools with similar demographics.

41 www.GeorgiaEducation.org Comparing Test Scores Click “Test Scores Comparisons” to see the test scores of all the schools.

42 www.GeorgiaEducation.org Comparing Test Scores A disaggregation box is available. The list can be sorted by clicking on “M/E” (Meets and Exceeds) to see the schools ranked by the highest percentage of students meeting and exceeding the standard. Clicking on “DNM” (Did Not Meet) puts the lowest achieving at the top. Each subject can be sorted by achievement.

43 www.GeorgiaEducation.org Comparing Test Scores

44 www.GeorgiaEducation.org Comparing in Graph Form If you prefer to look at a graph, click on a subject area below “Target schools by subject.”

45 www.GeorgiaEducation.org Comparing in Graph Form The graph will include your school and the five most similar schools.

46 www.GeorgiaEducation.org The SAT Debate

47 www.GeorgiaEducation.org SAT and ACT Scores: What do they tell Us? Both tests are used for college admissions, but they test different skills. The SAT is more of a critical thinking and problem solving test designed to measure a student’s potential to learn. The ACT is a more curriculum-based test designed to measure what a student has learned.

48 www.GeorgiaEducation.org SAT Scores The SAT was designed to predict how well any given student would perform in his or her freshman year of college. Because the SAT is taken by students in all 50 states, SAT scores are used by the media to rank the quality of public education in the 50 states. Within the state, SAT scores are used to rank high schools. Georgia’s low ranking is often attributed to the high percent of students taking the test. Is that a valid argument? What happens if only states with similar demographics and similar participation rates are compared?

49 www.GeorgiaEducation.org Comparison of 2004 Georgia SAT Scores to States with Similar Participation and Demographics Participation Rate Average Score State Rank% Asian% Black % Hispanic% White% Other MD68%1026347%27%5%58%3% VA71%1024357%19%5%66%4% NJ83%1015389%13%10%63%4% NY87%1007398%13%12%62%6% NC70%1006413%23%3%68%3% DE73%999464%19%4%70%3% FL67%998474%15%18%57%5% GA73%987495%28%3%61%3%

50 www.GeorgiaEducation.org Are our students prepared for the SAT? If the purpose of the SAT is to determine college readiness, students taking it should be on the college prep track. Is the number of students who receive a college prep diploma similar to the number who take the SAT? If the percent of students eligible for a HOPE scholarship is used to estimate grade point average, what does that indicate about the preparedness of Georgia’s students?

51 www.GeorgiaEducation.org Students appear to be well- prepared to take the SAT.

52 www.GeorgiaEducation.org SAT Conclusions Without a student information system, the state cannot know for sure that the students receiving college prep diplomas are the ones taking the SAT, but the close correlation of the numbers indicates that. It is also not clear that only those eligible for HOPE are taking the SAT. Taken together though, it appears that those students taking the SAT should be considered well-prepared.

53 www.GeorgiaEducation.org How does Georgia compare on the ACT? 38th 41st 45th 47th

54 www.GeorgiaEducation.org Graduation Rates The goal of the K-12 system is to graduate students prepared for postsecondary work, whether that is technical school, college, or employment. Graduation rates and the credentials awarded tell how well that goal was accomplished. The credentials awarded may be a diploma or a certificate of attendance. Students who earn the required number of Carnegie units but do not pass the graduation test receive a certificate of attendance.

55 www.GeorgiaEducation.org Graduation Rates There are different ways to determine a graduation rate. Without a student information system the state cannot track individual students. Looking at the number of 9 th graders and the number of graduates four years later gives an approximation of the graduation rate. Independent analyses which use a statistical variation of this cohort method yield a graduation rate within 3 percentage points of this simple method.

56 www.GeorgiaEducation.org 9 th Grade Enrollment vs. Completion 4 Years Later

57 www.GeorgiaEducation.org Credentials Awarded in 2004 College Preparatory Diploma 46.3% Both College and Technology 18.6% Technology/Career Diploma 23.7% Certificate of Attendance 6.9% Special Education Diploma 4.5%

58 www.GeorgiaEducation.org Data Analysis Summary Don’t analyze data in a vacuum. Context is critical. Analyzing the data should lead to additional questions. Consider what other information might be needed to explain it more fully. Effective data analysis can guide improvement in student learning, classroom instruction, and the school environment. Always look for the “meaning behind the data.” Don’t take it on face value. Analyze the “spinning” of the data.

59 www.GeorgiaEducation.org Summary Data analysis is just the beginning of the improvement process. It is how we interpret and use the data that can make a difference.

60 www.GeorgiaEducation.org Data Analysis This presentation is intended to accompany the Georgia School Council GuideBook.


Download ppt "Www.GeorgiaEducation.org Data Analysis This presentation is intended to accompany the Georgia School Council GuideBook."

Similar presentations


Ads by Google