Presentation is loading. Please wait.

Presentation is loading. Please wait.

EXAMINING OPTIONS FOR HIGH SCHOOL NCLB TESTING Deanna L. Morgan, Moderator The College Board.

Similar presentations


Presentation on theme: "EXAMINING OPTIONS FOR HIGH SCHOOL NCLB TESTING Deanna L. Morgan, Moderator The College Board."— Presentation transcript:

1 EXAMINING OPTIONS FOR HIGH SCHOOL NCLB TESTING Deanna L. Morgan, Moderator The College Board

2 This Session  Representatives from 4 states will discuss the model they use for high school NCLB testing and the associated advantages and disadvantages of each.  Michael Hock, Vermont Department of Education  Carol Crothers, Nevada Department of Education  John Jesse, Utah Department of Education  Dan Hupp, Maine Department of Education  Tim Crockett from Measured Progress, the contractor for each state, will compare the 4 models and provide some information for consideration.  Time for questions will be available at the end of the session.

3 New England Common Assessment Program Grade 11 Tests of Reading, Writing & Mathematics Michael Hock Vermont Department of Education NCSA 2010 STRENGTHS AND CHALLENGES OF A GENERAL ACHIEVEMENT SURVEY BATTERY FOR ASSESSING HIGH SCHOOL STUDENTS

4 About NECAP Michael Hock NCSA 2010 Fall Administration to Grade 11 Student in New Hampshire, Rhode Island and Vermont Total Grade 11 Students Assessed – 33,290 Reading, Writing and Mathematics Multiple Choice, Short Answer, Constructed Response and Extended Response (writing) Fully aligned with Grade Expectations for end of grade 10

5 1 long passage 4 MC + 1 CR 1 long passage 4 MC + 1 CR 1 long passage 4 MC + 1 CR 1 short passage 4 MC + 1 CR 1 short passage 4 MC + 1 CR 1 short passage 4 MC + 1 CR 6 stand-alone vocabulary MC Passages are literary and informational. MC = multiple choice; CR = constructed response. Session 1 Session 2 60 – 90 minutes each Distribution of Emphasis for Reading Grade 11 Reading Test Design

6 Reading Strands3 rd 4 th 5 th 6th7th8th11th Word Identification 20%15%-- Vocabulary Strategies & Breadth 20% Subtotals 40%35%20% Initial Understanding of Literary Text 20% 15% Initial Understanding of Informational Text 20% Subtotals 40% 35% Analysis & Interpretation of Literary Text 10%15%20% 25% Analysis & Interpretation of Informational Text 10% 20% Subtotals 20%25%40% 45% Totals 100% Distribution of Emphasis for Reading

7 No Calculator or Math Tools 16 multiple choice (one point each) 8 Short Answer (one point each) 5 Short Answer (two points each) 16 multiple choice (one point each) 8 Short Answer (one point each) 4 Short Answer (two points each) Calculator and Math Tools Permitted 3 Constructed Response (four points each) Distribution of Emphasis for Reading Grade 11 Mathematics Test Design

8 Mathematics Strands3 rd 4 th 5 th 6 th 7 th 8 th 11 th Number and Operations 55%50% 45%30%20% Geometry and Measurement 15%20% 25% Algebra and Functions 15% 30%40% Data, Statistics and Probability 15% Totals 100% Distribution of Emphasis for Reading Distribution of Emphasis for Mathematics

9 Common prompt: response to lit text response to info text report procedure persuasive writing OR reflective essay Matrixed prompt: response to lit text response to info text report procedure persuasive writing OR reflective essay Session 1 Session 2 60 – 90 minutes Distribution of Emphasis for Reading Grade 11 Writing Test Design

10 Writing Clusters5 th 8 th 11 th Structures of LanguageLess emphasis Response to Literary or Informational Text Greater emphasis Expressive Writing: Narrative Reflective Essay Greater emphasis (narrative only) Less emphasis (narrative only) Greater emphasis (reflective essay only) Informational Writing: Report Procedure Persuasive Greater emphasis (report only) Greater emphasis (report & persuasive) Greater emphasis (report, procedure, persuasive) ConventionsLess emphasis Distribution of Emphasis for Reading Distribution of Emphasis for Writing

11  Directly aligned with standards and grade expectations  Linked directly to expectations for lower grades; Part of a coherent learning progression  Relevant and appropriate distribution of emphasis  High standards for technical adequacy  An engine for standards-based reform  Convenient and efficient for contracting Model Strengths

12  Difficulty selecting appropriate grade for administration; End of grade or end of sequence?  Significant intervals between learning and assessment; What are the reasonable standards for retention of skills and concepts?  Difficulty identifying core expectations: What do ALL students need to know?  Student Engagement; How can we make the test relevant for students? Model Challenges

13 Dear the State of Vermont Michael Hock CCSSO 2010 E-Mail from an 11 th Grade CTE Student: I am writing this letter to apologize to you for what I did to my NECAP test. I realize now that what I wrote in it was wrong. I understand that I should have taken the test a lot more seriously because the Center for Technology relies on our outcomes. If we do well than they benefit from it. Tech is really an awesome program and I took advantage of it. Most kids in high school would not get the chance to do something like this. Tech prepares you for the future and gives you great opportunities to work in whatever field you choose. If I had written and drew those kinds of pictures in the actual working world it would be completely unethical and I could potentially have gotten fired. I don’t want to throw excuses at you because what’s done is done but I did not mean anything by what I wrote. It was lyrics from some songs that had been stuck in my head. So I apologize again and hope you accept it.

14 Dear the State of Vermont Michael Hock CCSSO 2010 My response: Thank you for your e-mail. I am sorry that you didn’t take the test more seriously. Just a guess, but your note shows a lot of intelligence and maturity so I suspect that your “real” test scores would have been a valuable addition to the Center for Technology’s results. However, as you wrote, what’s done is done, and it seems like you learned something from the experience, so that’s a plus. I’m particularly glad that you now see that the test is important because it can help a good school like yours get even better. So, in that way, taking the test seriously can be a legacy you leave for the next group of students to come along, or your younger bothers and sisters, or even the children you might have some day. We have a particularly hard time getting high school student to see that the test is important. I wonder if you have any ideas how we might get that message across to next year’s students Oh, by the way, your apology is accepted.

15 Dear the State of Vermont Michael Hock CCSSO 2010 (The student’s suggestions) I know for a lot of students this test can be difficult because we don't take regular curriculum classes. We learn more about the jobs we are trying to reach out of tech. I can't speak for every student in tech but from what i know and understand that is why students don't take the tests as seriously as we should. I don't know if this is a possibility but maybe students could take a different kind of test rather than the NECAP. This year when students were preparing to take the tests, teachers tried to explain how important it was and how much it could benefit tech. I heard a lot of students talking about how they'd take it more seriously if they got some kind of reward.

16 EXAMINING OPTIONS FOR HIGH SCHOOL NCLB TESTING The Nevada High School Proficiency Examination (HSPE) Meets No Child Left Behind Presented by Carol J. Crothers Nevada Department of Education June 20, 2010 National Student Assessment Conference Detroit, MI

17 Nevada HSPE  Passed by the Nevada Legislature in 1977  Became a requirement for graduation in 1990  Writing (Performance Assessment)  Reading (Multiple Choice)  Mathematics (Multiple Choice)  Science* (Multiple Choice) * Required for students beginning with class of 2010

18 Current Testing Opportunities  Reading, Math, & Science  Grade 10 (Spring)  Grade 11 (Fall, Spring)  Grade 12 (Fall, Spring, May, July)  Writing  Grade 11 (Fall, Spring)  Grade 12 (Fall, Spring, May, July)

19 Application to AYP  For purposes of AYP calculations, students are allowed opportunities through spring of Grade 11  11 th Grade enrollment file pulled as of mid-week during spring testing  Student records are matched against past and present testing history

20 Business Rules for AYP  Proficiency  Math  Writing & Reading combined for ELA calculation  Participation  Passed in any test administration, or  Participated in most recent test administration (Spring 11 th grade)  Writing & Reading combined for ELA calculation

21 Challenges Not all students are required to pass the HSPE for graduation (Adjusted Diplomas are issued to students with disabilities who meet the requirements established in IEP) Some statutory or regulatory changes to graduation requirements affecting HSPE are not compatible with AYP

22 Advantages  High stakes for students result in strong motivation for testing  No extra costs to create a stand alone tests for purposes of accountability only

23 Contact Information Carol J. Crothers Director of the Office of Assessment, Program Accountability & Curriculum Nevada Department of Education 700 East Fifth Street Carson City, NV 89701 775-687-9180 ccrothers@doe.nv.gov

24 HIGH SCHOOL AYP – UTAH STYLE John Jesse Director of Assessment and Accountability

25 2009 AYP Workbook  Language  10 th grade Language Criterion Referenced Test  End of course  Math  Algebra 1, Geometry  End of course  High School Graduation Exam (non AYP)  10 th grade

26 Positives: Focused curriculum/clear targets Specific teacher responsibility Course availability for remediation Issues: Math samples lowest achieving segment Math AMO 40 % Language Arts AMO 82% Policy makers created additional assessment

27 2010 AYP Workbook  Algebra 10 th grade  (Score banking and retakes)  Resolutions  All students sampled  Sets high school math standard  Issues  Students moving from out of state  Students on adjusted math curriculum schedule

28 MAINE’S SAT INITIATIVE USING A COLLEGE ADMISSIONS TEST AS A STATE’S HIGH SCHOOL NCLB ACCOUNTABILITY MEASURE DAN HUPP MAINE DEPARTMENT OF EDUCATION

29

30 A Brief History  After administering the Maine Educational Assessment (MEA) since 1985, the Maine Department of Education changed its required high school assessment to the SAT in the spring of 2006.

31 The Maine SAT Initiative has been made possible by a working collaborative consisting of dedicated members from the College Board, Measured Progress and the Maine Department of Education.

32 Why the Change?  Students lacked engagement and investment in the MEA  The results could not be used for grades  The results were not used by colleges

33 Why the Change? (continued) ….additionally,  The MEA results had been flat for the previous five years  MEA required much in-school testing time  MEA was developed specifically for Maine - no opportunity to share expenses or expertise

34 Why the Change? (continued)  The MEA was the school’s sole NCLB accountability measure yet many students did not put forth their best efforts.  Without maximum student effort, the resulting MEA scores were not a valid or accurate measure of actual student learning.  Therefore, decisions made from the analysis of the MEA data were debatable.

35 So Why Was the SAT Chosen? Because:  About 2/3 of Maine’s graduating classes were already taking the SAT at their own expense  It has relevance and meaning to students, parents and the educational community  It is widely recognized and accepted by academic institutions around the world

36 So Why Was the SAT Chosen? (continued)  The are multiple levels of student support- readiness and preparation  It fits into the Department’s vision of graduating all students college*, career and citizenship ready. * any post-secondary institution

37 Is there any hard data to support the claim of increased student engagement associated with the implementation of the SAT Initiative?

38 Yes-  The percentage of students who took the SAT prior to the state initiative  The number of home-schoolers now requesting the high school test  The student questionnaire data supplied on the next slide

39 How important to you is your score on the Math-A and Science test you just completed? (SAT)?  A. extremely important 11% (52%)  B. important 37% (29%)  C. somewhat important 29% (9%)  D. not very important 17% (5%) results from 2009 student questionnaire

40 Were There Concerns About Adopting the SAT? –Yes. Two Basic Categories of Concerns:  The SAT was not the right test  The logistics of administering the SAT to all students would be impossible to implement

41 Concerns Each concern was:  taken seriously  examined thoroughly  addressed as completely as possible

42 Not the Right Test  It’s an “Aptitude Test” and does not measure academic content. -“Aptitude” was dropped in 1994; colleges use SAT results for placement decisions; alignment studies confirm the match.  It is extremely “coachable”; students from affluent families would be advantaged. -Any test with a stable blueprint is somewhat “coachable”; on-line prep for all.

43 Not the Right Test  It is not designed for all students - Recent studies show college and career skill set to be similar.  The USDOE’s NCLB review would not approve the test -Maine’s assessment system was approved on April 24, 2008.

44 Impossible Logistics  Students won’t come to school on a Saturday to take a test. -Commissioner declared the day a legal school day; state has achieved at least 95% participation rate each year.  Some students will have to travel many miles to a test center. -Every Maine high school becomes an approved SAT test center for the May administration.

45 Impossible Logistics  Transportation and operational expenses are an unfair burden on local schools. - All transportation costs incurred by schools are covered by the state.  The “other” students will disrupt the test. - To the amazement of some and the delight of others, no such incidents have occurred -and on the contrary, those “other” students have stepped up to the challenge and atmosphere.

46 2010 MHSA Administration Dates  The MHSA SAT administration date for the 2009- 2010 school year is Saturday, May 1, 2010.  The MHSA Science Tests must be administered during a 2- week window which begins Monday, March 29th and closes Friday, April 9th, 2010.

47 2010 MHSA/SAT Make-Up Dates  Saturday June 2nd for students wanting to receive traditional SAT college reportable scores (taken at a nearby test center).  Monday May 3rd –Wednesday May 12th for students wanting “Maine Purposes Only” scores (taken at the local high school during the school day).

48 Equity in Preparation Leveling the playing field for all students

49 Equity in Preparation  *#1 (by far) is quality daily instruction*  SAT “Question of the Day”  The Official SAT Online Course -WebEx Training for Maine Students and Educators -Regional Professional Development for Math and ELA

50 SAT: Student Readiness / Preparation  As part of a multi-year agreement with College Board, the Maine Department of Education is pleased to announce that effective immediately, all students enrolled in Maine public high schools (grades 9-12) have 24- hour, year-round access to The Official SAT Online Course. This opportunity also extends to all high school faculty and administrators. For technical assistance regarding The Official SAT Online Course, call:  1-800-416-5137  SAT Online Course  Case Studies  October edition of The Official SAT Online Course Educator Newsletter   The WebEx training for Maine educators on SAT Online Course use is available at http://www.collegeboard.com/mainetraining

51 Equity in Preparation  The Official SAT Study Guide –student and teacher editions  PrepMe.com  Google “free SAT preparation material absolutely free”

52 SAT Data Release  By combining the Measured Progress Data Analysis Tool and the College Board released test form, schools are able to view how every student answered every question on the May SAT administration.

53

54

55

56 Challenges and Next Steps  To make all students, parents and educators aware of the resources that currently exist (on-going)  To provide an SAT item-level report to all students and schools (previous slide)  To provide professional development using that SAT data in combination with corresponding PSAT data (continue and improve)

57 Challenges and Next Steps  To further simply the MHSA student registration process  To make fully transparent and understandable all aspects of the MHSA program  To create a state-wide “best practices” user group

58 Maine’s SAT Initiative All Maine High School Assessment (MHSA) information can be found on the Department’s web site at: http://www.maine.gov/education/sat_initiative/ind ex.htm http://www.maine.gov/education/sat_initiative/ind ex.htm Contact me directly at: dan.hupp@maine.govdan.hupp@maine.gov

59 COMPARISONS AND CONSIDERATIONS Tim Crockett, Discussant Measured Progress

60 Model Comparison Survey Battery Graduation Test End of Course College Placement High Stakes for Students (Motivation) XXX High Stakes for SchoolsXXXX Re-tests/ make-ups RequiredXXX Targeted Content (Subset of High School Coursework) X Specific Teacher ResponsibilityX Constructed as Standards- Based (as opposed to NRT) XXX Very Rapid Turnaround (Potentially Leads to all MC) XXX

61 Other Considerations: Survey Battery  What grade to test?  What is core for all students? Graduation Test  Significant file matching for past performances  Are achievements standards as high as grades 3-8? End of Course  Varying student course-taking schedules  Student mobility College Placement  Non-college reportable administration required to allow for full range of accommodations  All items released and reported out


Download ppt "EXAMINING OPTIONS FOR HIGH SCHOOL NCLB TESTING Deanna L. Morgan, Moderator The College Board."

Similar presentations


Ads by Google