Presentation is loading. Please wait.

Presentation is loading. Please wait.

Setting Accuplacer Cut Scores for WritePlacer Plus and College Math Becky J. Mussat-Whitlow, M.A., Ph.D. Director of Institutional Assessment Director.

Similar presentations


Presentation on theme: "Setting Accuplacer Cut Scores for WritePlacer Plus and College Math Becky J. Mussat-Whitlow, M.A., Ph.D. Director of Institutional Assessment Director."— Presentation transcript:

1 Setting Accuplacer Cut Scores for WritePlacer Plus and College Math Becky J. Mussat-Whitlow, M.A., Ph.D. Director of Institutional Assessment Director of Institutional Assessment Winston-Salem State University Winston-Salem State University Robert Ussery, M.S. Assistant Vice Chancellor for Academic Affairs North Carolina A&T State University North Carolina A&T State University 18th Annual Accuplacer National Conference Fort Lauderdale, Florida Fort Lauderdale, Florida June 26, 2008 June 26, 2008

2 Introduction Historical context Historical context  Local bubble sheet test for math placement  ETS’s online Criterion for English placement Executive decision to use Accuplacer Executive decision to use Accuplacer  For math and English placement  Single vendor

3 November, 2007 November, 2007  Assign project manager  Complete project plan December, 2007 – Secure funding December, 2007 – Secure funding January, 2008 January, 2008  Provost appoint Placement Committee  Recruit cut score study consultant February, 2008 February, 2008  Appoint cut score study panels and chairs  Conduct Round 1 of student testing March, 2008 March, 2008  Conduct Cut Score study during Spring Break (March 3 – 7)  Round 2 of student testing  Cut score recommendations to Placement Committee Project Timeline

4 The standard setting process should pay careful attention to : 1. Selection of panelists 2. Training 3. Aggregation of data into a final set of standards 4. Validation of performance standards 5. Careful documentation of the process. Reference: Hansche, L.N. (1998). Handbook for the development of performance standards. Bethesda, MD: US Department of Education, Council of Chief State School Officers. Creating Defensible Cut Scores

5 At what point along the scale should the passing mark be set? Critical Question

6 Cut scores split a continuous distribution of knowledge, skills, and abilities into separate regions. Need to determine the preference for classification error.  Do you prefer to pass students who deserved to fail? OR  Do you prefer to fail students who deserved to pass? Classification Error

7 2000 BC – Chinese Military Selection 2000 BC – Chinese Military Selection 1800s - Integration of Psychology and Statistics 1800s - Integration of Psychology and Statistics 1970s – Mandated pupil proficiency testing 1970s – Mandated pupil proficiency testing 1978 Journal of Educational Measurement 1978 Journal of Educational Measurement 1982 ETS Passing Scores publication 1982 ETS Passing Scores publication History of Standard Setting

8 “People have been setting cut scores for thousands of years, but it is only since the middle of the 20 th century that measurement professionals began to pay much attention.” ~ From: Cizek, G.J. (2001). Setting Performance Standards. Hillsdale, NJ: Lawrence Erlbaum Associates. (page 20)

9 Overview of Standard Setting All standard setting methods involve judgment All standard setting methods involve judgment Performance standards (cut scores) may be set too high or low Performance standards (cut scores) may be set too high or low Need a reasonable process to arrive at decision Need a reasonable process to arrive at decision

10 Two Categories of Setting Performance Standards Test-based methods – methods in which panelists are focused on a review of test content Test-based methods – methods in which panelists are focused on a review of test content Student based methods – methods that focus on students Student based methods – methods that focus on students

11 Guiding Principles to Ensure Fairness in Establishing Cut Scores Those who set the standards should be thoroughly knowledgeable about the content domain that is to be assessed, the population of examinees who are to take the assessment, and the uses to which the results are to be put. (p. 316) Those who set the standards should be thoroughly knowledgeable about the content domain that is to be assessed, the population of examinees who are to take the assessment, and the uses to which the results are to be put. (p. 316) Proficiency classifications should have the same meaning for all sub-groups. (p. 317) Proficiency classifications should have the same meaning for all sub-groups. (p. 317) ~ From: Bond, L. (1995). Ensuring fairness in the setting of performance standards. In Proceedings of Joint Conference on Standard Setting for Large-Scale Assessments (pp ). Washington, DC: National Assessment Governing Board and National Center for Education Statistics.

12 Guiding Principles to Ensure Fairness in Establishing Cut Scores If the assessment is to be used as a screen for future educational opportunities, the content of the assessment and the level of proficiency required should be demonstrably related to future success. (p. 317) If the assessment is to be used as a screen for future educational opportunities, the content of the assessment and the level of proficiency required should be demonstrably related to future success. (p. 317) Attention must be paid to the consequences of particular uses of an assessment. (p. 318) Attention must be paid to the consequences of particular uses of an assessment. (p. 318) ~ From: Bond, L. (1995). Ensuring fairness in the setting of performance standards. In Proceedings of Joint Conference on Standard Setting for Large-Scale Assessments (pp ). Washington, DC: National Assessment Governing Board and National Center for Education Statistics.

13 Guiding Principles to Ensure Equity in the Process of Setting Cut Scores Adequate notice of proposed actions. Adequate notice of proposed actions. Ample provision of opportunities for participation Ample provision of opportunities for participation Adequate records of all discussions and decisions by the participants. Adequate records of all discussions and decisions by the participants. Timely distribution of minutes and ballot results Timely distribution of minutes and ballot results Careful attention to minority opinions. Careful attention to minority opinions. ~ From: Collins,B. L. (1995). The consensus process in standards development. In Proceedings of Joint Conference on Standard Setting for Large-Scale Assessments (pp ). Washington, DC: National Assessment Governing Board and National Center for Education Statistics.

14 Common Problems Use of ambiguous descriptions of performance standards Use of ambiguous descriptions of performance standards Failure to involve key stakeholders in standard setting process Failure to involve key stakeholders in standard setting process Failure to devote sufficient time to establish cut scores Failure to devote sufficient time to establish cut scores Failure to document the process Failure to document the process Failure to validate the process Failure to validate the process

15 Questions to Consider How many cut scores are needed? Two (1 to differentiate between remedial and regular, 1 to differentiate between regular and advanced) VERSUS One (to differentiate between remedial and regular course placement)

16 Desirable Characteristics of Cut Scores Should be understandable and useful for all stakeholders Should be understandable and useful for all stakeholders Clearly differentiate among levels Clearly differentiate among levels Grounded in student work Grounded in student work Built by consensus Built by consensus Focus on learning Focus on learning

17 Contrasting Group Method A group of panelists qualified to assess the content domain and students being assessed are asked to classify students into two groups (master vs. non-masters) A group of panelists qualified to assess the content domain and students being assessed are asked to classify students into two groups (master vs. non-masters) Panelists initial make judgments regarding grouping of students. Then, the performance of examinees empirically determined. Panelists initial make judgments regarding grouping of students. Then, the performance of examinees empirically determined.

18 Contrasting Group Method Two distributions created to represent the students’ actual (obtained) scores on the assessment separately; one for those students judged to have acceptable skills by the standard setters and another for those students whose performances were judged to be unacceptable. Two distributions created to represent the students’ actual (obtained) scores on the assessment separately; one for those students judged to have acceptable skills by the standard setters and another for those students whose performances were judged to be unacceptable. The point at which the two distributions intersect may be chosen as the cut score location. The point at which the two distributions intersect may be chosen as the cut score location.

19 Hypothetical Illustration of Contrasting Groups Distributions Nonmasters Distribution Masters Distribution Score Scale CχCχ ƒ

20 Modified Contrasting Group Method Students were classified into three broad performance categories of average, below average, or above average and administered the placement tests. Students were classified into three broad performance categories of average, below average, or above average and administered the placement tests. Score distributions were plotted to represent the students’ actual (obtained) scores on the assessment separately Score distributions were plotted to represent the students’ actual (obtained) scores on the assessment separately The distribution plot was visually analyzed to identify an appropriate cut score. The distribution plot was visually analyzed to identify an appropriate cut score.

21 Modified Contrasting Group Method Panelists completed the placement test role playing as a student with average and above average ability Panelists completed the placement test role playing as a student with average and above average ability Score distributions were plotted to represent the panelists’ actual (obtained) scores Score distributions were plotted to represent the panelists’ actual (obtained) scores The distribution plot was visually analyzed to help guide the establishment of an appropriate cut score. The distribution plot was visually analyzed to help guide the establishment of an appropriate cut score.

22 Multi-Step Approach to Establish Cut Scores Panelists created performance level descriptions (PLDs) and mapped these PLDs to College Board proficiency level statements Panelists created performance level descriptions (PLDs) and mapped these PLDs to College Board proficiency level statements Panelists reviewed student score distributions Panelists reviewed student score distributions Panelists completed placement test and reviewed their score distributions. Panelists completed placement test and reviewed their score distributions. Additionally, for English, panelists retrospectively categorized papers into 2 groups (remediation required vs. no remediation required) and used this information to guide cut score establishment.

23 Three Day Panel Sessions Panels met for 3 hour sessions on 3 consecutive days. Panels met for 3 hour sessions on 3 consecutive days. Math Panel: 12 faculty Math Panel: 12 faculty English Panel: 12 faculty English Panel: 12 faculty

24 Panel Session I Panel Session 1: Panel Session 1:  Identified the number of cut scores needed and the different courses into which students would be placed  Developed clear and concise performance level descriptions for each placement level

25 Characteristics of Performance Level Descriptions Written in positive terms Written in positive terms Parallel in organization, language, and style Parallel in organization, language, and style Written in clear and concise language without using unmeasurable qualifiers (e.g., often, seldom, etc.) Written in clear and concise language without using unmeasurable qualifiers (e.g., often, seldom, etc.)

26 Example of Worksheet

27 Writing Performance Level Description FOCUS maintains consistent point of view/perspective and clearly communicates main point and unity of purpose ORGANIZATION demonstrates understanding of primary elements of the essay (introduction, body, conclusion), develops effective paragraphs, and uses transitions correctly and effectively sequences ideas DEVELOPMENT AND SUPPORT develops complete thoughts using appropriate vocabulary and contains relevant evidence and supporting details SENTENCE STRUCTURE demonstrates understanding of basic grammar and varied syntax MECHANICS demonstrates understanding of basic mechanical conventions such as spelling, punctuation, capitalization, and word choice

28 Math Performance Level Descriptions MATH 101: MATH 101: Perform addition, subtraction, multiplication and division of polynomials; identify factors, terms, constants, variables, exponents, and coefficients; and recognize graphs of polynomial functions. MATH 110 or 111 MATH 110 or 111 Perform addition, subtraction, multiplication and division on algebraic expressions involving integer exponents and radicals, solve linear and quadratic equations, evaluate, graph, and find the domain and range of functions including linear and quadratic functions. MATH 131 MATH 131 Perform addition, subtraction, multiplication, division, factoring, composition and the simplification of algebraic, trigonometric, exponential and logarithmic expressions and functions; use algebraic skills to solve linear, quadratic, rational, and absolute value equations and inequalities; and evaluate, graph and interpret functions as well as their transformations.

29 Panel Session II Panel Session 2: Panel Session 2:  PLDs Approved  Initial round of cut score setting based upon the results of student testing and faculty role play performance  For English, student work categorized by panelists into 2 groups.

30 Illustration of Score Distributions

31 1 st Cut Score = 30 Students scoring less than 30 – Remedial Placement 2 nd Cut Score = 42 Students scoring from 30 to 41 – Math 101 3rd Cut Score = 65 Students scoring from 42 to 64 – Math 110 or 111 Students scoring 65 or higher – Math 131

32 English Panelist Rating Information MeanSD Remedial Indicated by Two or Fewer Panelists Remedial Indicated by Three or More Panelists

33 English Panelist Rating Information MeanSD Remedial Indicated by None or Some Panelists Remedial Indicated by All Panelists

34 Panel Session III Panel Session 3: Panel Session 3:  Cut Scores were aligned across various methods used.

35 PLDs Mapped To Proficiency Statements MATH 101 MATH 101  PLD: Perform addition, subtraction, multiplication and division of polynomials; identify factors, terms, constants, variables, exponents, and coefficients; and recognize graphs of polynomial functions.  Lower than ACCUPLACER 40  Cut Score 30

36 Alignment of Recommendations Performance level descriptions written by Performance level descriptions written by NC A & T faulty were mapped to College Board proficiency level statements NC A & T faulty were mapped to College Board proficiency level statements Review score distributions for student performance groups to determine cut score Review score distributions for student performance groups to determine cut score Cut scores indicated on the basis of the student data were considered in conjunction with performance level descriptions. Cut scores indicated on the basis of the student data were considered in conjunction with performance level descriptions.

37 Example of Alignment Process Based on mapping to Accuplacer Proficiency Statements. Cut score should fall between 40 & 63. Math 110 or 111

38 Follow-Up Testing and Cut-Score Refinement Additional Students Tested Additional Students Tested Cut Scores Revised Cut Scores Revised Recommendations Made By Panelists Recommendations Made By Panelists

39 Project Timeline after Study March, 2008 March, 2008  Committee review and recommendations  Executive review and policy April, 2008 April, 2008  Develop student score validation and upload procedure May, 2008 May, 2008  Train proctors and faculty advisors  Final system test June June  Go live with Accuplacer placement testing  500 students per week July, 2008 July, 2008  Process evaluation and recommendations

40 Campus Collaboration Admissions – New students Admissions – New students Student Affairs – New student orientation Student Affairs – New student orientation Registrar – Validate student data Registrar – Validate student data IT – Upload student data to campus system IT – Upload student data to campus system Academic Affairs - Proctors and advisors Academic Affairs - Proctors and advisors Institutional Research – Management and support Institutional Research – Management and support

41 Accuplacer Support Helpline Helpline Online procedure guides Online procedure guides

42 Early Results Five (5) orientation sessions in June Five (5) orientation sessions in June Overall, the system works Overall, the system works Some Accuplacer interface weaknesses Some Accuplacer interface weaknesses

43 WHERE TO FROM HERE Develop alternative placement measures Develop alternative placement measures Implement off-site testing Implement off-site testing

44 Thank you for attending! Questions? NCA&T Support Site

45 Proficiency Statements WritePlacer Plus

46 ACCUPLACER – Math Proficiency Statements Total Right Score of About 40 Total Right Score of About 40 Students scoring at this level can identify common factors, factor binomials and trinomials, manipulate factors to simplify complex fractions Total Right Score of About 63 Total Right Score of About 63 Students scoring at this level can demonstrate the following additional skills: work with algebraic expressions involving real number exponents, factor polynomial expressions, simplify and perform arithmetic operations with rational expressions, including complex fractions, solve and graph linear equations and inequalities, solve absolute value equations, solve quadratic equations by factoring, graph simple parabolas, understand function notation, such as determining the value of a function for a specific number in the domain, have a limited understanding of the concept of function on a more sophisticated level, such as determining the value of the composition of two functions, have a rudimentary understanding of coordinate geometry and trigonometry Total Right Score of About 86 Total Right Score of About 86 Students scoring at this level can demonstrate the following additional skills: understand polynomial functions, evaluate and simplify expressions involving functional notation, including composition of functions, solve simple equations involving: trigonometric functions, logarithmic functions, exponential functions Total Right Score of About 103 Total Right Score of About 103 Students scoring at this level can demonstrate the following additional skills: perform algebraic operations and solve equations with complex numbers, understand the relationship between exponents and logarithms and the rules that govern the manipulation of logarithms and exponents, understand trigonometric functions and their inverses, solve trigonometric equations, manipulate trigonometric identities, solve right-triangle problems, recognize graphic properties of functions such as absolute value, quadratic, and logarithmic


Download ppt "Setting Accuplacer Cut Scores for WritePlacer Plus and College Math Becky J. Mussat-Whitlow, M.A., Ph.D. Director of Institutional Assessment Director."

Similar presentations


Ads by Google