Presentation is loading. Please wait.

Presentation is loading. Please wait.

Technical Documentation for Two Types of Alternate Assessment for Students with Signficiant Cognitive Disabilities

Similar presentations


Presentation on theme: "Technical Documentation for Two Types of Alternate Assessment for Students with Signficiant Cognitive Disabilities"— Presentation transcript:

1 Technical Documentation for Two Types of Alternate Assessment for Students with Signficiant Cognitive Disabilities http://www.cde.state.co.us/cdesped/EAG.asp http://www.cde.state.co.us/cdesped/EAG.asp Patricia Almond, Sue Bechard, and Janet Filbin April 14, 2005 NCME Montreal, Canada

2 Alternate Assessment Collaborative (AAC) Grant S368A03000 US DOE OESE Funded to Colorado by the USDOE Enhanced Assessment Grant 2003-2005 Multi-state collaborative 7 states (originally 9 states) Purpose Develop 2 types of alternate assessment (AA) Pilot test the AAs

3 First Goal—Frame the Problem Very few students participate in alternate assessment (persistent small N size problem) As one psychometrician expresses it, these “students don’t fit the mathematical model” used in general assessment Students do, however, receive test scores and are assigned performance levels We need to be able to specify the interpretations that can be made based on alternate assessment scores

4 Second Goal—Seek assistance from the Measurement Community NCME members are the professionals having the necessary technical knowledge, experience, and expertise in measurement We request (urge, invite, tempt) members to bring their knowledge base to the problem we are framing We need your guidance and participation to determine and improve the technical adequacy of alternate assessments

5 In service of these goals we are going to present: Students & Test Items: Briefly describe students and structured performance-based tasks and activities employed in AAC EAG Pilot Alternate Assessments Approach: Explain our approach and procedures for documenting technical adequacy Results : Present AAC EAG reports for technical documentation of the pilot alternate assessments in the areas of test development, reliability, and alignment

6 Students and Test Items: Briefly describe students and structured performance-based tasks and activities employed in AAC EAG Pilot Alternate Assessments

7 Beta Student—Selected Response

8 Saige—Selected Response

9 Two types of alternate assessment Performance Task (PT) Typically one session One-on-one teacher and student Teacher presents tasks in order (math 27, reading 23, writing 18, science 23) Teacher rates and records student response for each task Instructionally Embedded Assessment (IEA) An instructional unit with 10 lessons over 10 days (approx 50 min each day) Structured to resemble typical classroom instruction Evidence of student performance collected from instructional activity Scored by raters at a scoring site,

10 Performance Task Science Item Selected Response Set-Up: Arrange the Prediction Cards in front of the student Describe each (the celery will wilt, the celery will grow, the cup will be empty, there will be no change) Prompt: Communicate: “You said living objects need water to live. Celery is a living object. Tell me what will happen if the celery does not have water.” Student may respond by saying, pointing, or using eye gaze to select card.”

11 Instructionally Embedded Assessment Science Lesson Instructional/Assessment Activities 1. Conduct data probe 2. Review characteristics of living and nonliving things (song, sorted collections). 3. Introduce the culminating product and the presentation of findings 4. Observe mealworms in terrarium. Help student make observations. 5. Choose a nonliving object make observations. 6. Model the scientific notebook. Day 2: Assessment Types to be Collected Check off each type of student work sample as it is included. □ Probe data collected for day 2 □ Scientific Notebook Entry

12 Approach: Explain our approach and procedures for documenting technical adequacy

13 Imitate Approach to Technical Documentation used for General Assessment as our Guide  A Parallel Path  Another side of the Same Coin  Comparable Questions  Evidence Fitted to Special Education  Process Adjusted to Meet the Unique Aspects of Alternate Assessment

14 From NCME Training Course (Huynh, Meyer, Barton, April 2004)

15 “How to” determine/improve AA Technical Adequacy = Ultimate Question Selected Approach: Apply traditional or currently accepted practices used in technical documentation for large-scale assessment (Huynh, Meyer, Barton, April 2004) “Technical document is not a place to make judgments but rather a place to lay out the information—what was done and what happened.” Alternative to Consider: Consider an evidence centered approach to technical documentation of assessment (McCall, Duran, Quellmalz, January 2005) “... the ECD approach resonates with part of what you’re working on, but is more statistical in nature...” such an approach might be useful for determining technical adequacy in alternate assessment (Duran email, January 2005))

16 we need judges because the laws are rules that don’t fit the exceptions Paraphrased Grant Wiggins used this quote from Aristotle

17 Test Development Results : Present AAC EAG reports for technical documentation of the pilot alternate assessments in the areas of: Test Development,  Reliability, and Alignment

18 Reliability Results : Present AAC EAG reports for technical documentation of the pilot alternate assessments in the areas of: Test Development, Reliability,  and Alignment

19 Results : Present AAC EAG reports for technical documentation of the pilot alternate assessments in the areas of: Test Development, Reliability, and Alignment 

20 An aligned system calls for alignment between standards and assessment but also standards and instruction.

21 Content Alignment Adapted from Grisham-Brown and Kearns(2001)

22 Purpose of Study Determine the extent to which AAC alternate assessments represent the concepts and skills spelled out in the expanded benchmarks

23 Study 1: Expert Review Alignment between six alternate assessments And Expanded Benchmarks

24 Four Experts asked to rate alignment between: A) the expanded benchmarks and the assessment activity, B) the expanded benchmarks and the targeted student response, and C) the assessment activity and the student response.

25 Expert Judgments IEA: 92 percent of the days (events) were considered aligned to expanded benchmarks in all three pairings PT: 70 percent of the steps (items) were considered aligned to expanded benchmarks in all three pairings

26 Study 2: Alignment (Webb, 2001 & Tindal, 2005) List standards and objectives Concurrence—count standards with alternate assessment tasks & calculate % Determine Depth of Knowledge Calculate Range of Knowledge (tasks/total No. = % in each standard) Balance of Representation—standards with assessment tasks

27 Categorical Concurrence: % of Standards in Alternate Assessment % of Standards in AA SubjectPTIEA Science20% Reading100% Writing75%60% Mathematics60%80%

28 Depth of Knowledge for Mathematics, Reading, Science, and Writing Expanded Benchmarks

29 Range of Knowledge for AAC Subject Domains

30 Balance of Representation Criterion Degree to which one concept is given more emphasis on assessment than another..7 or higher indicates that items distributed among objectives Values between.6 and.7 indicate the criterion only been “weakly” met Results Four subjects and two assessment formats did not meet the criterion The index ranged between.47 and.59 Design of assessments gave more emphasis to PLT’s target concepts

31 Answers to Alignment Questions, Broadly Speaking: YES—Categorical Concurrence: Does the alternate assessment address subject matter contained in the state Content Standards? NO—Range of Knowledge: Does the alternate assessment address every Content Standard? To what extent is each standard represented? YES—Depth of Knowledge: Does the alternate assessment address the standard at the level of difficulty (cognitive complexity) reflected in the content standard? NO—Balance of Representation: For each content standard (or indicator) represented in the alternate assessment, how often does it appear?

32 Conclusions Expanded Benchmarks provided a structure for aligning two types of alternate assessment to academic content standards With the exception of science, the resulting assessments showed alignment with the content standards expanded benchmarks were generally at level one for Depth of Knowledge representing recall or rote behavior

33 Question What is the expected breadth of knowledge and range of knowledge for alternate assessments?

34 To Conclude

35 REMINDER OF BIGGER QUESTION: “How to” determine/improve AA Technical Adequacy = Ultimate Question A. Selected Approach: Apply traditional or currently accepted practices used in technical documentation for large-scale assessment (Huynh, Meyer, Barton, April 2004) “Technical document is not a place to make judgments but rather a place to lay out the information—what was done and what happened.” B. Alternative to Consider: Consider an evidence centered approach to technical documentation of assessment (McCall, Duran, Quellmalz, January 2005) “... the ECD approach resonates with part of what you’re working on, but is more statistical in nature...” such an approach might be useful for determining technical adequacy in alternate assessment (Duran email, January 2005)) Which path will be most appropriate? A, B, other?

36 Second Goal—Seek assistance from the Measurement Community NCME members are the professionals having the necessary technical knowledge, experience, and expertise in measurement We request (urge, invite, tempt) members to bring their knowledge base to the problem we are framing We need your guidance and participation to determine and improve the technical adequacy of alternate assessments

37 Technical Documentation for Two Types of Alternate Assessment for Students with Signficiant Cognitive Disabilities http://www.cde.state.co.us/cdesped/EAG.asp http://www.cde.state.co.us/cdesped/EAG.asp Patricia Almond, Sue Bechard, and Janet Filbin April 14, 2005 NCME Montreal, Canada


Download ppt "Technical Documentation for Two Types of Alternate Assessment for Students with Signficiant Cognitive Disabilities"

Similar presentations


Ads by Google