Presentation on theme: "Dynamic Learning Maps Alternate Assessment Consortium Laura Kramer Center for Educational Testing and Evaluation University of Kansas March 3, 2011 1."— Presentation transcript:
Dynamic Learning Maps Alternate Assessment Consortium Laura Kramer Center for Educational Testing and Evaluation University of Kansas March 3, 2011 1
Overview Race to the Top Assessment competition focused on the general assessment. OSEP released a grant competition in mid- June for an alternate (“1%”) assessment. KSDE approached CETE in late June about applying for the grant. Of 5 applications, only 2 were awardees.
Consortia Collaboration The other grantee under this competition is the National Center and State Collaborative. To maximize resources the two consortia are exploring ways to work together. To maintain coherence in state assessment systems, also will explore working with the Race to the Top Assessment consortia (SBAC and PARCC). 3
State Participants Iowa Kansas Michigan Mississippi Missouri New Jersey North Carolina Oklahoma Utah Virginia* West Virginia Washington* Wisconsin 4
Other Participants University of Kansas – Beach Center on Disability – Center for Educational Testing and Evaluation – Center for Research Methods and Data Analysis – Center for Research on Learning – Faculty in several departments AbleLink Technologies The ARC The Center for Literacy and Disability Studies at the University of North Carolina at Chapel Hill Edvantia 5
Major Tasks Common Core Essential Elements and creating ALDs Development and validation of learning maps Creation of instructionally relevant item types Technology development Item and assessment development Standard setting Professional development Instructional consequences Family engagement and dissemination 6
Essential Elements and Learning Maps Use Common Core State Standards as the starting point. Derive key concepts in iterative fashion. – Break down skills in CCSS and identify multiple (somewhat) hierarchical pathways. – Review nodes with educators and content experts to ensure that links to CCSS are maintained.
Purposes Establish consistency in expectations Emphasize skill similarities within diverse ways of performing Provide instructional guidance Connect instruction and assessment Accommodate diverse disabilities
Dynamic Learning Maps The goal – proficiency on the CCSS – Like driving from DC to San Francisco… – Won’t always take the shortest or most direct route – Scenic routes, through small towns, occasional detours, might spend some extra time in one place or another (or go back and revisit a town) – But we keep going!
Dynamic Learning Maps Assessment integrated with instruction Multiple waypoints with detailed feedback – Between St Louis and Denver vs between Lawrence and Topeka – Identifying specific aspects of student mastery to pinpoint what has been achieved, or what still needs work Different paths – Driving conditions differ, so that may lead you to choose a different route – Students don’t all travel the same roads, so DLM provides many routes for students to demonstrate mastery
Dynamic Learning Maps Focus on what students CAN do Identify gaps of what students cannot do (yet) Provide standardized scaffolds to unpack what precursor skills are missing Offer logical next steps for instruction / skill- building
So Far… Developing Learning Maps and Essential Standards Developing new technology to deliver instructional tasks Identifying instructionally relevant and sensitive item types Preparing first professional development modules
Coming Soon… From Essential Elements, develop assessment achievement level descriptors, instructional achievement descriptors, and examples Develop assessment tasks based on nodes of Learning Maps, reviews including cognitive labs, pilot testing, and field testing Enhance assessment task delivery through technology platform, including “built-in” accommodations and accessibility
Item Development Development of instructionally relevant item types – Focus groups with master educators – Review by special education, assessment, and technology experts Use of evidence centered design – Student model (What skills in the learning maps should be assessed?) – Evidence models (What behaviors will provide necessary evidence?) – Task models (What tasks will elicit the behaviors required for evidence gathering?) 14
Item Development Creation of item and item pool specifications Align items to learning maps (not just a 1-to-1 correspondence!) Develop items and standardized scaffolds so that incorrect responses lead to further diagnostic inquiry Cognitive labs with students, teachers, and parents Review by educators with content expertise and special education expertise Other internal and external reviews (bias, sensitivity, editorial, etc.) Pilot testing Census field testing (last year of grant) 15
Next Generation Assessment System Planned New Assessment Delivery Features Probabilistic model (Bayesian network-based) for item selection added to existing linear, item-level-adaptive, testlet- adaptive, and multi-dimensional IRT adaptive test models Constructed response and other new item types Item Scoring – Keyword lists – Numerical responses – Hot spots – Drag and drop – Others to be determined 16
Next Generation Assessment System Planned New Assessment Delivery Features Enhanced accommodation/universal design capabilities including (but not limited to) – Audio via sound files – American Sign Language video – Pop-up context-dependent dictionaries/glossaries – Text and image magnification – On-screen note taking – Color overlays – IntelliKeys™ keyboard accessibility – Masking Section 508, QTI, APIP, and SCORM compliant Proctoring and real-time results monitoring capabilities 17
Professional Development Topics Emphasis on instruction UDL principles as they relate to students with significant intellectual disabilities Integration of (extended) standards, maps, and the assessment process Relationship with goal setting and IEP development 18
THANK YOU! For more information, please contact: Neal Kingston (firstname.lastname@example.org)email@example.com or Alan Sheinker (firstname.lastname@example.org)email@example.com 19