Presentation is loading. Please wait.

Presentation is loading. Please wait.

ADL Slide 1 December 15, 2009 Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment Robert J. Mislevy Professor, Measurement &

Similar presentations


Presentation on theme: "ADL Slide 1 December 15, 2009 Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment Robert J. Mislevy Professor, Measurement &"— Presentation transcript:

1 ADL Slide 1 December 15, 2009 Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment Robert J. Mislevy Professor, Measurement & Statistics University of Maryland with John T. Behrens & Dennis Frezzo Cisco Systems, Inc. December 15, 2009 ADL, Alexandria, VA

2 ADL Slide 2 December 15, 2009 Simulation-based & Game-based Assessment Motivation: Cog psych & technology »Complex combinations of knowledge & skills »Complex situations »Interactive, evolving in time, constructive »Challenge of technology-based environments

3 ADL Slide 3 December 15, 2009 Outline ECD Packet Tracer Packet Tracer & ECD

4 ADL Slide 4 December 15, 2009 ECD

5 ADL Slide 5 December 15, 2009 Messick’s (1994) guiding questions: What complex of knowledge, skills, or other attributes should be assessed? What behaviors or performances should reveal those constructs? What tasks or situations should elicit those behaviors? Evidence-Centered Assessment Design

6 ADL Slide 6 December 15, 2009 Evidence-Centered Assessment Design Principled framework for designing, producing, and delivering assessments Process model, object model, design tools Explicates the connections among assessment designs, inferences regarding students, and the processes needed to create and deliver these assessments. Particularly useful for new / complex assessments.

7 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Layers in the assessment enterprise

8 ADL Slide 8 December 15, 2009 Packet Tracer

9 ADL Slide 9 December 15, 2009 Cisco’s Packet Tracer Online tool used in Cisco Networking Academies Create, edit, configure, run, troubleshoot networks Multiple representations in the logical layer Inspection tool links to a deeper physical world Simulation mode »Detailed visualization and data presentation Standard support for world authoring »Library of elements »Simulation of relevant deep structure »Copy, paste, save, edit, annotate, lock

10 ADL Slide 10 December 15, 2009 Instructors and students can author their own activities

11 ADL Slide 11 December 15, 2009 Instructors and students can author their own activities

12

13 Explanation

14 Experimentation

15 ADL Slide 15 December 15, 2009 Packet Tracer & ECD

16 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Layers in the assessment enterprise

17 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

18 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment argument structures Design Patterns

19 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Application to familiar assessments Upfront design of features of task situation – static, implicit in task, not usually tagged Upfront design of features of response classes Fixed competency variables

20 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Application to complex assessments More complex evaluations of features of task situation Multiple, perhaps configured- on-the-fly competency variables

21 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Application to familiar assessments Application to complex assessments Macro features of performance Micro features of performance Unfolding situated performance Micro features of situation Macro features of situation Time Some up front design of features of task situation; others recognized (e.g., agents) Some up front design of features of performance or effects; others recognized (e.g., agents) Evolving, interactive, situation

22 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Object models for representing … Psychometric models Includes “competences/proficiencies Simulation environments Task templates Automated scoring

23 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. PADI object model for task/evidence models

24 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Graphical representation of network & configuration is expressable as a text representation in XML format, for presentation & work product, to support automated scoring.

25 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Authoring interfaces Simulation environments Re-usable platforms & elements Standard data structures IMS/QTI, SCORM

26 In Packet Tracer, Answer Network Serves as base pattern for work product evaluation

27 Dynamic task models - Variable Assignment: Initial Network Similar to the Answer Network Tree. When the activity starts, instead of using the initial network as the starting values, the activity will configure the network with the contents of the variables.

28 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

29 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Interoperable elements IMS/QTI, SCORM Feedback / instruction / reporting

30 ADL Slide 30 December 15, 2009 Conclusion Behavior in learning environments builds connections for performance environments. Assessment tasks & features strongly related to instruction/learning objects & features. Re-use concepts and code in assessment, »via arguments, schemas, data structures that are consonant with instructional objects. Use data structures that are »share-able, extensible, »consistent with delivery processes and design models.

31 ADL Slide 31 December 15, 2009 Further information Bob Mislevy home page »http://www.education.umd.edu/EDMS/mislevy/ »Links to papers on ECD »Cisco NetPASS »Cisco Packet Tracer PADI: Principled Assessment Design for Inquiry »NSF project, collaboration with SRI et al. »http://padi.sri.com


Download ppt "ADL Slide 1 December 15, 2009 Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment Robert J. Mislevy Professor, Measurement &"

Similar presentations


Ads by Google