Presentation is loading. Please wait.

Presentation is loading. Please wait.

1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate.

Similar presentations


Presentation on theme: "1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate."— Presentation transcript:

1 1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing

2 2/27 CRESST/UCLA Overview TECHNOLOGY FOR ASSESSMENT & CRESST MODEL ONGOING COLLABORATIVE PROBLEM SOLVING RESEARCH WHAT’S NEXT

3 TECHNOLOGY FOR ASSESSMENT & CRESST MODEL

4 4/27 CRESST/UCLA PRESUMED ADVANTAGES Provide consistent high-quality assessment available on large scale at remote sites Individualized testing Time/Cost saving reduce testing time quicker result reporting and system update rapid update of testing materials reduce reliance on highly skilled personnel

5 5/27 CRESST/UCLA POSSIBLE PROBLEM AREAS Cost Equity Fidelity Program maintenance Teacher attitudes

6 6/27 CRESST/UCLA PURPOSES OF TESTING AND ASSESSMENT Individual-Team–Oriented Individual/team certification Admissions and selection Placement Individual progress and student learning Diagnosis/prescription

7 7/27 CRESST/UCLA CRESST MODEL OF LEARNING Content Understanding Learning Communication Collaboration Problem Solving Self-Regulation

8 8/27 CRESST/UCLA PROBLEM-SOLVING DEFINITION Problem solving is cognitive processing directed at achieving a goal when no solution method is obvious to the problem solver (Mayer & Wittrock, 1996)

9 9/27 CRESST/UCLA PROBLEM SOLVING Self- Efficacy Content Understanding Domain-Dependent Problem-Solving Strategies Self-Regulation Metacognition Self- Monitoring Planning Motivation Effort

10 10/27 CRESST/UCLACOLLABORATIVEPROBLEM-SOLVING Problem-Solving Strategies Self - Regulation Content Understanding Group Teamwork Process 1. Adaptability 2. Coordination 3. Decision Making 4. Interpersonal 5. Leadership 6. Communication Environmental Science Domain Knowledge 1. Browsing 2. Searching 3. Boolean Operator Used 4. Feedback Accessing 1. Planning 2. Self-Checking 3. Effort 4. Self-Efficacy

11 ONGOING COLLABORATIVE PROBLEM SOLVING RESEARCH

12 12/27 CRESST/UCLA CRESST’S KNOWLEDGE MAPPER

13 13/27 CRESST/UCLA COMPUTER-BASED ASSESSMENT Diagnosis: Match in real time student map with expert map Prescription: Nature of feedback tied to diagnosis

14 14/27 CRESST/UCLA THREE CHARACTERISTICS OF FEEDBACK Complexity of feedback What information is contained in the feedback messages Timing of feedback When is the feedback given to students Representation of feedback The form of the feedback presented (text vs. graphics vs. audio)

15 15/27 CRESST/UCLA ONGOING COLLABORATIVE PROBLEM SOLVING RESEARCH Improve on the nature of the task Improve communication messages Improve scoring efficiency provide more extensive complex feedback. Feedback on collaboration/teamwork Feedback on problem solving Content understanding Domain specific problem-solving strategy

16 16/27 CRESST/UCLA Schacter et al. Use of simulated Internet Web space and Information seeking processes significantly increased content understanding/map scores.

17 17/27 CRESST/UCLA Hsieh & O’Neil 1. Improve task to a “real group task” 2.Provide two different type of feedback Knowledge of Response Your map has been scored against an expert’s map in environmental science. The feedback tells you: How much you need to improve each concept in your map (i.e., A lot, Some, A little). Use this feedback to help you search to improve your map. A lot SomeA little Atmosphere ClimateEvaporation BacteriaCarbon dioxide Greenhouse gasses Decomposition Photosynthesis OxygenSunlight WasteWater cycle

18 18/27 CRESST/UCLA Adapted Knowledge of Response Adapted Knowledge of Response (the above Knowledge of Response + the following) Improvement: You have improved the “food chain” concept from needing “A lot of improvement” to the “Some improvement” category. Strategy: It is most useful to search for information for the “A lot” and “Some” categories rather than the “A little” category. For example, search for information on “atmosphere” or “climate” first, rather than “evaporation.”

19 19/27 CRESST/UCLA Hsieh & O’Neil Knowledge of Response Feedback Adapted Knowledge of Response Feedback Contains Knowledge of Response Feedback plus an explanation for why it is correct or incorrect Adapted Knowledge of Response Feedback significantly better than Knowledge of Response Feedback Effect size moderate

20 20/27 CRESST/UCLA Chuang & O’Neil User defined feedback timing Adapted Knowledge of Response Feedback Task-specific knowledge of Response Feedback Adapted feedback plus task-specific strategies on searching using Boolean operators Task-Specific Knowledge of Response Feedback significantly better than Adaptive Knowledge of Response Feedback Effect size moderate

21 21/27 CRESST/UCLA Chen et al. Feedback via After-Action Review

22 22/27 CRESST/UCLA Chen et al. the AAR groups’ overall communication messages were higher. the AAR group changed their searching strategies and performed significantly more Boolean searches after receiving the feedback. However, the After-Action Review had no significant effect on improving the teams’ content understanding. Chen suggested investigating the relationship of this type of feedback and cognitive load.

23 23/27 CRESST/UCLA Chuang & O’Neil Kalyuga, Chandler, & Sweller, 2004 Mayer (2001) Audio vs. Text feedback After Action Review Feedback

24 WHAT’S NEXT

25 25/27 CRESST/UCLA GENERAL LESSONS LEARNED Computer based assessment is feasible and effective Need sub-models of process Role of type of task and feedback may be critical for assessment of collaborative problem solving

26 26/27 CRESST/UCLA WHAT’S NEXT IN LAB SETTINGS Tailor reports to individual differences Hi vs. low prior knowledge IN FIELD SETTINGS Scale up results to applied Navy training environment Goal of Navy training system is to reduce time in training and maintain proficiency

27 27/27 CRESST/UCLA DYNAMIC TESTING Reconceptualization of our line of research (Vygotsky, Feuerstein) Dynamic testing can be viewed as the quantification of one’s learning potential Potential vs. actual Learn new things rather than knowledge already acquired Difference between Dynamic and Static testing Emphasis on quantifying psychological process Role of feedback Static test has no explicit feedback Dynamic test provides feedback after each item Interaction is individualized

28 28/27 CRESST/UCLA


Download ppt "1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate."

Similar presentations


Ads by Google