Presentation is loading. Please wait.

Presentation is loading. Please wait.

CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil.

Similar presentations


Presentation on theme: "CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil."— Presentation transcript:

1 CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil University of Southern California & National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Gloria Hsieh University of Southern California Gregory K. W. K. Chung UCLA/CRESST

2 CRESST Conference 9/15/00 v.3 p.2 C R E S S T / U S C CRESST MODEL OF LEARNING Content Understanding Learning Communication Collaboration Problem Solving Self-Regulation

3 CRESST Conference 9/15/00 v.3 p.3 C R E S S T / U S C JUSTIFICATION: WORLD OF WORK The justification for collaborative problem solving as a core demand can be found in analyses of both the workplace and academic learning –O’Neil, Allred, and Baker (1997) reviewed five major studies from the workplace readiness literature. Each of these studies identified the need for (a) higher order thinking skills, (b) teamwork, and (c) some form of technology fluency. In four of the studies, problem-solving skills were specifically identified as essential.

4 CRESST Conference 9/15/00 v.3 p.4 C R E S S T / U S C JUSTIFICATION: NATIONAL STANDARDS New standards (e.g., National Science Education Standards) suggest new assessment approaches rather than multiple-choice exams –Deeper or higher order learning –More robust knowledge representations –Integration of mathematics and science –Integration of scientific information that students can apply to new problems in varied settings (i.e., transfer) –Integration of content knowledge and problem solving –More challenging science problems –Conduct learning in groups

5 CRESST Conference 9/15/00 v.3 p.5 C R E S S T / U S C MODELS PROBLEM SOLVING DEFINITION Problem solving is cognitive processing directed at achieving a goal when no solution method is obvious to the problem solver (Mayer & Wittrock, 1996) Problem-solving components –Domain-specific knowledge (content understanding) –Problem-solving strategy Domain-specific strategy in troubleshooting (e.g., malfunction probability [i.e., fix first the component that fails most often]) –Self-regulation (metacognition [planning, self- monitoring] + motivation [effort, self-efficacy] )

6 CRESST Conference 9/15/00 v.3 p.6 C R E S S T / U S C Content Understanding PROBLEM SOLVING Domain-Dependent Problem-Solving Strategies Self-Regulation Metacognition Self- Monitoring Planning Motivation Effort Self- Efficacy

7 CRESST Conference 9/15/00 v.3 p.7 C R E S S T / U S C COMPUTER-BASED PROBLEM- SOLVING TASK (CAETI) Metacognition and motivation are assessed by paper-and-pencil survey instrument (self- regulation) Create a knowledge map on environmental science (Content understanding) Receive feedback on it Using a simulated Web site, search for information to improve it (problem-solving strategy) –Relevance, searches, browsing Construct a final knowledge map –Serves as the outcome content understanding measure

8 CRESST Conference 9/15/00 v.3 p.8 C R E S S T / U S C CRESST’S CONCEPT MAPPER

9 CRESST Conference 9/15/00 v.3 p.9 C R E S S T / U S C CORRELATION COEFFICIENTS: OUTCOME AND PROCESS VARIABLES (N = 38)

10 CRESST Conference 9/15/00 v.3 p.10 C R E S S T / U S C CONCLUSIONS Computer-based problem-solving assessment is feasible –Process/product validity evidence is promising Allows real-time scoring/reporting to students and teachers Useful for program evaluation and diagnostic functions of testing What’s next? –Generalizability study –Collaborative problem solving with group task

11 CRESST Conference 9/15/00 v.3 p.11 C R E S S T / U S C TEAMWORK MODEL

12 CRESST Conference 9/15/00 v.3 p.12 C R E S S T / U S C CRESST ASSESSMENT MODEL OF TEAMWORK Simulation Pre-Defined Process Taxonomy Union Management Negotiation/ Networked Concept Map Real-Time Assessment and Reporting Networked Computers Pre-Defined Messages

13 CORRELATION BETWEEN TEAM PROCESSES AND OUTCOME MEASURES 1 (N = 26) CREEST Conference 9/15/00 v.1 C R E S S T / U S C

14 CRESST Conference 9/15/00 v.3 p.14 C R E S S T / U S C Nonparametric (Spearman) Correlations Between Team Processes and Post Outcome Measures for Concept Map (N =14)

15 CRESST Conference 9/15/00 v.3 p.15 C R E S S T / U S C PUZZLE Unfortunately, the concept mapping study (Chung et al., 1999) found that the team process did not predict team outcomes, unlike the union management negotiation task. We hypothesized that the lack of useful feedback in the concept mapping task and low prior knowledge may have influenced the results.

16 CRESST Conference 9/15/00 v.3 p.16 C R E S S T / U S C ONGOING RESEARCH We changed the nature of the task to provide more extensive feedback and to create a real “group” task Feedback will be knowledge of response feedback versus adaptive knowledge of response feedback A group task is a task where –no single individual possesses all the resources; –no single individual is likely to solve the problem or accomplish the task objective without at least some input from others (Cohen & Arechevala-Vargas, 1987) One student creates the concept map, the other student does the searches

17 CRESST Conference 9/15/00 v.3 p.17 C R E S S T / U S C KNOWLEDGE OF RESPONSE FEEDBACK (Schacter et al. Study) Your map has been scored against an expert’s map in environmental science. The feedback tells you: How much you need to improve each concept in your map (i.e., A lot, Some, A little). Use this feedback to help you search to improve your map. A lot Some A little _______________________________________________ AtmosphereClimateEvaporation BacteriaCarbon dioxideGreenhouse gasses DecompositionPhotosynthesis OxygenSunlight WasteWater cycle RespirationOceans NutrientsConsumer Food chainProducer Adapted Knowledge of Response (the above + the following) Improvement: You have improved the “food chain” concept from needing “A lot of improvement” to the “Some improvement” category. Strategy: It is most useful to search for information for the “A lot” and “Some” categories rather than the “A little” category. For example, search for information on “atmosphere” or “climate” first, rather than “evaporation.”

18 CRESST Conference 9/15/00 v.3 p.18 C R E S S T / U S C GENERAL LESSONS LEARNED Need model of cognitive learning (the Big 5) –Need submodels of process problem solving is content understanding, problem- solving strategies, self-regulation Teamwork is adaptability, coordination, decision making, interpersonal skill, leadership, and communication For diagnostic low-stakes environments need real-time administration, scoring, and reporting Role of type of task and feedback may be critical for assessment of collaborative problem solving

19 BACK-UP SLIDES CREEST Conference 9/15/00 v.1 p.19

20 CRESST Conference 9/15/00 v.3 p.20 C R E S S T / U S C Problem Solving Domain-SpecificDomain-specificAugmented Concept Mapping With Search Task, Self-regulation strategiesTransfer Tasks, Motivation (effort, self- efficacy, anxiety), Search Strategies ASSESSMENTS FOR TYPES OF LEARNING Content Understanding FactsProceduresExplanation Tasks, Concept Mapping, ConceptsPrinciplesMultiple-Choice, Essays TYPES OF LEARNING ASSESSMENT METHODOLOGY Team Work and Collaboration CoordinationAdaptabilityCollaborative Simulation, Self Report, LeadershipInterpersonalObservation Decision Making Self-regulation PlanningSelf-Report, observation, inference Self-Checking Self-Efficacy Effort Communication ComprehensionUse of ConventionsExplanation scored for communication ExpressionMultimode

21 DOMAIN SPECIFICATIONS EMBEDDED IN THE UNION/MANAGEMENT NEGOTIATION SOFTWARE CRESST Conference 9/15/00 v.1 p.19

22 CRESST Conference 9/15/00 v.3 p.22 C R E S S T / U S C Domain Specifications Embedded in the Knowledge Mapping Software

23 CRESST Conference 9/15/00 v.3 p.23 C R E S S T / U S C BOOKMARKING APPLET

24 CRESST Conference 9/15/00 v.3 p.24 C R E S S T / U S C SAMPLE METACOGNITIVE ITEMS The following questions refer to the ways people have used to describe themselves. Read each statement below and indicate how you generally think or feel. There are no right or wrong answers. Do not spend too much time on any one statement. Remember, give the answer that seems to describe how you generally think or feel. Note. Formatted as in Section E, Background Questionnaire: Canadian version of the International Adult Literacy Survey (1994). Item a is a planning item; item b is a self-checking item. Kosmicki (1993) reported alpha reliability of.86 and.78 for 6-item versions of these scales respectively.

25 CRESST Conference 9/15/00 v.3 p.25 C R E S S T / U S C TEAMWORK PROCESSES

26 CRESST Conference 9/15/00 v.3 p.26 C R E S S T / U S C SCREEN EXAMPLE

27 CRESST Conference 9/15/00 v.3 p.27 C R E S S T / U S C FEEDBACK FREQUENCY Lowering the percentage of feedback –slows down the acquisition of concepts –but facilitates the transfer of knowledge

28 CRESST Conference 9/15/00 v.3 p.28 C R E S S T / U S C TIMING OF FEEDBACK Delayed-Retention Effect (Delayed > Immediate) Classroom or Programmed Instruction Settings (Immediate > Delayed) Developmental difference: –Younger children —> Immediate > Delayed –Older children —> Delayed > Immediate

29 CRESST Conference 9/15/00 v.3 p.29 C R E S S T / U S C THREE CHARACTERISTICS OF FEEDBACK Complexity of feedback –What information is contained in the feedback messages Timing of feedback –When is the feedback given to students Representation of feedback –The form of the feedback presented (text vs. graphics)

30 CRESST Conference 9/15/00 v.3 p.30 C R E S S T / U S C CORRELATIONS BETWEEN TEAMWORK PROCESS SCALES AND OUTCOME MEASURES FOR UNION PARTICIPANTS (N = 48)

31 CRESST Conference 9/15/00 v.3 p.31 C R E S S T / U S C THE NATURE OF TASKS Interaction will be positively related to productivity under two conditions: Group Tasks –No single individual possesses all the resources –No single individual is likely to solve the problem or accomplish the task objectives without at least some inputs from others Ill-Structured Problem –No clear-cut answers or procedures for the problem


Download ppt "CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil."

Similar presentations


Ads by Google