CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil.

Slides:



Advertisements
Similar presentations
USEFUL LESSONS FROM ASSESSMENT IN AN ADULT LEARNING ENVIRONMENT: INSTRUCTIONAL STRATEGIES IN DISTANCE LEARNING Harry O’Neil University of Southern California.
Advertisements

Neag School of Education Using Social Cognitive Theory to Predict Students’ Use of Self-Regulated Learning Strategies in Online Courses Anthony R. Artino,
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
Principles of High Quality Assessment
Meaningful Learning in an Information Age
1 Instructional Strategies to Improve Learning in Computer Games Harold F. O’Neil and Hsin-Hui Chen, University of Southern California/CRESST AERA v.5.
Discovery Learning Theresa Murphy, John Malloy, Sean O’Brien
Virginia Teacher Performance Evaluation System
Planning, Instruction, and Technology
The Role of Metacognition in
Tutoring and Learning: Keeping in Step David Wood Learning Sciences Research Institute: University of Nottingham.
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
1 MSP-Motivation Assessment Program (MSP-MAP) Tools for the Evaluation of Motivation-Related Outcomes of Math and Science Instruction Martin Maehr
C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
COPYRIGHT WESTED, 2010 Calipers II: Using Simulations to Assess Complex Science Learning Diagnostic Assessments Panel DRK-12 PI Meeting - Dec 1–3, 2010.
Southern Regional Education Board Supporting New Teachers in Technology Centers Heather Sass Southern Regional Education Board TCTW Forum January 13, 2010.
Strategies to Accelerate Academic Learning for English Learners
An Action Learning Approach For Increasing Critical Thinking Skills In An Information Systems Capstone Course Alan Burns School of CTI DePaul University.
Southern Regional Education Board HSTW An Integrated and Embedded Approach to Professional Development and School Improvement Using the Six-Step Process.
1 How can self-regulated learning be supported in mathematical E-learning environments? Presenters: Wei-Chih Hsu Professor : Ming-Puu Chen Date : 11/10/2008.
Seeking and providing assistance while learning to use information systems Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Sep. 16, 2009 Babin, L.M.,
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Rigor in the Classroom DECEMBER 11, Standards: 3. INSTRUCTIONAL STRATEGIES: The teacher promotes student learning by using research-based instructional.
Staff Development and the Change Process
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
EDU 385 Education Assessment in the Classroom
Forum - 1 Assessments for Learning: A Briefing on Performance-Based Assessments Eva L. Baker Director National Center for Research on Evaluation, Standards,
Applying Self-Regulated Learning Strategies in a Web-Based Instruction-An Investigation of Motivation Perception 指導教授: Chen, Ming-puu 報 告 者: Chen, Wan-Yi.
The Impact of the MMP on Student Achievement Cindy M. Walker, PhD Jacqueline Gosz, MS University of Wisconsin - Milwaukee.
Teaching Learning Strategies and Academic Language
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Problem-Based Learning. Process of PBL Students confront a problem. In groups, students organize prior knowledge and attempt to identify the nature of.
Twilight Training October 1, 2013 OUSD CCSS Transition Teams.
PRINCIPAL SESSION 2012 EEA Day 1. Agenda Session TimesEvents 1:00 – 4:00 (1- 45 min. Session or as often as needed) Elementary STEM Power Point Presentation.
Stages 1 and 2 Wednesday, August 4th, Stage 1: Step 5 National and State Standards.
Baker ONR/NETC July 03 v.4  2003 Regents of the University of California ONR/NETC Planning Meeting 18 July, 2003 UCLA/CRESST, Los Angeles, CA ONR Advanced.
ONR/NSF Technology Assessment of Web-Based Learning, v3 © Regents of the University of California 6 February 2003 ONR/NSF Technology Assessment of Web-Based.
K-12 Technology Literacy Curriculum and Assessment.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate.
“When learners are at a distance, the careful design of assessments is particularly important, because society somewhat unfairly imposes higher expectations.
LEARNER CENTERED APPROACH
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
CRESST ONR/NETC Meetings, July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Greg Chung Bill Bewley UCLA/CRESST Ontologies and Bayesian.
Readiness for AdvancED District Accreditation Tuscaloosa County School System.
Problem-Solving Approach of Allied Health Learning Community.
1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards,
Celebrate Our Rising Stars Summit IV “From Essential Elements to Effective Practice” Strategies to Accelerate Academic Learning for English Learners Anna.
Enriching Assessment of the Core Albert Oosterhof, Faranak Rohani, & Penny J. Gilmer Florida State University Center for Advancement of Learning and Assessment.
The role of feedback and self-efficacy on web-based learning: The social cognitive perspective Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jun.
Connecting the Characteristics Margaret Heritage UCLA/ CRESST Attributes of Other Characteristics of Effective Instruction and Assessment for Learning.
1 Science, Learning, and Assessment: (Eats, Shoots, and Leaves) Choices for Comprehensive Assessment Design Eva L. Baker UCLA Graduate School of Education.
Instructional Leadership: Planning Rigorous Curriculum (What is Rigorous Curriculum?)
The Teacher- Child Interaction Linking Developmentally Appropriate Practices to the Characteristics of Effective Instruction.
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
Getting the Most Out of Learning Opportunities
Harry O'Neil University of Southern California and The Center for Research on Evaluation, Standards, and Student Testing A Theoretical Basis for Assessment.
OSEP Leadership Conference July 28, 2015 Margaret Heritage, WestEd
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
Assist. Prof.Dr. Seden Eraldemir Tuyan
ASSESSMENT OF STUDENT LEARNING
Classroom Assessment Validity And Bias in Assessment.
Project–Based Learning
Welcome to the overview session for the Iowa Core Curriculum
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Designing Programs for Learners: Curriculum and Instruction
The Role of Metacognition in
LEARNER-CENTERED PSYCHOLOGICAL PRINCIPLES. The American Psychological Association put together the Leaner-Centered Psychological Principles. These psychological.
Presentation transcript:

CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil University of Southern California & National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Gloria Hsieh University of Southern California Gregory K. W. K. Chung UCLA/CRESST

CRESST Conference 9/15/00 v.3 p.2 C R E S S T / U S C CRESST MODEL OF LEARNING Content Understanding Learning Communication Collaboration Problem Solving Self-Regulation

CRESST Conference 9/15/00 v.3 p.3 C R E S S T / U S C JUSTIFICATION: WORLD OF WORK The justification for collaborative problem solving as a core demand can be found in analyses of both the workplace and academic learning –O’Neil, Allred, and Baker (1997) reviewed five major studies from the workplace readiness literature. Each of these studies identified the need for (a) higher order thinking skills, (b) teamwork, and (c) some form of technology fluency. In four of the studies, problem-solving skills were specifically identified as essential.

CRESST Conference 9/15/00 v.3 p.4 C R E S S T / U S C JUSTIFICATION: NATIONAL STANDARDS New standards (e.g., National Science Education Standards) suggest new assessment approaches rather than multiple-choice exams –Deeper or higher order learning –More robust knowledge representations –Integration of mathematics and science –Integration of scientific information that students can apply to new problems in varied settings (i.e., transfer) –Integration of content knowledge and problem solving –More challenging science problems –Conduct learning in groups

CRESST Conference 9/15/00 v.3 p.5 C R E S S T / U S C MODELS PROBLEM SOLVING DEFINITION Problem solving is cognitive processing directed at achieving a goal when no solution method is obvious to the problem solver (Mayer & Wittrock, 1996) Problem-solving components –Domain-specific knowledge (content understanding) –Problem-solving strategy Domain-specific strategy in troubleshooting (e.g., malfunction probability [i.e., fix first the component that fails most often]) –Self-regulation (metacognition [planning, self- monitoring] + motivation [effort, self-efficacy] )

CRESST Conference 9/15/00 v.3 p.6 C R E S S T / U S C Content Understanding PROBLEM SOLVING Domain-Dependent Problem-Solving Strategies Self-Regulation Metacognition Self- Monitoring Planning Motivation Effort Self- Efficacy

CRESST Conference 9/15/00 v.3 p.7 C R E S S T / U S C COMPUTER-BASED PROBLEM- SOLVING TASK (CAETI) Metacognition and motivation are assessed by paper-and-pencil survey instrument (self- regulation) Create a knowledge map on environmental science (Content understanding) Receive feedback on it Using a simulated Web site, search for information to improve it (problem-solving strategy) –Relevance, searches, browsing Construct a final knowledge map –Serves as the outcome content understanding measure

CRESST Conference 9/15/00 v.3 p.8 C R E S S T / U S C CRESST’S CONCEPT MAPPER

CRESST Conference 9/15/00 v.3 p.9 C R E S S T / U S C CORRELATION COEFFICIENTS: OUTCOME AND PROCESS VARIABLES (N = 38)

CRESST Conference 9/15/00 v.3 p.10 C R E S S T / U S C CONCLUSIONS Computer-based problem-solving assessment is feasible –Process/product validity evidence is promising Allows real-time scoring/reporting to students and teachers Useful for program evaluation and diagnostic functions of testing What’s next? –Generalizability study –Collaborative problem solving with group task

CRESST Conference 9/15/00 v.3 p.11 C R E S S T / U S C TEAMWORK MODEL

CRESST Conference 9/15/00 v.3 p.12 C R E S S T / U S C CRESST ASSESSMENT MODEL OF TEAMWORK Simulation Pre-Defined Process Taxonomy Union Management Negotiation/ Networked Concept Map Real-Time Assessment and Reporting Networked Computers Pre-Defined Messages

CORRELATION BETWEEN TEAM PROCESSES AND OUTCOME MEASURES 1 (N = 26) CREEST Conference 9/15/00 v.1 C R E S S T / U S C

CRESST Conference 9/15/00 v.3 p.14 C R E S S T / U S C Nonparametric (Spearman) Correlations Between Team Processes and Post Outcome Measures for Concept Map (N =14)

CRESST Conference 9/15/00 v.3 p.15 C R E S S T / U S C PUZZLE Unfortunately, the concept mapping study (Chung et al., 1999) found that the team process did not predict team outcomes, unlike the union management negotiation task. We hypothesized that the lack of useful feedback in the concept mapping task and low prior knowledge may have influenced the results.

CRESST Conference 9/15/00 v.3 p.16 C R E S S T / U S C ONGOING RESEARCH We changed the nature of the task to provide more extensive feedback and to create a real “group” task Feedback will be knowledge of response feedback versus adaptive knowledge of response feedback A group task is a task where –no single individual possesses all the resources; –no single individual is likely to solve the problem or accomplish the task objective without at least some input from others (Cohen & Arechevala-Vargas, 1987) One student creates the concept map, the other student does the searches

CRESST Conference 9/15/00 v.3 p.17 C R E S S T / U S C KNOWLEDGE OF RESPONSE FEEDBACK (Schacter et al. Study) Your map has been scored against an expert’s map in environmental science. The feedback tells you: How much you need to improve each concept in your map (i.e., A lot, Some, A little). Use this feedback to help you search to improve your map. A lot Some A little _______________________________________________ AtmosphereClimateEvaporation BacteriaCarbon dioxideGreenhouse gasses DecompositionPhotosynthesis OxygenSunlight WasteWater cycle RespirationOceans NutrientsConsumer Food chainProducer Adapted Knowledge of Response (the above + the following) Improvement: You have improved the “food chain” concept from needing “A lot of improvement” to the “Some improvement” category. Strategy: It is most useful to search for information for the “A lot” and “Some” categories rather than the “A little” category. For example, search for information on “atmosphere” or “climate” first, rather than “evaporation.”

CRESST Conference 9/15/00 v.3 p.18 C R E S S T / U S C GENERAL LESSONS LEARNED Need model of cognitive learning (the Big 5) –Need submodels of process problem solving is content understanding, problem- solving strategies, self-regulation Teamwork is adaptability, coordination, decision making, interpersonal skill, leadership, and communication For diagnostic low-stakes environments need real-time administration, scoring, and reporting Role of type of task and feedback may be critical for assessment of collaborative problem solving

BACK-UP SLIDES CREEST Conference 9/15/00 v.1 p.19

CRESST Conference 9/15/00 v.3 p.20 C R E S S T / U S C Problem Solving Domain-SpecificDomain-specificAugmented Concept Mapping With Search Task, Self-regulation strategiesTransfer Tasks, Motivation (effort, self- efficacy, anxiety), Search Strategies ASSESSMENTS FOR TYPES OF LEARNING Content Understanding FactsProceduresExplanation Tasks, Concept Mapping, ConceptsPrinciplesMultiple-Choice, Essays TYPES OF LEARNING ASSESSMENT METHODOLOGY Team Work and Collaboration CoordinationAdaptabilityCollaborative Simulation, Self Report, LeadershipInterpersonalObservation Decision Making Self-regulation PlanningSelf-Report, observation, inference Self-Checking Self-Efficacy Effort Communication ComprehensionUse of ConventionsExplanation scored for communication ExpressionMultimode

DOMAIN SPECIFICATIONS EMBEDDED IN THE UNION/MANAGEMENT NEGOTIATION SOFTWARE CRESST Conference 9/15/00 v.1 p.19

CRESST Conference 9/15/00 v.3 p.22 C R E S S T / U S C Domain Specifications Embedded in the Knowledge Mapping Software

CRESST Conference 9/15/00 v.3 p.23 C R E S S T / U S C BOOKMARKING APPLET

CRESST Conference 9/15/00 v.3 p.24 C R E S S T / U S C SAMPLE METACOGNITIVE ITEMS The following questions refer to the ways people have used to describe themselves. Read each statement below and indicate how you generally think or feel. There are no right or wrong answers. Do not spend too much time on any one statement. Remember, give the answer that seems to describe how you generally think or feel. Note. Formatted as in Section E, Background Questionnaire: Canadian version of the International Adult Literacy Survey (1994). Item a is a planning item; item b is a self-checking item. Kosmicki (1993) reported alpha reliability of.86 and.78 for 6-item versions of these scales respectively.

CRESST Conference 9/15/00 v.3 p.25 C R E S S T / U S C TEAMWORK PROCESSES

CRESST Conference 9/15/00 v.3 p.26 C R E S S T / U S C SCREEN EXAMPLE

CRESST Conference 9/15/00 v.3 p.27 C R E S S T / U S C FEEDBACK FREQUENCY Lowering the percentage of feedback –slows down the acquisition of concepts –but facilitates the transfer of knowledge

CRESST Conference 9/15/00 v.3 p.28 C R E S S T / U S C TIMING OF FEEDBACK Delayed-Retention Effect (Delayed > Immediate) Classroom or Programmed Instruction Settings (Immediate > Delayed) Developmental difference: –Younger children —> Immediate > Delayed –Older children —> Delayed > Immediate

CRESST Conference 9/15/00 v.3 p.29 C R E S S T / U S C THREE CHARACTERISTICS OF FEEDBACK Complexity of feedback –What information is contained in the feedback messages Timing of feedback –When is the feedback given to students Representation of feedback –The form of the feedback presented (text vs. graphics)

CRESST Conference 9/15/00 v.3 p.30 C R E S S T / U S C CORRELATIONS BETWEEN TEAMWORK PROCESS SCALES AND OUTCOME MEASURES FOR UNION PARTICIPANTS (N = 48)

CRESST Conference 9/15/00 v.3 p.31 C R E S S T / U S C THE NATURE OF TASKS Interaction will be positively related to productivity under two conditions: Group Tasks –No single individual possesses all the resources –No single individual is likely to solve the problem or accomplish the task objectives without at least some inputs from others Ill-Structured Problem –No clear-cut answers or procedures for the problem