Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bieber et al., NJIT ©2006 - Slide 1 CLASS - Collaborative Learning through Assessment Michael Bieber With help from Jia Shen, Dezhi Wu, and many others…

Similar presentations


Presentation on theme: "Bieber et al., NJIT ©2006 - Slide 1 CLASS - Collaborative Learning through Assessment Michael Bieber With help from Jia Shen, Dezhi Wu, and many others…"— Presentation transcript:

1 Bieber et al., NJIT ©2006 - Slide 1 CLASS - Collaborative Learning through Assessment Michael Bieber With help from Jia Shen, Dezhi Wu, and many others… Information Systems Department College of Computing Sciences New Jersey Institute of Technology http://web.njit.edu/~bieber February 2006

2 Bieber et al., NJIT ©2006 - Slide 2 Outline Motivation Introducing CLASS - collaborative learning through assessment A bit of theory Experimental results Interesting issues Opportunities for Collaboration

3 Bieber et al., NJIT ©2006 - Slide 3 Motivation To increase learning of course content Learning through active engagement –involve students as active participants –with the full problem life-cycle –through peer evaluation Minimize overhead for instructors

4 Bieber et al., NJIT ©2006 - Slide 4 How is this IS Research? Applying technology to make learning more effective Future research: –technology to support CLASS –opportunity to study task-technology-fit & technology acceptance

5 Bieber et al., NJIT ©2006 - Slide 5 Outline Motivation Introducing CLASS - Collaborative Learning through Assessment A bit of theory Experimental results Interesting issues Opportunities for collaboration

6 Bieber et al., NJIT ©2006 - Slide 6 CLASS Process Example: CLASS exam Each student creates 2 exam problems Instructor edits the problems if necessary Each student solves 2 problems Students evaluate (grade) the solutions to the problems they authored, writing detailed justifications Other students evaluate each problem a second time Instructor gives a final grade optional: Students can dispute their solution’s grade, by evaluating it themselves and writing detailed justifications Instructor resolves the dispute Participatory if done as individuals Collaborative if done in teams

7 Bieber et al., NJIT ©2006 - Slide 7 CLASS Process Example: CLASS exam Each student creates 2 exam problems Instructor edits the problems if necessary Each student solves 2 problems Students evaluate (grade) the solutions to the problems they authored, writing detailed justifications Other students evaluate each problem a second time Instructor gives a final grade optional: Students can dispute their solution’s grade, by evaluating it themselves and writing detailed justifications Instructor resolves the dispute All entries posted on-line

8 Bieber et al., NJIT ©2006 - Slide 8

9 Bieber et al., NJIT ©2006 - Slide 9 Exam Process Control Assign ID Edit questions Assign who answers questions Assign level-2 graders Course Design Determine Final Grades Set up on-line environment Dispute final grade Pass-1 and Pass-2 graders grade solutions Make up problems Read - other problems - other solutions - grade justifications - disputes Solve problems Instructor Control ProcessStudent Learning Process Resolve Disputes Process Flow: Learning from doing the CLASS activities additional learning from reading everything peers write

10 Bieber et al., NJIT ©2006 - Slide 10 Process Control Assign ID Edit problems Assign who solves problems Assign Pass-2 graders Course Design Determine Final Grades Set up on-line environment Dispute final grade Pass-1 and Pass-2 graders grade solutions Make up problems Confirmation ID, understand process Read - other problems - other solutions - grade justifications - disputes Solve problems Instructor Control ProcessStudent Learning Process Resolve Disputes

11 Bieber et al., NJIT ©2006 - Slide 11 Evaluation (grading) Evaluation includes: –Written critique or “justification” (positive or negative) –Optional: separate sub-criteria to critique Solution result is correct and complete (40%) Solution was well explained (30%) Solution demonstrated class materials well (10%) Solution cited appropriate references (20%) –Grade (optional; recommended to save instructor time) Evaluation/grade may be disputed (optional) –Student must re-evaluate own solution when disputing

12 Bieber et al., NJIT ©2006 - Slide 12 Instructor should provide… Detailed instructions and timetable Solution: what is expected Critiquing and grading guidelines

13 Bieber et al., NJIT ©2006 - Slide 13 Outline Motivation Introducing CLASS - collaborative learning through assessment A bit of theory Experimental results Interesting issues Opportunities for collaboration

14 Bieber et al., NJIT ©2006 - Slide 14 Constructivism (Learning Theory) The central idea is that human learning is constructed, that learners build new knowledge upon the foundation of previous learning {learning throughout the exam process} Two classic categorizations –Cognitive Constructivism (Piaget’s theory) –Social Constructivism (Vygotsky’s theory)

15 Bieber et al., NJIT ©2006 - Slide 15 Cognitive Constructivism (Piaget 1924) Knowledge is constructed and made meaningful through individual’s interactions and analyses of the environment. --> knowledge is constructed in the mind of individual Knowledge construction is totally student- centered.

16 Bieber et al., NJIT ©2006 - Slide 16 Learning Learning is a constructivist, often social activity occurring through knowledge building (Vygotsky, 1978) Knowledge building activities include contributing to, authoring within, discussing, sharing, exploring, deploying a collective knowledge base (O’Neill & Gomez 1994; Perkins 1993).

17 Bieber et al., NJIT ©2006 - Slide 17 Learning People learn as they navigate to solve problems (Koschmann et al, 1996) and design representations of their understanding (Suthers 1999) Learning requires cognitive flexibility (Spiro et al. 1991), and results from interaction with people having different experiences and perspectives (Goldman-Segall et al. 1998)

18 Bieber et al., NJIT ©2006 - Slide 18 Expert-like Deep Learning Categorizing knowledge and constructing relationships between concepts are likely to promote expert-like thinking about a domain (Bransford 2000). To design appropriate problems for their peers, students must organize and synthesize their ideas and learn to recognize the important concepts in the domain. This results in deep learning (Entwistle 2000) : –seeing relationships and patterns among pieces of information, –recognizing the logic behind the organization of material –achieving a sense of understanding

19 Bieber et al., NJIT ©2006 - Slide 19 Where is Knowledge Constructed in CLASS? In all CLASS stages: constructing problems, solutions, grade justifications, dispute justifications When reading everything their peers write –Students also are motivated to learn more when peers will read their work (McConnell, 1999).

20 Bieber et al., NJIT ©2006 - Slide 20 Assessment & Learning Main goals of tests: –To measure student achievement –To motivate and direct student learning The process of taking a test and discussing its grading should be a richly rewarding learning experience (Ebel and Frisbie 1986) Assessment should be a fundamental part of the learning process (Shepard 2000)

21 Bieber et al., NJIT ©2006 - Slide 21 Outline Motivation Introducing CLASS A bit of theory Experimental results –By Wu, Bieber & Hiltz (2004) –By Shen, Bieber & Hiltz (2004, 2006) –By Shen (2005) Interesting issues Opportunities for collaboration

22 Bieber et al., NJIT ©2006 - Slide 22 Course Information Wu, Bieber & Hiltz (2004) - Participatory Mode (individuals) NJIT CIS677: Information System Principles Graduate level core course (Masters/Ph.D.) Aim: study how IS/IT can be used effectively Both on-campus and distance-learning sections software: Virtual Classroom/WebBoard Traditional Exam: –Three-hour, in class, 3-4 essay questions, 6 pages of notes Used CLASS 5 times between Fall 1999 and Summer 2002 We compared control groups without CLASS and treatment groups with CLASS Also, we used with shorter essay questions in CIS 365, undergraduate course on file structures in Fall 2002, with similar survey results.

23 Bieber et al., NJIT ©2006 - Slide 23 Independent Variable Intervening Variables Dependent Variables Exam Process Quality Enjoyability Perceived Learning Recommend Future Use Exam Grades.308*.346*.521**.653** Research Model Wu, Bieber & Hiltz (2004) - Participatory Mode (individuals)

24 Bieber et al., NJIT ©2006 - Slide 24 Enjoyability QuestionsSAANDSDMeanS.D.# I enjoyed the flexibility in organizing my resources 26.2%48.9%16.7%3.6%4.6%3.881.00221 I was motivated to do my best work 23.5%42.9%28.2%3.4%2.1%3.82.92238 I enjoyed the examination process 17.2%42.3%22.6%10.5%7.4%3.511.13239 SA - strongly agree (5 points); A - agree (4); N - neutral (3); D - disagree (2); SD - strongly disagree (1); the mean is out of 5 points; S.D. - standard deviation Cronbach’s Alpha=0.68

25 Bieber et al., NJIT ©2006 - Slide 25 Perceived Learning QuestionsSAANDSDMeanS.D.# I learned from making up questions 17.9%42.5%21.3%13.8%4.5%3.551.08240 I learned from grading other students answers 17.7%48.1%19.4%9.3%5.5%3.631.06237 I learned from reading other people’s answers 15.8%45.0%22.1%11.3%5.8%3.541.07240 I demonstrated what I learned in class 13.6%50.2%22.6%10.9%2.7%3.61.95221 My ability to integrate facts and develop generalizations improved 21.8%49.2%25.6%2.1%1.3%3.88.83238 I learned to value other points of view 17.6%51.9%27.6%1.3%1.6%3.82.81239 I mastered the course materials 7.4%51.6%31.4%6.9%2.7%3.54.84188 Cronbach’s Alpha=0.88

26 Bieber et al., NJIT ©2006 - Slide 26 Recommendation: Do Again! QuestionSAANDSDMeanS.D.# Would you recommend in the future that this exam process used? 20.7%40.1%24.5%8.9%5.8%3.601.10237 Similar results for CIS365: undergraduate file structures course using short essay questions (Fall 2002)

27 Bieber et al., NJIT ©2006 - Slide 27 Shen, Bieber & Hiltz (2004, 2006) Participatory Exams (individuals)

28 Bieber et al., NJIT ©2006 - Slide 28 Shen, Bieber & Hiltz (2004, 2006) Participatory Exams (individuals)

29 Bieber et al., NJIT ©2006 - Slide 29 Shen (2005) - Collaborative Exams Compared traditional, participatory and collaborative exams –collaborative exam: 3-5 student teams; teams created/graded - individuals answered –no 2nd-Pass grading spring, summer, fall 2004 9 instructors, 22 sections 586 students, 485 responses (83%)

30 Bieber et al., NJIT ©2006 - Slide 30

31 Bieber et al., NJIT ©2006 - Slide 31

32 Bieber et al., NJIT ©2006 - Slide 32 Shen (2005) - Collaborative Exams

33 Bieber et al., NJIT ©2006 - Slide 33 Shen (2005) - Collaborative Exams In Collaborative Exam (C): Students studied more intensely over traditional exam (T) Students had a higher level of social engagement over T/P exams. Students formed a learning community, and interaction with others enhanced their understanding.

34 Bieber et al., NJIT ©2006 - Slide 34 Shen (2005) - Collaborative Exams

35 Bieber et al., NJIT ©2006 - Slide 35 Shen (2005) - Collaborative Exams Collaborative Exam: higher levels of correlation between social engagement and student satisfaction, and with perceived learning.

36 Bieber et al., NJIT ©2006 - Slide 36 Shen (2005) - Collaborative Exams

37 Bieber et al., NJIT ©2006 - Slide 37 Outline Motivation Introducing CLASS - collaborative learning through assessment A bit of theory Experimental results Interesting issues Opportunities for collaboration

38 Bieber et al., NJIT ©2006 - Slide 38 What students liked best Active involvement in the exam process Flexibility Reduction in tension

39 Bieber et al., NJIT ©2006 - Slide 39 Trade-offs Trade-offs for students (traditional vs. CLASS) –Timing: Concentrated vs. drawn-out (2.5 weeks) –Access to information: limited vs. the Internet –Experimental integrity: we couldn’t justify the process to the students fully Trade-offs for professors –Fewer solutions to evaluate, but each is different –Timing: Concentrated vs. drawn-out process –Much more administration

40 Bieber et al., NJIT ©2006 - Slide 40 Timing CLASS for exams took 2.5 weeks For frequent activities CLASS processes could overlap –e.g., quizzes, homeworks –Students could be creating problems for one quiz, while solving problems for the prior quiz, while evaluating solutions from the quiz before that Benefits to overlapping CLASS activities: –working with materials from several classes at the same time –could reinforce class materials –could result in synthesis (combined understanding)

41 Bieber et al., NJIT ©2006 - Slide 41 Extending Scope Which activities? –so far: exams –what about: quizzes, homeworks, larger projects, in-class projects Which problem types? –so far: short and long essay questions –what about: multiple choice, short answer, computer programs, semester projects –Sub-problems: computer program design & implementation semester project outline & execution

42 Bieber et al., NJIT ©2006 - Slide 42 Extending Scope, cont. Course Level –Graduate, undergraduate, secondary school (high school, junior high) Disciplines –IS/IT, business, science, engineering, humanities, medical, all of secondary school

43 Bieber et al., NJIT ©2006 - Slide 43 Extending Scope, cont. Degree of Evaluation (assigning grades) –Currently: solutions –What about: quality of problems quality of evaluations/grades –All could be disputed Degree of Participation –students could evaluate each –students could arbitrate disputes

44 Bieber et al., NJIT ©2006 - Slide 44 Evaluation Results Written critique (positive or negative) Grade (optional; recommended to save instructor time) Recommendation to accept or reject the “artifact” (problem, solution, evaluation) If rejected, optionally: –the artifact would have to be redone and re-evaluated –the evaluator or instructor would substitute an acceptable artifact, and the CLASS process continues The evaluation/grade could be disputed

45 Bieber et al., NJIT ©2006 - Slide 45 Full Collaboration Groups for: –Problems, solutions, evaluation, dispute arbitration Requires group process support –Group roles: leader, scheduler, etc. –Process: work on each activity together or separately, internal review –Grading of individual group members –Process Tools: brainstorming, voting, etc.

46 Bieber et al., NJIT ©2006 - Slide 46 What can go wrong Students are late; students drop the course Entries posted in wrong place Inadequate critiques –“Good” –“I agree with the other evaluator” and of course, technical difficulties…

47 Bieber et al., NJIT ©2006 - Slide 47 CLASS Environment Software Guide the process Form groups (based on individual characteristics) Support group processes (research opportunity) Assign problem solvers, evaluators, dispute arbitrators On-line templates to –ensure full entries –teach students how to create, solve and assess (research opportunity) Guide people to post entries in correct place Incorporate group process tools (research opportunity) Handle problems as much as possible –Remind people who are late –Reallocate who does what Based on a workflow management tool…

48 Bieber et al., NJIT ©2006 - Slide 48 Anonymity/Privacy Issues Should student entries be anonymous? Will students reveal their IDs? Is it fair to post critiques if not anonymous? Is it fair to post grades if not anonymous? Will anonymity work in small classes?

49 Bieber et al., NJIT ©2006 - Slide 49 Issue: Perceived Fairness Should students evaluate/grade peers? –But they must evaluate others in the workplace… It’s the instructor’s job to evaluate and grade –CLASS is a (constructivist) learning technique Students have no training in evaluation –Evaluation is a skill that must be learnt (and taught) –How to teach evaluation? (research opportunity) Many evaluators = inconsistent quality –safeguards in the CLASS process

50 Bieber et al., NJIT ©2006 - Slide 50 Grading Issues Disputing high grades: –Award bonus points if students dispute (and justify with a critique) grades that are too high Encouraging honest grading: –For successful disputes, deduct points from evaluators

51 Bieber et al., NJIT ©2006 - Slide 51 Grade Inflation Detailed grading guidelines for sub-criteria: great: 20 points very good: 18 points good: 14 points OK: 10 points poor: 6 points Student does “good” on 5 problems, grade = 70 U.S. students will protest vigorously Evaluators will hesitate to assign “good” Result: pressure for highly skewed grading rubrics

52 Bieber et al., NJIT ©2006 - Slide 52 Other Cross-Cultural Issues In some cultures: –Students are so competitive, they would only give failing grades to peers –Students would not hurt peers’ feelings, and would only give good evaluations Some systems only have pass/fail, so numeric grades are mostly irrelevant

53 Bieber et al., NJIT ©2006 - Slide 53 Research Questions for Future Studies How does CLASS affect –student learning? –student ability to assess? –student/instructor satisfaction? –instructor teaching and preparation? How is CLASS affected by –different class modes? –different cultures? Questions about the various issues…

54 Bieber et al., NJIT ©2006 - Slide 54 Outline Motivation Introducing CLASS - collaborative learning through assessment A bit of theory Experimental results Interesting issues Opportunities for collaboration

55 Bieber et al., NJIT ©2006 - Slide 55 First Steps More experiments: assess actual learning –useful for full grant proposals Develop CLASS software, version 1 –we have a preliminary design Work on all those interesting issues! Write grant proposals… Study task-technology-fit & technology acceptance, etc. Looking for instructor & Ph.D. student collaborators! Next Steps

56 Bieber et al., NJIT ©2006 - Slide 56 CLASS: Contributions Systematic technique to increase learning –Constructivist approach –Actively engages students in the entire problem life-cycle –Minimizes overhead for students and instructors Experimental evaluation Supporting software Thank you! Questions, please?


Download ppt "Bieber et al., NJIT ©2006 - Slide 1 CLASS - Collaborative Learning through Assessment Michael Bieber With help from Jia Shen, Dezhi Wu, and many others…"

Similar presentations


Ads by Google