Peer Assessment of Oral Presentations Kevin Yee Faculty Center for Teaching & Learning, University of Central Florida Research Question For oral presentations,

Slides:



Advertisements
Similar presentations
Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
Advertisements

Project-Based vs. Text-Based
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
The Framework for Teaching Charlotte Danielson
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Chapter 1 What is listening?
1 Your Quality Enhancement Plan (QEP) Associate Professor Sarah Thomason & Karen Brunner, Asst. VP for Institutional Effectiveness & Research August 29,
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Using RUBRICS to Assess Program Learning Outcomes By Dr. Ibrahim Al-Jabri Director, Program Assessment Center April 16, 2007.
EQuIP Rubric and Quality Review Curriculum Council September 26, 2014.
Providing Constructive Feedback
Consistency of Assessment
Norm-referenced assessment Criterion-referenced assessment Domain-referenced assessment Diagnostic assessment Formative assessment Summative assessment.
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
Rationale for CI 2300 Teaching and Learning in the Digital Age.
Matt Moxham EDUC 290. The Idaho Core Teacher Standards are ten standards set by the State of Idaho that teachers are expected to uphold. This is because.
FLCC knows a lot about assessment – J will send examples
Grappling with Grading Assessment & Rubrics
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
STRATEGIES FOR ONLINE LEARNING IN A GLOBAL NETWORK UNIVERSITY INTED 2013 Annette Smith, Kristopher Moore, Erica Osher Reifer New York University.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Leadership in Community Action. Assignment 1: Beginning Community Action About 2,000 words plus assignment cover sheet and list of references In a short.
Formative Assessment.
MA course on language teaching and testing February 2015.
What should teachers do in order to maximize learning outcomes for their students?
Interstate New Teacher Assessment and Support Consortium (INTASC)
The Cain Project in Engineering and Professional Communication WORKSHOP SERIES Giving Students Feedback on Oral Presentations.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
Asynchronous Discussions and Assessment in Online Learning Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous Discussions and Assessment in.
Universal Design for Learning in the College Classroom Abstract This Faculty Learning Community (FLC) integrated components of Universal Design for Learning.
College of Engineering and Science Graduate TA Training Workshop Day 2: Questioning and Discussion Techniques Dr. Lisa Benson Ms. Justine Chasmar August.
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
EDU 385 Education Assessment in the Classroom
Human Communication Delivering/Evaluating Persuasive Speeches.
Institutional Outcomes and their Implications for Student Learning by John C. Savagian History Department Alverno C O L L E G E.
COMPONENT #6 PracticeandApplication SIOP. Review Homework 1. Share with the people at your table your plans for_______________. 2. The person staying.
375 students took the number sense common formative assessments this school year. These are their stories. (Please view as a slide show)
EdTPA Teacher Performance Assessment. Planning Task Selecting lesson objectives Planning 3-5 days of instruction (lessons, assessments, materials) Alignment.
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
Content-Area Writing Chapter 10 Writing for Tests and Assessments Darcey Helmick EIWP 2013.
Copyright © 2012 Assessment and Accountability Comprehensive Center & North Central Comprehensive Center at McRel.
Focusing on Learning Instead of Teaching David L Tauck Biology All of the ideas expressed in these slides were adapted from publications and web sites.
Integrating Technology & Media Into Instruction: The ASSURE Model
Selected Teaching-Learning Terms: Working Definitions...
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Adapted from “Best Practices for Student Learning, Assessment in Online Courses”“Best Practices for Student Learning, Assessment in Online Courses”
Department of Secondary Education Program Assessment Report What We Assessed: Student Learning Outcomes (SLOs) and CA State Teaching Performance.
Writing Constructed Responses Praxis II Principles of Learning and Teaching.
Effective Grading Strategies Alison Morrison-Shetlar Faculty Center for Teaching and Learning Adapted from the book Effective Grading by Barbara Walvoord.
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
What Faculty Members Need to Know.  Created in response to parent/teacher agreement that entering freshman should be technologically literate  Given.
Formative Assessment. Fink’s Integrated Course Design.
CLASSROOM ASSESSMENT TECHNIQUES Departmental Workshop Wayne State University English Department January 11, 2012.
Planning the Learner Experience Linda Rolfe & Cerian Ayres Petroc.
Improving Courses Across an Online Program: A Design-Based Approach Leonard Bogle, Scott Day, Daniel Matthews & Karen Swan University of Illinois Springfield.
Students who graduate college should be prepared with the knowledge and skills necessary to succeed in their selected profession. One necessary skill is.
Learning Objectives for Senior School Students. Failing to plan is planning to fail. / Psychology of Achievement /
Common Core.  Find your group assignment.  As a group, read over the descriptors for mastery of this standard. (The writing standards apply to more.
Writing Program-level Student Learning Outcomes (PSLOs)
Consider Your Audience
Learner-Centered Teaching
Assessments TAP 1- Strand 5.
ACUE Program at Salem State
UAL level 3 Diploma Print & Journalism
The Nuts and Bolts of National Board Certification
Cold War Project Due: March 15, 2012.
UAL level 3 Diploma Print & Journalism
Presentation transcript:

Peer Assessment of Oral Presentations Kevin Yee Faculty Center for Teaching & Learning, University of Central Florida Research Question For oral presentations, does the introduction of revised peer-evaluation rubrics with increased focus and greater detail improve student awareness of presentation issues, and thus improve their perceived performance? Methods Context Knowledge Base Students learn more from interacting with other students than listening passively to teachers (McKeachie’s Teaching Tips, Wilbert J. McKeachie, New York, Houghton Mifflin, 2002). Also, “When assessment criteria are firmly set, peer-feedback enables students to judge the performance of their peers in a manner comparable to those of the teachers. However, the same is not found to be true with self- assessment” (Mrudula Patri. "The Influence of Peer Feedback on Self- and Peer-Assessment of Oral Skills." Language Testing, 19:2, 2002, p.109 ). Thus, peer evaluation is the only means of assessment which approaches the utility and practicality of teacher evaluation. The potential benefit of collaborative work to enhance learning is strong enough to consider it one of the elite principles of good practice. But, “when poorly managed, collaborative assignments can decrease students’ sense of control and increase their anxiety and anger” (Barbara Walvoord and Virginia Anderson, Effective Grading, San Francisco, Jossey- Bass, 1998). Thus, peer edit rubrics must be properly constructed, or else the exercise risks diminishing learning rather than enhancing it. -Two groups of graduate students took the same non- credit course for a Teaching Certificate. G1 was five students in Fall 2004, and G2 was 12 students in Spring Both groups gave weekly oral presentations. G1 was given a less-focused and less-detailed rubric for peer evaluation than G2. -At the midterm, both groups were asked to fill out non-anonymous self-evaluations of their oral presentation skills. The questions allowed for open- form, prose answers. Transferability G1 was given a less focused and more unstructured peer review rubric for oral presentations throughout the semester. G2 performed the same presentations and was given a more detailed rubric that focused more attention on just two categories (voice and performance). In both cases, student presentations were followed by a class-wide discussion of specific techniques, problems, and model behaviors from the performances. In the case of G2 only, the anonymous peer evaluations were also handed back to each student after every session. After six weeks, both groups completed a five-minute, non-anonymous midterm self-evaluation to answer these questions with bullet points: What do you do well in presentations? What needs improvement? What have you learned about your presentation style? No further instructions were given, and responses were intentionally free-form. Results of the self-evaluations were tabulated by group, and frequency of each type of reply was determined, enabling several patterns to be discerned from the data. -Multiple disciplines can use grading rubrics to evaluate student work whenever a sliding scale is appropriate, including student oral presentations in other fields, formal essays, written lab reports, or math/computer problems where the process is as important as the outcome. -This study suggests when detailed rubrics CAN be used as peer edit sheets, they SHOULD be integrated into the curriculum to focus attention on a few key concepts, and to deepen and broaden student learning. Findings G2 was focused on issues of voice and performance rather than content and organization. G2, in considering time management to be part of “organization”, listed this most often as the item needing improvement (6 out of 12 respondents). If one condenses the categories of answers given by G2 under the topic “Things I Do Well,” the majority had to do with voice and performance (32 out of 45 responses). For G1, there was less agreement (only 5 out of 13 responses had to do with voice or performance). Members of G2 gave answers similar to each other, and the words they chose mimic the language on the rubric. Examples in “What I Learned” include eye contact, monotone, accent, and confidence. G1 chose words that did not mimic their simpler and broader rubric, and seemed to focus on idiosyncrasies rather than categories common to the rubric. Therefore, the focused rubric used by G2 seems to focus student attention on these particular concepts. Items at the top of G2’s rubric were the ones most often listed on the “Things I Do Well” list, and those at the bottom of the rubric are mentioned the least. The top of the list, volume, was mentioned by 5 of 12 respondents, the most correlation on the “Things I Do Well” list. Eye contact, at the top of the “performance” category, was mentioned by 4 of 12 respondents. In G1 there was no meaningful correlation between things they believe they do well and their rubric. G1 demonstrated little agreement within the group on any of the questions. Except for organization listed as “needing improvement,” virtually all other responses by G1 were individual responses that did not match other responses in the class. This suggests the class was not focused on any particular set of topics, and people pursued their own interests. The student responses can suggest ways in which the rubrics should be altered. Based on the feedback of the midterm evaluation instrument, I will implement a follow-up rubric to highlight the two topics seen as most urgent by G2 needing improvement: organization (6 of 12 respondents) and time management (7 of 12 respondents). Conclusion -Adding extra details to peer edit sheets does enhance student learning about the specifics being highlighted, but with some loss of holistic and “big picture” performance. Details on the peer edit sheets should be chosen with care. -Peer-edit rubrics should not only be as detailed as grading rubrics, but should also be open to revision based on evaluations. Group One (Fall 2004): Group Two (Spring 2005):