WIP – Using Information Technology to Author, Administer, and Evaluate Performance-Based Assessments Mark Urban-Lurain Guy Albertelli Gerd Kortemeyer Division.

Slides:



Advertisements
Similar presentations
Teaching for Fluency with Information Technology: The Role of Feedback in Instructional Design and Student Assessment Explorations in Instructional Technology.
Advertisements

1 Effective Feedback to the Instructor from Online Homework Michigan State University Mark Urban-Lurain Gerd Kortemeyer.
Online Rubric Assessment Tool for Marine Engineering Course
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Personalized Examinations In Large On-Campus Classes Guy Albertelli, Gerd Kortemeyer, Alexander Sakharuk, Edwin Kashy Michigan State University.
Evaluation of the Impacts of Math Course Placement Improvement Achieved through a Summer Bridge Program John R. Reisel, Leah Rineck, Marissa Jablonski,
Chapter 8 Criteria and Validity PERSIAN GROUP. ارزیابی امتحان آزمون ارزیابی امتحان آزمون ارزیابی امتحان آزمون ارزیابی امتحان آزمون ارزیابی امتحان آزمون.
Using Rubrics for Assessment: A Primer Marcel S. Kerr Summer 2007 
Developing Rubrics Presented by Frank H. Osborne, Ph. D. © 2015 EMSE 3123 Math and Science in Education 1.
Making Your Assessments More Meaningful Flex Day 2015.
Ohhhhh, Christopher Robin, I am not the right one for this job... The House on Pooh Corner, A.A. Milne.
ICT Assessment of Entering Preservice Teachers Rachel Vannatta Bowling Green State University.
Creating Effective Classroom Tests by Christine Coombe and Nancy Hubley 1.
Pondamania: Exploring Pond Viability Project Overview Teacher Planning Work Samples & Reflections Teaching Resources Assessment & Standards Classroom Teacher.
The purpose of this workshop is to introduce faculty members to some of the foundation issues associated with designing assignments for writing intensive.
Brian Yusko Associate Dean of Academic Programs Subject and grade-level specific.
Yanling Sun, Ph.D Carolyn Masterson, Ed. D. Implementing ePortfolio among Pre-service Teachers and Interns Montclair State University.
We Need Your Help What we need you to do for us: If we build it, will you use it? Be willing to test-drive the user interface and provide feedback. Help.
Raouf Boules, Ph.D. January 17, DVMT 101- Developmental Mathematics (4 contact hours) DVMT Intermediate Algebra (3 contact hours)
Managerial Role – Setting the Stage Lesson 6 Jeneen T. Chapman John Madden Facilitators.
Implementing Active Learning Strategies in a Large Class Setting Travis White, Pharm.D., Assistant Professor Kristy Lucas, Pharm.D., Professor Pharmacy.
Classroom Assessment and Grading
COURSE ADDITION CATALOG DESCRIPTION To include credit hours, type of course, term(s) offered, prerequisites and/or restrictions. (75 words maximum.) 4/1/091Course.
LON-CAPA 1 The LearningOnline Network with Computer-Assisted Personalized Approach (LON-CAPA) Gerd Kortemeyer Guy Albertelli Michigan State University.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Redesign of Precalculus Mathematics THE UNIVERSITY OF ALABAMA College of Arts and Sciences Course Redesign Workshop October 21, 2006.
Social Studies Classroom Based Assessments (CBAs ) Tacoma Public School K – 5 Implementation Plan
M-STEP A presentation for Macomb County Math Teachers.
Data Management for Large STEP Projects Michigan State University & Lansing Community College NSF STEP PI Meeting March 15, 2012 Workshop Session I-08.
EHS and EMS Presentation 11/12/10 With thanks to Lucille E. Davy, Senior Advisor, James B. Hunt, Jr. Institute.
UH - Downtown. Our Journey…Our Goals  Evaluate course management systems  Raise standards and expectations  Provide consistency for both faculty and.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
We Need Your Help What we need ASM members to do for us: If we build it, will you use it? Be willing to test-drive the user interface and provide feedback.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Writing, Reflection, and Assessment with ePortfolios Christy Desmet Director of First-year Composition Presented to 2007 Academic Affairs Faculty Symposium.
“When learners are at a distance, the careful design of assessments is particularly important, because society somewhat unfairly imposes higher expectations.
Redesign of Intermediate Algebra THE UNIVERSITY OF ALABAMA College of Arts and Sciences Department of Mathematics NCAT Redesign Alliance Conference March.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
The Assessment Continuum Gerd Kortemeyer Michigan State University Jan 25th, 2012 Before, in, and after lecture.
Session 4 Performance-Based Assessment
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Marchetta Atkins, Mathematics Instructor Alcorn State University Alcorn State, Mississippi College Algebra 16 sections Fall Semester Sections/Number.
Online Mathematics Assessments: Formative and Summative Lee Alan Hanawalt Roher
Michigan State University Gathering and Timely use of Feedback from Individualized On-Line Work. Matthew HallJoyce ParkerBehrouz Minaei-BigdoliGuy AlbertelliGerd.
AMATYC 2015 Self-Paced Mastery Learning for Developmental Mathematics The Community College of Baltimore County Lisa Brown Assistant Professor Tejan Tingling.
LON-CAPA The LearningOnline Network with CAPA Gerd Kortemeyer Michigan State University East Lansing, USA.
Summary of Assessments By the Big Island Team: (Sherry, Alan, John, Bess) CCSS SBAC PARCC AP CCSSO.
Identifying Assessments
PERFORMANCE-BASED- ASSESSMENT JUAN PABLO RAMIREZ LUISA FERNANDA GODOY.
Faculty Development Models
ASSESSMENT IN VALUES EDUCATION Fe Josefa G. Nava, Ph. D. Lecturer College of Education, University of the Philippines, Diliman, Quezon City, DISP 2009.
Scoring Rubrics: Validity and Reliability Barbara M. Moskal Colorado School of Mines.
Part II – Chapters 6 and beyond…. Reliability, Validity, & Grading.
Chapters 8 & 9 Planning for Continuous Performance-Based Assessment 8 & 9 Planning for Continuous Performance-Based Assessment C H A P T E R S.
Unit Meeting Feb. 15, ◦ Appreciative Inquiry Process-BOT Steering Committee and Committee Structure. ◦ Four strategies identified from AIP: Each.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Quantitative Literacy Across the Curriculum. Members of the QLAC Committee Beimnet Teclezghi – co-chair Laura Pannaman – co-chair Marilyn Ettinger John.
Center for Assessment and Improvement of Learning
2nd Annual Faculty e-Learning Symposium 26th May 2016 Mrs E D Joubert
Discussion and Conclusion
NACEP Standards and Assessment February 15, 2016
Assessment Information
Assessment 101 Zubair Amin MD MHPE.
Chair: Nadine Jennings Date of Presentation: January 19, 2017
This material is based upon work supported by the National Science Foundation under Grant #XXXXXX. Any opinions, findings, and conclusions or recommendations.
Student Learning Outcomes Assessment
Presentation transcript:

WIP – Using Information Technology to Author, Administer, and Evaluate Performance-Based Assessments Mark Urban-Lurain Guy Albertelli Gerd Kortemeyer Division of Science and Mathematics Education Michigan State University FIE 2005 Session T2E October 20, 2005

Acknowledgements Based upon work supported in part by: –U.S. Department of Education, under special project number P342A –National Science Foundation through the Information Technology Research (ITR ) and the Assessment of Student Achievement (ASA ) programs –Michigan State University Any opinion, finding, conclusions or recommendations expressed in this presentation are those of author(s) and do not necessarily reflect the views of any of the supporting institutions

Performance-Based Assessments (PBA) Students construct responses in “authentic” problem-solving contexts Can support student learning by –Focusing on fundamentals of a discipline –Requiring deeper understanding –Exposing student thinking –Providing formative evaluation

Performance-Based Assessments More expensive and time-consuming to score than “objective” tests Scoring entails human judgment –Scoring requires detailed rubrics –Training for raters –Inter-rater reliability

Previous Work CSE 101 –Non-major CS0 course –2000 students semester –Goal: Fluency with Information Technology (FITness) –Modified-mastery PBA: Bridge Tasks (BT) Database-driven system for creating, delivering and evaluating BTs 14,000 BTs / semester

Bridge Task (BT) Database Each Bridge Task (BT) has dimensions (M) that define the skills and concepts being evaluated. Within each dimension are some number of instances (n) of text describing tasks for that dimension. A bridge task consists of one randomly selected instance from each dimension for that bridge task Dim 1 Instance i Instance i+1 Instance i+2 Instance i+n Dim 2 Instance i Instance i+1 Instance i+2 Instance i+n Dim M Instance i Instance i+1 Instance i+2 Instance i+n Creating Assessments

Bridge Task (BT) Database Dim 1 Instance i Instance i+1 Instance i+2 Instance i+n Dim 2 Instance i Instance i+1 Instance i+2 Instance i+n Dim M Instance i Instance i+1 Instance i+2 Instance i+n Dim 1 Instance 1 Criteria i Criteria i+1 Criteria i+2 Criteria i+n Dim 2 Instance i+2 Criteria i Criteria i+1 Criteria i+2 Criteria i+n Dim M Instance i+n Criteria i Criteria i+1 Criteria i+2 Criteria i+n Student Evaluation PASS or FAIL Evaluation Criteria

Scoring Rubrics Software –Retrieves each student’s files –Presents grader scoring rubrics for particular BT –Grader evaluates each criteria pass/fail Adds open-ended formative feedback for the student Backend computes outcome based on pass/fail scores as defined by instructor for each BT Students view their scored BT with the detailed rubrics and comments on the Web

Porting to Open Source CMS LON-CAPA –Open source course management system developed at MSU –Content sharing and reuse network –Randomized homework, quizzes, numeric problems

Development Effort Completed Mastery model BT assessments –Multiple pass/fail criteria determine if pass or fail BT –Students repeat given level of BT until pass BT grading rubrics Scoring rubrics for each part of each BT –Grader evaluates each pass/fail –Provide formative feedback to student Support for proctored authentication –Online resources typically completed without proctors –BTs administered in scheduled labs with a proctor –Schedule BTs in particular labs at particular dates and times

Current Status Available in current release Testing with one section in CSE 101 Authoring and scheduling require editing/uploading files

Spring 2006 Improve scheduling interface Convert CSE 101 course Teacher Education –Incorporate BT as part of the technology requirement for pre-service teachers

Future Work Other CSE courses adopt LON-CAPA and implement PBAs Working with faculty to implement and evaluate the impact of PBAs on student problem-solving across other disciplines

Questions Mark Urban-Lurain LON-CAPA