 Shelley A. Chapman, PhD Texas A & M University February 2013.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

By: Edith Leticia Cerda
Assessing Student Performance
Performance Assessment
Non-Classroom Teacher Evaluation Guidelines. The single most influential component of an effective school is the individual teachers within that school.
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
ASSESSMENT 101 Preparing Future Faculty (PFF) Workshop Spring 2011 Facilitators: Dr. Kara Penfield, Director of Academic Assessment; Isis Artze-Vega, PFF.
PD Plan Agenda August 26, 2008 PBTE Indicators Track
 Goals: 1. To help faculty respond critically to student generated course evaluation; 2. To help faculty improve student learning.  Outcomes: Faculty.
Campus-wide Presentation May 14, PACE Results.
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Gathering Data for Faculty Evaluation Shelley A. Chapman, PhD Senior Educational Consultant, The IDEA Center.
Understanding and Using Your IDEA Evaluation Results Nina Campanicki & Ernie Linsay Faculty Development Day March 20, 2010.
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
Guidelines and Methods for Assessing Student Learning Karen Bauer, Institutional Research & Planning, Undergraduate Studies; Gabriele Bauer, CTE.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
Using IDEA Reports To improve your teaching. Using IDEA Reports 0 What’s the goal of this session?
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Lecturette 1: Culturally Responsive Progress Monitoring: Universally Designed Classroom Assessment.
IDEA Student Ratings of Instruction Insight Improvement Impact ® Loyola University of Chicago April 10-11, 2013 Steve Benton, PhD Senior Research Officer.
TTU Teaching Evaluations at TTU Using the IDEA Instrument.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Preview of Today l Review next paper l Cover Chapter Three Get into groups.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Interpreting IDEA reports Diagnostic Form Short Form
FLCC knows a lot about assessment – J will send examples
Created by The School District of Lee County, CSDC in conjunction with Cindy Harrison, Adams 12 Five Star Schools SETTING GOALS (OBJECTIVES) & PROVIDING.
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
Principles of Assessment
Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Classroom Assessment and Grading
Colorado’s Student Perception Survey. Agenda Why use a Student Perception Survey? What the Research Says Survey Overview Survey Administration Use of.
Source Code: Simple Tool to Help Assess and Improve Student Research Writing Dale Vidmar Information Literacy and Instruction Librarian Southern Oregon.
Every Student READY. North Carolina Educator Evaluation A process for professional growth.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
. Reaffirmation 2014Off-site review is completedOn-site review is February 2014QEP is ready to be submitted in January 2014.
Let’s Talk Assessment Rhonda Haus University of Regina 2013.
Teaching Today: An Introduction to Education 8th edition
Assessments Matching Assessments to Standards. Agenda ● Welcome ● What do you think of assessment? ● Overview of all types of evidence ● Performance Tasks.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Data-Guided Faculty Development Planning University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
Data-Informed Faculty Development Planning Howard Community College August 17, 2012 Shelley A. Chapman, PhD.
Module 3: Unit 2, Session 1 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 2, Session 1.
NC Professional Teaching Standards. North Carolina Professional Teaching Standards.
University Teaching Symposium January 9,  Individual Development and Educational Assessment  Kansas State University  A diagnostic to.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
IDEA Student Ratings System Loyola University September 9 & 10, 2015 Jake Glover, PhD Senior Education Officer.
Authentic Assessment Kellie Dimmette CI Pretest on Evaluation Part I 1.C & D 2.B & C 3.T 4.Valid, reliable 5.T 6.T 7.T 8.A & B 9.C 10.B.
March Madness Professional Development Goals/Data Workshop.
Fair and Appropriate Grading
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Faculty Information Form Insight Improvement Impact ® University of Alabama Birmingham.
IDEA STUDENT EVALUATION REPORTS Insight Improvement Impact ® Using IDEA as a Tool for Reflection about Your Teaching Raritan Valley Community College January.
Using IDEA for Assessment Howard Community College August 17, 2012 Shelley A. Chapman, PhD.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
21 st Century Learning and Instruction Session 2: Balanced Assessment.
Part II – Chapters 6 and beyond…. Reliability, Validity, & Grading.
Any fact of intellect, character or skill means a tendency to respond in a certain way to a certain situation Any fact of intellect, character or skill.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
USING STUDENT EVALUATIONS AT STOCKTON Heather McGovern Director of the Institute for Faculty Development Associate Professor of Writing January 2012.
Teaching Evaluations at TTU Using the IDEA Instrument
Teaching Evaluations at TTU Using the IDEA Instrument
IDEA Student Ratings of Instruction
Presentation transcript:

 Shelley A. Chapman, PhD Texas A & M University February 2013

 “Teaching Effectiveness”  What it is  Uniqueness of IDEA  Conditions for the Good Use of IDEA  3-Phase Process for Faculty Evaluation  Using Reports to Improve Teaching

Most Surveys How well do the instructor’s methods resemble those of a “model” teacher? How well do students rate their progress on the types of learning the instructor targeted? Teaching Effectiveness

1. Focus on Student Learning 2. Focus on Instructor’s Purpose 3. Adjustments for Extraneous Influences 4. Validity and Reliability 5. Comparison Data 6. Flexibility

The instrument o Focuses on learning o Provides suggested action steps

The Faculty o Trust the process o Value student feedback o Are motivated to make improvements

Campus Culture o Teaching excellence - high priority o Resources to improve - provided o Student ratings - appropriate weight

The Evaluation Process o 30-50% of evaluation of teaching o 6-8 classes, more if small (<10) o Not over-interpreted (3-5 performance categories)

Teaching effectiveness is determined primarily by students’ progress on the types of learning the instructor targets.

1.How did students rate their learning experience? 2.What contextual factors impacted those ratings? 3.How do my scores compare to: IDEA, discipline, and institution? 4.What might I do to facilitate better learning for my students next time?

Wisdom Knowledge Information Data What the Report Can Provide Calculation of Scores Context: Variables and Comparisons Suggested Action Steps

As Part of a Faculty Evaluation Process

Student Ratings External Perspective Artifacts Balanced Plan for Summative Evaluation

Artifacts Syllabi Graphic Organizers Assignments and project descriptions Rubrics Written Teaching Philosophy/Reflections Samples of Student Work CATs and results

Classroom Observation Classroom Visitation Invited Presentations Alumni Surveys Focus Groups of Graduating Students External Perspective

TimeWhat HappenedWhat Was Said

TimeWhat HappenedWhat Was Said 8:05 8:10 8:15 Instructor shut door Students are shuffling papers, opening books. Student comes in late Several students raise hands Female in first row is called on Instructor (I): OK, Class. Let’s begin. Make sure you turned in your homework as you came in. Today we will begin our discussion on the brain. Turn in your textbooks to chapter 5. Is your brain more like a computer or a jungle? Who would like to respond first? Student (S) My brain is a jungle! I am so unorganized! (class laughs)…

Instructor M F F F MMM M M M M M F F F F F F F F F

Administer Appropriately Collect 6-8 Reports (more if class size is <10) 30-50% of Overall Evaluation Student Ratings

Student Comments-formative Be mindful of standard error of measurement (±.3) Use 3-5 Performance Categories Student Ratings

Set Expectations Collect Data Use Data

Set Expectations Collect Data Use Data I.Set Expectations What does this entail regarding IDEA?

Criterion o Use averages on 5-point scale o Recognize that some objectives are more difficult to achieve o “Authenticate” objectives Page 1

Use Converted Averages o IDEA o Discipline o Institution

40% Similar Lowe r 20% 10% Much Lower 10% Much Higher 20% Highe r Gray Band

Below Acceptable Standards Marginal, Needs Improvement Meets Expectations Exceeds Expectations Outstanding Does Not Meet Expectations Meets Expectations Exceeds Expectations

Criterion Average Rating Effectiveness CategoryNormative T-Score Below 3.0Below acceptable standardsBelow Marginal, improvement needed Meets expectations Exceeds expectations or higherOutstanding63 or higher

Set Expectations Collect Data Use Data II. Collect Data What do you look for regarding IDEA?

Create value for student feedback Monitor and Communicate through multiple modalities: Twitter Facebook Other Prepare Students Talk about it Syllabus

Objective 3: Learning to apply course material (to improve thinking, problem solving, and decisions) Students will be able to apply the methods, processes, and principles of earth science to understanding natural phenomena Students will think more critically about the earth and environment Objective 8: Developing skill in expressing myself orally or in writing Students will be able to present scientific results in written and oral forms IDEA Center Learning Objective Course Learning Outcomes

Pages 1 and 2 What were students’ perceptions of the course and their learning?

Were the appropriate objectives selected? How many? Do they match the course? How might you “authenticate” the objectives selected?

3-5 as “Essential” or “Important” o Is it a significant part of the course? o Do you do something specific to help students accomplish the objective? o Does the student’s progress on the objective influence his or her grade? Be true to your course.

What were the students’ perceptions of their course and their learning?

Your Average (5-point Scale) RawAdj. A.Progress on Relevant Objectives 1 Four objectives were selected as relevant (Important or Essential—see page 2) If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average.

Progress On Relevant Objectives

Report Page 1 Your Average Score (5-point scale) RawAdj. A.Progress on Relevant Objectives Four objectives were selected as relevant (Important or Essential—see page 2) Overall Ratings B. Excellent Teacher C. Excellent Course D. Average of B & C Summary Evaluation (Average of A & D) % 25%

Work Habits (Item 43) Student Motivation (Item 39) High Avg. Avg. Low Avg. Low High High Avg Average Low Avg Low  Gaining Factual Knowledge – Average Progress Ratings Technical Report 12, page 40

Work Habits (Item 43) Student Motivation (Item 39) High Avg. Avg. Low Avg. Low High High Avg Average 4.01 Low Avg Low  Gaining Factual Knowledge – Average Progress Ratings Technical Report 12, page 40

PurposeRaw or Adjusted? How much did students learn? Raw What were the instructor’s contributions to learning? Adjusted How do faculty compare? Adjusted

Do raw scores meet or exceed Expectations?* Are adjusted scores lower or higher than raw scores? Use adjusted scores Use raw scores Lower Yes Higher When to Use Adjusted Scores *Expectations defined by your unit. No

Set Expectations Collect Data Use Data III. Use Data Which data will you use and how?

.  Keep track of reports  Look for longitudinal trends  Use for promotion and tenure Created by Pam Milloy, Grand View University Available from The IDEA Center Website

Summative (pp.1-2)  Criterion or Norm- referenced  Adjusted or raw  Categories of Performance  30-50% of Teaching Evaluation  6-8 Classes (more if small) Formative (p.3)  Identify areas to improve  Access applicable resources from IDEA website  Read and have conversations  Implement new ideas

Collect Feedback Interpret Results Read & Learn Reflect & Discuss Improve IDEA resources that are keyed to reports Talk with colleagues Try new ideas Online, Paper What the reports say and what they mean

Relationship of Learning Objectives to Teaching Methods