DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Assessment & Evaluation adapted from a presentation by Som Mony
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
 Goals: 1. To help faculty respond critically to student generated course evaluation; 2. To help faculty improve student learning.  Outcomes: Faculty.
Campus-wide Presentation May 14, PACE Results.
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Understanding and Using Your IDEA Evaluation Results Nina Campanicki & Ernie Linsay Faculty Development Day March 20, 2010.
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
IDEA Student Ratings of Instruction Insight Improvement Impact ® Loyola University of Chicago April 10-11, 2013 Steve Benton, PhD Senior Research Officer.
TTU Teaching Evaluations at TTU Using the IDEA Instrument.
Student Assessment CERRA National Board Candidate Support Workshop Toolkit WS
STANDARDIZED TESTING MEASUREMENT AND EVALUATION  All teaching involves evaluation. We compare information to criteria and ten make judgments.  Measurement.
Writing Program Assessment Report Fall 2002 through Spring 2004 Laurence Musgrove Writing Program Director Department of English and Foreign Languages.
Evaluation of Teaching Excellence, a Guide for Administrators Heather McGovern, PhD Director of the Institute for Faculty Development Associate Professor.
Assessing and Evaluating Learning
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Interpreting IDEA reports Diagnostic Form Short Form
Assessment Overview: Part 1. Overview Overview of IDEA Data – Not other college assessments like AACP surveys, experiential results, dashboards, etc.
Measures of Central Tendency
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
REC 375—Leadership and Management of Parks and Recreation Services Jim Herstine, Ph.D., CPRP Assistant Professor Parks and Recreation Management UNC Wilmington.
Assessment Overview Drake CPHS. Overview Overview of IDEA Data Assessing college-wide teaching goal Advising Results Q&A.
Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
Jeanne M. Clerc, Ed.D. Western Illinois University (WIU) October 14, 2011.
Classroom Assessment A Practical Guide for Educators by Craig A
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
AFT 7/12/04 Marywood University Using Data for Decision Support and Planning.
Derek Herrmann & Ryan Smith University Assessment Services.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
IDEA Making an Old Enemy Your Friend. MYTH or REALITY?  IDEA is a for-profit corporation.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
The role of course evals in faculty reviews: One perspective John Petraitis CAFE March 25, 2011.
Student Assessment Workshop #5 CERRA National Board Candidate Support Workshop Toolkit WS
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
Chapter 1 Introduction to Statistics. Statistical Methods Were developed to serve a purpose Were developed to serve a purpose The purpose for each statistical.
EDU 8603 Day 6. What do the following numbers mean?
University Teaching Symposium January 9,  Individual Development and Educational Assessment  Kansas State University  A diagnostic to.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
IDEA Student Ratings System Loyola University September 9 & 10, 2015 Jake Glover, PhD Senior Education Officer.
Experimental Research Methods in Language Learning Chapter 9 Descriptive Statistics.
 Shelley A. Chapman, PhD Texas A & M University February 2013.
Assessment and Testing
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Faculty Information Form Insight Improvement Impact ® University of Alabama Birmingham.
IDEA STUDENT EVALUATION REPORTS Insight Improvement Impact ® Using IDEA as a Tool for Reflection about Your Teaching Raritan Valley Community College January.
CCSSE 2015 Findings for OSU Institute of Technology.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Using IDEA for Assessment Howard Community College August 17, 2012 Shelley A. Chapman, PhD.
Summary of CPHS Course Evaluations: AY and Chuck Phillips August 2009.
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
Reliability EDUC 307. Reliability  How consistent is our measurement?  the reliability of assessments tells the consistency of observations.  Two or.
USING STUDENT EVALUATIONS AT STOCKTON Heather McGovern Director of the Institute for Faculty Development Associate Professor of Writing January 2012.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
The University of Texas-Pan American National Survey of Student Engagement 2014 Presented by: October 2014 Office of Institutional Research & Effectiveness.
Designing Quality Assessment and Rubrics
Making an Old Enemy Your Friend
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Teaching Evaluations at TTU Using the IDEA Instrument
Teaching Evaluations at TTU Using the IDEA Instrument
Teaching Goal Overview Drake CPHS
Interpreting IDEA Results: Getting the Most from Course Evaluations
Derek Herrmann & Ryan Smith University Assessment Services
Helping US Become Knowledge-Able About Student Engagement
IDEA Student Ratings of Instruction
Presentation transcript:

DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING & USING YOUR ONLINE IDEA RESULTS

Individual Development and Educational Assessment

STUDENT LEARNING MODEL Specific teaching behaviors influence certain types of student progress under certain circumstances Specific Teaching Behaviors Student Progress Certain Circumstances

HOW IS THE INFORMATION CAPTURED? Faculty Information Form (FIF) Relevance of 12 learning objectives Describe course conditions Student Survey Diagnostic Form Overall teaching effectiveness Course improvement strategies Short form Overall teaching effectiveness

13 Learning Objectives 6 Categories Importance Rating Essential (E) Important (I) Minor/No importance (M) (Determined by the division)

HOW IS THE INFORMATION CAPTURED? Faculty Information Form (FIF) Relevance of 13 learning objectives Describe course conditions Student Survey Diagnostic Form Overall teaching effectiveness Course improvement strategies Short form Overall teaching effectiveness

Teaching methods/behaviors Only used for diagnostic purpose Not required to employ all methods Not used to make summary evaluation of teaching effectiveness Omitted in short form Student progress on learning objectives Identical to FIF Also in Short form Course Characteristics Useful for identifying characteristics of specific disciplines

SUMMARY OF IDEA MEASURES Specific Teaching Behaviors Student Progress Diagnostic Form Teaching methods Diagnostic Form Progress on 12 Learning Objectives - Items Faculty Information Form Rating of Learning objectives Minor - weighted 0 Important - weighted 1 Essential - weighted 2 Certain Circumstances Student Motivation DF - Items 36, 38, 39 Student Effort DF - Item 37 Course Difficulty DF- items Student's work habits DF - Item 43 Class Size FIF - Number enrolled Summary measures of effectiveness DF - Items 40-42

Summative Tab

Formative Tab

Quantitative Tab

Comments What did you like best about this course? What did you like least? What would you change? Qualitative Tab

No Segment Howard Community College Division All Sections in Course Segment Comparison Tab* * NEW

RAW AND ADJUSTED SCORES

Raw score – actual rating Adjusted score – raw score adjusted for ‘extraneous factors’ Major Factors: Student motivation to take the course & Student work habits Minor factors: Student effort, Difficulty of subject matter, & Class size

QUESTIONS ADDRESSED BY IDEA REPORT Q1. Overall, how effectively was this class taught? Q2. How does this compare with the ratings of other teachers? Q3. Were you more successful in facilitating progress on some class objectives than on others? Q4. How can instruction be made more effective? Q5. Do more salient characteristics of this class and its students have implications for instruction?

QUESTION 1 Overall, how effectively was this class taught?

PRO – Progress on Relevant Objectives Overall index of teaching effectiveness Single best estimate of teaching effectiveness Two additional overall measures of teaching effectiveness (1- definitely false; 5- definitely true) Overall, I rate this instructor as an excellent teacher Overall, I rate this course as excellent Avg. of these two regarded as equal in value to PRO Hence summary evaluation – average of these 2 with PRO

OBTAINING AVERAGE “PRO” SCORE Essential objectives weighted double Minor objectives ignored {(4.1*2)+(4.0*2) } ────────────────── 6

QUESTION 2 How do your ratings compare with the ratings of other teachers?

Converted averages are preferred when making comparisons among faculty members or classes They take into account the fact that average progress ratings are much higher for some objectives than for others Assure faculty members that they will not be penalized for selecting objectives that are especially difficult Compare with  All Classes in IDEA database  All Classes in your institution  All Classes in your course CONVERTED AVERAGES

QUESTION 3 Were you more successful in facilitating progress on some class objectives than on others?

Knowing the percent of students making ratings in the two highest and two lowest categories is helpful in identifying classes where student outcomes are “bi-modal”  divided fairly evenly between students who profited greatly and those whose sense of progress was disappointing Reason 1 - Often occur when a substantial portion of the class lacks the background needed to profit from the course BI-MODAL RATINGS Reason 2 - May reflect differences in preferred learning styles of students

QUESTION 4 How can instruction be made more effective?

INSTRUCTIONAL IMPROVEMENT Teaching methods grouped into 5 teaching styles 5 point scale ; 1 - hardly ever … 5 - almost always A.Stimulating student interest B.Fostering student collaboration C.Establishing rapport D.Encouraging student involvement E.Structuring classroom experiences

QUESTION 5 Do more salient characteristics of this class and its students have implications for instruction?

COURSE CHARACTERISTICS Students described the class by comparing it to other classes they have taken in terms of : (1) amount of reading (2) amount of work in non-reading assignments (3) difficulty Average ratings are compared with “All classes” in the IDEA database; If sufficient data were available, comparisons are also made with classes in the broad discipline group in which this class was categorized and all other classes at your institution. Because relatively large disciplinary differences have been found on these three characteristics the disciplinary comparison may be especially helpful.

USING STATISTICAL DETAIL Distribution of responses Average rating Standard deviation of ratings Attention should be concentrated on ‘ important ’ or ‘ essential ’ objectives and on methods that are closely related to progress ratings on these objectives

USING STATISTICAL DETAIL Standard deviations of about 0.7 are typical. When these values exceed 1.2, the class exhibits unusual diversity. In such cases, distribution of responses should be examined closely, primarily to detect tendencies toward a bimodal distribution (one in which class members are about equally divided between the “high” and “low” end of the scale, with few “in-between.”) Bimodal distributions suggest that the class contains two types of students who are so distinctive that what “works” for one group will not for the other.

CONCLUSION Reliability of results – largely affected by class size Validity Consistent results on positive correlation between “ student rating on progress ” & “ faculty rating on importance ” (besides other tests) Results relevant for both summative evaluation and formative evaluation IDEA suggests a comprehensive evaluation process and that student ratings constitute no more than 30-50% of the final judgment.

WANT ADDITIONAL SUPPORT? Watch for professional development offerings Contact to request individual support