Interpreting Student Evaluations Heather McGovern, Fall 2011.

Slides:



Advertisements
Similar presentations
Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
Advertisements

Transformations & Data Cleaning
Reliability and Validity checks S-005. Checking on reliability of the data we collect  Compare over time (test-retest)  Item analysis  Internal consistency.
Please check, just in case…. Announcements 1.Terminology Treasure Hunt due in two weeks (Oct 29). Please check the resources provided in the folder on.
Measurement Reliability and Validity
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Understanding and Using Your IDEA Evaluation Results Nina Campanicki & Ernie Linsay Faculty Development Day March 20, 2010.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
Evaluation of Teaching at Stockton Heather McGovern Director of the Institute for Faculty Development August 2010.
TTU Teaching Evaluations at TTU Using the IDEA Instrument.
Potential Biases in Student Ratings as a Measure of Teaching Effectiveness Kam-Por Kwan EDU Tel: etkpkwan.
Chapter 15 Conducting & Reading Research Baumgartner et al Chapter 15 Measurement Issues in Research.
Build Assessment Literacy and Create a Data Overview Oct. 10, 2006.
Availability in Global Peer-to-Peer Systems Qin (Chris) Xin, Ethan L. Miller Storage Systems Research Center University of California, Santa Cruz Thomas.
ICE Evaluations Some Suggestions for Improvement.
Interpreting IDEA Heather McGovern Director of the Institute for Faculty Development November 2009 add info on criterion vs. normed scores Add info on.
Evaluation of Teaching Excellence, a Guide for Administrators Heather McGovern, PhD Director of the Institute for Faculty Development Associate Professor.
Characteristics of Sound Tests
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Classroom Assessment A Practical Guide for Educators by Craig A
Meaningful Assessment and the Impact of Technology Catherine Kelley, Ph.D. Senior Faculty Consultant Assistant Professor.
Introduction to NYC Teacher Data Initiative Training for Schools Fall 2008.
The Youth Music Network. I need some advice I don’t have time Where do I fit in? Sustainability!? I need some ideas It’s hard to find the right.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Near East University Department of English Language Teaching Advanced Research Techniques Correlational Studies Abdalmonam H. Elkorbow.
Handling Negative Communications Instructor: Dr. Bahna.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
1 There is no “quick fix” So, think crock pot, not microwave Strategies… Before that, we must say something very important!
C O L L E G E S U C C E S S ™ What Denton High Students Need to Know about the PSAT/NMSQT Critical Reading & Writing Skills COMING SOON! OCTOBER 17 th.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
The role of course evals in faculty reviews: One perspective John Petraitis CAFE March 25, 2011.
Evaluation of Teaching at Stockton Heather McGovern, PhD Associate Professor of BASK and Writing Director of the Institute for Faculty Development August.
Listening to Bob Marley makes me think of statistics and standardized testing….. Said no one ever But… it is still your responsibility.
Chapter 4: Test administration. z scores Standard score expressed in terms of standard deviation units which indicates distance raw score is from mean.
Summer 2015 Glenville State College Forensics Science Student and Teacher Post Evaluation Results.
University Teaching Symposium January 9,  Individual Development and Educational Assessment  Kansas State University  A diagnostic to.
Assessment What is it? Collection of relevant information for the purpose of making reliable curricular decisions and discriminations among students (Gallahue,
 Shelley A. Chapman, PhD Texas A & M University February 2013.
The Satisfied Student October 4 th, Today’s Presentation  Present data from Case’s Senior Survey and the National Survey of Student Engagement.
Unit task Preparing a personality quiz How sociable are you ? How open-minded are you ? How fashionable are you ? How generous are you ? How studious are.
AP STATISTICS Section 3.2 Least Squares Regression.
Psychometrics. Goals of statistics Describe what is happening now –DESCRIPTIVE STATISTICS Determine what is probably happening or what might happen in.
IDEA STUDENT EVALUATION REPORTS Insight Improvement Impact ® Using IDEA as a Tool for Reflection about Your Teaching Raritan Valley Community College January.
Jackie Knowles, Project Manager Welsh Repository Network.
ACES: Developing a Valid and Reliable Survey to Assess Faculty Support of Diversity Goals.
Classroom Assessment Chapters 4 and 5 ELED 4050 Summer 2007.
STRUCTURED INTERVIEWS AND INSTRUMENT DESIGN PART II Lecture 8.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
USING STUDENT EVALUATIONS AT STOCKTON Heather McGovern Director of the Institute for Faculty Development Associate Professor of Writing January 2012.
Reduced STAAR test blueprints
Classroom Assessments Checklists, Rating Scales, and Rubrics
Evaluation of Teaching at Stockton
Teaching Evaluations at TTU Using the IDEA Instrument
Evaluation of Teaching at Stockton
Teaching Evaluations at TTU Using the IDEA Instrument
Test Design & Construction
The University of Texas-Pan American
Classroom Assessments Checklists, Rating Scales, and Rubrics
Reliability Module 6 Activity 5.
Standard 3 Candidate Quality, Recruitment, and Selectivity
Directions for Expert Review Panel
Lecture 6 Structured Interviews and Instrument Design Part II:
Interpreting IDEA Results: Getting the Most from Course Evaluations
Using statistics to evaluate your test Gerard Seinhorst
The Efficacy of Student Evaluations of Teaching Effectiveness
Evaluation of Teaching at Stockton
Validity and Reliability II: The Basics
Student Evaluations of Teaching (SETs)
IDEA Student Ratings of Instruction
But… it is still your responsibility
Presentation transcript:

Interpreting Student Evaluations Heather McGovern, Fall 2011

Quick Advice  Attend to trends—disregard outliers  Use whichever is higher—adjusted or raw scores  Resist interpreting numbers with too much precision  Attend to the context of the institution (level of class, race of faculty member, etc.) and of the person’s teaching  Remember that students cannot provide any valid information on most f our aspects of excellence in teaching with our student rating instrument (or at all)

Myth: Being “similar” is bad  It is “similar.” The majority of faculty nationwide will fit into this band, given that the scores are normed.

FAQ: What does it mean when someone is adjusted down?  It means that the five factors IDEA uses indicate that students in the class were predisposed to give higher ratings. This may be because they were unusually motivated to take the class, perceive themselves as unusually hard working, because the class is smaller, or a combination of these and other factors.  What it does not mean: the teacher did something wrong.  Basic guidance from IDEA: Use the higher score when evaluating an individual faculty member.

FAQ: Why do some people have to use the small class form?  IDEA’s research indicates that fewer than 15 student responses lead to unreliable data. The union and administration agreed to move to the small class form for classes under 15 in order to avoid giving faculty what is essentially “junk” statistical data. IDEA reports the following median rates: 10 raters.69 reliability 15 raters.83 reliability 20 raters.83 reliability 30 raters.88 reliability 40 raters.91 reliability Reliability ratings below.70 are highly suspect.

FAQ: Why does page 3 not highlight an area in which faculty performed well?  Because IDEA’s research hasn’t noted a correlation between that item and the objectives selected. Bottom line: good teachers don’t have to use all the pedagogical techniques all the time (or ever), and you should see IDEA’s guidance as informed guidance, but not as a mandate.

Myth: Can low scores be because a CIP code is wrong?  Probably not. Unless you mean you are really talking about your disciplinary- comparison scores—let’s see who in this room even knows where they can find those on their IDEA report?  That said, a change in CIP code can provide better disciplinary comparison data which you can then consider and point others to.