Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.

Slides:



Advertisements
Similar presentations
The Teacher Work Sample
Advertisements

 Goals: 1. To help faculty respond critically to student generated course evaluation; 2. To help faculty improve student learning.  Outcomes: Faculty.
Why We Are Here: Context for Curricular Design and Clinical Education Copyright 2008 by The Health Alliance of MidAmerica LLC.
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William.
IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Understanding and Using Your IDEA Evaluation Results Nina Campanicki & Ernie Linsay Faculty Development Day March 20, 2010.
IDEA Student Ratings of Instruction: Adjunct Workshop Dr. Kristi Roberson-Scott Fall 2009 Semester.
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
Utilization-focused Assessment in Foundations Curriculum: Examining RCLS 2601: Leisure in Society Clifton E. Watts, PhD Dept. of Recreation & Leisure Studies.
2013 Spring Assessment Colloquium Beth Tipton CBPA Associate Dean “CLOSING THE LOOP” AND IMPROVING STUDENT LEARNING VIA ONGOING ASSESSMENT.
IDEA Student Ratings of Instruction Insight Improvement Impact ® Loyola University of Chicago April 10-11, 2013 Steve Benton, PhD Senior Research Officer.
TTU Teaching Evaluations at TTU Using the IDEA Instrument.
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
QEP Update CFCC Planning Retreat June 19, QEP Update Mid-Term Report Global Outcomes 1.Measurable improvement of students’ critical thinking skills.
The Current Refocusing of General Education. Objectives for the Workshop Proposing and/or Renewing a Course Assessing the general education aspect of.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Interpreting IDEA reports Diagnostic Form Short Form
Assessment Overview: Part 1. Overview Overview of IDEA Data – Not other college assessments like AACP surveys, experiential results, dashboards, etc.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Advancing Excellence in STEM Education ___________________________________________________________________________________________ The Teacher is the Key.
Assessment Overview Drake CPHS. Overview Overview of IDEA Data Assessing college-wide teaching goal Advising Results Q&A.
Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Khan Academy Implementation Models Making the Best Use of Khan Academy with Your Students 1.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
MA course on language teaching and testing February 2015.
Writing Across the Curriculum (WAC) at Sojourner Douglass College Faculty and Staff Session One Saturday, November 9, 2013.
Robert W. Lingard California State University, Northridge EEET July 11, 2009.
TEACHERS’ KNOWLEDGE AND PEDAGOGICAL CONTENT KNOWLEDGE
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Data-Guided Faculty Development Planning University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
Data-Informed Faculty Development Planning Howard Community College August 17, 2012 Shelley A. Chapman, PhD.
University Teaching Symposium January 9,  Individual Development and Educational Assessment  Kansas State University  A diagnostic to.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
IDEA Student Ratings System Loyola University September 9 & 10, 2015 Jake Glover, PhD Senior Education Officer.
 Shelley A. Chapman, PhD Texas A & M University February 2013.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
Teaching Philosophy and Teaching Portfolio
Darla Stynen. The subject matter I am teaching in my classroom, as prescribed by the school district.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Faculty Information Form Insight Improvement Impact ® University of Alabama Birmingham.
IDEA STUDENT EVALUATION REPORTS Insight Improvement Impact ® Using IDEA as a Tool for Reflection about Your Teaching Raritan Valley Community College January.
Using IDEA for Assessment Howard Community College August 17, 2012 Shelley A. Chapman, PhD.
Summary of CPHS Course Evaluations: AY and Chuck Phillips August 2009.
Assessment Literacy and the Common Core Shifts. Why Be Assessment Literate? ED Test and Measurement 101 was a long time ago Assure students are being.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
Instructional Leadership Supporting Common Assessments.
The University of Texas-Pan American National Survey of Student Engagement 2014 Presented by: October 2014 Office of Institutional Research & Effectiveness.
Do Now  You have given your beginning of the year diagnostic assessment. Your 30 students produce these results:  20 score below 50%  7 score between.
Anna Parkman New Faculty Orientation ◦ ACCOUNTABILITY in Higher Education ◦ ASSESSMENT as validation of learning ◦ ASSESSMENT & ACCREDITATION ◦
Teaching Evaluations at TTU Using the IDEA Instrument
Smarter Balanced Assessment Results
Teaching Evaluations at TTU Using the IDEA Instrument
Interpreting and Using the ACT Aspire Interim Results
First-Stage Draft Plans for Gen Ed Revision
The Heart of Student Success
Curriculum Committee Report
IDEA Student Ratings of Instruction
Curriculum Coordinator: Patrick LaPierre February 1, 2016
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD

Plan for this Session Program Evaluation & Assessment of Student Learning Group Summary Reports Aggregate Data File Benchmarking Reports Accreditation Guides

What makes IDEA unique? 1.Focus on Student Learning 2.Focus on Instructor’s Purpose 3.Adjustments for Extraneous Influences 4.Validity and Reliability 5.Comparison Data 6.Flexibility

Underlying Philosophy of IDEA Teaching effectiveness is determined primarily by students’ progress on the types of learning the instructor targets.

Faculty Information Form

Diagnostic Report Overview  Page 1 – Big Picture  How did I do? Page 3 – Diagnostic  What can I do differently? Page 2 – Learning Details  What did students learn? Page 4 – Statistical Detail  Any additional insights?

Your Average (5-point Scale) RawAdj. A.Progress on Relevant Objectives 1 Four objectives were selected as relevant (Important or Essential—see page 2) If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average. The Big Picture

Progress On Relevant Objectives

Summary Evaluation: Five-Point Scale Report Page 1 Your Average Score (5-point scale) RawAdj. A.Progress on Relevant Objectives Four objectives were selected as relevant (Important or Essential—see page 2) Overall Ratings B. Excellent Teacher C. Excellent Course D. Average of B & C Summary Evaluation (Average of A & D) % 25%

Using Evidence to Improve Student Learning

Individual Reports to Group Reports

The Group Summary Report How did we do? How might we improve?

Defining Group Summary Reports (GSRs) Institutional Departmental Service/Introductory Courses Major Field Courses General Education Program

GSRs Help Address Questions Longitudinal Contextual Curricular Pedagogical Student Learning- focused

Adding Questions Up to 20 Questions can be added Institutional Departmental Course-based All of the above

Local Code Use this section of the FIF to code types of data.

Defining Group Summary Reports Local Code 8 possible fields Example: Column one – Delivery Format 1=Self-paced 2=Lecture 3=Studio 4=Lab 5=Seminar 6=Online Example from Benedictine University

Example Using Local code Assign Local Code 1=Day, Tenured 2=Evening, Tenured 3=Day, Tenure Track 4=Evening, Tenure Track 5=Day, Adjunct 6=Evening, Adjunct Request Reports All Day Classes Local Code=1, 3, & 5 All Evening Classes Local Code=2, 4, & 6 Courses Taught by Adjuncts Local Code=5 & 6

Description of Courses Included in this Report Number of Classes Included Diagnostic From 42 Short Form 27 Total 69 Number of Excluded Classes 0 Response Rate Classes below 65% Response Rate 2 Average Response Rate 85% Class Size Average Class Size 20 Page 1 of GSR

Assessment of Learning What are our faculty emphasizing? How do students rate their learning? How do our courses compare with others? How do our students compare with others (self- rated characteristics)? What efforts can we make for improvement? (How can we “close the loop”?)

Texas A & M University Student Learning OutcomesPossible IDEA Learning Objectives Master the depth of knowledge required for a degree 1, 2, 3, 4, 7 Demonstrate critical thinking11 Communicate effectively8 Work collaboratively5 Practice personal and social responsibility 10, Extra Question Demonstrate social, cultural, and global competence 7, Extra Question Prepare to engage in lifelong learning 9, 12

Are we targeting TAMU SLOs in Core Curriculum? TAMU Core Curriculum Courses ENGL 203XXXX MATH 241XXX BIOL 101XXXX RELS 211XXXX ARTS 103XXX ANTH 201XXX IDEA Learning Objectives

Are we targeting TAMU SLOs in Core Curriculum? TAMU Core Curriculum Courses ENGL 203XXXX MATH 241XXX BIOL 101XXXX RELS 211XXXX ARTS 103XXX ANTH 201XXX IDEA Learning Objectives

What are We Emphasizing? Percent of Classes Selecting Obj. as Important or Essential This GroupInstitutionIDEA System Objective 116%70%78% Objective 213%59%75% Objective 341%58%75% Objective 432%35%55% Objective 523%19%32% Objective 632%14%25% Objective 722%27% Objective 878%43%47% Objective 919%23%41% Objective 107%11%23% Objective 1128%42%49% Objective 1220%23%41% Average # of Obj. Selected Page 2

What are We Emphasizing? Page 9 Section B Number Rating Percent indicating amount required None or Little SomeMuch Writing662%17%82% Oral Communication666%42%52% Computer Application6650%44%6% Group Work6627%59%14% Mathematics/Quantitative Work 6597%3%0% Critical Thinking6630% 40% Creative/Artistic/Design6661%33%6%

Do Students’ report of learning meet our expectations? Pages 5 and 6 Raw Average Adj. Average # of Classes This Report Institution 4.2 3,963 IDEA System ,991 Objective 1: Gaining factual knowledge (terminology, classifications, methods, trends )

How do students rate their learning? Page 3 Part 1: Distribution of Converted Scores Compared to the IDEA Database

Overall Progress Ratings (Courses) Page 3 Percent of Classes at or Above the IDEA database Average

Overall Progress Ratings (Courses) Part 3: Percent of Classes at or Above This Institution’s Average Page 4

Which teaching methods might we use to improve learning? Page 7 Teaching Methods and Styles Stimulating Student Interest# ClassesAv.s.d. 15. Inspired students to set and achieve goals which really challenged them

Relationship of Learning Objectives to Teaching Methods

How do students view course work demands? Page 8B Student Ratings of Course Characteristics Diagnostic Form Item # & ItemAverage% Classes Below 3.0 % Classes 4.0 or Above 33. Amount of Reading This Report3.421%24% Institution3.331%19% IDEA System3.233%15% 34. Amount of work in other (non-reading) assignments This Report3.324%10% Institution3.423%20% IDEA System3.421%18% 35. Difficulty of subject matter This Report3.219%0% Institution3.513%19% IDEA System3.420%18%

Aggregate Data File Allows you to Use Excel Spreadsheet Use with SAS or SPSS Ask other types of questions Display data in different ways

Instructors’ Reports on Course Emphases: Selected Pairings-Writing and Oral Communication

Instructors’ Reports on Course Emphases: Selected Pairings-Critical Thinking & Writing

Benchmarking Institutional and Discipline Reports

Benchmarking Reports Comparison to 6-10 Peers Same Carnegie Classification IDEA database

Benchmarking Reports The student, rather than the class, is the unit of analysis Percentage of positive ratings is given rather than averages

Report Summary

Comparison Groups Your University Peer* Carnegie National * Peer group is based on 6-10 institutions identified by your institution

Students’ Perceptions: Gen Ed

Background for Specialization

Instructional Objectives Selected by Instructors Instructors’ Intentions/ focus Students’ Self-Reported Progress on Learning

IDEA Objective 3 Learning to apply course material (to improve thinking, problem solving, and decisions)

IDEA Objective 8 Developing skill in expressing oneself orally or in writing

Teaching Methods and Styles Reported by Students Fostering Student Collaboration Encouraging Student Involvement

Using Aggregate Data for Assessment TAMU Student Learning Outcomes Sub-Group Summary Reports Institutional Group Summary Report, Include Extra Questions Benchmarking: One Year or 3-5 Year Trend Report Benchmarking: Discipline Report Core Curriculum Courses in the Major Graduate Level Course Learning Outcomes Course Learning Outcomes Course Learning Outcomes Course Learning Outcomes

Accreditation Guides SACS

NCATE Guide

CACREP Guide

Questions?