Assessment Overview Drake CPHS. Overview Overview of IDEA Data Assessing college-wide teaching goal Advising Results Q&A.

Slides:



Advertisements
Similar presentations
Office of Academic Student Instructional Support -OASIS- -Cheri Tillman, Pat Burns.
Advertisements

IDEA Course Evaluations New Faculty Academy Spring,
 Goals: 1. To help faculty respond critically to student generated course evaluation; 2. To help faculty improve student learning.  Outcomes: Faculty.
1 Selected Results from UNCG’s Sophomore and Senior Surveys Spring 2000 Office of Institutional Research UNCG Planning Council August 24, 2000 The University.
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William.
IDEA Student Ratings of Instruction: A Diagnostic Guide for Improvement Dr. Kristi Roberson-Scott.
Zayed University’s Information Literacy Program Fiona Hunt, MLIS Academic Advisor and Instructor, University College, Zayed University, Abu Dhabi.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
IDEA Student Ratings of Instruction: Adjunct Workshop Dr. Kristi Roberson-Scott Fall 2009 Semester.
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
Student Satisfaction Geneva College Student Satisfaction Inventory (SSI: Noel-Levitz) 1997 to 2013.
IDEA Student Ratings of Instruction Insight Improvement Impact ® Loyola University of Chicago April 10-11, 2013 Steve Benton, PhD Senior Research Officer.
Structured Learning Assistance (SLA) and Training for Success Jim Valkenburg Delta College.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Interpreting IDEA reports Diagnostic Form Short Form
Assessment Overview: Part 1. Overview Overview of IDEA Data – Not other college assessments like AACP surveys, experiential results, dashboards, etc.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Assessment Overview Drake CPHS Overview Scholarship highlights of College Status of Annual Surveys Overview of IDEA Data Assessing college-wide.
Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
SENSE 2013 Findings for College of Southern Idaho.
Results of AUC’s NSSE Administration in 2011 Office of Institutional Research February 9, 2012.
Implementing Active Learning Strategies in a Large Class Setting Travis White, Pharm.D., Assistant Professor Kristy Lucas, Pharm.D., Professor Pharmacy.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
Assessing the Value of Synchronous Learning Phillip Knutel, Bentley University Louis Chin, Bentley University Jim Lee, UMass Online (Lowell) MJ Potvin,
Standards For Teacher Preparation. What do you see in the previous slide? Students who are ready to answer the question? Students who are listening and.
College Algebra: An Overview of Program Change Dr. Laura J. Pyzdrowski Dr. Anthony S. Pyzdrowski Dr. Melanie Butler Vennessa Walker.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
Noel-Levitz Student Satisfaction Survey of Classroom and Online Students Conducted Spring 2008.
1 This CCFSSE Drop-In Overview Presentation Template can be customized using your college’s CCFSSE/CCSSE results. Please review the “Notes” section accompanying.
2009 Pitt Community College CCSSE Results September 21, 2009 Report to the Campus College CCSSE Results Pitt Community College Dr. Brian Miller, Assistant.
Student Perception Survey Results Ms. X’s Results for SPS 2015.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
University Teaching Symposium January 9,  Individual Development and Educational Assessment  Kansas State University  A diagnostic to.
 Shelley A. Chapman, PhD Texas A & M University February 2013.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
Welcome Aboard! CCC-QEP Carteret Community College Quality Enhancement Plan.
IDEA STUDENT EVALUATION REPORTS Insight Improvement Impact ® Using IDEA as a Tool for Reflection about Your Teaching Raritan Valley Community College January.
Using Groups in Academic Advising Dr. Nancy S. King Kennesaw State University.
Summary of CPHS Course Evaluations: AY and Chuck Phillips August 2009.
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
Criterion 1 – Program Mission, Objectives and Outcomes Weight = 0.05 Factors Score 1 Does the program have documented measurable objectives that support.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
Continuing Education Provincial Survey Winter 2012 Connie Phelps Manager, Institutional Research & Planning.
USING STUDENT EVALUATIONS AT STOCKTON Heather McGovern Director of the Institute for Faculty Development Associate Professor of Writing January 2012.
Graduate Program Completer Evaluation Feedback 2008.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
Tony McCoy EDL 518 Summer 2010 Elmwood High School- iirc Data Evaluation.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Tell Survey May 12, To encourage large response rates, the Kentucky Education Association, Kentucky Association of School Administrators, Kentucky.
Creating a Comprehensive Early Warning System to Further Student Success and Retention Shane Hammond CCLA June, 2007.
Advanced Writing Requirement Proposal
Director of Institutional Accreditation and Assessment
Jackson College CCSSE & CCFSSE Findings Community College Survey of Student Engagement Community College Faculty Survey of Student Engagement Administered:
Teaching Evaluations at TTU Using the IDEA Instrument
Teaching Goal Overview Drake CPHS
Derek Herrmann & Ryan Smith University Assessment Services
Assessment Day 2017 New Student Experience Presented by Jenny Lee
Background Third time assessed
Student Equity Planning August 28, rd Meeting
SENSE: Survey of New Student Engagement
IDEA Student Ratings of Instruction
Assessment Day 2017 New Student Experience Presented by Jenny Lee
Background Third time assessed
2009 Student Opinion Survey Results
2017 Postgraduate Research Experience Survey (PRES) Results
Presentation transcript:

Assessment Overview Drake CPHS

Overview Overview of IDEA Data Assessing college-wide teaching goal Advising Results Q&A

Drake Background IDEA System – Course evaluation system: Heavily researched – Used by ~ 7 SOP/COP; 320 institutions – Measures progress against faculty objectives IDEA at Drake – Since 2004 – Paper and on-line (35-45 courses/semester)

IDEA Background Students' feedback on their own learning progress, effort, and motivation, Student perceptions of the instructor's use of teaching methods and strategies. Rather than teaching style or personality, IDEA focuses on student learning and the methods used to facilitate it.

Progress on Relevant Objectives vs. IDEA National Database Converted Scores Much Higher (10% of courses) 8.2%5.5%1.6%1.4%1.1% Higher (20% of courses) 34.3%34.2%26.6%18.3%24.2% Similar (40% of courses) 43.8%42.5%43.8%57.8%50.6% Lower (20% of courses) 5.5%12.3%15.6%15.5%15.4% Much Lower (10% of courses) 8.2%5.5%12.5%7.0%8.8% Goal Progress 86.3% ✓ 82.2% ✓ 72%77.5%75.9%

Actions What actions have been taken? – Faculty Development & meetings on: Soul searching on what I’m trying to achieve How to choose objectives (right ones, right number) Should Teaching methods be adjusted? Linking content and methods to the objective More use of IDEA resources (POD notes, etc.) Tie action to faculty annual goals Initiate FIF’s early in semester Good discussions and culture of assessment

Primary Instructional Approach (11-12)

Instructor Related Course Requirements (Some or Much required)

Percent of CPHS classes selecting objective as either Essential or Important (FIF)

1=no progress 2=slight progress 3=moderate progress 4=substantial progress 5=exceptional progress Student ratings of progress on objectives chosen as Essential or Important

AY AY AY AY AY Average% of classes below 3.0 % of classes 4.0 or above Amount of Reading CPHS % 41% 46% 42% 46% 8% 16% 22% 23% 19% IDEA3.233%15% Amount of work in other (non-reading) assignments CPHS % 32% 39% 28% 42% 28% 20% 8% 13% 11% IDEA3.421%18% Difficulty of subject matter CPHS % 33% 35% 29% 35% 33% 31% 26% 28% 25% IDEA3.420%18% Values are similar if within 0.3 1=Much less than most courses, 2=less than most, 3=about average, 4=more than most, 5=much more Amount and Difficulty of Course work: Student Ratings

2012 Trends towards more on-line evaluations Average response rate was down to 62% (57% online rate) Average Class size was 77 Average number of objectives by faculty 3.7

Advising: Interactions with Advisor over the last AY Meetings sPhone calls OtherTotal HSCI (n=72) * Pre-Pharm (n=159) * Pharmacy (n=249) * Total: * Significantly different between all majors at 0.05

Advising: Lowest agreement HSCIPre-PharmPharmacyTotal knows where to refer me for campus-wide support services (writing center, supplemental instruction, counseling center, disability resource center, academic achievement office). 63%70%55%61% has discussed professional involvement with me % knows where to refer me for College support services (Student Services, Admissions, Professional and Career Development, etc.) % has helped me assess my career goals % is knowledgeable about the campus general education program (Drake Curriculum) %

Advising: Highest agreement HSCIPre-PharmPharmacyTotal treats me as an individual.85%96%94%93% is available for appointments % encourages me to make my own decisions % is willing to spend sufficient time with me to deal with my questions and concerns % knows where to refer me for information on college policies and programs %

General Advising Effectiveness HSCIPre-PharmPharmacyTotal I am pleased with the overall nature of my meetings with my faculty advisor. 69%88%80%81% Overall, I have a good faculty advisor. 69%90%82%83%

Future/ongoing areas Identifying embedded assessments Re-evaluate our educational outcomes? Populate new AAMS system (AACP’s Assessment and Accreditation Management System) Re-write (condense) our Assessment Plan HSCI Program Review PharmD Accreditation ( )

Questions/Discussion