Tips and guidelines.  Too much time  First, do no harm: many things do appear to be going well. Let’s not screw them up.  Why help others evaluate.

Slides:



Advertisements
Similar presentations
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Advertisements

Meeting MSCHE Assessment Expectations
Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
What “Counts” as Evidence of Student Learning in Program Assessment?
Institutional Effectiveness (ie) and Assessment
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Source Code: Assessing Cited References to Measure Student Information Literacy Skills Dale Vidmar Information Literacy and Instruction Librarian Southern.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
Assessment Report School of TAHSS Department: History Chair: Dr. Owen Ireland Assessment Coordinator: Date of Presentation: 10/1/2013.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
An Assessment Primer Fall 2007 Click here to begin.
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Effective Grading and Assessment:. Strategies to Enhance Student Learning.
 The Middle States Commission on Higher Education is a voluntary, non-governmental, membership association that is dedicated to quality assurance and.
FLCC knows a lot about assessment – J will send examples
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
1 C-99 Assessing Student Learning in Graduate Degree Programs C-99 Assessing Student Learning in Graduate Degree Programs Bob Smallwood, University of.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Sheila Roberts Department of Geology Bowling Green State University.
BY Karen Liu, Ph. D. Indiana State University August 18,
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Essential Elements of a Workable Assessment Plan Pat Tinsley McGill, Ph.D. Professor, Strategic Management College of Business Faculty Lead, Assessment.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
Source Code: Simple Tool to Help Assess and Improve Student Research Writing Dale Vidmar Information Literacy and Instruction Librarian Southern Oregon.
ASSESSMENT SYED A RIZVI INTERIM ASSOCIATE PROVOST FOR INSTITUTIONAL EFFECTIVENESS.
Department Mission Statement and Program Learning Outcomes.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Learning Outcomes Made Easy Using the Best Tools Jeffrey D. Keith, Ph.D. J. Kelly Flanagan, Ph.D. Russell T. Osguthorpe, Ph.D. Danny R. Olsen, Ph.D. Tom.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
Assessment 101: A Review of the Basics Jill Allison Kern, PhD Director of Assessment Christopher Newport University January 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Evidence and Standard Two. The Big Picture The Big Picture Standard Two is about: Curriculum Curriculum Planned, overseen curriculum – with clear outcomes.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
“Outcomification”: Development and Use of Student Learning Outcomes Noelle C. Griffin, PhD Director, Assessment and Data Analysis Loyola Marymount University.
Model for Sustaining Departmental Student Outcomes Assessment Russ E. Mullen, Mary H. Wiedenhoeft, Thomas A. Polito, Sherry L. Pogranichniy, and Michelle.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
Program Assessment: Choosing Assessments Specify intended outcomes Measure whether students are meeting those outcomes Improve your program based on results.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Student Learning Outcomes and SACSCOC 1.  Classroom assessment ◦ Grades ◦ Student evaluation of class/course  Course assessment –????  Academic program.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
The Gold Standard… Faculty are Key.  Annual Assessment based on  Address each SLO  Be specific, measurable, student- focused  Align the new.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Stetson University welcomes: NCATE Board of Examiners.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Developing and Linking Objectives at the Program and Course Levels Presentation to Fulton-Montgomery Community College October 5, 2006.
AAC&U Members on Trends in Learning Outcomes Assessment Key findings from a survey among 325 chief academic officers or designated representatives at AAC&U.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
Texas Higher Education Coordinating Board Dr. Christopher L. Markwood Texas A&M University-Corpus Christi January 23, 2014.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
+ Montgomery College Program Assessment Orientation Spring 2013.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
The Assessment Process: A Continuous Cycle
Consider Your Audience
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
Effective Outcomes Assessment
Advanced Program Learning Assessment
Assessing Student Learning
Presentation transcript:

Tips and guidelines

 Too much time  First, do no harm: many things do appear to be going well. Let’s not screw them up.  Why help others evaluate us?  Actually the premise is not that faculty are doing a bad job, but that they are looking for ways to do a better job.  We already make “data-driven” decisions  Why document for others?

 Common sense  Authority / seniority / expert status  Logic  Structured, empirical observation, a.k.a. “the scientific method”  Middle States refers to a “culture of evidence” in which the fourth of these has pride of place

MISSION General learning goals OBJECTIVES Specific competencies, skills, knowledge that support the learning goal PROCESSES Sites and methods for achieving objectives ASSESSMENT Do processes, as designed and implemented, lead to realization of objectives? Which processes work best?

 Ineffable outcomes  Hard to measure or hard to defend?  Take a long, cold look at agency  Diversity of student learning goals  Faculty / discipline-centered or student-centered?  Student inputs versus student outcomes  “Exposure” / opportunities / curricular requirements as ends in themselves versus processes leading to specific learning outcomes

 Goal statements should be comprehensive, organized, integrated  But, pick one or two goals / assessment loops to implement initially:  Something that has cross-cutting implications for your department  Something you’re already doing  Something you already believe is an area for improvement

 Senior capstone / thesis requirement  Rubric-based assessment linked to departmental goals  Be sure to design in “closing the loop”  Core course(s) for the major  Goal statements here are usually clear, agreed-upon, and often more competency-based (as opposed to content-based).  You may be able to compare multiple sections of the same course (assess impact of experiments with differing pedagogy, etc.)

 Focus on a single competency  Should be core to your departmental learning goals  Choose on that is eminently assessable, even (gasp!) perhaps with standardized tests ▪ Quantitative reasoning ▪ Oral communication  Curriculum matrix  Grid showing how all departmental courses support all learning goals – can provide a useful map to later efforts at assessment.

 Often are not explicitly linked to learning goals for assignment or course  Often does not communicate strengths and weaknesses to student  Written comments are more “rubric-like” than the grade.  Standards vary across departments and across faculty within departments  Grades are about individual student performance – assessment is about departmental performance

 Cross-rater consistency is key  Training, category definitions need to be explicit – often with explicit examples of student work  Can be enhanced by using persons other than faculty to apply the rubrics to student work  If copies of student work are archived, rubric- based assessment can be post hoc  Thoughtful sampling of student work can make the task manageable

 At a minimum, archive any produced materials  Some reporting to the Provost documenting cycle(s) of closing the loop – a “one-pager”  Of course, more comprehensive reports are useful in many ways:  Grant support (which is plentiful in this arena)  Accountability to external agencies (MSCHE)  Avoid re-inventing the wheel in future generations

You may find these useful

 Useful  Cost-effective / sustainable  Reasonably accurate and truthful  Direct versus indirect measures  Multiple methods  Planned, goal-oriented  Systemic, coordinated across levels  How do course goals related to dept goals?  How do dept goals relate to institutional goals?

MISSION ”Situate a good research question within the existing literature” OBJECTIVES Effective use of research databases; identifies gaps in extant literature; cites appropriately, etc. PROCESSES Research methods course; lit.review of research papers in dept; library research practices ASSESSMENT Rubric-based assessment of literature review portion of research methods course and across written papers in dept.; score on a library-organized test of research practices

Goal: Psychology majors will have a clear understanding of the logic of scientific inquiry and of psychological research method

Assertion: Bryn Mawr graduates pursue graduate education at higher rates than at peer institutions. Multiple data sources, all individually flawed, but collectively convincing since they all point in the same direction:  PhD rates (consistent with assertion, but don’t include non-PhD degrees)  Alumni survey data (consistent with assertion, but are self-report and are have a potentially high non-response bias)  Career services one-year out survey (consistent with assertion, but do not capture data beyond one year out)  Senior survey data (consistent with assertion, but are self-report and are “planned” grad school, not actual attendance at time of survey).  Student clearinghouse data (consistent with assertion, but don’t have good peer institution data).

 Some web links:  Middle States expectations for assessment ▪ ations pdf ations pdf  Bryn Mawr IR website ▪ Past self-studies, links to other schools, external reports ▪ creditation.html ▪ General resources ▪ ssment.html