A SSESSMENT AND E VALUATION P RACTICES 101 New York State College Health Association Syracuse, New York October 21, 2010.

Slides:



Advertisements
Similar presentations
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
Catherine Kost Heather McGreevy E VALUATING Y OUR V OLUNTEER P ROGRAM.
Best Practices in Assessment, Workshop 2 December 1, 2011.
Tips and guidelines.  Too much time  First, do no harm: many things do appear to be going well. Let’s not screw them up.  Why help others evaluate.
Assessing Student Learning Outcomes in Student Development – Part II Student Development Division Retreat SUNY Oneonta May 27, 2008.
An Assessment Primer Fall 2007 Click here to begin.
Family Resource Center Association January 2015 Quarterly Meeting.
M EASURABLE O UTCOMES AND M EANINGFUL M EASURES Catherine M. Wehlburg, Ph.D. Assistant Provost for Institutional Effectiveness Texas Christian University.
Copyright © 2013 by Mosby, an imprint of Elsevier Inc. Importance of Health Assessment DSN Kevin Dobi, MS, APRN.
Chapter Three: Determining Program Components by David Agnew Arkansas State University.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
Computer Science Department Program Improvement Plan December 3, 2004.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Why Institutional Assessment is Important for Middle States Adapted (with permission) From Andrea Lex, Who Presented at Stockton September 20, 2010 Facilitated.
INSTITUTIONAL PHARMACY PRACTICE STANDARDS
D OCUMENTING E VIDENCE OF E FFECTIVENESS IN Y OUR A PPROVED P ROVIDER U NIT F ALL, 2014 W EBINAR S ERIES Educational Design Process Ohio Nurses Association.
Standards and Guidelines for Quality Assurance in the European
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
FLCC knows a lot about assessment – J will send examples
Quality Improvement Prepeared By Dr: Manal Moussa.
Assessment Workshop SUNY Oneonta May 23, Patty Francis Associate Provost for Institutional Assessment & Effectiveness.
Catherine Wehlburg, Ph.D. Assistant Provost for Institutional Effectiveness Texas Christian University TAMU Assessment Conference 2011.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Assessing General Education Programs GE Fall Retreat 2010 Kathleen Thatcher.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
David Gibbs and Teresa Morris College of San Mateo.
D OCUMENT O UR W ORK (DOW) VDSS O UTCOME R EPORT VAdata: Virginia’s Sexual and Domestic Violence Data Collection System.
Assessing Student Learning Outcomes in Student Development – Part I Student Development Division Meeting SUNY Oneonta May 9, 2008.
Performance Measurement and Analysis for Health Organizations
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
The Evaluation Plan.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Evaluating a Research Report
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2011.
Crosswalk of Public Health Accreditation and the Public Health Code of Ethics Highlighted items relate to the Water Supply case studied discussed in the.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Core Curriculum Proposal Workshop. Overview  Defining Assessment  Steps in Assessment  Time to Practice Developing an Assessment Plan  Q&A and Wrap.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
PRESENTATION TO ASSOCIATION OF INSTITUTIONAL RESEARCH AND PLANNING OFFICERS BUFFALO, NEW YORK JUNE 11, 2009 How Campuses are Closing the GE Assessment.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Student Affairs Assessment Council Wednesday, October 28, 2015.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Cindy Tumbarello, RN, MSN, DHA September 22, 2011.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment to Administrative Services Mohawk Valley Community College February 7, 2006.
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Director of Policy Analysis and Research
Effective Outcomes Assessment
Creating Assessable Student Learning Outcomes
Monitoring and Evaluation
Presentation transcript:

A SSESSMENT AND E VALUATION P RACTICES 101 New York State College Health Association Syracuse, New York October 21, 2010

P RESENTER Patricia Francis Associate Provost for Institutional Assessment and Effectiveness SUNY Oneonta

T OPICS F OR T ODAY Assessment and Evaluation in Higher Education: The National Landscape Middle States’ Expectations for Institutional and Program Effectiveness The Assessment Process: Using it to Understand and Improve Programs and Services Demonstrating Program Effectiveness: Collecting, Analyzing, and Interpreting Data

A SSESSMENT AND E VALUATION IN H IGHER E DUCATION : The National Landscape

H ISTORICAL C ONTEXT : T HE “S PELLINGS C OMMISSION ” Formation of Commission on the Future of Higher Education in September 2005 by Secretary of Education Margaret Spellings In acknowledgment of federal government as a “big investor” in higher education To provide opportunity for national discussion on how to assure our system of higher education remains the “best in the world” Release of Commission’s Report in September 2006 Calling for improvements in “accessibility, affordability, and accountability”

T HE C OMMISSION ’ S F AR - R EACHING I MPACT Implications for Regional Accrediting Agencies (Including Middle States) “Accreditation – the system we use to put a stamp of approval on higher education quality – is largely focused on inputs.” (from Commission’s report) Has resulted in great pressure on accrediting agencies – and their institutions – to demonstrate their impact on students, not only in terms of learning outcomes but also overall effectiveness

M IDDLE S TATES ’ E XPECTATIONS FOR U S : Standard 7 (Institutional Assessment)

I NSTITUTIONAL E FFECTIVENESS : MSCHE C RITERIA 1.Documented, organized, and sustained assessment process to evaluate and improve the total range of programs and services and achievement of institutional mission, goals, and plans 2.Written institutional (strategic) and unit plans that reflect consideration of assessment results 3.Evidence that assessment results are shared and used in institutional planning, resource allocation, and renewal

W HAT I S G OOD A SSESSMENT ? (S USKIE, 2005) It’s Useful It’s Cost-Effective Focus on a few goals/outcomes and stagger assessments Use sampling approach It’s Reasonably Truthful and Accurate (NOT Dissertation Quality Research) Use variety of tools Make sure measures map to outcomes It is “organized, systemized, and sustained”

T HE A SSESSMENT P ROCESS : Using It to Understand and Improve Programs and Services

A SSESSMENT P LANNING ’ S F OUR S TEPS 1. Setting objectives: “What you say you do” 2. Activity mapping: “How you do what you say you do” 3. Assessment: “How you know you are doing what you say you do” 4. “Closing the loop”: “What you do next based on results”  Assessment without #4 = Waste of time!

D EVELOPING P ROGRAM O BJECTIVES : W HAT S HOULD T HEY R EFLECT ? Documentation of all services and programs offered Consideration of all constituents served Tracking of use of services (and by whom) Unit effectiveness performance indicators Student satisfaction with services/programs Direct impact of services/programs on students (as appropriate) #1 Rule: Use existing data sources as much as possible!

B UT O THER T HINGS TO K EEP IN M IND Do you have a program/activity in place to realize the objective? Can desired change be measured (especially if focus is student learning or development)? Ultimately, objectives will have to be translated into outcomes How will you know you were successful? Do external standards apply (i.e., in case of external accreditation/certification)?

S AMPLE P ROGRAM O BJECTIVES : SFSU To provide accessible and cost-effective quality medical care for all registered students of SFSU To work with students to facilitate retention and graduation, reduce systematic health disparities related to human and cultural diversity, and enhance lifelong health and wellness To increase students’ knowledge of wellness and preventive health so that they can develop and/or maintain a healthy lifestyle To educate students about healthy eating - how it serves as a foundation for good health and reduces their chances of chronic disease To help position students for post-college wellness by increasing their understanding of health care and how it functions in modern American society

F ROM A SSESSMENT TO E VALUATION : Outcome Statements

W RITING G OOD O UTCOMES : E SSENTIAL C OMPONENTS Who/what is the target? What aspect of your program or services are you intending to change? What domain of student development is your focus? What activity is expected to lead to change? What change is expected?

S AMPLE O UTCOME : SFSU Objective To increase students’ knowledge of wellness and preventive health so that they can develop and/or maintain a healthy lifestyle Outcome Students participating in the SHS “Hot Safer Sex” workshop series will increase basic awareness and comprehension of sexually transmitted infections such as Chlamydia, HIV, GC, and HPV.

B ASIC R ULES FOR W RITING O UTCOME S TATEMENTS Focus on students (not programs) Good example: “Students will articulate knowledge and understanding of the health consequences of tobacco use.” Poor example: “Student Health Services will educate students regarding the health consequences of tobacco use. ” Focus on outcomes (not process) Good example: “Students will demonstrate clear and accurate knowledge regarding the consequences of alcohol abuse with respect to sexual behavior, the ability to operate a motor vehicle, and health across the lifespan.” Poor example: “Students will attend three events that focus on the consequences of alcohol abuse.”

S TUDENT L EARNING O UTCOME E XAMPLES : I LLINOIS I NSTITUTE OF T ECHNOLOGY As a result of services provided in the Student Health Center, female students will become knowledgeable and assume responsibility for their sexual and reproductive health. Including recognition, prevention, and treatment of sexually transmitted diseases, prevention of unintended pregnancies, and awareness of health screenings appropriate at various stages of their reproductive lives. As a result of participation in the Student Health Insurance Plan, students will have access to high quality healthcare and become knowledgeable regarding successful navigation of the US healthcare system within the confines of the insurance industry bureaucracy. As a result of the implementation of the Automated External Defibrillator Program, students will become aware of early warning signs and symptoms of both heart attack and stroke, the importance of accessing rapid emergency medical services, and will be capable of instituting defibrillation when appropriate.

D EMONSTRATING P ROGRAM E FFECTIVENESS : Collecting, Analyzing, and Interpreting Data

S ELECTING G OOD M EASURES : B ASIC P RINCIPLES Measurement techniques must be rigorous, since unreliable data are of minimal value But again, not dissertation research quality! Best to use variety of quantitative and qualitative measures Quantitative easier, but not often as rich Qualitative often more informative, but require check on scoring (e.g., rubrics) Indirect measures have value, but should never be sole indicator of effectiveness Rely as much as possible on existing data sources!

G ENERAL T YPES OF M EASURES (M AKI, 2004) Direct measures – students actually demonstrate learning so that evaluators can match results to expectations Standardized tests Authentic, performance-based – embedded into students’ actual educational context Indirect – students’ perceptions of learning Should never be used as sole indicator

Q UANTITATIVE O R Q UALITATIVE ? Depends on what you’re trying to do Quantitative methods are best if you’re interested in “How many? How much? What percentage? How frequently? How much on average?” Qualitative methods are best if you’re interested in “What were strengths, weaknesses? What do the data mean? What worked, what didn’t? How was the project useful?”

A UTHENTIC, P ERFORMANCE -B ASED A SSESSMENT : T HE V ALUE OF R UBRICS Rubrics provide reliable means of rating student performance in more qualitative way Steps in developing rubrics Use entire staff to help develop and be as specific as possible in differentiating between performance levels Use existing rubrics as guide Pilot test to assure scoring is reliable

R UBRIC E XAMPLE : S TUDENT H EALTH S ERVICE (B OWLING G REEN S TATE U NIVERSITY ) ComplianceAdministrationDuration of Therapy Side EffectsDrug Interactions Knowledge Can correctly express how often medication should be taken Can correctly state whether antibiotic should be taken with or without food or water Expresses that full course of therapy should be completed Is aware of all potential side effects of therapy Is aware that taking other medications, vitamins and supplements may interfere with antibiotic Some knowledge Partially identifies potential side effects Is not taking other medications, vitamins or supplements Lack of Knowledge Is unaware of prescribed dosing schedule Is unaware of the need to take medication with or without food or water States that treatment can be stopped prior to completing full course Is unaware of the potential for side effects Does not believe that interactions are possible or is unaware of possible interactions

M AJOR D ATA C OLLECTION M ETHODS Questionnaires, surveys, checklists Interviews Documentation review Benchmarks Observation Focus groups Case studies

A NALYZING AND I NTERPRETING I NFORMATION Quantitative information (e.g., scores, ratings/rankings) Tabulate the information Calculate frequencies, means or averages, percentages, and put into table – will help visualization If possible, and of interest, compare scores between sub- groups (e.g., males vs. females) For each table, write summary paragraph about what the numbers indicate

A NALYZING AND I NTERPRETING Q UANTITATIVE I NFORMATION - E XAMPLE

A NALYZING AND I NTERPRETING I NFORMATION ( CONT.) Qualitative information (e.g., as collected through interviews, written essays, focus groups) Read through all the information and begin to organize it into “themes” as they emerge Attempt to identify patterns, or associations/causal relationships in the themes Examine themes as they may vary between sub-groups

F ROM M EASURES TO C RITERIA Assessment criteria reflect your expectations about student performance (i.e., how you know you were successful) Set criteria at reasonable but challenging levels Use benchmarks whenever possible, but sometimes you just have to guess! Often take these forms: “90% of students will …..” “80% of students will score at least 70% on…”

E VALUATING Y OUR A SSESSMENT P LAN : S AMPLE A CTION P LAN ObjectiveOutcomeAssessment Measure Expected Outcome To increase students’ knowledge of wellness and preventive health so that they can develop and/or maintain a healthy lifestyle Students participating in the SHS “Hot Safer Sex” workshop series will increase basic awareness and comprehension of sexually transmitted infections such as Chlamydia, HIV, GC, and HPV. 25-item pre-test/post- test that evaluates knowledge of these topics 70% of students will demonstrate improvement on post- test cores.

P ARTING A DVICE 1. Relax – this is not brain surgery! 2. No assessment or evaluation is ever perfect – if you wait for that, you won’t do anything. 3. Take a participatory approach, being sure to involve all staff members as much as possible – the more buy- in, the better. 4. Don’t try to do everything at once, and don’t try to assess all students. 5. Document, document, document! 6. CLOSE THE LOOP!

Y OUR T URN !