Alliance for Graduate Education and Professoriate (AGEP) Evaluating Retention Strategies: Some Thoughts On The Homework Patricia Campbell Campbell-Kibler.

Slides:



Advertisements
Similar presentations
Response to Intervention: Linking Statewide Initiatives.
Advertisements

Broadening Participation in Computing (BPC) Alliance Evaluation Workshop Patricia Campbell, PhD Campbell-Kibler Associates, Inc.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Strategies to Measure Student Writing Skills in Your Disciplines Joan Hawthorne University of North Dakota.
360 Degrees: Conducting a Comprehensive Evaluation of Your Integrated Planning Processes Bri Hays Jill Baker San Diego Mesa College RP Conference April.
Practical resource Evidence based – developed through our research study Draws on the framework of impact, practicalities of capturing impact & lessons.
Assessment is about Quality AMICAL CONFERENCE – MAY 2008 Ann Ferren, Provost, American University in Bulgaria.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Toolkit Series from the Office of Migrant Education Webinar: SDP Toolkit August 16, 2012.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Building Evaluation Capacity (BEC) Beatriz Chu Clewell, Urban Institute Patricia B. Campbell, Campbell-Kibler Associates, Inc.
Using Mixed-Methods and Survey Research in Project Evaluation/Assessment Pat Campbell QEM Workshop on Assessment and Evaluation August 7-8, 2009.
Assessment of Learning
7 Accountability Getting clear about what you want to accomplish with technology How will you measure its use? How will you communicate its effects?
Thinking Smart About Assessment Ben Clarke, Ph.D. Rachell Katz, Ph.D. August 25, 2004 Oregon Reading First Mentor Coach Training © 2004 by the Oregon Reading.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Title I Needs Assessment and Program Evaluation
Phillip R. Rosenkrantz, Ed.D., P.E. Industrial & Manufacturing Engineering Department California State University, Pomona.
Human Resources Office of Summary of Results 1 University of Minnesota Morris.
performance INDICATORs performance APPRAISAL RUBRIC
Assessing Financial Education: A Practitioner’s Guide December 2010.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
ASSESSMENT  Are you having an impact?  How do you know?  Are the programs and services you work on getting better?  How do you know?
Alliance for Graduate Education and Professoriate (AGEP) Data Results Yolanda George AAAS Patricia Campbell Tom Kibler Rosa Carson Campbell-Kibler.
MARYLAND STATE DEPARTMENT OF EDUCATION  DIVISION OF SPECIAL EDUCATION/EARLY INTERVENTION SERVICES JOHNS Hopkins University Center for Technology in Education.
Finding, Analyzing, and Documenting Information
2011 SAA Annual Meeting Genya O’Gara SPECIAL COLLECTIONS RESEARCH CENTER Engaged! Innovative Engagement and Outreach and Its Assessment.
Alliance for Graduate Education and Professoriate (AGEP) 2006 Data Collection Information Yolanda George AAAS Patricia Campbell Tom Kibler Rosa Carson.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
Presenters Rogeair D. Purnell Bri C. Hays A guide to help examine and monitor equitable access and success Assessing and Mitigating Disproportionate Impact.
John Stem 7 th Annual Summit on VR PEQA Louisville, Kentucky.
Coaches Training Introduction Data Systems and Fidelity.
CHOLLA HIGH MAGNET SCHOOL Plc Workshop
PBIS Tier 1 Coaches Training
Media Services North Campus Feb 2001 Miami-Dade Community College Enrollment Management Media Services North Campus Feb 2001 Presentation to The Board.
Achieving the Dream: Assessing Implementation CCPRO February 20, 2007.
Developed by Yolanda S. George, AAAS Education & Human Resources Programs and Patricia Campbell, Campbell-Kibler Associates, Inc. With input from the AGEP.
MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference
ASSESSING STUDENT LEARNING OUTCOMES IN DEGREE PROGRAMS CSULA Workshop Anne L. Hafner May 12, 2005.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Educating Professionals - Creating and Applying Knowledge - Serving the Community UNIVERSITY OF SOUTH AUSTRALIA Evaluating programs.
Assessment: Research in Context Allison Sivak University of Alberta Libraries June 13, 2008.
Designing an Evaluation Framework for Retaining Students in STEM PhD Programs The 3rd Annual Alliance for Graduate Education & the Professoriate (AGEP)
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Diving Deeper: understanding the UDL Guidelines your application of UDL theory.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond.
Connect Aveda.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Local AIMSweb® Manager: Taking the Role An introduction to: Course Overview Role of a Local AIMSweb Manager (LAM) Skills needed to become a successful.
Draft of the Conceptual Framework for Evaluation & Assessment of the National Science Foundation (NSF) Alliance for Graduate Education & the Professoriate.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Benchmarking Learning from Others ATP On-line Workshop Darla McCann Anoka-Ramsey Community College.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Data to collect and questions to ask to understand the meaning of the data.
Summer Institutes 2013 Changing Teacher Practice Changing Student Outcomes.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
Evaluation Structure. 2| Evaluation – A Multi-layered Approach All AHEC awardees are conducting pre and post tests as well as focus groups with an external.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
CIAS Program Level Assessment Office of Educational Effectiveness Assessment September 6, 2016.
Alice Fornari, Ed.D. Francesco Leanza, M.D. Janet Townsend, M.D.
Class Two Jeff Driskell, MSW, PhD
Class Two Jeff Driskell, MSW, PhD
Module 1: Introducing Development Evaluation
DRAFT Standards for the Accreditation of e-Learning Programs
Efcog Contractor assurance Working Group (CAWG) Update
Presentation transcript:

Alliance for Graduate Education and Professoriate (AGEP) Evaluating Retention Strategies: Some Thoughts On The Homework Patricia Campbell Campbell-Kibler Associates, Inc. February, 2007

Wanda’s Evaluation Question Are your evaluation results: Credible: Do you trust them? Relevant: Do they give information you can use to either improve the strategies or to make judgments about the strategies’ worth? Timely: Do you get the results fast enough to be able to use them?

Pat’s Evaluation Assumptions The core evaluation question is “What works for whom in what context?” The core evaluation question is “What works for whom in what context?” “Black hole” evaluations are bad. “Black hole” evaluations are bad. If you aren’t going to use the data, don’t ask for it. If you aren’t going to use the data, don’t ask for it. A bad measure of the right thing is better than a good measure of the wrong thing. A bad measure of the right thing is better than a good measure of the wrong thing. Acknowledging WIIFM increases response rates. Acknowledging WIIFM increases response rates. Process is a tool to help understand outcomes. Process is a tool to help understand outcomes. Outcomes are at the core of accountability. Outcomes are at the core of accountability.

Your Evaluations of Retention Strategies Two thirds of the respondents do some evaluation of their retention strategies, most frequently: giving students surveys after each activity getting verbal feedback from students formally and informally giving students annual surveys conducting student exit surveys. One quarter track students’ progress to degree as part of their evaluations. Ten percent track student participation in retention activities. One project is linking student participation in activities to student progression through their academic programs.

Your Evaluations of Retention Strategies More Comprehensive Efforts “We evaluate all AGEP activities. The evaluation has taken a variety of forms including tracking students, surveys, focus groups and collection of anecdotal information from students.” “Students receive an evaluation form at the end of each activity. Evaluation results are compiled and used to plan subsequent activities. One campus holds an evaluation brunch each April to assess all of the activities; an external consultant facilitates. Program-wide results are being assessed, participation is being linked to students’ progression through their academic programs.” Others study specific components related to retention such as graduate advising and special summer programs.

Some Thoughts on Measurement Don’t reinvent the wheel; where possible use existing measures Don’t reinvent the wheel; where possible use existing measures Share measures with other projects. Common questions can be useful. Share measures with other projects. Common questions can be useful. Look for benchmark measures that are predictors of your longer term goals. Look for benchmark measures that are predictors of your longer term goals. Remember self developed measures need some checking for validity and reliability. Remember self developed measures need some checking for validity and reliability. A sample with a high response rate is better than a population with a low one. A sample with a high response rate is better than a population with a low one.

For Those Who Are Doing It Yourselves: Some Web-based Resources OERL, the Online Evaluation Resource Library User Friendly Guide to Program Evaluation AGEP Collecting, Analyzing and Displaying Data American Evaluation Association Center for the Advancement of Engineering Education