· IUPUI · Conceptualizing and Understanding Studies of Student Persistence University Planning, Institutional Research, & Accountability April 19, 2007.

Slides:



Advertisements
Similar presentations
Sociology 680 Multivariate Analysis Logistic Regression.
Advertisements

 The University of Hawai ʻ i at Mānoa – Spring 2011.
David Fairris Tarek Azzam
Institutional and Student Characteristics that Predict Graduation and Retention Rates Braden J. Hosch, Ph.D. Director of Institutional Research & Assessment.
Tennessee Education Lottery Scholarship Program Annual Report Tennessee Higher Education Commission January 29, 2009.
How College Shapes LivesFor detailed data, see: trends.collegeboard.org. SOURCE: National Center for Education Statistics, 2013, Tables 222, 306, and.
UMCP Study on Defaults A Study of Ten Year Default Rates of Undergraduate Students Who Borrowed Any Loan in /6/2012UMD Office of Student Financial.
Report of Achieving the Dream Data Team November 14, 2007.
Report of Achieving the Dream Data Team January 11, 2008.
Toya Roberts-Conston African American Male Transfer Students’
Logistic Regression Multivariate Analysis. What is a log and an exponent? Log is the power to which a base of 10 must be raised to produce a given number.
Facts about First-Year Students at Central Connecticut State University Presented by Braden J. Hosch, Ph.D. Director of Institutional Research & Assessment.
Institution Research Update John Porter AIRPO June 20, 2006.
1 Predicting Success and Risk: Multi-spell Analyses of Student Graduation, Departure and Return Roy Mathew Director Center for Institutional Evaluation.
Dealing with Uncertainty: Statewide Retention Conference, March 5, 2008 Presenters: Beckie Hermansen, Craig Mathie, Mat Barreiro How Snow College students.
A Longitudinal Analysis of the College Transfer Pathway at McMaster Karen Menard Ying Liu Jin Zhang Marzena Kielar Office of Institutional Research and.
Understanding Colleague Data Using Colleague data to complete the Spring IPEDS surveys Matt Smith: Pitt Community College.
ARCC /08 Reporting Period Prepared by: Office of Institutional Research & Planning February 2010.
Dual Credit and Advanced Placement: Do They Help Prepare Students for Success in College? Mardy Eimers, Director of Institutional Research & Planning Robert.
Regular Versus Shorter University Orientations: A Comparison of Attendee Make-up Carla Abreu-Ellis & Jason Brent Ellis.
ICEE 2010 Attracting and Retaining Women and Underrepresented Groups in Engineering, Science, and Related Programs ICEE 2010 – Gliwice, Poland July 18-22,
A Comprehensive Analysis of a PrOF Instructional Data Packet To illustrate the data analysis process CRC Research Office 2009.
WEST VIRGINIA UNIVERSITY Institutional Research WEST VIRGINIA ADVENTURE ASSESSMENT Created by Jessica Michael & Vicky Morris-Dueer.
UMass Boston Retention, Persistence, and Graduation Rates UMass Boston Advising Collaborative March 28, 2013 Office of Institutional Research and Policy.
Achieving the Dream Baseline Data – What does it tell us? Presented by the ATD Data Team February 24, 2015.
Board of Trustees Quarterly Data Report Volume 1, Number 2 Graduation and Retention Update January 7, 2014.
1. ACCJC INSTITUTION-SET STANDARDS DISCUSSION 2 A “standard” is the level of performance set by the institution to meet educational quality and institutional.
Undergraduate Student Persistence and Completion: Do Pell Grants Matter? Charles Hatcher, California Competes CAIR Conference, Tongshan Chang, University.
INSTITUTIONAL NEED-BASED AID PROPOSAL Marvin Smith, Director of Student Financial Services Beth Barnette Knight, Director Office of Student Scholarships.
Complete College OCCC Fall 2012 AtD Cohort Retention September 18,
The University of Hawai ʻ i at Mānoa ACCESS TO SUCCESS: LEADING INDICATORS WORKGROUP.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 26.
Nonparametric Survival Analysis of Undergraduate Engineering Student Dropout Young Kyoung Min 1,3, Guili Zhang 1,4, Russell A. Long 2, Timothy J. Anderson.
Identifying At-Risk Students Gary R. Pike Information Management & Institutional Research Indiana University Purdue University Indianapolis.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
Voluntary Disclosure Not Covered in Textbook. You’re on a job interview and the interviewer knows what the distribution of GPAs are for MBA students at.
Scholarship and Grant Workshop July 11, State Funding (in $ millions)
The Chicago Guide to Writing about Multivariate Analysis, 2 nd edition. Interpreting multivariate OLS and logit coefficients Jane E. Miller, PhD.
© CCSR Rising On-Track Rates and the Solution to the Dropout Crisis Melissa Roderick and Thomas Kelley-Kemple with Courtney Thompson & Nicole Beechum Confidential.
An Alternate Approach to Studying Transfer Student Outcomes Suresh Gangireddy, GRA Lakshmi Kokatla, GRA Sam Houston State University Office of Institutional.
By: Assoc. Prof. Dr. Nagarajah Lee Prof. Dr. Latifah Abdol Latif
STATISTICAL AND METHODOLOGICAL CONSIDERATIONS FOR EXAMINING PROGRAM EFFECTIVENESS Carli Straight, PhD and Giovanni Sosa, PhD Chaffey College RP Group Conference.
Multiple Logistic Regression STAT E-150 Statistical Methods.
Identifying At-Risk Students With Two- Phased Regression Models Jing Wang-Dahlback, Director of Institutional Research Jonathan Shiveley, Research Analyst.
Correlation They go together like salt and pepper… like oil and vinegar… like bread and butter… etc.
Logistic Regression. Linear regression – numerical response Logistic regression – binary categorical response eg. has the disease, or unaffected by the.
Examining the Enrollment and Persistence of Students with Discrepant High School Grades and Standardized Test Scores Anne Edmunds, Ed.D. Higher Education.
1 BINARY CHOICE MODELS: LOGIT ANALYSIS The linear probability model may make the nonsense predictions that an event will occur with probability greater.
Vicki A. McCracken, Professor, School of Economic Sciences Fran Hermanson, Associate Director, Institutional Research Academic Performance and Persistence.
Beginners statistics Assoc Prof Terry Haines. 5 simple steps 1.Understand the type of measurement you are dealing with 2.Understand the type of question.
UNDERGRADUATE CREDIT HOUR ENROLLMENT PATTERNS AND OUTCOMES Rebecca E. Porter, Ph.D., PT Executive Director of Enrollment Management & Associate Vice Chancellor.
Undergraduate Student Persistence & Graduation advisor UI/WSU Advising Symposium September 9, 2011 Joel Michalski, Ph.D. Candidate & Karla Makus, Academic.
Template provided by: “posters4research.com” Academic Performance and Persistence of Undergraduate Students at a Land-Grant Institution: A Statistical.
Texas Higher Education Coordinating Board Data Highlight: Completion CAAP Meeting March 30,
Abstract Improving student success in postsecondary education is a key federal, state, and university objective that is inseparable from the focus on increasing.
Fall Enrollment by Ethnic Group and Year Grant Campus Year ETHNIC GROUP Nonresident alien Hispanic/Latino American Indian or Alaska Native Asian Black.
INSTITUTIONAL RESEARCH FAST FACTS ACADEMIC YEAR Source:
Academic Performance and Persistence of Washington State University Students Vicki A. McCracken, Professor, School of Economic Sciences Fran Hermanson,
Community College of Baltimore County
A Statistical Analysis Utilizing Detailed Institutional Data
Defining Non-Traditional Students
Student Entry Information Cumulative1 2nd Semester
University of Michigan
Presented by: Office of Institutional Research (UNCG-IR) November 2017
Is High School GPA a Predictor of College Student Success?
Analytics in Higher Education: Methods Overview
Defining and Measuring Student Success Dr
Undergraduate Retention
Defining Non-Traditional Students for Retention Studies
2010 ARCC Report Findings May 3, 2010
Presentation transcript:

· IUPUI · Conceptualizing and Understanding Studies of Student Persistence University Planning, Institutional Research, & Accountability April 19, 2007

· IUPUI · Overview  Framing the persistence problem  Understanding results of retention studies  Providing perspective on concepts using IUPUI example

· IUPUI · Framing the Problem  How should we define and measure persistence?  Graduation rates  An entering cohort approach  Probability of graduating within 150% of program length  How do these rates vary by student characteristics?  Time to graduation  A graduating cohort approach  Number of years (months) from matriculation to graduation  How does this time vary by student characteristics?

· IUPUI · Framing the Problem  How should we define and measure persistence?  Retention/departure measured at a single interval  Between two academic years  Between two semesters  Within a single semester These three approaches assume time-invariant predictors: The effects of characteristics on retention/departure (or even the characteristics themselves) do not change over time

· IUPUI · Framing the Problem  How should we define and measure persistence?  Retention/departure measured at multiple intervals  Can capture timing of departure  Assume time-variant predictors of retention/departure  Account for changes to the student body due to self selection However… Methods for examining persistence under this framework can be very complex and are relatively new to many in IR

· IUPUI · Framing the Problem  How should we define and measure persistence?  Type of departure most often studied  Return vs. Do not return (in most general sense)  Other possible characterizations  Continuous Enrollment vs. Stop-out vs. Permanent absence  Transfer vs. Dropout (from higher education)  Voluntary withdrawal vs. Academic expulsion

· IUPUI · Understanding Retention Results  Most common approach to study of persistence  Retention/departure measured at a single interval  Interval: Academic year  Dichotomous outcome: Return vs. Do not return  e.g., Second year retention among first-time students  Methods for dichotomous outcomes  More common: Logit (a.k.a. logistic regression)  Less common: Probit

· IUPUI · Understanding Retention Results  Three common formats for presenting results  Used least often: Predicted probabilities  Used more often: Changes in probability (Delta-p)  Used most often: Odds ratios  All formats are related (and as such, are easily confused)  So what’s the difference?

· IUPUI · Understanding Retention Results  Predicted probabilities  Two common approaches:  Ceteris paribus (i.e., all else being equal)  Isolates the “effect” of a particular characteristic (e.g., gender)  Assumes that students are average on all other characteristics  All else being equal, Females = 0.85, Males = 0.75  Hypothetical (within reason!) student  Allow multiple characteristics to vary  Nonresident male with $2000 unmet need = 0.35  Resident female with $0 unmet need = 0.90

· IUPUI · Understanding Retention Results  Delta-p (i.e., change in probability)  Based on ceteris paribus approach  The female “effect” = female prob. – male prob.  0.85 – 0.75 = 0.10  Beware the misinterpretation of the delta-p!  Correct: A ten percentage point difference in prob.  Incorrect: A ten percent difference in prob.  What is the percent diff? (0.85 – 0.75)/ 0.75 = 13%

· IUPUI · Understanding Retention Results  Odds  P/(1 - P) = Odds  Females: 0.85/( ) = 5.7  Males: 0.75/( ) = 3.0  Odds ratio (literally the ratio of two odds)  Odds ratio for females versus males 5.7/3.0 = 1.89  Odds ratio for males versus females 3.0/5.7 = 0.53

· IUPUI · Understanding Retention Results  Interpretation of odds ratios  OR ~ 1 = No difference in odds  OR > 1 = Greater odds (females have greater odds than males)  OR < 1 = Lower odds (males have lower odds than females)  OR can be expressed in terms of percentages  OR 1.89 = 89% greater odds  OR 2.89 = 189% greater odds  OR 0.53 = 47% lower odds

· IUPUI · Understanding Retention Results  Beware the misinterpretation of odds ratios!  Compared to males:  Correct: Females have 89% greater odds...  Incorrect: Females have an 89% greater probability…  Incorrect: Females have an 89% greater likelihood…

· IUPUI · Understanding Retention Results  Advantage of Delta-p  Discrete change in probability is more intuitive Remember: Delta-p is not equal to % change!  Limitation of Delta-p  Delta-p is assessed for the “average” student  “Average” student ~ overall probability  Logistic “probability” curve is not linear  Size of delta-p depends on overall probability  Practical significance not contextualized via overall probability

· IUPUI · Understanding Retention Results  Limitation of Delta-p (continued)  Logistic Curve

· IUPUI · Understanding Retention Results  Limitation of Delta-p (continued)  If overall probability were ~ 0.50

· IUPUI · Understanding Retention Results  Limitation of Delta-p (continued)  If overall probability were ~ 0.80

· IUPUI · Understanding Retention Results  Advantage of Odds Ratio  Is not tied to location within distribution Overall Prob Female Prob Male Prob Female Odds Male Odds Female Odds Ratio2.01

· IUPUI · Understanding Retention Results  Limitations of Odds Ratio  What’s an odds ratio again? (Not intuitive)  Is not tied to location within distribution!  Female odds are 3 times greater than odds for males!  Sounds like a big deal. Is it? It depends…  Overall prob 0.50, Delta-p = 0.27 Wow!  Overall prob 0.98, Delta-p = 0.03 Hmph!

· IUPUI · Understanding Retention Results  Predicted Probabilities: Why I like ‘em…  Most intuitive approach to presenting results  Can be calculated ceteris paribus or hypothetical  Can easily derive Delta-p from probabilities  Final Precaution  Any of these formats for presenting results are only as good (i.e., accurate or plausible) as the statistical model from which they are derived

· IUPUI · Understanding Retention Results  Questions to ask yourself (or others!)  How are the results reported?  Predicted prob., delta-p, or odds ratios  If reported as odds ratios…  Are odds ratios being correctly interpreted?  i.e., reported as % change in odds?  If reported as delta-ps…  Are delta-ps being correctly interpreted?  i.e., reported as percentage point change in probability?  To assess practical sig. of delta-p, is overall probability provided?

· IUPUI · Perspective: An IUPUI Example  A different look at IUPUI’s one-year retention rate  Considers one-year retention rate as set of sequential decisions  Retention between fall and spring semesters  Retention between spring and second academic year  Two outcomes, two models  Different than single model: beginning of first to second year  Assumes reasons for retention/departure differ over time  Uses time-varying predictors to capture differential “effects”  Sample: IUPUI full-time beginners (2004 and 2005 cohorts)

· IUPUI · Perspective: An IUPUI Example Full-time Beginner Cohort Spring Did not Return Returned 14% 86% Fall Did not Return Returned 26% 74%

· IUPUI · Perspective: An IUPUI Example  Predictors of retention (time invariant)  Age (20+ vs. less than 20)  Gender (Female vs. male)  Race (Hispanic, African American vs. other race)  State residency (Non-resident vs. resident)  Campus residence (Live on campus vs. live off campus)

· IUPUI · Perspective: An IUPUI Example  Predictors of Retention (time variant)  Credit load earned (Full-time vs. less than full-time)  Semester GPA  Completed FAFSA  Second semester = Current year  Second year = Reapplied for subsequent year  Unmet need (i.e., need – total aid)  Net aid (i.e., total aid above need)

· IUPUI · Perspective: An IUPUI Example  Significant Predictors of Second Semester Retention (Remember: “All else being equal”)  Race  Hispanic prob. = 0.91,  Other race prob. = 0.85 (not including African Americans)  Fall credit load earned  Full-time prob. = 0.89  Part-time prob. = 0.80  FAFSA for current year  Completed prob. = 0.87  Did not complete prob. = 0.76

· IUPUI · Perspective: An IUPUI Example  Significant Predictors of Second Semester Retention (Remember: “All else being equal”)  Fall Semester GPA Probability

· IUPUI · Perspective: An IUPUI Example  Significant Predictors of Second Semester Retention (Remember: “All else being equal” except FAFSA and Net Aid)  Unmet Need Probability

· IUPUI · Perspective: An IUPUI Example  Significant Predictors of Second Year Retention (Remember: “All else being equal”)  Age  20+ prob. = 0.66  < 20 prob. = 0.75  Campus residence  On campus prob. = 0.69  Off campus prob. = 0.75

· IUPUI · Perspective: An IUPUI Example  Significant Predictors of Second Year Retention (Remember: “All else being equal”)  Spring credit load earned  Full-time prob. = 0.78  Part-time prob. = 0.67  FAFSA for subsequent year  Did not reapply prob. = 0.51  Reapplied = 0.77  Newly applied = 0.92

· IUPUI · Perspective: An IUPUI Example  Significant Predictors of Second Year Retention (Remember: “All else being equal”)  Spring Semester GPA Probability

· IUPUI · Perspective: An IUPUI Example  Significant Predictors of Second Year Retention (Remember: “All else being equal” except FAFSA)  Subsequent year unmet need and net aid Probability

· IUPUI · Perspective: An IUPUI Example  Summary  Time invariant predictors get “turned on” at different times  Second semester: Race  Second year: Age, Campus residence  Time variant predictors have differential “effects”  Unmet need isn’t as strong a predictor of second year retention  May be due to self selection after first semester  May be due to a failure to reapply “effect”

· IUPUI · Conceptualizing and Understanding Studies of Student Persistence University Planning, Institutional Research, & Accountability April 19, 2007

· IUPUI · Other Pertinent Issues  Financial Aid Effects and False Attribution  Type and amount of aid awarded is tied to criteria (student characteristics) that also predict retention  Example 1  Lower income  lower prob. of persisting  Lower income  more need based aid  More need based aid  lower prob. of persisting  Example 2  Higher SAT  higher prob. of persisting  Higher SAT  more merit aid  More merit aid  higher prob. of persisting

· IUPUI · Other Pertinent Issues  Financial Aid Effects and False Attribution  Research results have been inconsistent as a result  Must do more to separate effects of selection criteria from effects of dollar amount  IR and other higher education research just starting to touch on this issue