Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bill DeBaun Program Analyst, NCAN

Similar presentations


Presentation on theme: "Bill DeBaun Program Analyst, NCAN"— Presentation transcript:

1 Bill DeBaun Program Analyst, NCAN
College Access Data: Developing Common Measures and NCAN's Benchmarking Study Bill DeBaun Program Analyst, NCAN Over 350 college access and success programs from all across the country. The membership is diverse and includes 501c3s, state agencies, GEAR UP programs, foundations, school districts, and IHEs Member benefits include tools and information for advisors, executive directors, and others involved in the work of college access and success. Strategies for building capacity (e.g., e-learning platform for professional development), member discounts for services like the National Student Clearinghouse, webinars, newsletters, and weekly updates on techniques and news relevant to the college access field. And of course our conference is a top national professional development resource for the field. Here today to talk about data, its uses in the college access field, and some specific data-related projects that NCAN has undertaken and why these are valuable for the field.

2 What is data? If the question here was “Who is Data?” the answer would be, “Data is an android who appeared in the hit sci-fi television series Star Trek: the Next Generation.” If you aren’t a Trekkie, fear not, because the question we want to ask is “What is data?” Data, broadly, of course are bits of information or facts that can be coordinated to better analyze, understand, or plan some other phenomena that we see in our world. That’s fairly broad. Now, I don’t mean to belabor the point, but I think that getting into a mindset of how data can be useful and what kinds of questions they can answer will be helpful when we’re looking at specific applications during this session. Before we start, let’s brainstorm about data. Just call out some data points that you use frequently, go ahead, call them out. Okay great, now call out some questions that either you can answer with the data your program has or that you would like to be able to answer but can’t currently. Perfect. So you gave some great examples of how data can be used. Let’s keep going and narrow down how data, which we’ve already established are varied and broad, can be put in some contexts that you are probaby more familiar with in the college access. 11/13/2018

3 What is data? Quantitative Qualitative vs.
3/27/15 – My meeting with Joe went well today. He is highly motivated and ready to take some more rigorous classes in order to better prepare himself for college. He is concerned about paying for college. vs. So let’s start off with some pretty common categorizations. You have quantitative data and qualitative data. Quantitative data, no surprise, can be quantified, or counted. These are what people tend to think of as “data.” Percentages, averages, year-over-year performance, counts, medians, etc. Qualitative data, as many of you probably know, are also key in college access. Sometimes a student’s GPA or ACT score provide insights that are equally as valuable as knowing a student’s temperament or their family situation. So in the examples above we have some comparisons of a given program’s benchmarks against NCAN’s benchmarks. And then we have some case file notes on a student named Joe. Both of these data points tell us something from which we can draw next insights and next steps. 11/13/2018

4 What is data? Inputs vs. Outputs vs. Outcomes Budget Number of staff
Number of advisory sessions Scholarship dollars awarded FAFSAs completed Graduation rate Enrollment rate College completion rate X college graduates If you’ve ever looked at or developed a logic model, the lingo on this slide should be fairly familiar to you. If you’re not familiar with logic models, they’re a tool that programs can use to measure their effectiveness. Data can also be put in the form of where they fall in the way a program is organized. For example you can collect input data, which are budget numbers, or the number of staff you have, or the number of computers per student. Those inputs then get turn into the activities undertaken by the program. Outputs, then, are the result of these activities, and we’ve listed some of these here. All of these outputs are also quantifiable. Finally, outcomes are the long-term or end-result change of the inputs, activities, and outputs. So what is the benefit of the program’s efforts? All of these are also data points. 11/13/2018

5 What is data? Formative vs. Summative
Which students are below a 2.0 GPA this quarter? How many FAFSAs were completed last week? Last month? What were our enrollment and FAFSA completion rates this year? How did they compare to national data? To last year’s data? We can also categorize data by the point in time in which we use it. Formative data are often collected at set periods during, for example, an academic year, and these data shape our knowledge in real-time and offer us the ability for course correction or interventions. In the examples above, a program could work with the students who are found to have low GPAs, or efforts can be intensified if FAFSA completions are behind schedule. Summative data are collected or analyzed at the end of a given period, and they look back. These are where your annual reports, year-over-year longitudinal data, etc. come in. 11/13/2018

6 What is data? Disaggregation
Data is also its subparts. Disaggregation, that is, breaking down or analyzing data according to some variable or combination of variables, is critical for examining gaps in performance between student groups. If we only had the chart on the left for program X, we would think it’s doing fairly well. And it is. An 81% enrollment rate is pretty good, although we don’t know any other context about the program model, size, etc. …but when we then disaggregate this data by student race and gender, we see that outcomes are better for some students than others. Disaggregating this data makes it more actionable, programs can now take a look at the services being provided to black and Hispanic students and try to tweak them to improve performance. 11/13/2018

7 Data Challenges for NCAN Members?
Photo via Creative Commons license from:

8 NCAN’s Common Measures
Data are everywhere. College access is no different. Lingua franca for diverse set of members. Research-backed and member-developed. All of this brings us to NCAN’s Common Measures. We know that data are ubiquitous in every other field and facet of daily life. College access and success are no different. The NCAN membership is very diverse, however, and having a common language about which data even very different programs should be considering was a request we were increasingly getting from our members. Some programs were very outcomes focused while others were very inputs focused. So in 2012, NCAN embarked on a process to develop this common language. Multiple rounds of input were solicited from members where they discussed which data they were most often using and which they saw the tracking of as the most useful in getting students to and through college. Simultaneously, NCAN conducted an intensive literature review to see what research said about the metrics our members were discussing. Metrics with RCTs or other rigorous methodologies were given emphasis. Ultimately, we wound up with the Common Measures, a set of access and success indicators that is representative of both what our members are tracking and what research says has an impact on student enrollment and completion. 11/13/2018

9 NCAN’s Common Measures
Are on your table and serve as a free gift. Thanks for coming! “Access” vs. “Success” “Essential” vs. “If Available” Categorical The end result of those efforts in 2012 is on your tables there. Thanks for attending, here’s a momento to remember me by, I hope you’ll never forget this presentation. The Common Measures break down along a few different lines. First there are Access and Success metrics. The Access metrics all serve as milestones along the path to college enrollment. Here you have academic, financial aid, admissions, and testing indicators. The Success metrics all revolve around what moves a student toward college completion, so these are pre-enrollment, enrollment, persistence, academic, and financial aid indicators. The metrics are also broken down into “Essential” and “If Available” indicators. If you’re perceptive about context clues, you’ve probably already picked up on the intention that, all else being equal, programs prioritize the essential indicators. There are 14 essential indicators, 7 access and 7 success. Finally, I mentioned earlier how important disaggregation is, and so the Common Measures are prescriptive in the characteristics programs should be tracking about their students so that their data can be broken down by these characteristics later. 11/13/2018

10 And Now for Some Carpentry
We’re going to change gears and talk about benchmarking now. You see, benchmarking…carpentry…okay let’s not dwell on my inability to make a joke in a conference presentation. The other data-related endeavor that NCAN has taken on is our Benchmarking Project, now in its second year. We’re really excited about the potential this project holds, not just for our members but also for the field. 11/13/2018

11 Benchmarking Putting data in context
Comparing organizational performance to industry-wide performance Other examples: PISA Airport arrivals/departures Auto safety Partnered with NSC NCAN’s Benchmarking Project, fundamentally, comes from a desire to put performance data in context. Many NCAN members can tell you their students’ enrollment and completion rates, but it may be difficult to tell how that performance stacks up without some kind of reference point about what the standard level of performance for the field is. The process of benchmarking is common in many fields. Another notable example in education is the PISA international assessment, which lets countries compare their performance against each other and an international average. Automakers do the same in terms of automobile safety. Not only do we want our individual members to have a reference against which to compare their performance to other NCAN members’ students, we’d also like to be able to compare NCAN member-served students’ performance, broadly, against how similar students fare nationwide. To get to this point, we teamed up with the National Student Clearinghouse and the National Student Clearinghouse Research Center.

12 NCAN’s Benchmarking Report
Sample 2014 Report (Year 1) 2015 Report (Year 2) Programs Involved 24 42 Student Records 155,620 130,538 2014 was the first year of the Benchmarking Project. We asked programs to voluntarily submit data on students from the 2007, 2008, 2009, and 2013 cohort years so that we could procure 4-, 5-, and 6-year graduation rates as well as enrollment rates. The response we got was fairly healthy. Programs submitted their data through the NSC’s Student Tracker service, and then the NSCRC calculated the enrollment and graduation rates for us. You can see in the chart here that this year, the second year, the response was even better in terms of number of programs. A program that participated in year 1 declined to do so in year 2, which is why we see the overall number of student records drop, but the expansion to 42 programs is one that we are hoping to continue to build on. Worth noting, as you see the programs involved and the student records here, is that we make no claims that this is a representative sample, either of NCAN member-served students, first generation students nationally, or even low-income students nationally. Programs who participated in either round did so voluntarily. The hope as the project progresses, is that we will have more and more programs sign on and we’ll be able to approach a more representative sample. At this point what we can say is that this represents some NCAN member programs, which is a good start and gives programs some reference point to compare against, but it does merit a caveat. The other caveat is that the Benchmarking Project is not a longitudinal study. There’s no guarantee that we have the same students or the same programs from year to year though there is some overlap. The extent to which that overlap exists, and the extent to which we can make this truly longitudinal, are both things we’re looking at as we get further down the road. Every year, we’re hoping to build on the project to make it more robust and improve the insights we can pull from it. 11/13/2018

13 NCAN’s Benchmarking Report
Indicators 2014 Report (Year 1) 2015 Report (Year 2) Pell recipient Scholarship recipient EFC >= $5,000 EFC Interventions received Selectivity of program Race/ethnicity Gender First-generation status Speaking of improving insights, here we have the indicators that were collected in year 1 and year 2 of the Benchmarking Report. In year 1, we collected only a handful of variables: whether or not students were Pell recipients, whether they received a scholarship from the organization that was submitting their data, and whether their family’s EFC was over or under $5,000. What’s missing here in year 1? We didn’t follow our own advice and include variables on which we could meaningfully disaggregate to look for differential performance among students groups. So in year two we included student’s gender, race/ethnicity, and Hispanic heritage, as well as their first-generation status. We also asked programs to identify which of 8 types of interventions students participated in. We have a report coming out on Wednesday from the second round of data, and we were able to provide a lot more detail at deeper levels of analysis than we were in the first round. We hope to continue to refine the data that we ask for as we figure out both what members are interested in and, more importantly, what they are collecting and able to report. 11/13/2018

14 NCAN’s Benchmarking Report
2014 Report Results - Enrollment High-Income, Low-Minority, Suburban High Schools 77% High-Income, Low-Minority, Urban High Schools 76% *NCAN Class of 2007* 71% *NCAN Class of 2009* 70% *NCAN Class of 2008* High-Income, Low-Minority, Rural High Schools 69% High-Income, High-Minority, Suburban High Schools High-Income, High-Minority, Urban High Schools 68% High-Income, High-Minority, Rural High Schools 66% *NCAN Class of 2013*^ 65% ^ NCAN 2013 Cohort is a partial year of enrollment. Compared to NSCRC Class of 2014 data. NCAN cohort years are higher than all 6 categories of low-income schools. Even the partial class of The full year cohorts 2007, 2008, 2009 are all higher than 4 classes of high-income high school. We’re encouraged by this finding because it means that ZIP code is not destiny for these students. With the proper supports offered to them by NCAN member programs, our students can succeed at rates that approach, meet, or exceed those of their higher-income peers. Certainly this is something a lot of you working with GEAR UP program can appreciate! 11/13/2018 11/13/2018 14

15 NCAN’s Benchmarking Report
2015 Report Results - Enrollment High-Income, Low-Minority, Suburban High Schools 77% High-Income, Low-Minority, Urban High Schools 76% *NCAN Class of 2008* 74% *NCAN Class of 2009* 71% *NCAN Class of 2010* 69% High-Income, Low-Minority, Rural High Schools High-Income, High-Minority, Suburban High Schools High-Income, High-Minority, Urban High Schools 68% High-Income, High-Minority, Rural High Schools 66% *NCAN Class of 2014*^ 64% ^NCAN 2014 Cohort is a partial year of enrollment. Also compared to NSCRC 2014 data. 11/13/2018 11/13/2018 15

16 NCAN’s Benchmarking Report
Results - Completion NCAN is approaching the national figure released by the NSCRC (this is not a representative sample, it must be kept in mind, but this is the most timely example that’s out there given that the federal data collection for postsecondary statistics leaves a lot to be desired) This is for students enrolling within one year of high school graduation NCAN’s completion rates for both the 2007 and 2008 cohorts actually have higher six year graduation rates than the NSC cohort of all students who first enrolled in that year. But we use the subset of students in the NSC cohorts who are age 20 or younger at first enrollment because we feel that is a more appropriate comparison given what we know about the programs who have submitted data to us for benchmarking 11/13/2018 11/13/2018 16

17 What’s Next? For CMs: Toolkit for better using these
Handbook that talks about the research Information on evaluation and becoming data-driven Continuously reevaluating research landscape and trying to add to and strengthen these measures Continue to work with and push members to adopt these into their programs For Benchmarking: More participating programs Refine the variables being collected More rigorous analysis with statistical modeling (probit, logit) Additional reports on which students receive which services (completers, non-completers, never enrolled) Looking for patterns and “the secret sauce” of which combinations of services are most strongly tied to students enrolling and completing NCCEP’s evaluation is a great model of this, much more robust. Should be proud of the insights coming out of that. Kudos to Dr. Chrissy Tillery and her research team. 11/13/2018

18 Parting Thought: “Data! Data! Data! I can’t make bricks without clay!”
11/13/2018

19 7 Important Building Blocks for Robust Evaluation Designs

20 How Does Evaluation Help?
Highlights effective components Recognizes achievements Replicates successes Assesses and prioritizes needs Targets improvements Advocates for programs/interventions

21 Seven Building Blocks for Robust Evaluation
Theory Literature Review Conceptual Model Logic Model Standardized data collection Appropriate Statistical Analyses Fidelity Plan

22 Seven Building Blocks for Robust Evaluation
Theory Proven Path Ensures no Harm Heightens Professional Approach

23 Seven Building Blocks for Robust Evaluation
Literature Interventions Evidence-based Appropriate Measurement Tools

24 Seven Building Blocks for Robust Evaluation
Conceptual Model Picture of Theory Applied

25 Conceptual Model Using Life Course Theory
Prenatal Infancy Childhood Milestones (Medical, School, Family, Intrapersonal Development) Adolescence Milestones Adulthood Old Age Risk Factors Socioeconomic disparities Negative perceived stress Weakened resilience Starting School From high school to adulthood Poor school/job Performance Parent Support Emergency funds Education Referrals Counseling Support group Advocacy Child Support Tutoring Peer support Protective Factors Adequate income Education on reframing stressors Parent groups Self-care Preparation for school Healthy family and social relationships Preparation for medical self-care Positive beliefs on health behavior Education on Stressors School support Education on stressors School/job support Healthy relationships

26 Seven Building Blocks for Robust Evaluation
Logic Model A visual representation of how a program will work (process) and anticipated results (outcomes)

27 Basic Logic Model Social Problem: The primary social problem addressed by the organization. Mission: The overall goal of the organization to address the social problem. For Whom Assumptions Inputs Activities Outputs Outcomes Who is/are the people inherent in the social problem (direct and indirect). Information accepted as true Observation Literature Experience A program’s resources and constraints (ex. money, staff, laws, regulations) What a program does with its inputs to fulfill its mission. (ex. program services) Products of a program’s activities. (ex. number of services provided or clients served) Changes in participants’ knowledge, attitudes, or skills. Intermediate Long Range (ex. participant gains knowledge about x)

28 Seven Building Blocks for Robust Evaluation
Standardized Data Collection Plan for Data Collection - Before and after activity to change goals Measurement Tools

29 Seven Building Blocks for Evaluation
Appropriate Statistical Analyses Data that describes what you did Data that measures change in outcomes

30 Seven Building Blocks for Robust Evaluation
Fidelity Plan A plan to ensure evaluation design was implemented the way it was designed

31 Bill DeBaun, Program Analyst, NCAN debaunb@collegeaccess.org
Questions? Bill DeBaun, Program Analyst, NCAN For contracted evaluation services contact: Jon Meyer, PhD


Download ppt "Bill DeBaun Program Analyst, NCAN"

Similar presentations


Ads by Google