Rationale In examining completion outcomes, student-level characteristics are key. Disaggregation of data by student groups are important for both intervention.

Slides:



Advertisements
Similar presentations
Challenge to Lead Southern Regional Education Board Kentucky Challenge to Lead Goals for Education Kentucky is On the Move Progress Report 2008 Challenge.
Advertisements

Challenge to Lead Southern Regional Education Board Tennessee Challenge to Lead Goals for Education Tennessee is On the Move Progress Report 2008 Challenge.
Success is what counts. A Better Way to Measure Community College Performance Presentation about the ATD Cross-State Data Workgroup NC Community College.
Criteria for High Quality Career and Technical Education Programs National Career Pathways Network Orlando, FL November 14, 2014.
Challenge to Lead Southern Regional Education Board Oklahoma Challenge to Lead Goals for Education Oklahoma is On the Move Progress Report 2008 Challenge.
Individual Career and Academic Plans. What are ICAPs? An individualized plan, developed by the student and the student’s parent or legal guardian, in.
WTCS Framework for Student Success WTCS Board Meeting March
Setting the Record Straight: How Trendy Approaches to College Access Might or Might Not Be Helping Low- Income Students Jennifer Brown Lerner September.
1 WIA Youth Common Measures Webinar Attainment of a Degree or Certificate January 19, :00 am – 11:00 am.
Keystone Instructional Specialist. Keystone Exams Offered three times each year – winter, spring and summer. Offered in Algebra I, Biology and Literature.
Colleges can provide all Washingtonians access to 2-year post secondary education Measures: Enrollments in community and technical colleges Rate of participation.
LOUISIANA 1 Goals for Education Challenge to Lead 2003 Louisiana.
TAC NEASC 126 TH ANNUAL MEETING & CONFERENCE LEARNING THAT TRANSFORMS Prior Learning Assessment December 7, 2011 THIS PORTION OF THE PANEL IS PRESENTED.
National Accountability Initiatives and Their Impact on NCCCS J. Keith Brown CCPRO Fall Conference October 18, 2010.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
APRIL 2014 Nevada Advanced Placement 2014 Report 1.
1 Division of Public Schools (PreK -12) Florida Department of Education Florida Education: The Next Generation DRAFT March 13, 2008 Version 1.0 INSERT.
Vision for Education in Tennessee Our Strategic Priorities ESEA Directors Institute Kathleen Airhart, Deputy Commissioner August 2014.
College Preparatory Course Certification Pilot May 5th,
P-20 Statewide Longitudinal Data System (SLDS) Update Center for Educational Performance and Information (CEPI)
Monitoring and Oversight: College Completion and Attainment Dr. Kevin Reilly & Dr. Sheila Stearns AGB Consultants December 7th, 2015.
AB 86: Adult Education Webinar Series
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
Hawaii’s Public Schools Update USPACOM Pacific Theater Education Conference Deputy Superintendent Ronn Nozoe Asst Supt Stephen Schatz 12/4/ :15-11:15.
Education 2018: Excellence for Every Student Presented to the Board of Education August 27,
IS CTE THE NEW VOC ED? MI CAREER EDUCATION CONFERENCE FEBRUARY 2016.
Our State. Our Students. Our Success. DRAFT. Nevada Department of Education Goals Goal 1 All students are proficient in reading by the end of 3 rd grade.
Matthew Bennett ASPECT Autism in Education Conference Melbourne Convention & Exhibition Centre, Melbourne, VIC, Australia Friday, 6 th May 2016 at 9.40am.
Plan early. Plan Smart. June Giddings, M.Ed Director, Strategic Partnerships & Outreach
HLC Criterion Four Primer Thursday, Oct. 15, :40 – 11:40 a.m. Event Center.
Summer Data Conference – June 6, (2008) National Completion Goals (Lumina’s Big Goal: 60% of those will have an Associate Degree or above.
Understanding the LaunchBoard
Higher Education Act.
Industry Certifications, Licenses & Credentials
Good teaching for diverse learners
Too Few Gavilan Students Achieving College Goals Too many choices Often choices made are wrong Extra units = extra time and cost Attrition How many.
Annual Perkins Planning Meeting
First-Year Experience Seminars: A Benchmark Study of Targeted Courses for Developmental Education Students.
Youth CareerConnect Programs:
Kansas Leads the World in the Success of Each Student.
THE PATH FORWARD KCTCS Strategic Plan
2007 Article VII # ELFA 8 Education, Labor, and Family Assistance
Evaluation of An Urban Natural Science Initiative
Career Pathways and Programs of Study: A Federal Perspective
Screeners in a Multi-Tiered System of Support
Curriculum and Accreditation
ACE Colorado’s High-Risk / Special Populations
Defining and Measuring Student Success Dr
Measuring Project Performance: Tips and Tools to Showcase Your Results
Overview for Alternate Assessment
Post-secondary Success for All: Increasing Awareness, Aspiration, Opportunity and Attainment A Vision for Redesigning Grades 11, 12, 13 and 14 in Minnesota.
Workforce Innovation and Opportunity Act Eligibility
Assessment Committee Meeting Continuous Program Improvement
FY 2017 Nursing Workforce Diversity (NWD) Program Funding Opportunity Announcement (FOA) Pre-Review Call January 17, 2016 Tara D. Spencer, MS, RN Project.
Special Education Local Plan Area Meeting
ePortfolios: Emerging Definitions & Campus Planning
YCC Career Pathways Discussion
2018 OSEP Project Directors’ Conference
Enhancing Accountability Alabama’s Colleges and Universities
Assessment Leadership Day Continuous Program Improvement
Using Data for Improvement
Implementation Guide for Linking Adults to Opportunity
Findings from the Project Quest Evaluation
Consideration of Core Outcomes for Strategic Plan
Imagine that you're one of the estimated 36 million adults in the U. S
ILLUMINATING COLLEGE AND CAREER READINESS
Starting Community Conversations
National Center for Higher Education Management Systems (NCHEMS)
Technical and Advisory Meeting
Institutional Self Evaluation Report Team Training
Presentation transcript:

Rationale In examining completion outcomes, student-level characteristics are key. Disaggregation of data by student groups are important for both intervention and evaluation of outcomes. In addition to asking “How many graduated?” we should ask “Who graduated?”

Completion With a Purpose Predictive Analytics College Readiness Tool Academic Progress Tool Financial Risk Tool Diagnostic Analytics Dropout Data Tool Workforce Data Completion With a Purpose

Objectives Compare the percentages of students who graduate or expected to graduate within 150% of program length by student group. Track the completion rates of groups of students before and after the innovation project. Identify accomplishments in educating student groups with significant challenges. Identify student groups that are doing worse than their peers. Examine completion rates against targets. Identify questions for further inquiry.

Tool Design Part I Data entry and visual display of graduation rates Part II Reflection questions for team discussions

Definitions Student Population A Student population is defined by type of degree sought (2-year; 4-year) across programs of study and/or by a named program of study (e.g., business, education, nursing). Program length The amount of time necessary for a student to complete all requirements for a degree or certificate according to the institution’s catalog. Graduated within 150% of Program Length Completers are counted through August 31 of the summer following the sixth year of a 4-year program, or the third year of a 2-year program.

Calculation Method The percentage of students who are expected to graduate within 150% of program length in a given year is calculated based on students' enrollment date, characteristic, and credit accumulation. Define the student population Align with the goals of the innovation project Consider all factors that define the population (e.g., type of program, type of degree, enrollment status, etc.) Consider characteristics that may be called out as student groups. (Note: student groups may be defined as demographic characteristics, academic readiness characteristics, or other factors such as participation in Summer Bridge or Experiential Education programs).

Calculation Method (Continued) Identify the number of student in a group (e.g., students older than 24 years) Apply exclusions Calculate the percentage of students who graduated or the students expected to graduate Estimation of number of students expected to graduate within 150% of program length may be based on total credit accumulation to date or a more sensitive approach, which takes into account credit accumulation rate and early warning signs.

Visual Display of Data Each chart depicts graduation rates for one student group (e.g., first-generation students). The left side of each chart shows the graduation year. The middle part of each chart shows the time of assessment (e.g., baseline, Year 1). The right side of each chart shows the percentages of students, by group, who graduated or are expected to graduate within 150% of program length.

Using the Tool in Conjunction With Other Metrics and Tools Completion rates are insufficient to determine success. Additional indicators, which involve cost and staff time, can inform evaluation and action. Academic Progress tool offers metrics for monitoring progress towards career readiness. Workforce Outcomes tools offers metrics for assessing improved working conditions, salary, and well-being. Examples of Additional Indicators Number of students passing a licensure or certification exams related to degree (administrative records) Post-degree plans (student survey) Scores on academic tests designed to measure general education skills or field related tests (administrative records) Number of students who completed job training, apprenticeship, or internship (student survey)

The tool includes: Reflection questions for each year Space to document notes from team meetings to track data analysis over time Recommended Process: Allow for adequate time to drill down into data to gain a deeper understanding of issues. Record the questions and the data that served as impetus for the questions. Include all members of the Innovation Team and other key staff. Take notes of multiple perspectives.

Type your questions in the chat box. Raise your hand to ask questions or provide suggestions.

Bibliography The Hamilton Project. (2013). Using Data to Improve the Performance of Workforce Training. Washington DC: Author. Executive Office of the President of the United States. (2017). Using federal data to measure and improve the performance of U.S. institutions of higher education. Washington DC: Author. Jones, T. (2014). Performance Funding at MSIs: Considerations and Possible Measures for Public Minority-Serving Institutions. Atlanta, GA: Southern Education Foundation. Markle, R., Brenneman, M., Jackson, T., Burrus, J., & Robbins, S. (2013). Synthesizing Frameworks of Higher Education Student Learning Outcomes. Research Report. ETS RR-13-22. ETS Research Report Series, Self, S., Machuca, A., & Lockwood, R. (2014). Differences in the performance on the certified public accountant exam of graduates from minority and non-minority institutions. International Journal Of Education Research, 9(1), 43-56. Seifert, T. A., Gillig, B., Hanson, J. M., Pascarella, E. T., & Blaich, C. F. (2014). The Conditional Nature of High Impact/Good Practices on Student Learning Outcomes. Journal Of Higher Education, 85(4), 531-564.