Nobody is unpredictable M aritime Provinces Higher Education Commission March 26, 2013 Forum on Data Collection and Research: Measures of Student Progress.

Slides:



Advertisements
Similar presentations
Institutional Readiness Questionnaire Bonnie Luterbach, Raymond Guy, Kathleen Matheos Funding for this study was provided by HRSDC and CNIE.
Advertisements

1 Student Assessment and Learning Outcomes Tess Kirsch Associate Director of Accreditation for Policy & Education American Speech-Language-Hearing Association.
©2012 MFMER | slide-1 Mayo Clinic: Models of Clinical Education: Implications for Workforce Development Mayo School of Health Sciences Team – Mayo Clinic.
Learner-Centered Education Course Redesign Initiative Builds upon work of prior LCE grants Will award grants of $40,000 - $50,000, with the option.
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Association for Career and Technical Education 1 Changes and Implications of the Carl D. Perkins Career and Technical Education Improvement Act of 2006.
Lifelong learning: Taking Bologna to the labour market Lars Lynge Nielsen President of EURASHE Leuven Ministerial Conference 28 April 2009.
EDULINK II ACP-EU Co-operation Programme in Higher Education Call for proposals EuropeAid/132023/D/ACT/Multi General outlines Funded by the European Union.
Setting internal Quality Assurance systems
SJR 88 Higher Education Finance Study Commission November 10, 2010.
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
Tennessee Higher Education Commission Higher Education Recommendations & Finance Overview November 15, 2012.
FBOE K-20 Accountability Project CEPRI Workgroup June 13, 2002 Orlando, Florida.
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
HEPI conference, 12 May 2011 Great expectations: how can students gain a great deal from their HEI, and how can quality assurance help? Anthony McClaran.
Enhancing student learning through assessment: a school-wide approach Christine OLeary & Kiefer Lee Sheffield Business School.
Chris Millward 26 May A new settlement for higher education ___________________________________________________________________________________________________________________.
The SCPS Professional Growth System
Management Plans: A Roadmap to Successful Implementation
A Roadmap to Successful Implementation Management Plans.
Greater Minnesota Transit Investment Plan PAC December 14, 2010.
March 14, 2013 KCTCS Board of Regents Efficiency, Effectiveness and Accountability Committee.
Program Management Office (PMO) Design
Developing an Effective Tracking and Improvement System for Learning and Teaching Achievements and Future Challenges in Maintaining Academic Standards.
NJDOE TALENT DIVISION OVERVIEW prepared for: New Jersey Association of School Administrators April 30,
ARMENIA: Quality Assurance (QA) and National Qualifications Framework (NQF) Tbilisi Regional Seminar on Quality Management in the Context of National.
Employment Ontario Program Updates EO Leadership Summit – May 13, 2013 Barb Simmons, MTCU.
Paulding County School District Stakeholder’s Meeting
1 Phase III: Planning Action Developing Improvement Plans.
Making Opportunity Affordable Grant
So What Happened to All of Those 20-Something Students Who Didn’t Complete Their Degree Programs? Bruce Chaloux Southern Regional Education Board.
State Plan for Independent Living UPDATE Overview, Impact and Involvement.
Criteria for High Quality Career and Technical Education Programs National Career Pathways Network Orlando, FL November 14, 2014.
Leading the Way : Access. Success. Impact. Board of Governors Summit August 9, 2013.
Illinois High School to College Success Report High School Feedback Reporting In Cooperation with ACT, Inc. Illinois Community College Board Illinois Board.
Facilities Management 2013 Manager Enrichment Program U.Va.’s Strategic Planning Initiatives Colette Sheehy Vice President for Management and Budget December.
Carl D. Perkins Career and Technical Education Improvement Act of 2006.
TECH PREP PERFORMANCE MEASURES & PROGRAMS OF STUDY NACTEI Annual Conference May 2012.
Assessment of Higher Education Learning Outcomes (AHELO): Update Deborah Roseveare Head, Skills beyond School Division Directorate for Education OECD 31.
The Future of Higher Education in Texas
Ontario Online Inter-ministerial Public Library Discussion Forum January 29, 2014 Ministry of Training, Colleges and Universities Strategic Policy and.
1 The National Fund NATIONAL EVALUATION The First Five Years Navjeet Singh Deputy Director National Fund for Workforce Solutions
INTOSAI Public Debt Working Group Updating of the Strategic Plan Richard Domingue Office of the Auditor General of Canada June 14, 2010.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
U.S. Department of Education Office of Vocational and Adult Education Division of Academic and Technical Education Progress of the State Perkins Accountability.
The Maritime Provinces Higher Education Commission Overview of the Commission’s Proposal: Context and Proposed Institutional Standards.
Hans P. L’Orange State Higher Education Executive Officers October 20, 2009.
Effective Decision Making – Part 2. Effective Decision Making Key components: Service Standards (Part 1) Transit Improvement Program (TIPs) Three Year.
Findings of a Student Retention Study University of Saskatchewan Overview of Findings: June 12, 2007 CACUSS 2007 Conference.
Developing and improving data resources for social science research A strategic approach to data development and data sharing in the social sciences Peter.
A Proposed Accountability Framework for California Higher Education Recommendations from the Advisory Group November 4, 2003.
Selected Results of President’s Office Survey of Alumni Graduating in 1997/98 The Office of Institutional Research and Policy Studies July 15, 2003 Jennifer.
Illinois Community College BoardIllinois State Board of Education Programs of Study Self-Assessment: Starting the Journey on the Right Foot February 4,
P-20 Statewide Longitudinal Data System (SLDS) Update Center for Educational Performance and Information (CEPI)
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
The implementation programme for the 2008 SNA and supporting statistics UNECE special session on National Accounts for economies in transition Geneva,
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Kathy Corbiere Service Delivery and Performance Commission
Distance Learning and Accreditation Heather G. Hartman, Ph.D. Brenau University Online Studies and SACS Liaison.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning a Statistical Project Section B 1.
SIF II Briefing Session 21 st September Briefing Session Content SIF Cycle I – overview Funding and arising issues SIF Cycle II – Process for evaluation.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
The Future of Higher Education in Texas Dr. Larry R. Faulkner Vice-Chair, Higher Education Strategic Planning Committee Presentation to Texas Higher Education.
Measures for a national outcomes based audit model Stakeholder consultations.
Council for Education Policy, Research and Improvement Council for Education Policy, Research and Improvement Changing Directions Project Lumina Foundation.
Program Quality Assurance Process Validation
PLAY VIDEO 10/13/2018. Transformation Overview on Guided Pathways and Integrated Student Support March 6, 2018.
AGRICULTURAL EDUCATION IN UNIVERSITIES BY YEAR 2030
Technical and Advisory Meeting
Presentation transcript:

Nobody is unpredictable M aritime Provinces Higher Education Commission March 26, 2013 Forum on Data Collection and Research: Measures of Student Progress and Outcomes

About Us – MPHEC Mandate (2005 Act): Give first consideration to improving and maintaining the best possible service to students Establish public reporting requirements and produce public reports Take measures to ensure continuous improvement in the quality of academic programs and of teaching Promote smooth transitions between learning and work Take measures intended to ensure programs are of optimum length and best quality Take measures intended to ensure teaching quality

MPHEC Commission Stakeholders Commission Staff Commission Members NB NS PEI Universities Govt & non- govt leaders Students & public-at-large

Operating Principles Institutional Autonomy Public Information Required Commission Mandate

Operating Principles Best Processes & Outcomes emerge from: Dialogue and Collaboration Best Processes & Outcomes emerge from: Dialogue and Collaboration

The Commissions Top Two Priorities MPHEC monitors Student progress through their education --- Graduates outcomes Universities assess programs/ services MPHEC validates institutional QA frameworks MPHEC assesses programs (All programs approved on the condition that the institution review them)

Data and Information: The Commissions Approach Research Agenda: Identifying the characteristics of programs and educational streams/pathways, and other factors, that impact upon student progression, credential completion, and graduate outcomes Research Agenda: Identifying the characteristics of programs and educational streams/pathways, and other factors, that impact upon student progression, credential completion, and graduate outcomes Demand Success Educational Pathways Educational Pathways Student Background Student Background Outcomes System Effectiveness Efficiency Value

MPHEC Research Agenda Research Agenda Monitoring of Basic Statistics / Measures Outputs Basic Statistical Tables Tuition Ancillary & Residence Fees Tables Measures of Student Progress & Outcomes Annual Digests Trends in Maritime Higher Education Support for Provincial KPIs Support Of Funding Administration Analysis Institutions Provided Own Statistics for Comparison

Measures of Student Progress and Outcomes Institutions: for example to support efforts to deliver a quality education Governments: assist in the monitoring of the effectiveness of policies Students and their families: understand the nature of educational pathways in the Maritime university system and the outcomes of its graduates The Measures support the Commissions mandate/ mission and are intended to assist stakeholders in their work Sensible markers of student progress and outcomes Intended Purpose and Value: Standardized definitions built from standardized information - extract value from data provided by institutions, graduates responses Building blocks for in-depth analysis

What do we intend to achieve today? Participants understand the Measures - purpose, value and scope - and provide feedback through roundtable discussion Participants understand the GO Survey Program and provide detailed feedback/suggestions: methodology, response rate improvement, quality of contact information, and reporting Participants will discuss the MPHECs traditional reporting practices, and explore options for future reporting RESULT Your feedback today will help ensure: value and relevance of the measures measures are aligned with needs effective reporting Your feedback today will help ensure: value and relevance of the measures measures are aligned with needs effective reporting

Measures of Student Progress and Outcomes Launched 2008 Working Group Working Group Data Sources Data Sources Student Administrative Data (PSIS) Graduate Survey Data Academic Program Information M EASURES OF P ROGRESS AND OUTCOMES

Demand for Maritime university education Student progress Characterizing educational pathways in Maritime universities Distance to institution from address of application Student/graduate outcomes Measures of Student Progress and Outcomes Measures

Overall Overall Full and Part-time Home Province Home Province Full and Part-time Home Province Maritime Maritime Full and Part-time National Direct Entry Rate of High School Students Measures of Student Progress and Outcomes Published Feb 2012, Feb 2009 W HAT IS THE LEVEL OF DEMAND FOR UNIVERSITY EDUCATION ? (P ARTICIPATION R ATES ) 9 different definitions provide a comprehensive understanding; (PSIS + Census)

Understanding Participation Rates as a Measure of Demand Total FT enrolment population Maritimers Enrolled FT in Region population Maritimers Enrolled FT within Home Province population Maritimers (18-19) Enrolled FT in Region population 18-19

Measures of Student Progress and Outcomes H OW IS STUDENT PROGRESS MEASURED IN MARITIME UNIVERSITIES ? Published January 2010 Within institution of first entry What proportion graduate within 6 years? (Graduation Rate) …Based on cohort tracking using longitudinal PSIS data What proportion of students re-enroll for second year of study? (Persistence Rate)

Published January 2010 Measuring Student Progress 79% of students re-enroll at the same university for a second year of study

Measuring Student Progress Published January % of students graduate at the university of first entry within 6 years

Measures of Student Progress and Outcomes H OW IS STUDENT PROGRESS MEASURED IN MARITIME UNIVERSITIES ? …Based on cohort tracking using linked longitudinal PSIS data At the system level What proportion of students re-enroll for second year of study in any Maritime university? (Persistence Rate) What proportion graduate within 6 years at any Maritime university? (Graduation Rate)

The next 7 slides showcase measures under development; the statistics provided are preliminary numbers and are intended to provide examples only

Measuring Student Progress Preliminary system-level information Preliminary system-level information 83% of students re-enroll at any Maritime university for a second year of study

Measuring Student Progress Preliminary system-level information Preliminary system-level information 64% of students graduate from any Maritime university within 6 years

Measures of Student Progress and Outcomes What we know: 39% of students complete a 4-year bachelors degree at their university of first entry within 4 years Measures under development : How are different pathways related to persistence and graduation at the university of first entry and system level? …Based on cohort tracking using longitudinal linked PSIS data C HARACTERIZING EDUCATIONAL PATHWAYS IN M ARITIME UNIVERSITIES

Time-to-degree 4-year degrees What is the average time to complete a bachelors degree? (expected completion time 3.67 years) 4.2 years On average, how many terms do students complete? (expected time enrolled 8.0 terms) 8.6 terms Stop-outs1-2 terms off What proportion of students take a break from studying? 13% Characterizing Educational Pathways Measures under development – Preliminary examples

Characterizing Educational Pathways Measures under development – Preliminary examples Switching programs/disciplineWithin an institution What proportion of students switch programs after their first year? 18% Switching InstitutionsSystem-wide What proportion of students switch university after their first year? 4%

Number of courses completed relative to number required How many students earned more credits than required to graduate?41% Course load (60% course load is considered full-time) How many full-time bachelor students are taking a full course load (100% or more)? 55% Characterizing Educational Pathways Measures under development – Preliminary examples Course success/failure How many students fail at least one course?35%

Distance to University Measures under development – Preliminary examples Distance from institution from address of applicationAfter 1 year What is the persistence rate of Maritimers from within 40 km of the university attended? 80% What is the persistence rate of Maritimers from more than 80 km of university attended? 85%

Financing Education Percent who borrowed (any source) – First Degree 70% Average amount borrowed – First Degree$30,767 Percent debt-free two years after graduation (borrowers+non-borrowers) 44% Employment Employment Rate86% Percent with jobs requiring university education/management 59% Average Earnings (mean)$37,669 Graduate Outcome Measures Examples (statistics from Class of 2007 in 2009 graduate survey):

Mobility Percent Maritimers remaining in region two years after graduation 80% Percent non-Maritimers remaining in region two years after graduation 19% Further Education Percent who pursued further education within two years of completing first degree 59% Graduate Outcome Measures

Graduate Perceptions Education worth/well worth investment of time83% Education worth/well worth financial investment69% Percent who would choose the same field of study74% Percent reporting satisfied/very satisfied with quality of teaching in most classes 95% Graduate Outcome Measures

Thank you

Roundtable Discussion I 1. Is the focus on student progress and outcomes the right one to support the Commissions mandate? Explain. 2. Do the measures adequately describe student progress? student/graduate outcomes? Are there other measures that should be included? 3. What are the key information requirements in your work? What sources of information do you draw on (e.g., do you run/participate in surveys/data collection)? 4. Do/can these measures support you in your work? How?/Why not? What is missing? 5. What is the relative relevance of these measures within the landscape of other sources of measures/information you may use or consult, e.g., ACUDS, Macleans rankings? How can their relevance be increased? Measures of Student Progress and Outcomes: Scope, Value and Relevance

Roundtable Discussion II Roadmap 1.Is the roadmap clear; is there anything missing, or that can be improved (i.e. elements for greater clarification)? 2.Does the Roadmap help you understand how the Graduate Outcomes Survey Program will support the Programs research objectives, and how the Programs various components are inter- connected? Measures of Student Progress and Outcomes: Focus on the Graduate Outcomes Survey Program

Roundtable Discussion II Key Challenges 1.What could institutions do to encourage graduates to take part in the GO Surveys? Quality of Contact Information 2.How up-to-date, accurate, comprehensive are institutions graduate contact databases (i.e. , phone, address)? 3.What are the challenges in managing the contact database and keeping it up-to-date, what are some ways institutions can improve contact information? Reporting/Dissemination of Findings 4.What reporting format would be most useful and relevant for how you use the Graduate Outcomes Survey findings? For example, does the format proposed in the conceptual roadmap work or not work for you? 5.What are the reporting/dissemination options youd like to see explored?

Reporting In addition to public outputs institutions are provided: MPHEC FTEs, WFTEs, enrolments and credentials at the student level – provided during PSIS submissions process Graduate Surveys custom data – data sharing agreements Custom statistics to accompany reports

Roundtable Discussion III A. Is the traditional reporting practice useful? Does your institution make use of the custom information provided? B. Is there interest among the institutions in sharing with each other measures and other information at the institutional level (but not publicly reported) under a data-sharing agreement? If yes, would it be preferable to share anonymized information, or to identify institutions? What might be some pros and cons to such an initiative? C. How do you rate the usefulness of the measures/ information reported at the different levels – e.g., discipline, institution and system? What other level of reporting would be informative? Reporting