Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program Performance Reporting Module 2: Participant and Comparison.

Slides:



Advertisements
Similar presentations
Training for the Work-Study Supervisor
Advertisements

MoHealthWINs Colleges & The Grant Team: Partners In Discovery and Innovation MoHealthWINs Summer Training Sessions St. Louis, MO July 12, 2012.
Data Validation Documentation for Enrollments. Learning Objectives As a result of this training you will be able to: Describe the data validation process.
United States Department of Labor Employment & Training Administration EVALUATING TAACCCT Kristen Milstead Region 2 TAACCCT Roundtable July 29-31, 2014.
Performance Development Plan (PDP) Training
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Tennessee Promise Forward Mini- Grant Competition Tennessee Higher Education Commission Informational Webinar.
Data Integration Project. MoSTEMWINS Data Projects Strategy 1 -- Develop and Implement a statewide data system in support of tracking student performance.
National Science Foundation Research Experiences for Undergraduates (REU) Site Program.
Illinois Network for Advanced Manufacturing TAACCCT Round 2 Awardee Overview of Grant Evaluation.
Clackamas Community College The fiscal agent, on behalf of Oregon ‘s 17 community colleges Credential Acceleration and Support for Employment (CASE) An.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
Request for Proposals Webinar Presenters Carol Dombek, MSESP Program Manager Teresa Kittridge, MSESP Project Manager, Executive Director – MNREM.
Data Collection Process: USDOL Data Requirements Colleges must be able to identify grant participants by program (credit & non-credit) I. Initial Point.
MoHealthWINs Colleges & The Grant Team: Partners In Discovery and Innovation MoHealthWINs Summer Training Sessions St. Louis, MO July 12, 2012.
Perkins Reserve Fund rant for New Career Pathway Development Technical Assistance Meeting ● April 27, 2015 g.
Section 1512(c) Reporting: Calculating and Reporting of Job Creation and Retention.
1 U.S. Department of Labor Employment & Training Administration Quarterly Performance Reporting: Reporting on the Participant Lifecycle, Reviewing the.
How to Submit An Amendment Tips from the 21 st CCLC Unit Updated September 17, 2009.
Chesapeake College Program Planning & Grant Proposal Writing Workshop Presented by: Pat Bates and Elizabeth Mahler October – November 2009.
Request for Proposals Webinar Presenters Carol Dombek, MSESP Program Manager Teresa Kittridge, MSESP Project Manager, Executive Director –
Eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Trade Act Participant Report (TAPR) 2005 Revisions for Implementing Common Measures.
1.  Interpretation refers to the task of drawing inferences from the collected facts after an analytical and/or experimental study.  The task of interpretation.
Adult Education Block Grant Webinar October 23, 2015
Application for Funding for Phase II of the Education Fund under the State Fiscal Stabilization Fund Program CFDA Number:
Workshop: ETA 9134 (Part 2) Section D.1.—Quarterly Narrative Accountability and Grants Management: Connecting the Dots U.S. Department of Labor, ETA, Region.
The CAMP Performance Reporting Process Michelle Meier Nathan Weiss Office of Migrant Education U.S. Department of Education New Directors Meeting Phoenix,
Language Studies and Academics Résumés Definition, Types, Formatting Employability Module.
FREQUENTLY ASKED QUESTIONS Common Measures. When did common measures become effective? Common measures became effective for W-P on 7/1/05.
Reporting & Performance Quarterly Performance Reports  Narrative  Performance  Financial.
1 Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program Performance Reporting Module 3: Annual Performance Report (APR)
1 Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program Performance Reporting Module 1: General Reporting Requirements.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
Crafting a Quality Grant Proposal March, 2016 ACCELERATED COLLEGE CREDIT GRANT.
1 Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grantee Orientation October 25, :00 pm Eastern Time.
Eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR eta EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT OF LABOR.
TRADE ADJUSTMENT ASSISTANCE COMMUNITY COLLEGE CAREER TRAINING GRANT (TAACCCT) January 24, 2014, 3 PM ET ROUND 3 GRANT REVIEW.
Finishing Your TAACCCT Grant Successfully Understanding Grant Closeout TAACCCT Round 2 Monday, June 22, 2015.
United States Department of Labor Employment & Training Administration TAA-CCCT Round 4 New Grantee Fiscal and Administrative Q&A TAA-CCCT Round 4 New.
PERSONNEL DEVELOPMENT PROGRAM Webinar for 325D and 325K Grantees Completing the ED Grant Performance Report (ED 524B) for the Annual Performance.
1 CREDENTIAL ATTAINMENT PILOT. USING CHAT 2 To submit a question or comment, type the question in the text box Choose who you want to submit the question.
October 13, 2015 Understanding TAACCCT Performance Metrics.
Performance Reporting Nuances Kristen Milstead, Ph.D. Workforce Analyst Department of Labor, ETA.
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration Webinar Date: March 6, 2014 Presented by: ETA/OWI Division of.
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration H-1B Grants Performance Reporting Tutorial #1: Performance Reporting.
TAACCCT Performance Reporting Q&A June 15, 2016 U.S. Department of Labor Employment & Training Administration.
1 Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program FPO Training October 21, 2011.
Understanding the LaunchBoard
Performance Reporting 101: The Annual Performance Report
Training Module 3: USDOL Annual Programmatic Reporting Connecticut Advanced Manufacturing Initiative (CAMI) September 14, 2015 Michelle Hall, CAMI Project.
TAACCCT Performance Reporting Q&A: .
Using Data to Manage Your YCC Program
Training for the Work-Study Supervisor
Webinar Date: November 1, 2013
TAACCCT Performance Reporting Q&A
Presented by: ETA/OWI Division of Strategic Investments
Understanding TAACCCT Performance Metrics
TAACCCT Performance Reporting Q&A: .
Opening Poll Were you involved in preparing and submitting performance reports for TAACCCT grants last year? A) Yes, I have done this before B) No, the.
TAACCCT Performance Reporting Q&A
H-1B America’s Promise Grantee Convening Day 1: November 14, 2017
H-1B Grants Performance Reporting Guidance
Opening Poll Which grant rounds do you work on? (Select One) Round 3
TAACCCT Performance Reporting Q&A April Topic: What You Need to Know about Performance Reporting During Program Activities Extension and During Closeout.
Your session will begin shortly
Top 10 TAACCCT Performance Reporting Questions: Countdown to Success
Opening Poll Were you involved in preparing and submitting performance reports for TAACCCT grants last year? A) Yes, I have done this before B) No, the.
Presentation transcript:

Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program Performance Reporting Module 2: Participant and Comparison Cohorts November 2011 1 1

Presenters Sharon Leu, Workforce Analyst, Division of Strategic Investments- National Office Hannah Sin, Federal Project Officer, Regional Office- San Francisco Kristen Milstead, Workforce Analyst, Division of Strategic Investments- National Office 2 2

Quick Links The main TAACCCT grant program page is available at www.doleta.gov/taaccct One-Pagers SGA and Amendments

Performance Training Overview Grantees and FPOs will be provided a series of performance trainings consisting of two modules: Module 1: General Reporting Requirements & Quarterly Progress Report Module 2 : Participant and Comparison Cohorts Upcoming performance training for 2012: Annual Performance Reporting

OMB Reporting Package Reporting instructions and forms are still under review by the Office of Management and Budget (OMB) To preview the materials, you can send a request to the TAACCCT mailbox at taaccct@dol.gov requesting the materials. Don’t forget to copy your FPO!

Contents of This Module Introduction to the Participant and Comparison Cohorts Building Your Participant Cohorts Selecting Your Comparison Cohorts The Comparison Cohort Pilot Study

Participant and Comparison Cohorts OVERVIEW

What You Will Learn in This Chapter The overall purpose of a comparison cohort and why DOL is requiring participant and comparison cohorts Why some comparison cohorts are better than others Features of an ideal comparison cohort Potential problems with creating the ideal comparison cohort

PARTICIPANT UNIVERSE: Who are in the cohorts? PARTICIPANT UNIVERSE: Tracked in Table 1 of APR NON-TAACCCT-FUNDED UNIVERSE Participant Cohort : Tracked in Table 2 of APR Comparison Cohort : Tracked in Table 2 of APR Matched

Purpose of Comparison Cohorts Allows for an analysis between the group that received something extra and the group that did not Did the “something extra” make a difference? Improves “internal validity” How valid an inference is about what caused something to happen Without a comparison group, it is easy to make reasoning errors about what “caused” something to happen.

Why is DOL Requiring Participant and Comparison Cohorts? Meets the goal of “continuous improvement” Encourage and enable a grantee self-evaluation DOL hopes to learn from the cohort information: Did the TAACCCT program design/ updates/changes that were proposed and implemented have a positive effect on students who went through the new/updated/revised program, as compared with students who did not? Education retention and completion Job placement, retention and earnings

Where Cohort Information is Reported Cohort data and information will be reported in Annual Performance Report (APR) Table 2 (DRAFT) 12 12

Why a High-Quality Comparison Cohort Matters The more attention that is paid to the comparison cohort selection, the greater the ability to rule out the possibility that things other than TAACCCT program are causing whatever outcome effects are observed. Poor selection of your cohorts can lead to inaccurate conclusions when: “Apples to oranges” comparisons are made. Crossover of students between the two groups occurs.

Features of a High-Quality Comparison Cohort (where matching is required) Matching: when random-assignment into both groups is not possible, the comparison cohort is matched on key attributes to make it as similar to the participant cohort as possible Important requirements: Match on the characteristics or attributes that are most likely to affect the outcomes. program of study, length of program, demographics Cohort groups should have a sufficient number of students in each. Students in both groups should have the same length of time to achieve outcomes.

Potential Problems With Selecting a High-Quality Comparison Cohort Inability to match students in the two groups on a key difference or requirement Low numbers of students in your cohorts No matching program of study not funded with grant funds Using a recent cohort or participant and comparison cohorts that start at different times The rest of this module is designed to help you think through your cohort selections and how to correct for or avoid these potential problems.

Summary The purpose of a comparison cohort is to determine whether something has an effect on the people who experienced it. DOL wishes to learn whether there are outcomes differences among students enrolled in TAACCCT-funded programs as compared with students who are not enrolled in these programs. Cohort data will be reported annually in Table 2 of the APR. Careful cohort selection is crucial to arriving at valid conclusions about the effect of TAACCCT. Students in the two cohorts must be matched on key characteristics, contain a sufficient number of students, and have the same length of time to achieve the outcomes.

Building Your Participant Cohorts

What You Will Learn in This Chapter How “program of study” affects the number of participant cohorts How and why to combine programs of study to create one cohort Requirements for combining programs of study Issues to consider in building your participant cohort

Participant Cohorts: The Basics Each program of study that is funded by the grant should have its own participant cohort on which data are reported in Table 2. The purpose of the participant cohorts are to track what happens to one group of students in a particular field of study throughout the remaining grant period of performance, as compared with a comparison cohort in the same field that is not affected by grant funds.

Reporting for One Program of Study PARTICIPANT UNIVERSE: Tracked in Table 1 of APR NON-TAACCCT-FUNDED UNIVERSE Participant Cohort : Tracked in Table 2 of APR Comparison Cohort : Tracked in Table 2 of APR MATCHED

Reporting for Two Programs of Study NON-TAACCCT-FUNDED UNIVERSE PARTICIPANT UNIVERSE: Tracked in Table 1 of APR Program of Study 1: Comparison Cohort (Tracked in Table 2 in its own section) Program of Study 1: Participant Cohort (Tracked in Table 2 in its own section) MATCHED Program of Study 2: Participant Cohort (Tracked in Table 2 in its own section) Program of Study 2: Comparison Cohort (Tracked in Table 2 in its own section) MATCHED

Program of Study A program of study is broadly defined as an educational program in which a degree or certificate is granted. Grouping of some programs with similar educational material or occupational outlook may be allowed. Example: Grantee B is planning to build or expand a program in Solar Photovoltaic Installation and a program in Wind Turbine Service Technician Training. Under some conditions, participants from each could be combined into one “renewable energy” program of study participant cohort.

Combining Programs of Study Why It Might Be Beneficial to Combine Programs One or more individual programs of study do not have its (their) own valid comparison cohort, but you have a useable comparison cohort for the broader category under which they can be combined. Enrollments in one or more individual programs are very small.

Requirements for Combining Programs Must be grant-funded Must have similar educational material or occupational outlook. Must begin training at the same time and the participants for the cohort must be drawn from each at the same time. Must have a legitimate reason for combining based on the development of a valid comparison cohort or on the number of students enrolled in a program.

Acceptable Programs of Study: Guidelines Strongest Participant Cohort  No combining (one program of study) Strong Participant Cohort Different programs of study in the same industry or discipline with the same credential type/level May only be used if a stronger participant cohort cannot be selected. Acceptable participant cohort, but not ideal  Different programs of study in different industries or disciplines with the same credential duration.

Selecting Participant Cohorts A participant cohort is a group of students who start the program of study at the same time. A participant cohort will likely be a subset of everyone who enrolls in a program of study The participant cohort is selected once and students remain in the cohort (even if they are no longer in a course or program of study) once they have been selected. Each participant in the cohort will be tracked for reporting purposes through the end of the grant period. For best results in terms of reporting, select a participant cohort with a start date as early in the grant as possible, but after capacity building is completed in Year 1 (e.g., the end of Year 1 or start of Year 2).

Summary Each program of study should have its own participant cohort. Programs of study may be combined (resulting in only one participant cohort) if occupational outlook and/or educational requirements are similar. Combining programs may be a way to meet the matching requirements for a comparison cohort or avoid some of the problems with having an invalid comparison cohort. All participants in a program may not be in your participant cohort, but all of them must start training at the same time and should start training as soon as possible in the grant period.

Selecting Your Comparison Cohorts

What You Will Learn in This Chapter The requirements for selecting your comparison cohort The features for selection that are not required, but are strongly encouraged Potential issues with selecting a “recent cohort” as your comparison cohort Creative suggestions for meeting the requirement to have a comparison cohort.

Comparison Cohorts: The Basics The purpose of the comparison cohort(s) is to provide the data used to measure the effectiveness of TAACCCT grant awards on the participant cohort(s). Thus, individuals in the comparison cohort(s) must be enrolled in a program that is not funded by TAACCCT. The number of participant cohorts and the number of comparison cohorts should be the same. As with the participants in the participant cohort, each participant in the comparison cohort will be tracked throughout the period of performance or the equivalent.

Reporting for One Program of Study PARTICIPANT UNIVERSE: Tracked in Table 1 of APR NON-TAACCCT-FUNDED UNIVERSE Participant Cohort : Tracked in Table 2 of APR Comparison Cohort : Tracked in Table 2 of APR MATCHED

Reporting for Two Programs of Study NON-TAACCCT-FUNDED UNIVERSE PARTICIPANT UNIVERSE: Tracked in Table 1 of APR Program of Study 1: Participant Cohort (Tracked in Table 2 in its own section) Program of Study 1: Comparison Cohort (Tracked in Table 2 in its own section) MATCHED Program of Study 2: Participant Cohort (Tracked in Table 2 in its own section) Program of Study 2: Comparison Cohort (Tracked in Table 2 in its own section) MATCHED

Selecting Comparison Cohorts: Requirements Requirements for all comparison cohorts: The number of students in the comparison cohort must be the same as the number of students in the participant cohort. The comparison cohort students must be matched to the participant cohort students on the basis of the program of study (or combined program of study, as previously described). The comparison students must be similar to the participants with respect to age and gender at a minimum. The length of the training program must be the same for all students in both the comparison and the participant cohorts. Once a student is selected for the comparison cohort and has started courses, that student is always in the comparison cohort.

Age & Gender Matching Requirement A program’s comparison cohort will be deemed acceptable only if the average age and percent male matches that of the participant cohort to which it is compared. Where exact matches are not possible (given actual enrollments), the margin of discrepancy between the participant and comparison cohorts should be as small as possible. Significant differences between the cohorts may require further inquiry by the FPO or possible cohort rejection if a stronger comparison cohort can be selected. The matching similarity in your cohorts will be a required area for narrative explanation in the Annual Performance Report (APR).

Selecting Comparison Cohorts: Not Required, But Encouraged Other demographics: If possible, the comparison cohort should be matched to the participant cohort on other demographics, such as race, ethnicity, incumbent worker status, veteran status, etc. Timing: It is recommended that the comparison cohort start education or training at or around the same as the participant cohort to avoid a time-lag in reporting outcomes between the two cohorts.

Using Recent Students for Your Comparison Cohort There are two main requirements in order to use comparison cohorts with earlier start dates than your participant cohorts. Requirement #1: The duration of time in which outcomes can be considered achieved should be the same duration as that between the start date of the participant cohort and the period of performance end date Example: Grantee C wishes to use a comparison cohort of students who enrolled in the semester just prior to that in which TAACCCT students were enrolled (or two reporting quarters). The TAACCCT students in the participant cohort started on August 26, 2012. There are 765 days between August 26, 2012 and the end date of the grant on September 30, 2014, occurring over nine reporting quarters. Therefore, even though the comparison cohort students began their education earlier, their outcomes should only be reported for the nine quarters.

Sample duration timeline for cohorts with the same start date

Sample duration timeline for cohorts with different start dates

Sample duration timeline for cohorts with a comparison cohort start date that is prior to the grant start date

Using Recent Students as Your Comparison Cohort – Other Requirements You must have available a level of detail for the recent students sufficient to report on all fields in the Annual Performance Report, Table 2 (Sections B and C, and A if applicable). This information must not only be cumulative through the grant period of performance, but also be reportable on a reporting year-by-year basis. In other words, the information must have been captured in a way that will allow for reporting snapshots of the comparison cohort after each reporting year.

APR Table 2 Field Overview The specific definitions for each of these fields are important to determining if historical information is sufficient. Here is a brief overview of the fields. Definitions of each field are forthcoming in Module 3 in 2012. You should E-mail the TAACCCT mailbox at taaccct@dol.gov for a copy of the OMB reporting package currently under review. Enrollments Completion Retained in Program Retained in Other Program Credit Hours Completed Earned Credentials Further Education After Graduation Employment After Graduation Employment Retention Earnings Demographics

Sample reporting intervals for cohorts with the same start dates

Sample reporting intervals for cohorts with different start dates

Sample reporting intervals for cohorts with comparison cohort start and end dates that occurred prior to the grant start date

Suggestions for Selecting a Comparison Cohort To Meet the Program of Study Match Partner with another college to use a cohort of students in the same program of study at that college as the comparison group for the participants in your grant-funded program of study. Remember: The students in the comparison cohort must not be enrolled in grant-funded courses! This strategy works for either a current or recent comparison cohort (recent cohorts can only be used if the following conditions previously described are met).

Suggestions for Selecting a Comparison Cohort To Meet the Program of Study Match Start tracking the comparison cohort immediately while you develop your program (only if your grant-funded program of study will not be brand new to your college). Remember: The timeline for your comparison cohort must be equivalent to the timeline for your participant cohort and must include the same durations between reporting periods with reportable information at each period end date!

Where to Get Additional Help Your FPO Evaluation experts you plan to work with to perform an individual program evaluation. Additional technical assistance materials will be provided in the form of FAQs and fact sheets as needed.

Summary Every participant cohort should have its own comparison cohort of students that are not grant-funded and will be tracked for the same period of time. Comparison cohorts should be matched to participant cohorts based on program of study, length of the training program, number of students, age and gender. In addition, comparison cohorts should ideally match on other demographic characteristics and start training at or around the same time as the participant cohort. If using a recent cohort, the length of time to achieve outcomes must be the same as the length of time the participant cohort will have to achieve them, all outcomes in the APR must have been collected, and the durations between reporting years should be the same.

Comparison Cohort Pilot Study

What You Will Learn in This Chapter The requirements and parameters for the comparison cohort pilot study How to use current students in your comparison cohort pilot How and where to report comparison cohort pilot data

Year 1 Comparison Cohort Pilot Study Requirement ETA is requiring the reporting of a “test” cohort during Year 1 in order to ensure that grantees are prepared to identify a cohort and report on it in preparation for the second annual report. Data is only collected for the test cohort for Year 1. Participant cohort is not enrolled in Year 1 Participants enrolled during Year 1 Grantees must identify and report on comparison cohort pilot study Grantees must report on the participant cohort and the comparison cohort in lieu of a comparison cohort pilot study

Comparison Cohort Pilot Study Parameters IDENTIFY PROGRAMS OF STUDY: Grantees should select a test comparison cohort for each grant-funded program in which they will eventually enroll participants. IDENTIFY DEMOGRAPHICS: For each program of study, grantees should try to predict the age and gender makeup of students in order to meet ETA’s minimal matching requirements When APR Year 1 is over.. Grantees should discontinue reporting on test cohort and begin reporting on actual participant and comparison cohorts. Only if the participant enrollments didn’t already start in year 1

If You Are Using Your Year 1 Students in Comparison Cohort Comparison cohort and test comparison cohort are both required and tracked separately and simultaneously. In Year 1, grantees will only report on the test comparison cohort. Grantees will report on the actual comparison cohort once participant cohort is enrolled.

How to Report Comparison Cohort Pilot Study Data Data for the test comparison cohort should be reported in Column A of Table 2 (the first APR due November 14, 2012.) All sections of the APR Table 2 should be completed for the comparison cohorts (with the possible exception of Section A). Section A: Acceleration of Progress for Low-skilled and Other Workers Section B: Participant Progress by Program Section C: Summary Student Information Section D: Comparison Cohort Description

Tracking Your Test Comparison Cohort DOL is developing an optional tracking tool for your comparison cohort pilot study that will help you to have the proper numbers for reporting for the first Annual Performance Report due date in November 2012

Summary ETA is requiring a comparison cohort pilot study during Year 1 of the grant, unless the grantee enrolls actual participants during Year 1. Comparison cohorts for the pilot study should be selected and reported on for each program of study based on expectations of who will enroll in each. In Year 2, the grantee will discontinue reporting on the test comparison cohort and report actual cohort data. Report on the test comparison cohort in column A of Table 2 in the first APR only. The test comparison cohort may not be used as the actual comparison cohort.

Contact Information After finishing this recording, submit any questions to the TAACCCT mailbox (taaccct@dol.gov) by COB on December 2, 2011 with subject line: TACT FPO Performance Training Questions. Please copy your FPO. Mark your calendar and join us for a “Live” 60-minute Q&A with ETA Program and Grant office staff to answer your questions

THANKS!