Presentation is loading. Please wait.

Presentation is loading. Please wait.

Path to Accelerated Completion and Employment Evaluation Meeting July 31, 2012.

Similar presentations

Presentation on theme: "Path to Accelerated Completion and Employment Evaluation Meeting July 31, 2012."— Presentation transcript:

1 Path to Accelerated Completion and Employment Evaluation Meeting July 31, 2012

2 New Growth Group New Growth is a full-service evaluation firm specializing in postsecondary education and workforce development. – Christopher Spence, Evaluation Project Manager – Joel Elvery, PhD, Data Analysis Partnering with Corporation for a Skilled Workforce for implementation assessment – Holly Parker, Ed Strong, Leise Rosman 2

3 Goals Measure the impact of strategies on student outcomes Capture the variety of approaches implemented for each strategy in the state Contribute to continuous improvement Comply with USDOL evaluation requirements 3

4 Approach Two parts: – Impact assessment – Implementation assessment Two audiences – USDOL – Reporting and compliance – PACE colleges – More detail to ID best practices Approach tailored for each strategy in the original proposal 4

5 Impact Assessment Before-and-after research design Measures - Before Key Measures Academic Progress Measures Program Completion Rates Employment Outcomes Measures - After 5

6 Implementation Assessment Documentation of approaches at each college and how implemented – Interviews, questionnaires, etc. Informs initiative continuous improvement efforts in later stages Sets the stage for future student success agenda 6

7 The Strategies StrategyApproachCycle 1.1a Assessment/ placement practice and retake Assessment score increase Improved placement outcomes Term 1.1b Prior learning credit # credits earned by PLA type Relationship with academic progress Term 1.2 & 1.3 Developmental course redesign/ elimination (Before and after comparison) Developmental requirement completion Program English/ math requirement completion Term 2.1 Actively engage employers Implementation assessmentTBD 2.2 Streamline targeted programs (Comparison cohort approach) Student retention, academic progress, completion, and employment Term 3.1 New guidance technologies Implementation assessmentTBD 3.2 Partnerships with WIBS to develop Virtual Career Centers Implementation assessmentTBD 7

8 First Year Timeline 8 Nov - 12Feb - 13 May - 13 Aug - 13 Sep - 12 -8/3 Surveys Employer Engagement, Developmental Education, Streamlining -9/1 Comparison cohorts defined -Program Launch -11/14 Quarterly Report -11/ 14 Annual Report -1/21 College data reports -2/14 Quarterly report -3/15 First semester roll-up -5/15 Quarterly report -6/21 College data reports -8/14 Quarterly report -8/16 Second semester roll-up -9/21 College data reports Notes: Colleges still provide monthly progress reports to NWACC Expect implementation assessment activities closer to end of first semester

9 Questions?

10 Contact Information Project Manager: Chris Spence, 216.767.6262, Impact Assessment: Joel Elvery, PhD: 216.375.6777, Implementation Assessment: Holly Parker, 734.769.2900, 10

11 PACE Impact Assessment

12 Data plan Quantitative evaluation design Comparison cohort plan Data requested Data submission 12

13 Quantitative evaluation design Two purposes – Meet DOL requirements – Inform stakeholders whether new approaches are increasing student success Using before-and-after comparison – Focusing on cohorts engaged in targeted programs in Fall 2012 vs. those in Fall 2010 – More than what’s needed for DOL requirements 13

14 Comparison cohort plan Where possible, will use past cohorts from targeted programs as comparison group Gathering comparison data from ADHE – Except for some developmental education metrics not in ADHE data New programs or dramatically shortened programs will have to be matched to other similar programs DOL convening in early August 14

15 Comparison cohort plan Next steps on comparison cohort plan – Learn about targeted programs & their duration – Develop groupings of programs – Write up cohort strategy for DOL – Get DOL approval – Inform colleges of any additional data need to provide 15

16 Means of Data Collection SourceType ADHEStudent-level course and program data, including data for comparison cohorts Arkansas Research Center Individual-level employment and earnings data CollegesSee next slides – individual-level data not captured by ADHE 16

17 Data required from colleges 1.Test scores & placement of students involved in assessment test preparation 2.Prior Learning Assessments 3.Demographics of students in targeted programs of study 4.Completion of developmental education requirements for students in targeted programs 5.Historic data on developmental education progress for past cohorts 17

18 Data on PREP Participants Who should be included – Every student who uses assessment test preparation provided in conjunction with PACE grant, regardless of whether in targeted program What need to know – Identifying variables – Type of assessment test, placement before & after readiness course For math, reading, & English assessments 18

19 Data on PLA Participants Who should be included – Everyone who attempts to get credit through a prior learning assessment What need to know – Identifying variables – Total credit hours earned through PLA – Credit hours earned through each of the following Portfolio Standardized test Local test Training 19

20 Questions on PREP or PLA data?

21 1 time student data Who should be included – All students enrolled in a targeted program of study – Includes students who began prior to Fall 2012 who are still enrolled What need to know – Identifying variables – Student demographics from intake form – Developmental ed. placements – Whether have completed developmental ed. 21

22 Term data Data to be reported each term for each student Who should be included – All students enrolled in a targeted program of study – Includes students who began prior to Fall 2012 who are still enrolled What need to know – Identifying variables – Whether taking Technical Math & number of modules have to take – Whether changed program of study & what new program of study is – Whether completed developmental ed. requirements 22

23 Program-level data What programs should be included – Each targeted program included in PACE – A separate row for each different one What need to know – Identifying variables – Credit hours before & after redesign – 2-year dev. ed. math, reading, & English completion rates for cohorts from Fall 2008, Fall 2009, & Fall 2010 – 2-year college-level math, reading, & English completion rates fro cohorts from Fall 2008, Fall 2009, & Fall 2010 23

24 Developmental education worksheet One for each targeted program of study Need to know course numbers for – Redesigned dev. ed. classes – Technical math – Past courses that students would have taken in place of these courses Will be used to gather data on student progress through developmental courses 24

25 How is your college using technical math? Will you have a modular technical math course this Fall? Is it replacing only developmental math? Is it replacing only college-level math? Is it replacing both? If it is replacing both, will some students have to do remediation prior to Technical Math? Do your programs have additional math requirements on top of Technical Math? 25

26 Questions on targeted program participant & program data?

27 Spreadsheets 1 st sheet has list of variables, their definitions, & required format Other sheets are data table shells to be completed by colleges 27

28 Submission Data will contain confidential data Each college will be given a password & will use password protection built into Excel Submission via secure Drop Box Timing of submissions – Fall semester data – January 21 – Spring semester data – June 21 – Summer semester data – September 21 28

29 Wrap up data plan Only asking colleges for information can not get from other sources Your help is crucial because changes to dev. ed. large part of PACE initiative Especially true of historic dev. ed. completion data & PLA data 29

30 PACE Implementation Assessment

31 Why do an Implementation Evaluation? Tell the story behind the data Contribute to Continuous Improvement Share learning across locations Stay on track with goals and funding requirements USDOL requirement 31

32 Overall Objectives Ultimate objective: capture lessons and best practices from your experiences that contribute to your ongoing efforts and the field in general 1.Understand how you plan to implement the strategies 2.Track early outcomes (findings and challenges) from initial implementation 3.Describe and share adaptations made in response to these early outcomes 4.Document lessons learned from modifications and final outcomes 32

33 Our Approach to Evaluating Implementation Greater focus on qualitative information Evaluation plan must be fluid and responsive Each phase builds on the prior phase – Start up and end of grant period usually reflects heaviest information gathering push Timing is frequently subject to course corrections 33

34 Key Topics of Inquiry For each of the three strategies outlined to USDOL: how has the strategy been implemented and how have students utilized/experienced it? – Describe key redesign features and approaches used in implementing them, for example: Personnel changes/additions Professional development and peer learning activities Specific models employed (i.e., CAEL, El Paso PREP, etc.) Curricula and/or delivery innovations New uses of technology Involvement of external partners (employers, WIBs, etc.) New roles for staff or faculty 34

35 Methods of Evaluating Implementation Document review – Relevant institutional policies – Curricula materials – Scheduling information – Informational/outreach materials Surveys Interviews – Phone and/or in person On-site observation Focus Groups – On site 35

36 Implementation Evaluation Information Gathering Timeline Fall 2012 semester – Analyze information from initial surveys (due Aug. 3) – Document review Winter 2013 semester – Second round of surveys on planning progress (first half of semester) and early lessons/challenges – Site visits (end of semester) Academic year 2013-14 – Surveys to track implementation progress, adaptations – Phone interviews or other follow up if needed Fall 2014 semester – Final document review – Final surveys and close-out site visit 36

37 Before we go to lunch… Any questions about the implementation evaluation approach? Lunch discussion topics: – Reflect on data plan – What are the key student success priorities at your institution? – What would be most useful (for your institution) to learn during and after PACE implementation? 37

38 Next Steps Updates based on today’s discussion Questions and clarifications Cohort definitions Rolling out analyses during the semester 38

39 Contact Information Project Manager: Chris Spence, 216.767.6262, Impact Assessment: Joel Elvery, PhD: 216.375.6777, Implementation Assessment: Holly Parker, 734.769.2900, 39

40 We look forward to working with you!

Download ppt "Path to Accelerated Completion and Employment Evaluation Meeting July 31, 2012."

Similar presentations

Ads by Google