Presentation is loading. Please wait.

Presentation is loading. Please wait.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 1 GEAR UP Evaluation 101 NCCEP/GEAR UP Capacity-Building Workshop Caesars Palace Las Vegas February.

Similar presentations


Presentation on theme: "Capacity-Building Workshop 2013 GEAR UP Evaluation Page 1 GEAR UP Evaluation 101 NCCEP/GEAR UP Capacity-Building Workshop Caesars Palace Las Vegas February."— Presentation transcript:

1 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 1 GEAR UP Evaluation 101 NCCEP/GEAR UP Capacity-Building Workshop Caesars Palace Las Vegas February 4, 2013 Chrissy Y. Tillery NCCEP Director of Evaluation Capacity-Building Workshop 2013

2 GEAR UP Evaluation Page 2 National GEAR UP Objectives  National Objective 1: Increase the academic performance and preparation for postsecondary education for GEAR UP students.  National Objective 2: Increase the rate of high school graduation and participation in postsecondary education for GEAR UP students.  National Objective 3: Increase GEAR UP students’ and their families’ knowledge of postsecondary education options, preparation and financing.

3 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 3 Evaluation Terminology Qualitative Analyses Analysis that involves descriptions and narrative; data is observed. Analysis can focus on different types of qualitative analyses including interpretive and narrative, critical theory, participatory action research, phenomenology, etc. Some examples include:  Focus groups  Case studies  Interviews  Ethnography

4 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 4 Evaluation Terminology Quantitative Analyses Analysis that involves numbers/inferential statistics; data is measured for growth or significance. Embedding quantitative analysis into specific research studies within the overall evaluation is a way to measure more specific outcomes. Some examples include:  Descriptive Statistics  Frequencies, Averages, Percentages  t-test  ANOVA  Regression  Propensity Score Matching

5 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 5 Evaluation Terminology Formative Evaluation Evaluation conducted and reported on an ongoing basis throughout the project to continuously assess the project. Provides program staff with knowledge of how the quality and impact of project activities can be improved. Allows for ongoing data-driven decisions to be made.

6 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 6 Evaluation Terminology Summative Evaluation Evaluation conducted at the conclusion of the project to assess the overall impact of the project in terms of meeting goals and utilizing efficient resources. Used to report final program outcomes.

7 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 7 Evaluation Terminology  National GEAR UP Objective National Objective 1: Increase the academic performance and preparation for postsecondary education for GEAR UP students. National Objective 2: Increase the rate of high school graduation and participation in postsecondary education for GEAR UP students. National Objective 3: Increase GEAR UP students’ and their families’ knowledge of postsecondary education options, preparation and financing.  Project Objective – GPRA (Government Performance and Results Act)Performance Indicators Individualized by grant Each Project Objective should fall under one of the three National GEAR UP Objectives  Performance Measure Should include the following:  Baseline Data  Target Benchmarks  Performance Indicators

8 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 8 Types of Data Baseline Data/Pre-Intervention Data Data collected on students in target schools prior to GEAR UP intervention Intervention Data Data collected on students in target schools receiving the GEAR UP intervention Post-Intervention Data Data collected on students in target schools after the GEAR UP intervention

9 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 9 A Model for Program Evaluation Continuous Data Collection Formative Data Analyses Program Implementation and Revisions Policy RecommendationsSummative Data Analyses

10 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 10 Data Collection Partners State Education Agency Local Education Agencies University System Community College System Private/Independent Colleges and Universities State Education Assistance Authority Business Partners Standardized Testing Agencies – ACT/College Board National Student Clearinghouse

11 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 11 Evaluation 101: Worksheet 1

12 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 12 Characteristics of Effective Data Collection A relational database that is linked by a unique identifier. A data system that defines all variables consistently allowing for comparisons. A data system that allows for customization related to grant activities. A data system that allows for formative and summative evaluation and longitudinal data tracking. A data system compliant with FERPA regulations.

13 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 13 Levels of Data Collection Student Level Data School Level Data State Level Data National Data

14 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 14 Student Level Data GEAR UP Student Services GEAR UP Parent/Family Services GEAR UP Professional Development services Student level demographic data Student level attendance and discipline data Student level academic data including GPA, state assessment scores, and course data Student level dropout and promotion data Standardized assessment data Survey data FAFSA data National Student Clearinghouse data for enrollment, persistence, and graduation Postsecondary data, i.e., remediation data, etc. *Link data using a unique identifier.

15 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 15 School Level Data Percentage of students receiving free and reduced-price lunch Percentage of advanced college preparatory courses Cohort graduation rate Average daily attendance Percentage of fully licensed teachers Percentage of highly qualified teachers Teacher turnover rate Percentage of GEAR UP dollars spent in relation to how much each school was allocated College Going Culture Data

16 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 16 Evaluation 101: Worksheet 2

17 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 17 Setting Up Your Data Non-Technical Build Relationships Define Legal Agreements (MOA) Define Data Elements Test & Validate DataTrain Staff & Document Technical Data SystemLinking Tables of DataWeb Interface Data EntryData LoadingReporting

18 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 18 Data Exchange Considerations Define file layouts Various layout options: CSV, XML, etc. Clearly define the file layout. Insist on precision from data provider, i.e. requires no manual manipulation on your end. Insist on consistency across data feeds, i.e. the file layout does not change. Ensure clarity in communication. Define data exchange protocol Secure FTP, Direct access to partner’s system to extract data, or Secure Website, etc. *Define data change process, i.e., how will changes to data outline be addressed.

19 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 19 Data Inputs and Outputs

20 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 20 Legal Considerations Guidance from Legal Counsel Institutional Review Board (IRB) review Family Educational Rights and Privacy Act (FERPA) Confidentiality Agreements Confidentiality Agreements for GEAR UP Personnel (GEAR UP staff, Coordinators, etc.) Confidentiality Agreements for External Consultants (Consultants, External Evaluators, etc.)

21 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 21 Security Considerations Encryption: Make sure steps are taken to encrypt sensitive data elements. Efficiency: Monitor databases to ensure data are cleaned and linked. Security: Keep the number of users with direct database access to a minimum. Have users sign a Confidentiality Agreement. Disaster Recovery: Make sure your databases are being backed up nightly and that a clear plan for restoration and recovery is outlined. Understand now how long you intend to store data and put measures in place to ensure that can happen.

22 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 22 National Student Clearinghouse Postsecondary Data Tracking StudentTracker for High Schools answers the following questions :  Which of your high school graduates enrolled in college?  Where did they enroll?  Did they enroll where they applied? Was it their first choice?  Did they graduate after six years? The National Student Clearinghouse’s database is the only nationwide collection of collegiate enrollment and degree data. These are actual student records provided to the Clearinghouse every days by our more than 3,300 participating postsecondary institutions, which enroll over 92% of all U.S. higher education students. After StudentTracker matches your records against their database, you’ll receive a comprehensive report containing the information you need to better assess the college attendance, persistence and achievement of your graduates. See:

23 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 23 National Student Clearinghouse Interpreting National Student Clearinghouse Data and setting up files with a unique identifier.

24 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 24 Internal and External Evaluation  GEAR UP must have “implementation of a mechanism to continuously assess progress toward achieving objectives and outcomes, and to obtain feedback on program services and provisions that may need to be altered.”  Internal Evaluator(s):  Important to continuously assess the program.  Important to have a complete understanding and connection to the program.  Important as a trainer for GEAR UP Coordinators and staff in the schools.  Important to continuously manage the data for data integrity.  Important for day-to-day oversight of evaluation activities.  External Evaluator(s):  Important to assess the program from an outside perspective.  Important to conduct parallel or independent analysis separate from internal evaluator(s) for integrity of results.  Important that they have knowledge of one or more of the following: (1) GEAR UP; (2) long term program evaluation; (3) best practices in research methodologies for accurate analysis; and (4) longitudinal analysis.

25 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 25 Evaluation Points to Consider  Research design should match and be appropriate for data collection and analysis.  Evaluation framework should be built around already known local, state, and national data on college-access.  Use prior GEAR UP data to build upon what was successful or what could be strengthened.  Embedded research projects within the overall evaluation can strengthen your proposal and program outcomes.

26 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 26 Evaluation Resources  The Program Evaluations Standards: A Guide for Evaluators and Evaluation Users (3 rd Edition) published by the Joint Committee on Standards for Educational Evaluation (2011)  The Institute for Educational Sciences (IES) Practice Guides  The What Works Clearinghouse   American Educational Research Association (AERA)   American Evaluation Association (AEA) 

27 Capacity-Building Workshop 2013 GEAR UP Evaluation Page 27 Capacity-Building Workshop 2013 Thank you for attending the For additional information regarding the Evaluation 101 session, please contact Chrissy Tillery at , extension 108 or NCCEP/GEAR UP Capacity-Building Workshop


Download ppt "Capacity-Building Workshop 2013 GEAR UP Evaluation Page 1 GEAR UP Evaluation 101 NCCEP/GEAR UP Capacity-Building Workshop Caesars Palace Las Vegas February."

Similar presentations


Ads by Google