Presentation is loading. Please wait.

Presentation is loading. Please wait.

DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.

Similar presentations


Presentation on theme: "DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have."— Presentation transcript:

1 DATA TRACKING AND EVALUATION 1

2 Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have to keep this in mind as you design your project and evaluation activities – This might mean that you change you way of thinking about project evaluation 2

3 For example: REUs The goal of the NSF REU program is to increase the number and quality of STEM undergraduates who pursue advanced degrees in STEM – This is NOT the goal of STEP Your project activities should focus REUs on: students not fully committed to STEM undergraduate degrees or students who may leave STEM programs – NOT students who will pursue a STEM degree “no matter what” 3

4 Evaluation-REUs Should focus on demonstrating that you have convinced students to stay in STEM or attracted new students to STEM – Pre/post-surveys – Tracking (with a comparison cohort) – Focus groups 4

5 Another example—Calculus Reform The goal of many TUES projects would be to increase student learning in calculus The goal of STEP is that students stay in STEM because you have reformed calculus Evaluation for TUES might include changes in standardized test scores Evaluation for STEP would include changes in pass rates, changes in rates of students who take Calc II, persistence rates, etc. 5

6 Evaluation of STEP projects One of the most difficult challenges is to identify the “impact” from the STEP funding – Have to think about this creatively – Try to separate out the impact of the project from the impact of other things that may be going on at your institution 6

7 Data Tracking and Evaluation For the key activity described in the previous exercise, state an expected outcome for the activity and give an example of data to be tracked. Think, Share, Report 7

8 Data Tracking and Evaluation (cont’d) Outcomes – Importance of goals, outcomes, and questions in the evaluation process Cognitive and affective outcomes – For STEP, you want to address the impact of these outcomes on persistence – Types of evaluation tools Advantages, limitations, and appropriateness – Data interpretation issues Variability, alternate explanations 8

9 Data Tracking and Evaluation (cont’d) Project evaluation – Formative – monitoring progress to improve approach – Summative – characterizing final accomplishments 9

10 Data Tracking and Evaluation (cont’d) Effective evaluation starts with carefully defined project goals and expected outcomes. – Goals and expected outcomes related to: Project management – Initiating or completing an activity Student behavior – Modifying an attitude or a perception » In the case of STEP, this means persistence and graduation 10

11 Data Tracking and Evaluation (cont’d) Goals  Expected outcomes Expected outcomes  Evaluation questions Questions form the basis of the evaluation process. Evaluation process collects and interprets data to answer evaluation questions. 11

12 Data Tracking and Evaluation (cont’d) Write a question for the expected outcome from the previous exercise. – For example: Did the survey show a change in the students’ attitude about …? Think, Share, Report 12

13 Tools for Evaluating Student Outcomes  Surveys ◦ Forced choice or open-ended responses  Concept Inventories ◦ Multiple-choice questions to measure conceptual understanding  Rubrics for analyzing student work products ◦ Guides for scoring student reports, tests, etc.  Interviews ◦ Structured (fixed questions) or in-depth (free flowing)  Focus groups ◦ Like interviews but with group interaction  Observations ◦ Actually monitor and evaluate behavior Olds et al., JEE 94:13, 2005 User-Friendly Handbook for Project Evaluation, NSF, 2002 (http://www.nsf.gov/pubs/2002/nsf02057/nsf02057.pdf) 13

14 Example - Interviews Use interviews to answer these questions:  What does program look and feel like?  What do stakeholders know about the project?  What are stakeholders’ and participants’ expectations?  What features are most salient?  What changes do participants perceive in themselves?  For STEP—ultimately did your activity lead to increased enrollment or persistence? User-Friendly Handbook for Project Evaluation, NSF, 2002 (http://www.nsf.gov/pubs/2002/nsf02057/nsf02057.pdf) 14

15 Choosing a Tool Relevance and design of the tool Prior testing and validation of the tool Experience of others with the tool 15

16 Learn: Summary of Best Practices in STEP Data Tracking and Evaluation Key roles for institutional research and IT services Roles/responsibilities of external evaluators and/or evaluation expert(s) 16

17 Data Tracking and Evaluation (cont’d) Measuring the “STEP effect” – Tools and databases to support the evaluation process – Intermediate metrics and effects – Disaggregated data – Corrective action – Institutional impact 17

18 Data Tracking and Evaluation (cont’d) Recall that expected outcomes are typically related to: – Project management – Student behavior Types of data – About the project – About students – Of special interest to NSF 18

19 Data Tracking and Evaluation (cont’d) Types of data – About the project “STEP effect” All metrics may not be known in advance. Are the data relevant and meaningful? Do the data inform next steps? 19

20 Data Tracking and Evaluation (cont’d) Types of data (cont’d) – About students Definitions of metrics should be clear and consistent. Some may be problematic (e.g., “majors” at a community college). Some data are difficult to track (e.g., transfer student data may be incomplete, or FERPA rules may limit sharing across institutions). Surveys help to confirm other data and observations. 20

21 Data Tracking and Evaluation (cont’d) Types of data (cont’d) – Of special interest to NSF Talk to your NSF program officer. Highlights, stories, successes, and shortcomings “People change because of the story.” Karan Watson, “Can we accelerate the rate of change in engineering education,” Main Plenary, 2010 ASEE Annual Conference, June 21, 2010 21

22 Data Tracking and Evaluation (cont’d) Your questions? 22


Download ppt "DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have."

Similar presentations


Ads by Google