Presentation is loading. Please wait.

Presentation is loading. Please wait.

Capacity Building: Drafting an Evaluation Blueprint

Similar presentations


Presentation on theme: "Capacity Building: Drafting an Evaluation Blueprint"— Presentation transcript:

1 Capacity Building: Drafting an Evaluation Blueprint
Moderator: Jennifer Gonzales, State Systemic Improvement Plan Coordinator, Arkansas Department of Education Panelists: Sarah Heinemeier, Partner, Compass Evaluation and Research, IDC evaluator Gretta Hylton, Director, Division of Learning Services, Kentucky Department of Education Robert Horner, Co-Director, Positive Behavioral Interventions & Supports (PBIS) Brian Megert, Director of Special Programs, Springfield Public Schools, Springfield, OR

2 Capacity Building: Drafting an Evaluation Blueprint
OSEP Project directors’ conference August 1-3, 2016 Sarah Heinemeier IDC Formative Evaluation Team Member Compass Evaluation & Research Gretta Hylton State Special Education Director Kentucky Department of Education

3 Purpose Describe project Describe IDC evaluation approach
Build state capacity to accurately collect, report, and use IDEA data Describe IDC evaluation approach Explore qualitative methodology used in the evaluation Describe state perspectives on capacity building Provide initial feedback and results In this slide we provide an introduction to the presentation. In brief, we will spend 1-2 sentences talking about IDC (more information to follow on the next slide). Then, we introduce the focus of the presentation: the IDC evaluation and its use of qualitative methods. Finally, we indicate that Gretta will be providing the state’s perspective on the services and initial feedback and results from the TA efforts.

4 IDC Logic Model See handout
This slide provides the IDC logic model, our map of services related to outputs and outcomes

5 Kentucky Department of Education
Staff changes within the past year: Commissioner State Director of Special Education Special Education Branch Manager IDEA Financial Analyst Staff changes within the past three years: Three Part B Data Managers Three Associate Commissioners In this slide, Gretta will present Kentucky’s story: what brought Kentucky to IDC, what Kentucky is hoping to achieve, what are the barriers to overcome, what are the organizational and cultural features to take advantage of/leverage?

6 Kentucky’s Vision Build capacity at the SEA
Build capacity within local school districts Document data processes Open lines of communication across divisions Create agency ownership for the data Identify needed changes/modifications to processes Create a culture of high-quality data In this slide, Gretta will present Kentucky’s story: what brought Kentucky to IDC, what Kentucky is hoping to achieve, what are the barriers to overcome, what are the organizational and cultural features to take advantage of/leverage?

7 Kentucky’s Vision (cont.)
Leverage Priority of the Commissioner, Associate Commissioner, Director Accurate data for improved decision making Confidence in data being used for a purpose Support from IDC in the form of data specialist, TA specialist, meeting recorder, and facilitator In this slide, Gretta will present Kentucky’s story: what brought Kentucky to IDC, what Kentucky is hoping to achieve, what are the barriers to overcome, what are the organizational and cultural features to take advantage of/leverage?

8 Kentucky’s Vision (cont.)
Barriers High turnover rates among SEA leadership, program staff, and data managers Lack of institutional knowledge and experience Organizational structure of SEA Time In this slide, Gretta will present Kentucky’s story: what brought Kentucky to IDC, what Kentucky is hoping to achieve, what are the barriers to overcome, what are the organizational and cultural features to take advantage of/leverage?

9 Intensive TA in Kentucky
Pilot for IDC’s IDEA Part B Data Processes Toolkit Child Count and Educational Environments Exiting Discipline MOE/CEIS Dispute Resolution Personnel Assessment In this slide, both IDC and Gretta discuss the specific IDC TA work in Kentucky. What TA has happened so far?

10 Intensive TA in Kentucky (cont.)
Monthly onsite support from IDC State Liaison Data process mapping Protocol development Privacy and confidentiality training Facilitation of meetings with various offices within the SEA In this slide, both IDC and Gretta discuss the specific IDC TA work in Kentucky. What TA has happened so far?

11 Intensive TA in Kentucky (cont.)
Monthly calls with division leadership, program staff, and data manager State-driven conversations FAQs Troubleshooting In this slide, both IDC and Gretta discuss the specific IDC TA work in Kentucky. What TA has happened so far?

12 Evaluating Intensive TA
Formative evaluation strategies for gathering information about the effectiveness of IDC intensive TA efforts Document review Regular communication with IDC State Liaison Observations of TA work Interviews with state staff In this slide, Sarah introduces the IDC formative evaluation approach for intensive TA efforts Note that while there are some quantitative elements, the focus of the intensive TA evaluation is qualitative—there is an investment in getting to know the state’s context for change

13 Evaluating Intensive TA (cont.)
Document review Service-related documents such as TA plan, logic model, meeting minutes Products produced through TA Regular communication with IDC State Liaison Contextual information Perspective on the past months’ activities Successes, challenges, and barriers Upcoming activities In this slide, Sarah provides more details about the formative evaluation work

14 Evaluating Intensive TA (cont.)
Observations of TA work (on-site, calls) Rich contextual information State participation IDC team engagement Alignment of work with TA goals TA strategies In this slide, Sarah provides more details about the formative evaluation work Contributing rich contextual information, including insight into state confounds and challenges Who is participating and what is the nature and “quality” of the participation? What is the level of engagement with the IDC team (distractions, adherence to agenda)? Are the goals of the TA work present as overarching themes or guides? What TA strategies, tools, or products were used during TA sessions? Does the TA (strategies, tools, etc.) appear appropriate for the participants? For their needs? For their current capacity level?

15 Evaluating Intensive TA (cont.)
Interviews with state staff State perspective on TA Better understanding of individual capacity building Better understanding of barriers and challenges Feedback on the alignment of TA with organizational culture and expectations In this slide, Sarah provides more details about the formative evaluation work State perspective of the nature and quality of TA received Better understanding of individual capacity building needs Better understanding of barriers and challenges to (a) receiving TA and (b) building capacity Feedback on the alignment of TA with organizational culture and expectations

16 Making Sense of the Data
Key Formative Questions Do results align with our theory of change? Are the barriers and challenges what we expected? Is the level of progress what we expected? Bottom Line Short and intermediate term: Has the capacity of Kentucky to collect, report, and use high-quality IDEA data improved? According to whom? OSEP, IDC TA team, Kentucky? Long-term: Does Kentucky have higher-quality IDEA data? In this slide, Sarah presents guiding questions, which are used to provide formative feedback to the IDC team—the actual bottom line is whether Kentucky has high-quality IDEA data (long-term outcome)

17 Making Sense of the Data (cont.)
Analytic Tools: IDC Intensive TA Quality Rubric Developed by the IDC Evaluation Team Four domains Clarity (state commitment, discovery, and planning) Integrity (faithfulness of implementation) Intensity (frequency, type, and duration) Accountability (outputs and outcomes) NEW: Management of the TA effort In this slide, Sarah introduces the TA Quality Rubric, a tool designed by the IDC evaluation team for scoring the nature and quality provided to a state. The components and quality indicators for the Rubric are based on the State Implementation and Scaling-up of Evidence-based Practices (SISEP) core features of effective intensive TA, TA and implementation research literature, and the TA experiences of the IDC TA team leaders.

18 Benefits of TA in Kentucky
Emphasis on systems thinking with a problem- solving approach Development of consistent, clear process requirements for the collection and reporting of data Documentation of clearly defined roles, responsibilities, and expectations for all staff Creation of a collaborative culture Increased attention on continuous improvement In this slide, Gretta presents feedback, from her perspective, on how well the IDC TA efforts are going in Kentucky

19 Benefits of TA in Kentucky (cont.)
Increased confidence among staff Accurate data used to make decisions Partners and shareholders have more confidence in the data More reflection and conversations around the data Less loss of institutional knowledge Data being used for a purpose In this slide, Gretta presents feedback, from her perspective, on how well the IDC TA efforts are going in Kentucky

20 Using Data to Improve Services
Feedback to IDC State Liaisons and Part B/C TA Leads Example: observations about the interactions of IDC TA staff with each other Example: observations about the adherence of IDC staff to the TA plan Refining plans with the state Example: pilot testing the use of a standard protocol for developing data-quality guidelines In this slide, we discuss a few details from IDC’s perspective, how data are used to provide feedback and improve the quality of TA services

21 Lessons Learned Better understanding from evaluation standpoint as to what makes TA effective TA must serve two agendas: state’s needs and OSEP’s needs. Management is important. Importance of contextual information Observations and conversations have been critical. Sarah and Gretta

22 For More Information Visit the IDC website http://ideadata.org/
Follow us on Twitter

23 The contents of this presentation were developed under a grant from the U.S. Department of Education, #H373Y However, the contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Richelle Davis and Meredith Miceli


Download ppt "Capacity Building: Drafting an Evaluation Blueprint"

Similar presentations


Ads by Google