Regulated Health Professions Network Evaluation Framework January 15, 2014 Presented by: Nancy Carter NSHRF REAL Evaluation Services
Introduction Joint development of the framework NSHRF and the Network Building capacity Informed framework Flexible involvement Collective impact as a guide for describing the work Purpose of the evaluation: Primary purpose accountability: being responsive to the act Learning – Early stages and a developmental approach Demonstrating value - engagement in the Network and its evaluation. Public accountability through the network Learning is also an important goal of the evaluation given that the Network is in early stages of implementation and continuous learning is key for this early developmental phase. A developmental approach to the evaluation will be taken, whereby the findings of early phases of evaluation will be used to inform development of the Network. Demonstrating value of the network is important for creating engagement in the Network and its evaluation. Accountability through the Network enhances members’ ability to be accountable to the public and therefore evaluation findings related to accountability to demonstrate value to members. Public accountability piece matters to members and we can do that through being accountable to the government and the legislation. In summary, the evaluation will focus on understanding to what extent the goals of the Network have been achieved, the impact of the Network at a system level, key successes and failures that can be used to improve the Network and engaging Network members in evaluation.
Contextual Issues Barriers and challenges Engagement in evaluation Multiple perspectives Acceptance of the evaluation Lack of common language Attribution of impact to the Network Demand for the evaluation and legislative review Diversity among members organizations Size Private vs. public Secondly, the demand for the evaluation was identified as a contextual consideration. A unique part of the context for this evaluation is that there is legislation that directs the activities of the Network and requires regular review. Members suggested that in developing and implementing evaluation plans it would be important to consider the needs of legislative review and take advantage of opportunities to leverage the work of the evaluation for reporting.
Three types of evaluation Implementation Has the implementation of the Network happened as intended? Process Is the Network is operating in the manner intended? Are the operations consistent with the Network’s design and related policies? Valuable for improving Network operations Outcomes Requires that the initiative be operating for a sufficient period for goals to be achieved What is the impact of the Network? Is the impact consistent with the Network’s mission and intended outcomes? Should the Network continue to be carried out in its current form or is modification is required?
Contribution Analysis approach Evaluation intended to capture contribution of the Network Requires development of program theory and theory of change Assumptions should be reasonable Theory generally makes sense, is plausible, supported by evidence, agreed upon by key stakeholders Activities are implemented as intended in program theory (process evaluation) Evidence confirms/ supports theory Role of other factors is considered and either recognized or determined to be insignificant Using a generative perspective on causality to infer that a program made an important contribution to an expected result that has been observed, contribution analysis argues that a reasonable contribution causal claim can be made if: There is a reasoned theory of change for the intervention: the key assumptions behind why the intervention is expected to work make sense, are plausible, may be supported by evidence and/or existing research, and are agreed upon by at least some of the key players. The activities of the intervention were implemented as set out in the theory of change. The theory of change—or key elements thereof— is supported by and confirmed by evidence on observed results and underlying assumptions—the chain of expected results occurred. The theory of change has not been disproved. Other influencing factors have been assessed and either shown not to have made a significant contribution or their relative role in contributing to the desired result has been recognized.
Collective Impact Approach for the Network Common agenda Shared measurement system Mutually reinforcing activities Backbone infrastructure Continuous Communication
Data Collection Tools Document Review Checklist Executive Director Interview or Questionnaire RHPN Network Member Data Collection Tool
Implementation Evaluation of the Network is intended to be ongoing and to support legislative review of the RHPN Act Requires staged implementation Stage 1 – Implementation and process (Sept 2015) Primarily focused on the 5 components of Collective Impact and the activities being implemented as intended Consideration of operations as compared to intended operations as defined by the act Identification of any early impacts (limited) Revision of Network based on Stage 1 findings Stage 2 – Process and early outcomes (Fiscal 2016 - 17) Considers activities and outputs given revisions Considers early outcomes for collective impact Revision of Network based on Stage 2 findings Stage 3 – legislative review and outcomes (End of fiscal 2017-18) Legislative review according to guidelines Outcomes evaluation to demonstrate contribution of the Network to the intended outcomes Revision based on Stage 3 findings