Presentation is loading. Please wait.

Presentation is loading. Please wait.

How to design programs that work better in complex adaptive systems

Similar presentations


Presentation on theme: "How to design programs that work better in complex adaptive systems"— Presentation transcript:

1 How to design programs that work better in complex adaptive systems
Ann Larson, PhD Presented at Australasian Evaluation Society Annual Meeting, Perth, September 2016

2 Presented as part of a session titled: Effective proactive evaluation
How can the evidence-base influence the design of interventions? John Owen Ann Larson Rick Cummings AES Annual Conference, Perth, 2016

3 Outline Good design principles for creating change in complex settings derived from evaluation findings Reasons why so many designs not incorporate complexity-sensitive design features Examples of good and bad strategies to incorporate into designs Last word on the role of evaluators

4 What works in creating positive change in complex settings?
Evaluations tell us what really happened. The accumulation of evaluation findings amounts to an evidence base that is more sensitive to context than research findings.

5 Obtain flexible, long term funding – because change is not linear, continuous nor predictable
Situate new behaviour in relevant history and saliency of priorities and concerns– because cultures and organisations remain committed to their ways of working Build coalitions around a vision for change – because external shocks will require new partners to support long term change Understand different actors motivations for behaviour change: introduce accountability and incentives Start small, be flexible and experiment before attempting wide-scale change Balance the need for fidelity with opportunities for lots of local initiative to promote genuine institutionalisation Monitor, review and act in a timely manner to be adaptive

6 If it was easy, everyone would do it.

7 Organisational blind spots make these factor difficult to integrate into designs
Command and control culture of central planning Structural limitations of processing and responding to large amounts of data with nuanced implications Epistemologies, especially in the health sector

8 And human nature is also a barrier
Designing interventions to change systems are more difficult when there are profound distances between the designer and the intended beneficiaries – whether that distance is geographic, cultural, economic, religious or linguistic. Evaluation methods can ‘translate’ beneficiaries voices so that decision makers can understand

9 Some common approaches to deal with complexity … that are not working

10 Careful planning does not reduce the likelihood of programs encountering unexpected obstacles and opportunities Reliance of monitoring and evaluation plans with high level annual review to guide program implementation and oversight does not facilitate timely adaptation when programs are not working well Emphasis on celebrating success rather than learning from failure makes it hard to recognise what needs to be changed Use of short time frames and rigid budgets to reduce risk actually makes it more difficult to achieve an outcome

11 And examples of good design trends, using evaluation skills and knowledge

12 Heavily invest in regular review, such as those required employed by TAF/DFAT, USAID’s active monitoring … but may require a large investment in time and may be overly structured for simple projects or projects that do not usually encounter problems

13 Determine where the bottlenecks for adoption are and delegate responsibility at that level, giving lots of local autonomy, coaching and fostering self-organisation May require tailoring approaches for different areas or different types of facilities

14 Learn from good pilots and use the inform to expand, while continuing to provide necessary support
One such example is on the next two slides …

15 Traditional expansion
Sarriot et al 2011 Health Policy and Planning

16 vs staged expansion capitalising on lessons learned

17 Payment by results gives the responsibility to implementing agencies and communities to find their own solutions by giving people incentives to do what they want to do. This gives implementers the incentives to experiment until they find a successful strategy. Only works when there is untapped capacity and a meaningful outcome to be achieved Also requires investment in verification

18 Human-centred design or Agent-based modelling or
Modelling of likely scenarios during design and at critical junctures using: Human-centred design or Agent-based modelling or Complex system modelling All approaches need insights from evaluation.

19 A simpler approach is to model path dependency, looking at the conditions necessary for the design to work

20 Last words about the role of evaluators
They sentenced me to twenty years of boredom For trying to change the system from within Leonard Cohen Evaluators do not need to become implementers internal to an organisation to promote designs that are responsive to complex contexts. As external actors we can contribute evidence for flexible and adaptive designs and evaluate whether these design approaches are more likely to achieve positive, sustainable change.


Download ppt "How to design programs that work better in complex adaptive systems"

Similar presentations


Ads by Google