Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance Decisions Civil Engineering Systems University of Bristol.

Similar presentations


Presentation on theme: "Performance Decisions Civil Engineering Systems University of Bristol."— Presentation transcript:

1

2 Performance Decisions Civil Engineering Systems University of Bristol

3 Recognise these questions? Have we really taken into account all the factors affecting our decision? Have we really taken into account all the factors affecting our decision? Tell me again about the assumptions? Tell me again about the assumptions? Just what are our levels of risk? Just what are our levels of risk? But where did these numbers come from? But where did these numbers come from? Have we got any evidence to prove that? Have we got any evidence to prove that? Where are we vulnerable? Where are we vulnerable? Are we really ready to make that decision yet? Are we really ready to make that decision yet? What is the most important factor? What is the most important factor?

4 Specialist analysis hard to understand / interrogate / test Not enough analysis Clash of personal / departmental cultures Input numbers are shaky Assumptions not clearly identified / tested Decision factors / influences poorly / incompletely framed Uncertainty, risks and unknowns hidden or forgotten

5 Yet why? Breadth depth & balance of issues not fully grasped Breadth depth & balance of issues not fully grasped Undue trust in quantitative analysis Undue trust in quantitative analysis Different influencers see only part of the big picture Different influencers see only part of the big picture Delays due to lack of consensus Delays due to lack of consensus Poor knowledge management or corporate learning Poor knowledge management or corporate learning Qualitative judgement and quantitative analysis done in isolation of each other Qualitative judgement and quantitative analysis done in isolation of each other Lack of framework Lack of framework

6 What is the PeriMeta Approach? A means to: Enhance decision making in the context of incomplete, sparse and conflicting information Enhance decision making in the context of incomplete, sparse and conflicting information Communicate complex systems simply Communicate complex systems simply Manage uncertainty explicitly Manage uncertainty explicitly Integrate hard and soft influences Integrate hard and soft influencesBy: Modelling systems as hierarchies of processes Modelling systems as hierarchies of processes Recording the attributes of the processes Recording the attributes of the processes Embedding a rich uncertainty calculus Embedding a rich uncertainty calculus Using all available evidence in whatever form Using all available evidence in whatever form

7 Added Value Shared understanding of the state of the asset Shared understanding of the state of the asset across teams across teams up and down the organisation up and down the organisation with all stakeholders..simply with all stakeholders..simply Meta level system - health overview Meta level system - health overview Identify success,failure and vulnerability Identify success,failure and vulnerability Sensitivity and value of information Sensitivity and value of information Vehicle for testing intervention strategies Vehicle for testing intervention strategies Decision recording Decision recording corporate memory, transparency, auditability corporate memory, transparency, auditability

8 Programme Development Civil Engineering Systems Bristol Juniper Oil IndustryTexacoCSIRO TMX & HLS PeriMeta FaberMaunsell Highways Agency Halcrow New Applications Design Information Group Bristol CMAM Industrial Partners EPSRC

9 Political risk & City vuln 3.02 & CMAM

10 At its heart……… A way to communicate uncertainty and its converse dependability: A way to communicate uncertainty and its converse dependability: Sn(A) = Evidence that A is successful 1 - Sp(A) = Evidence that A is not successful Sp(A) - Sn(A) = Lack of evidence Sn(A)Sp(A)10 Evidence that A is successful Uncertainty Evidence that A is not successful

11 Interval Probability Theory Coin Toss: Coin Toss: 0101 heads classically: open world: Life elsewhere ?: Life elsewhere ?:

12 Safety Environment Pitlochr y system M&E assets Water control Tail race Turbine s *****Drum gates ***** ******* Pitlochr y system M&E assets Water control Tail race Turbine s *****Drum gates ***** ******* Cost Pitlochr y system M&E assets Water control Tail race Turbine s *****Drum gates ***** ******* Views of performance in selected areas of interest Model manager The asset - a precious resource Overview Reservoi r F DamWater control Tail race Towers *****Drum gates ***** ******* Hierarchical model of the system Database of performance indicators Inspections Reports Instrumentation Analysis Values and objectives Regulatory standards Library of value functions

13 Key principles: 1/2 Simple – only one kind of blob and link! Simple – only one kind of blob and link! The blobs are processes with objectives The blobs are processes with objectives The degree to which a process meets its objectives is expressed explicitly The degree to which a process meets its objectives is expressed explicitly Its performance or dependability Its performance or dependability The asset or system of interest is described hierarchically The asset or system of interest is described hierarchically Each hierarchy layer represents a fairly complete description of the system of interest Each hierarchy layer represents a fairly complete description of the system of interest

14 Key principles: 2/2 Layers in the hierarchy can be related to different levels of decision-making & to the contingency planning process Layers in the hierarchy can be related to different levels of decision-making & to the contingency planning process Evidence of performance is assembled from all available sources Evidence of performance is assembled from all available sources from expert judgement, visual inspection reports, instrumentation model analysis etc from expert judgement, visual inspection reports, instrumentation model analysis etc Best built as a challenged group activity Best built as a challenged group activity

15 Generating specific views Multi-attribute weights can be changed to emphasise specific points of view on system performance, e.g. safety or economics. If any performance indicator is irrelevant to that aspect of performance is can be set a weight of zero.

16 Key features Rapid prototyping of models for exploring decision scenarios Rapid prototyping of models for exploring decision scenarios Comprehensive model building process for operational models Comprehensive model building process for operational models It is not just a software package It is not just a software package Rapid communication, exploration and modification with the built model Rapid communication, exploration and modification with the built model

17 Presidential safety

18 The Human Decision Maker - Creative - Responsible - Responsible - Operating in an Open World - Operating in an Open World - Achieving : Satisfaction Self esteem - Achieving : Satisfaction Self esteem Therefore: enhance rather than replace

19 Evidence for a Decision Generating the Options Assembling the Evidence Making the Choice Determining the Objectives Modelling Option Performance Comparing Options with Objectives and states of nature Taking Action to maximise value and mitigate risk Assessing Risks & Values Assessing Risks & Values

20 Mind mapping Monte-Carlo Fuzzy etc Decision Trees AHP Evidence for a Decision Generating the Options Assembling the Evidence Making the Choice Determining the Objectives Comparing Options with Objectives and states of nature Risk Management QRA etc PeriMeta Assessing Risks & Values Assessing Risks & Values Taking Action to maximise value and mitigate risk Modelling Option Performance

21 Philosophical problems Seventeenth century natural scientists dreamed of uniting the ideas of rationality, necessity and certainty into a single mathematical package, and the effect of that dream was to inflict on Human Reason a wound that remained unhealed for three hundred years – a wound from which we are only recently beginning to recover Stephen Toulmin 2001 Return to Reason Harvard University Press, p13

22 Culture informs process which defines tools Why before How Culture e.g. Processes e.g. Tools e.g. WHY Partnering Best value Managing Value - Risk Getting Agreements Incentivising success Building the team Empowering the supply chain ContractsProfiles ProceduresProcess maps WorkshopsRich pictures WHY HOW

23 Building the model Model the Process Why before how Why before how Assemble the evidence Assemble the evidence All sources – mapped to common expression All sources – mapped to common expression Record the Attributes Record the Attributes Why Who What When Where How? Why Who What When Where How? Use a Rich Uncertainty Calculus Use a Rich Uncertainty Calculus Gives a powerful handle on the dependability of each of the processes Gives a powerful handle on the dependability of each of the processes The process of constructing the model encourages creative collective reflection on how the asset system performs The process of constructing the model encourages creative collective reflection on how the asset system performs

24 Evidence from sub-processes PeriMeta maps the conditional probabilities to linguistic variables PeriMeta maps the conditional probabilities to linguistic variables H E1E1 E2E2 E3E3 Propagation of uncertainty requires input of a set of conditional probabilities Propagation of uncertainty requires input of a set of conditional probabilities

25 Summary of Judgements Required Evidence - for and against separated Evidence - for and against separated Sufficiency - How much of the evidence is directly relevant to the parent process? Sufficiency - How much of the evidence is directly relevant to the parent process? 1 = it is sufficient on its own to fully determine the success of the parent 1 = it is sufficient on its own to fully determine the success of the parent Dependency - How much overlap of evidence is there between the sub-processes? Dependency - How much overlap of evidence is there between the sub-processes? Issues of bias and redundancy Issues of bias and redundancy Necessity - Will the parent fail if the sub-process fails? Necessity - Will the parent fail if the sub-process fails? Higher necessity puts more weight on the red Higher necessity puts more weight on the red

26 The Essence of Risky Decisions

27 Examples of Use Oil industry Oil industry Expert interpretation of sparse data managing the oilfield asset Expert interpretation of sparse data managing the oilfield asset Political risk Political risk Water sector Water sector Assessing sustainability of supply Assessing sustainability of supply Assessing safety of contract Assessing safety of contract Civil Engineering Civil Engineering Flood defence decisions Flood defence decisions Measuring Egan performance Measuring Egan performance Highways Agency PFI MAC Highways Agency PFI MAC Assessment of Terrorist Risk Assessment of Terrorist Risk

28 Reservoir estimate dependability Things coming out of the woodwork

29 Field Development Success?

30 Spend?

31 Value of Information Testing strategies

32

33 Key Drivers for the Highways Agency Procurement have asked for a generic performance specification to apply to new MAC contracts Procurement have asked for a generic performance specification to apply to new MAC contracts The modernising government initiative (OGC/Treasury/NAO) requires outcome based processes to improve efficiency and effectiveness The modernising government initiative (OGC/Treasury/NAO) requires outcome based processes to improve efficiency and effectiveness SSR are responding to HAs business needs SSR are responding to HAs business needs Mapping of rate of progress showed that standard approach would be too slow Mapping of rate of progress showed that standard approach would be too slow

34 Proposition To develop a performance regime that connects outcomes to what people do (i.e. process) through consultation and learning to improve: To develop a performance regime that connects outcomes to what people do (i.e. process) through consultation and learning to improve: Provide a framework for high level performance specification and decision support Provide a framework for high level performance specification and decision support Motivating people by helping them to understand where they fit and their contribution to outcomes Motivating people by helping them to understand where they fit and their contribution to outcomes Provide confidence and competence through systematic rigour and recognition of uncertainty Provide confidence and competence through systematic rigour and recognition of uncertainty

35 Evidence Reservoir system The asset Database of performance indicators HA Knowledge Library of value functions Asset Knowledge

36 Alignment WOOs Work on Outputs and Outcomes Purpose: To match route based outputs to network level outcomes/targets. SUNS Setting up Network Strategy Purpose: To identify business improvements and best working practices. Output: Desk instructions John Bagley (Leeds) Network Strategy + KPIs Purpose: Publish KPIs against which to measure performance of key Agency asset management and service delivery activities. PRIDe Performance Measurement Group Purpose: To develop better metrics for Area Performance Indicators to facilitate benchmarking across the network. To develop CCC compliance indicators. Maintenance Contractors Individual Providers have various systems for demonstrating good performance. Performance Regime Purpose: To develop performance regime for PFMAC Output: performance specification to allow greater freedom to innovate and improve Keith Shaw Glynn Harrison John Fitch Nick Harding Halcrow We are not re-inventing the wheel! OD Quality Management (OD Process Mapping) Keith Shaw Simon Smith Corporate Planning Team Lisa Scott / Dick Tyson PPDG - Performance Planning Development Group HA Performance Man. – Framework for HA Supply Chain Management Integrated teams and continuous improvement David Parker ? ? Keith Shaw Cabinet Office and OGC are promoting the recognition of uncertainty

37 Stakeholder Diagram HA MAC - Internal Other Stakeholder s Road Users Government Other Stakeholder s Road Users Commuters Freight Many Others Consultants Contractors Others Many Others Local Authorities Emergency Services Supplier s

38 Measurement Boundaries Measurement needs to be at the Contract Boundary HAMAC HA Service Boundary Contract Boundary Service Delivery Delivery Slice Demonstrate Detail MAC

39 Measured Process Indicators used to measure delivery of outcomes D D eliver D D emonstrate Process Demonstrating competence Process Demonstrating competence D D etail Procedures / QMS D D evelop Continuous Improvement over time Contract Boundary Outcomes

40 Where are we going? No right answer, so how can you be sure you have the right answer!? No right answer, so how can you be sure you have the right answer!? We are searching for a model that: We are searching for a model that: is robust, and with which the owners are comfortable. is robust, and with which the owners are comfortable. enables exploration of the performance regime by all sorts of stakeholders enables exploration of the performance regime by all sorts of stakeholders Will drive improvement in outcomes Will drive improvement in outcomes Enables alignment to payment Enables alignment to payment

41 Journey to Process Model Who? Process owners a useful guide Who? Process owners a useful guide When ? Reflecting process cycles Why How

42 Process Winter Service Owner Network Manager Purpose - Safe mobility in winter Relevant outcome measures - Journey times - Accident rates - Material used (proxy sustainability) Procedure/Process Owner Purpose Customer Perspective Safety Measurement Procedure/Process Owner Purpose Customer Perspective Safety Measurement Procedure/Process Owner Purpose Customer Perspective Safety Measurement Performance Measurement ) Staff (A Good Employer) HIGHWAYS AGENCY KEY OBJECTIVES Attributes of Process Innovation & Learning Delivering in Partnership Best Value (Finance) Sustainability NOTE: HA Balanced Scorecard Perspectives in Blue Customer perspective Safety

43 Types of Evidence Types of evidence considered: Types of evidence considered: Measured Measured Performance Measurement Group (PMG) measures Performance Measurement Group (PMG) measures Existing contract Area Performance Indicators Existing contract Area Performance Indicators HA High Level KPIs (e.g. Balanced Scorecard, Government Objectives, etc.) HA High Level KPIs (e.g. Balanced Scorecard, Government Objectives, etc.) Instrumentation, analytical and computer models Instrumentation, analytical and computer models Linguistic Linguistic Expert judgement from form reports or interviews Expert judgement from form reports or interviews

44 Value Functions Performance targets are expressed as value functions Performance targets are expressed as value functions Translate PIs onto a non- dimensional 0-1 scale Translate PIs onto a non- dimensional 0-1 scale 0 = failure 0 = failure 1 = total success 1 = total success Allows different types of evidence to be brought together Allows different types of evidence to be brought together Safer Travel Reduce the number of people killed and seriously injured on trunk roads in by 694 to 4297 (compared with the average 4991)

45 Mapping Value Functions Performance Indicator Value

46 Highways Agency measurement Library of Value Functions – HA KPIs & Targets – 2003/04 Maintain at least 85% of the network in good condition. Effective Maintenance Safer Travel Reduce the number of people killed and seriously injured on trunk roads in by 694 to 4297 (compared with the average 4991)

47 Winter service Worked example

48 Contractors process model for a MAC Functional processes Core processes Procedures Tasks Winter service HA objectives

49 Winter service process WHY HOW WHEN Propagated

50 Outcome measures of success Current Total number 29 8 Key performance measures + 8 Key performance measures + 12 Covered in quality plan 12 Covered in quality plan 9 Other measures 9 Other measures Reduced to 3 key measures Reduced to 3 key measures Accident rate Accident rate Average journey time through system Average journey time through system Quantity of grit used (environmental constraint) Quantity of grit used (environmental constraint) Plus conformance with plan Plus conformance with plan to fulfil public duty to fulfil public duty

51 Benefit of outcome measures Simple measures common for many processes Simple measures common for many processes Moderated to be relevant to process context eg MOORI (Met office open road indicator). Moderated to be relevant to process context eg MOORI (Met office open road indicator). Benchmarked to achieve continuous improvement Benchmarked to achieve continuous improvement Empowers those doing the job to improve performance by Empowers those doing the job to improve performance by Better measurement of their process Better measurement of their process Continuous improvement Continuous improvement Innovating Innovating

52 Conclusions Progress is on programme Progress is on programme Strong evidence of need to improve measurement Strong evidence of need to improve measurement Key generic measures to be tested Key generic measures to be tested Safer travel –KSI reduction Safer travel –KSI reduction Mobility – hours of congestion Mobility – hours of congestion Customer Satisfaction – local surveys Customer Satisfaction – local surveys Minimise adverse environmental impact Minimise adverse environmental impact Robust process model to enable context based measurement at MAC boundary Robust process model to enable context based measurement at MAC boundary Next stage wider engagement - validation

53 Overall PeriMeta Conclusions Shared understanding of the state of the asset Shared understanding of the state of the asset across teams across teams up and down the organisation up and down the organisation with all stakeholders..simply with all stakeholders..simply Performance Regime Performance Regime health & vulnerability overview health & vulnerability overview Sensitivity and value of information Sensitivity and value of information Decision recording Decision recording corporate memory, transparency, auditability corporate memory, transparency, auditability

54

55 Prediction Uncertainty Models Uncertainty Risk Decision Monitoring Uncertainty Hazards Environment Vulnerability Systems Uncertainty Information Surprise Outcome Uncertainty is a property of information – fuzziness, incompleteness and randomness Risk is the likelihood of an uncertain event or behaviour, and its consequences for our intended purpose or objectives, set in a context that needs to be understood Hazard - set of incubating preconditions for failure Vulnerability – susceptibility to disproportionate damage from an event or behaviour Surprise - an unexpected event – an unrecognised risk Terminology

56 Fuzziness - Imprecision of definition Incompleteness - That which we do not know, choose not to include or cannot afford to include Randomness - Lack of a specific pattern The Nature of Uncertainty

57 Fuzziness Incompleteness Randomness

58 Tools for Uncertainty Management Analogue Studies Case Based Reasoning Case Based Reasoning Parametric Studies Parametric Studies Safety Factors Safety Factors Monte Carlo Simulation Monte Carlo Simulation Bayesian Reasoning Fuzzy Methods Fuzzy Methods Neural Nets Neural Nets Genetic Algorithms Genetic Algorithms Evidential Reasoning Evidential Reasoning Process models Process models

59 Just because past futures have resembled past pasts, it does not follow that future futures will resemble future pasts. Bryan Magee Popper 1973

60 Understanding risks Predict from history Understanding what we do not know Look out for change Priority action occurenceoccurence consequence frequent infrequent high low

61 Vulnerability Now done in many areas manually Now done in many areas manually Complexity of systems means that is now unreliable to depend on unaided human identification Complexity of systems means that is now unreliable to depend on unaided human identification Need systems approach as opposed to a reductionist paradigm Need systems approach as opposed to a reductionist paradigm

62 Juniper There is a need for generating new processes for imagining imaginative outcomes. There is a need for generating new processes for imagining imaginative outcomes. Schneider The future of climate potential for interaction and surprises In Downing: Climate Change and World Food Security. Springer Berlin (1996 p79) Schneider The future of climate potential for interaction and surprises In Downing: Climate Change and World Food Security. Springer Berlin (1996 p79)

63 Risk Management Plans ( KnowRisk – Australian RM software ) Uncertainty measurements Layout that gives view on Uncertainty

64 QRA Limited scope – applicable in tightly constrained physical environments Limited scope – applicable in tightly constrained physical environments Needs frequency database Needs frequency database Makes bold assumptions Makes bold assumptions Difficult to bring in soft systems Difficult to bring in soft systems Yet we need to mix hard and soft……..

65 HA Balanced Scorecard Staff Perspective How effective are we at managing, developing and motivating our workforce? Efficiency & Finance How good are we at managing our financial resources? Customer Perspective How effective are we are delivering customer requirements? Delivering in Partnership Do we work in synergy with our partners? Innovation & Learning How good are we at preparing for the future and learning from the past? HA Vision Safe Roads Reliable Journeys Informed Travellers Travelling with Confidence 7 Oct 02


Download ppt "Performance Decisions Civil Engineering Systems University of Bristol."

Similar presentations


Ads by Google