Presentation is loading. Please wait.

Presentation is loading. Please wait.

Varieties of System Dynamics Practice Alan K. Graham, PhD Presented at the International System Dynamics Conference in Washington, D.C. July 25 – 29, 2011.

Similar presentations


Presentation on theme: "Varieties of System Dynamics Practice Alan K. Graham, PhD Presented at the International System Dynamics Conference in Washington, D.C. July 25 – 29, 2011."— Presentation transcript:

1 Varieties of System Dynamics Practice Alan K. Graham, PhD Presented at the International System Dynamics Conference in Washington, D.C. July 25 – 29, 2011

2 What is system dynamics? What is not system dynamics? What is it that we have in common? Defined by foundations? The theory of information-feedback systems A knowledge of decision-making processes The experimental model approach to complex systems The digital computer as a means to simulate realistic mathematical models -> Broad and indecisive Defined by current practices? 10 steps 25 validation categories 33 questions model users should ask 982 pages of just one of many textbooks -> Whose practices? Complex and unchangeable Shouldnt the definition include purposethe uses for which SD models are intended? Page 2 · © Greenwood Strategic Advisors AG

3 A simple dynamic hypothesis to account for non-explosive growth in a field Page 3 · © Greenwood Strategic Advisors AG Complexity of SD Presentation: 10 Steps 25 Validation categories 33 Questions model users should ask 982 pages of textbook Pressure on instructors to cover material quickly Effectiveness in and throughness in teaching SD - - Number of qualified SD learners Number and capability level of SD practitioners + + Exposure of potential client users to SD at university + Execution of applied SD studies + + Ability to explain and differentiate SD to potential client users - + Body of real applications + + + Known precedents relevant to potential client users + + +

4 A model of SD modeling processes to show variations Page 4 · © Greenwood Strategic Advisors AG Step: 12 3 4. Validation of567 Validation ofProblemSystem Recommendations ProblemSystemRecommendationsStructureBehavior(by modelers)(by experts) Analysis and Use Requirements Diagram, model will and wont Block diagram, Causal diagram Causal tracing and scoring, interpretation Equations and parameters Modelers rough expectations in testing Comparison of simulated to observed behavior* Rough expectations for behavior in policy testing Expert expectations for behavior in policy testing Kickoff meetings with stakeholders validate purpose of modeling* Boundary adequacy Structure assessment : Consistency with known facts Level of aggregation consistent w purpose & facts Conservation laws represented Decisions mappable to specific actors or groups? Expert review of structure & key assumptions Structure assessment Dimensional consistency Parameters have real-world counterparts & values Response to extreme conditions Calibration Input / output Extreme conditions Behavior sensitivity Challenge by modelers and experts* of behavioral hypotheses (model of the model) System improvement Policy combination Policy sensitivity Challenge improvement hypotheses (model of policy impact)* Fit-constrained parameter Monte Carlo test of improvement* Expert review of analysis summary and model of the model* Getting facts Creating hypotheses Format of hypotheses Client issues and needs Client issues and needs Modeling purpose and scope Modeling purpose and scope Policy impact scoring by experts Policy impact scoring by experts Preliminary recommendations, scope & focus Quantitative (and additional qualitative) information-gathering Quantitative model structure Quantitative model structure Quantitative model behavior Quantitative model behavior Quantitative technical impact analysis Quantitative technical impact analysis Analysis, review and challenge by experts Analysis, review and challenge by experts Recommend- ations Recommend- ations Typical validation tests Qualitative Information- gathering Qualitative Information- gathering Causal diagram (qual. model)

5 Suppose we dont focus on specific hypothesis formats and tests… Page 5 · © Greenwood Strategic Advisors AG Step: 12 3 4. Validation of567 Validation ofProblemSystem Recommendations ProblemSystemRecommendationsStructureBehavior(by modelers)(by experts) Getting facts Creating hypotheses Client issues and needs Client issues and needs Modeling purpose and scope Modeling purpose and scope Policy impact scoring by experts Policy impact scoring by experts Preliminary recommendations, scope & focus Quantitative (and additional qualitative) information-gathering Quantitative model structure Quantitative model structure Quantitative model behavior Quantitative model behavior Quantitative technical impact analysis Quantitative technical impact analysis Analysis, review and challenge by experts Analysis, review and challenge by experts Recommend- ations Recommend- ations Qualitative Information- gathering Qualitative Information- gathering Causal diagram (qual. model)

6 … and dont worry too much about specific steps, and moreover aggregate system (structure + behavior) and recommendations (modelers + experts). Then… Page 6 · © Greenwood Strategic Advisors AG Step: 12 3 4. Validation of567 Validation ofProblemSystem Recommendations ProblemSystemRecommendationsStructureBehavior(by modelers)(by experts) Getting facts Creating hypotheses ) )

7 …even complex modeling can be described as testing only three kinds of hypothesis: Step: 12 3 4. Validation of 567 Validation of ProblemSystem Recommenda- tions ProblemSystem Recommenda- tions Page 7 · © Greenwood Strategic Advisors AG In system dynamics, we test hypotheses about only three things: The problem to be addressed The system it happens in The recommendations to address the problem

8 One format for testing the first type of hypothesis: Our understanding of the problems to be addressed Page 8 · © Greenwood Strategic Advisors AG

9 We test our understanding of the second hypothesis, the relevant system, mostly by standard SD tests For example: –Boundary adequacy –Extreme conditions –Behavior reproduction –Behavior sensitivity For quantitative systems thinking (more about this shortly), the model is a causal diagram with qualitative or quantitative characterizations of each link. Tests are review and discussion with authoritatively-experienced informants: –Does the voice-over description of a given link adequately describe what goes on in real life? –Do the characterizations (scoring, time delays) make sense alone and in comparison to characterizations of other links? –(These are ALWAYS done on buildupsnot the whole diagram at once) Page 9 · © Greenwood Strategic Advisors AG

10 The third hypothesis is we understand what recommendations are needed to improve the systems performance Some testing of policies is done by modelers as part of policy design: –Policy sensitivity analysis –Extreme conditions / scenario analysis Some testing is sometimes done of the modelers understanding of why a given set of policies is effective –Starts with a simpler diagram of how policies create their good effects –Implication: severing links on that diagram should reduce effectiveness of policies –Simulation test: Does severing those links in the model reduce effectiveness of policies? Some testing of policies is done by subject matter experts –Is the explanation of policy impact consistent with their knowledge of cause and effect in the system? –Are the changes in system behavior plausible? Page 10 · © Greenwood Strategic Advisors AG I sometimes hear the argument that separate testing of policy outcomes is unecessary; that if the model structure and behavior have been validated, the policy conclusions should be correct. But theres no such thing as perfect validation. When there are the inevitable time and resource constraints, doing policy testing is a way of focusing time and resources on what matters most to the usefulness of the effort.

11 If we want to address in the where, when, and how aspects of system dynamics, we can add one or two bookends to the three-hypotheses description: Page 11 · © Greenwood Strategic Advisors AG System dynamics is especially useful in complex problems where causes and effects are intertwined, the implications of actions are unclear, and the concern is behavior over time, often aiming to improve future behavior. To do that reliably, we reality-check that we understand three things: The problem(s) to be addressed The system they happen in The recommendations to address the problem(s) We usually use computer simulation to check the second and third hypotheses against both numerical data and expert knowledge of the system. Our hypotheses draw on a wide body of research about dynamic behavior (from feedback control theory) and about decision-making in many spheres of human activity, and in particular, on the extant body of system dynamics research..

12 With that definition in hand, let us examine varieties of system dynamics practiceand some activities wed probably say arent real SD Page 12 · © Greenwood Strategic Advisors AG Single relationship Internal detail, value chain, other organizations, etc. Scope of purpose (narrow) Scope of purpose (broad) Scope of validation (low) Scope of validation (high) Little infor- mation used Follows scientific method extensively Just using the software Exercise models Industrial Strength SD Legal strength SD Classic SD Quanti -tative ST Unsupported systems thinking

13 Varieties of SD practice differ in prominent ways. These are typical: Page 13 · © Greenwood Strategic Advisors AG Problem Dynamic HypothesisInformation System & Recommendations Testing Quantitative systems thinking (QST) ExplicitMany, with uncertainty Expert cause & effect knowledge with scoring Sensitivity testing, focused expert review Classic SDImplicitOneMostly cause & effect – little quantitative info. Within modeler Industrial strength SD ExplicitMultiple competing Quantitative & expert cause & effect knowledge Focused, with experts Legal strength SDImpact quanti- fication Multiple (adversarial) theories of the case Quantitative & expert cause & effect knowledge Extensive, with experts. Confidence bounding. Third party review.

14 There are two contributions here. First, for modelers, there is a taxonomy of SD practice that describes what is otherwise quite complex and detailed… Page 14 · © Greenwood Strategic Advisors AG Step: 12 3 4. Validation of567 Validation ofProblemSystem Recommendations ProblemSystemRecommendationsStructureBehavior(by modelers)(by experts) Analysis and Use Requirements Diagram, model will and wont Block diagram, Causal diagram Causal tracing and scoring, interpretation Equations and parameters Modelers rough expectations in testing Comparison of simulated to observed behavior* Rough expectations for behavior in policy testing Expert expectations for behavior in policy testing Kickoff meetings with stakeholders validate purpose of modeling* Boundary adequacy Structure assessment : Consistency with known facts Level of aggregation consistent w purpose & facts Conservation laws represented Decisions mappable to specific actors or groups? Expert review of structure & key assumptions Structure assessment Dimensional consistency Parameters have real-world counterparts & values Response to extreme conditions Calibration Input / output Extreme conditions Behavior sensitivity Challenge by modelers and experts* of behavioral hypotheses (model of the model) System improvement Policy combination Policy sensitivity Challenge improvement hypotheses (model of policy impact)* Fit-constrained parameter Monte Carlo test of improvement* Expert review of analysis summary and model of the model* Getting facts Creating hypotheses Format of hypotheses Client issues and needs Client issues and needs Modeling purpose and scope Modeling purpose and scope Policy impact scoring by experts Policy impact scoring by experts Preliminary recommendations, scope & focus Quantitative (and additional qualitative) information-gathering Quantitative model structure Quantitative model structure Quantitative model behavior Quantitative model behavior Quantitative technical impact analysis Quantitative technical impact analysis Analysis, review and challenge by experts Analysis, review and challenge by experts Recommend- ations Recommend- ations Typical validation tests Qualitative Information- gathering Qualitative Information- gathering Causal diagram (qual. model)

15 …in terms of a simpler menu of varieties of system dynamics Page 15 · © Greenwood Strategic Advisors AG Single relationship Internal detail, value chain, other organizations, etc. Scope of purpose (narrow) Scope of purpose (broad) Scope of validation (low) Scope of validation (high) Little infor- mation used Follows scientific method extensively Industrial Strength SD Legal strength SD Classic SD Quanti -tative ST

16 System dynamics addresses complex, intertwined issues. To do that reliably, we reality-check that we understand three things: The problem(s) to be addressed The system they happen in The recommendations to address the problem(s) We use computer simulation and lots of information and data about how people and organizations interact to do the reality- checks. Second, for both modelers and nonmodelers, some simple yet properly encompassing definitions of system dynamics, e.g.: Page 16 · © Greenwood Strategic Advisors AG This is a framework that is useful with nearly anyone. To name a few wed like to communicate more effectively with: fellow academics, potential clients, students, parents and spouses.

17 Questions, comments, expansions? Alan Graham Greenwood Strategic Advisors AG Zugerstrasse 40 CH-6314 Unterägeri, ZG Switzerland Alan.Graham@ Greenwood-AG.com Office Tel.: +41 41 754 7447 Office Fax: +41 41 754 7448 Home Office: +1 781 862 0866 mobile: +1 617 803 6757 www.greenwood-ag.com Page 17 · © Greenwood Strategic Advisors AG


Download ppt "Varieties of System Dynamics Practice Alan K. Graham, PhD Presented at the International System Dynamics Conference in Washington, D.C. July 25 – 29, 2011."

Similar presentations


Ads by Google