Download presentation
Presentation is loading. Please wait.
1
Systems, Thinking, Fast and Slow
Two systems, biases, heuristics Systems Engineers and Stakeholders The works of Daniel Kahneman Given by Kevan Boll Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
2
Aim Explore what we can learn and apply from Kahneman’s work
Share ideas, anecdotes, experiences Discover any advances? Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
3
Content Describe ideas from the book Reflect on those ideas
Draw some learning points Q&A, ideas, anecdotes, experiences, advances? Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
4
Quick Biog Daniel Kahneman
Born March 5, 1934, an Israeli-American psychologist notable for his work on the psychology of judgment and decision-making, as well as behavioural economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences (shared with Vernon L. Smith). His empirical findings challenge the assumption of human rationality prevailing in modern economic theory. With Amos Tversky and others, Kahneman established a cognitive basis for common human errors that arise from heuristics and biases and developed Prospect Theory (Kahneman & Tversky, 1979). In 2011 he published Thinking, Fast and Slow, which summarizes much of his research and became a best seller. He is professor emeritus of psychology and public affairs at Princeton University's Woodrow Wilson School. Kahneman is a founding partner of TGG Group, a business and philanthropy consulting company. [Ref 1] Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
5
What is the book? Culmination of years of joint study into judgement and decision making Is a summation of his, and the work of others into a coherent story Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
6
Where his work is aimed? Serious self criticism Enrich the Vocabulary
Informal Controls? Kahneman aimed is the book at people, who through everyday dialogue can begin to think and critical judge the decision they, or colleagues make. By Providing a vocabulary it is hoped that the dialogue can be more specific (elaborate on vocab) Eventually, the ideas, concepts indeed the facts from Kahemans works seep into the knowledge and self awareness of people in organisations and effectively become an informal controls in the organisation. Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
7
[Ref 8] Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
8
First Impressions What do you see? Can you predict a future from this
What is she going to do next What is she going to be like to talk with? You did not intend to assess her, you were not asked or prompted by or anyone else. But you just did it. Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
9
2 x 2 Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
10
You knew immediately that was a multiplication problem
You know the answer is 4, almost instantly. You knew immediately that was a multiplication problem You know the answer is 4, almost instantly (hopefully) Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
11
17 x 26 You knew immediately that was a multiplication problem
You may even think that you could solve it You may even have some idea of what the answer is going to be or rather what its not going to be ‘143 or 7 billion’ But you could not say of the answer is not 462 Have a go at completing the calculation What did you feel The answer is 442 Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
12
You knew immediately that was a multiplication problem
You may even think that you could solve it You may even have some idea of what the answer is going to be or rather what its not going to be ‘143 or 7 billion’ But you could not say of the answer is not 462 Have a go at completing the calculation You knew immediately that was a multiplication problem You may even think that you could solve it You may even have some idea of what the answer is going to be or rather what its not going to be ‘143 or 7 billion’ But you could not say of the answer is not 462 Have a go at completing the calculation Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
13
17 x 26 = 442 The answer is 442 What did you feel?
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
14
What just happened 2 x2 and the Face Fast Automatic Thinking 17 x 26
Slow Effortful Thinking There were two kinds of thinking going on, Fast and slow Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
15
The simple model of two systems
Fast Automatic Thinking System 2 17 x 26 Slow Effortful Thinking System 1 System 2 Kahneman uses a simple model to encapsulate two modes of thinking. Psychologist have for years offered many labels for the two mode and called them many names. For simplicity Kahneman uses System 1 and System 2. He uses this model throughout the book and considers the two systems to be ‘characters’ The two systems work together and have successfully over our evolution It is important to remember this is a model, all models are wrong but some are useful.... And I think this one is usefulf. Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
16
The Two systems work together
Fast Automatic Thinking System 2 Slow Effortful Thinking System 1 System 2 They have been working together and evolving as long as humans have been around We need them both but Kahneman suggests we need to be careful Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
17
Conflict between Two systems
But the two ways in which the systems work does cause conflict. Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
18
Exercise Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
19
System 1 and 2 in conflict Scan down both columns below, calling out "upper" or "lower" to indicate whether each word is printed in lowercase or uppercase. Repeat, saying "left" or "right" depending on whether the word is to the left or right of centre. LEFT left right RIGHT RIGHT left right upper lower LOWER UPPER Scan down both columns below, calling out (OR Whispering to yourself) "upper" or "lower" to indicate whether each word is printed in lowercase or uppercase. Repeat, saying "left" or "right" depending on whether the word is to the left or right of centre. [Ref 8] Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
20
System 1 and 2 in Conflict Scan down both columns below, calling out "upper" or "lower" to indicate whether each word is printed in lowercase or uppercase. Repeat, saying "left" or "right" depending on whether the word is to the left or right of centre. LEFT left right RIGHT RIGHT left right upper lower LOWER UPPER What did you notice? You probably were successful in saying the correct words in both columns and found that some tasks were easier than others Identifying upper and lower case in the LEFT hand column was relatively easy compared when doing it in the right hand column which caused you to slow down and even stumble. When you named the position of the words, the left hand column felt more difficult and the right hand column more easy [Ref 8] Conflict between System 1 and System 2, and the coerciveness of System 1 Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
21
Which horizontal line is shorter?
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission. A variation of the Müller-Lyer illusion is an optical illusion by Franz Carl Müller-Lyer (1889)
22
Which horizontal line is shorter?
Even those of you who know they are the same length your initial (System 1) thoughts are they are the same. A very simple visual illusion yes, but is our System 1 doing this in other situations are we jumping to conclusions based on ‘cognitive illusions’? Conflict between System 1 and System 2, and the coerciveness of System 1 A variation of the Müller-Lyer illusion is an optical illusion by Franz Carl Müller-Lyer (1889) Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
23
Take away point System 1 will go early System 1 will be more dominant
This will always be the case! Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
24
The Two Systems and their Traits
Fast Automatic Looks for Causes Effortful Lazy System 1 Believes Is Gullible System 2 Self aware Deliberate Coherent Story is good enough Logical Intuitive Serial Subject to Biases and Heuristics Sceptical Optimism Bias Planning Fallacy Priming & Anchoring Effect Availability Heuristic Fast – Automatic – Seeks coherent Story – Looks for causes – Part of looking for coherence, if the there is pattern of cause e.g. this he hurt his leg because he fell over when he was running along a road –therefore he must have tripped on something (he might not have tripped but is seems to make enough sense) Believes, is gullible – In order to determine if something is right or true, it must first be believed and then tested. Trouble is, it does not always get tested Intuitive Fireman - When in a rooms with smoke – few flames, ordered his men out, shortly afterwards the floor collapsed. At the time he did not know why he made the order. Except that he must have seen the same situation before and recognised the signs as being coherent and consistent with the floor falling in. Perhaps Read Blink by Malcolm Gladwell Subject to Heuristics and Biases - Substitution Framing Confirmation Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
25
System 1 A Machine for Jumping to Conclusions
Sometimes helpful Fight or Flight decision Sometimes not Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
26
System 1 Believes what is sees
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
27
System 2 - Slow Thinking Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
28
System 2 Slow, logical methodical
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
29
System 2 Effortful and Lazy
Dilating pupils normal working hard Uses Energy Gets tired Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
30
Heuristics and Biases Through his studies and analysis of the works of others Kahneman has identified that the human mind is subject to biases and heuristics Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
31
Heuristics and Biases Heuristic Rule of Thumb Quick and dirty Bias
Inclination or Prejudice Tends towards We all use heuristics – last time we went for a curry it cost me about £50 The last project of a similar size took about 6 months Biases are less obvious – I don’t like anchovies therefore anything with anchovies in is going to be bad, Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
32
System 1 Heuristics and Biases
Anchoring and Priming Setting it up in the mind influences the thinking Availability Bias If its easy to think of or recall from memory Substitution Heuristic Substituting a hard question with the answer from an easier one Optimism Bias & Loss Aversion Misplaced confidence in having control and the effects of risk Dislike of loss is greater than win Framing The context set for Decision Making has a tendency to influence the Decision (tick box to opt out) Sunk Cost The tendency to take bad decisions based on what has already been spent and refuse to cut our losses. Confirmation Bias Confirmation bias is the tendency of people to favour information that confirms their beliefs. Even those who avow complete and total open-mindedness are not immune Kahnemans biases and heuristics are those things he has found influence human thinking/decision making. He has conducted experiments to prove them for example for Anchoring and Daniel Kahneman (Amos Tversky) and once rigged a wheel of fortune, just like you’d see on the game show. Though labeled with values from 0 to 100, it would only stop at 10 or 65. As an experiment, they had unknowing participants spin the wheel and then answer a two-part question: “Is the percentage of African nations among UN members larger or smaller than the number you just wrote? What is your best guess of the percentage of African nations in the UN?” “The spin of a wheel of fortune… cannot possibly yield any useful information about anything, and the participants… should have simply ignored it. But they did not ignore it.” The participants who saw the number 10 on the wheel estimated the percentage of African nations in the UN at 25%, while those who saw 65 gave a much higher estimate, 45%. Participants’ answers were anchored by the numbers they saw, and they didn’t even realize it! Any piece of information, however inconsequential, can affect subsequent assessments or decisions. That’s why it’s in a car dealer’s best interest to keep list prices high, because ultimately, they’ll earn more money, and when you negotiate down, you’ll still think you’re getting a good deal! Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
33
System 1 - Heuristic and Biases Anchoring and Priming
Bias Priming An experiment - Split the room in half One half close there eyes whilst the other looks at the screen now Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
34
Eat Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
35
Experiment Priming An experiment Split the room in half
One half close there eyes whilst the other looks at the screen now Now Change sides Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
36
Wash Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
37
Note to self Tell them to open their eyes!
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
38
So_p Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
39
System 1 - Heuristic and Biases Anchoring and Priming
System 1 is associating what its sees with the recent past and make an association The association makes sense i.e. it is coherent Therefore system 1 can confidently state: Soup or Soap Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
40
Availability Bias Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
41
System 1 - Heuristic and Biases Availability Bias
If you have trained in Tanks Work with people who work with Tanks Are managing tanks Then the answer to your next problem is going to be a tank Some say “Solutioneering” Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
42
Optimism Bias Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
43
System 1 - Heuristic and Biases Optimism Bias - Planning Fallacy
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
44
Seen this before? Key problems
Major Project Review £5 billion the extent to which project teams may be underestimating the financial risks within project budgets, according to the Department’s independent Cost Assurance and Analysis Service [NAO MPR 2015 Ref 6] Key problems an inability to take tough, timely decisions in the Defence interest, particularly those necessary to ensure financial control and an affordable Defence programme, reflecting: the ‘conspiracy of optimism’ between industry, the military, officials and [Defence Reform [Ref 7] The schedule has slipped. Only one of four planned pilots went ahead according to the original schedule, and this pilot was restricted to extremely simple cases. (£12.8Bn) Some say “Can do” Some say “Can’t Decide” Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
45
Two Systems Working a Scenario
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
46
The Two Systems and Memory
Fast Automatic Looks for Causes Effortful Lazy System 1 Believes Is Gullible System 2 Self aware Deliberate Coherent Story is good enough Logical Intuitive Subject to Biases and Heuristics Serial Sceptical Optimism Bias Planning Fallacy Priming & Anchoring Effect Availability Heuristic Substitution Framing Confirmation Memory Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
47
The Two Systems Working
Problem Situation System 1 System 2 Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
48
The Two Systems and Memory
Lets System 2 know about the Problem System 2 might decides that it need to be involved Is it serious? Does it have too? What’s the benefit? Problem Situation System 1 System 2 Looks in Memory for something: Coherent Similar That will just about do Memory Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
49
System 1 Associative Machine – Coherence & Confidence
Problem Situation Info System 1 System 2 Info Looks in Memory for something: Coherent Similar That will just about do Memory Info Info Info Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
50
System 1 Associative Machine – Coherence & Confidence
Problem Situation Info System 1 System 2 Info Information is limited Memory Info Info Info Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
51
System 1 Associative Machine – Coherence & Confidence
Need not get involved System 1 System 2 Has confidence Forms a coherent story with ease But not always accurate! “If it hangs together it must be true?” Coherence Info Info Info Subject to Biases and heuristics Is there sufficient information? Is the information that has been associated from memory the right information or has it been substituted for something similar Info Info Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
52
Over confidence and WYSIATI
What You See Is All There Is The human trait of making use of the available data Neglecting to check that it is the minimum necessary Neglecting the: ‘known unknowns’ and the ‘unknowns unknowns’ Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
53
System 1 Expertise – Coherence & Confidence
Problem Situation Info System 1 System 2 Info Information is limited. But… More information Well Practiced over years (‘000hrs’) Reinforced by previous System 2 Read ‘Blink’ Malcom Gladwell Memory Info Info Info Info Info Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
54
Two System Summary Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
55
Discussion Point Discussion with the floor
As Systems Engineers how does this relate? How do we understand problems Stakeholders and their System 1s or 2? How do we see ‘mental model’ the problems Are our mental models vulnerable to the System 1 Traits? Are we aware? Can we be aware of the System1 one traits as we work? Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
56
As Systems Engineers how does this relate
Discussion with the floor How do we understand problems Stakeholders and their System 1s or 2? How do we see ‘mental model’ the problems Are our mental models vulnerable to the System 1 Traits? Are we aware? Can we be aware of the System1 one traits as we work? Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
57
Systems Engineers The people
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
58
Waters Foundation Habits of a Systems Thinker
Waters Foundation (R) Waters Foundation Habits of a Systems Thinker Systems thinking is described and defined differently by a variety of individuals and organizations. One way to identify systems thinking as a practice is to consider related habits of thinking and potential strategies to develop those habits. Do these ‘habits’ mitigate against System 1 ? Are these resonant with the two systems model Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission. [Ref 2]
59
Systems Engineering Theory and Practice
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
60
System Praxis - Systems Thinking
“Systems Thinking binds the foundations, theories and representations of systems science together with the hard, soft and pragmatic approaches of systems practice. Systems thinking is the ongoing activity of assessing and appreciating the system context (the system relationships and environment, resolved around a selected system-of-interest [Ref 3]) and guiding appropriate adaptation, throughout the praxis cycle” [Ref 3] Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
61
SeBOK Systems Praxis Does this represent the of mitigations that systems engineering has to System 1 thinking? Is the shear volume of elements in this praxis an indicator of the strength of the System 1 [Ref 3] Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
62
My thoughts on it Stakeholders tend to exhibit System 1
Systems Engineers tend to exhibit more of System 2 We have processes, tools and techniques to help us to be logical, objective to mitigate against the traits of System 1 We are not infallible, we are still human We (systems engineers) are still subject to System 1 Stakeholders tend to exhibit System 1 – It is inevitable that stakeholders will exhibit System 1 traits. You are looking ofr issues and needs (and each will express their own needs with their own drivers which will not always be objective, they will have availability biases, affect heuristics, suffer loss aversion conspiracy of optimism, the engagement itself will be relatively short thus only enabling fast thinking. Systems Engineers tend to exhibit more of System 2 They balance between holism and reductionism. Reductionism favours the System 2 whereas System 1 is more resonant of Holism i.e a System 2 9our brain can not deal with more that 7+2 issues simultaneously. We have processes, tools and techniques to help us to be logical, objective to mitigate against the traits of System 1 – See the technical process of ISO 15288:2015 and the INCOSE Handbook , SSM, VSM, Architecture modelling, system dynamics measurement experimentation etc We are not infallible, we are still human – We are still subject to System 1 - We still see the world through the System 1 traits, we may have disciplined ourselves to what we think is an objective place but as we have seen, System 1 is fast and dominant. But we should also acknowledge that System 1 thinking is the creative side, the very fact that it brings together information together, from memory and sources to form a coherent whole suggests there is potential for new ideas or innovation adaptation. To know this, to monitor this to even enhance this is perhaps an strength of SE with awareness of the Systems of his mind. Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
63
Problem Definition Systems Engineers, Stakeholders and the Two systems
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
64
Example - A lifecycle Stage
Most stakeholder activity 6.4.2 Stakeholder Needs & Requirements Definition [Ref 9] Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
65
Needs/Problem Definition The Systems Engineer looks at the Problem
Problem Situation System 1 System 2 The SE is seeing the problem via his System 1 With all of it’s fallibilities Fast – Automatic – Seeks coherent Story - Intuitive Fireman - When in a rooms with smoke – few flames, ordered his men out, shortly afterwards the floor collapsed. At the time he did not know why he made the order. Excetp that he must have seen the same situation before and recognised the signs as being coherent and consistent with the floor falling in. Subject to Heuristics and Biases - Memory Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
66
Need/Problem Definition An SE looks at the Problem from a Stakeholders
Problem Situation Stakeholder System 2 System 1 Systems Engineer System 1 System 2 The SE is seeing the problem via the Stakeholder’s System 1 and his own Memory Memory Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
67
A view through two System 1s
What are the consequences What can be done to mitigate these? Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
68
What are the Consequences?
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
69
What can be done to Mitigate?
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
70
Other Parts of the Lifecycle?
Anywhere there’s people What about Decision Management? How do biases and heuristics get engineered out. How is the System 2 thinking brought to the fore? Clearly the Decision Management Processes in lifecycle will be affected by what Kahneman has talked about. What can be done? Should more thought be put into planning preparation and he methods used in Decision making? How do biases and heuristics get engineered out. How does the System 2 thinking come to the fore? [Ref 9] Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
71
Summary Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
72
Summary Our brains work in a particular way
Models of ‘System 1’ and ‘System 2’ are helpful The Heuristics and Biases affect everyone Some less than others (System Engineers?) Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
73
What do we do with this knowledge?
We are aware of the phenomena We have the language System 1 and System 2 Biases, heuristics, fallacies Look out for System 1 thinking in your: Stakeholders Colleagues Yourselves Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
74
Other related books to read
Fifth Discipline – Peter Senge Black Swan – Nasseem Taleb Blink – Malcom Gladwell Chimp Paradox – Dr Steve Peters Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
75
Thank you Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
76
Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
77
References Thinking Fast and Slow, Kahneman D, retrieved 1 Jun 16 from Habits of a Systems Thinker (2015), Systems Thinking in Education retrieved 1 Jun 2016 from Foundations of Systems Engineering. (2016, March 25). in BKCASE Editorial Board, Guide to the Systems Engineering Body of Knowledge (SEBoK), version 1.6, R.D. Adcock (EIC), Hoboken, NJ: The Trustees of the Stevens Institute of Technology ©2016. Retrieved 1 Jun :50:54 UTC from BKCASE is managed and maintained by the Stevens Institute of Technology Systems Engineering Research Center, the International Council on Systems Engineering, and the Institute of Electrical and Electronics Engineers Computer Society. System Context (glossary). (2016, March 25). in BKCASE Editorial Board, Guide to the Systems Engineering Body of Knowledge (SEBoK), version 1.6, R.D. Adcock (EIC), Hoboken, NJ: The Trustees of the Stevens Institute of Technology ©2016. Retrieved 1 Jun :00:32 UTC from BKCASE is managed and maintained by the Stevens Institute of Technology Systems Engineering Research Center, the International Council on Systems Engineering, and the Institute of Electrical and Electronics Engineers Computer Society. Attribute Substitution in Systems Engineering, Smith D, Bahill, T.A. , Systems Engineering 13(2): · January 2009, retrieved 1 Jun 16 , from http National Audit Office Major Project Report 2015, retrieved 1 Jun 16 from Defence Reform, Levene, Retrieved 1 Jun 16 from Thinking Fast and Slow, Kahneman D, 2011, Farrar, Straus and Giroux ISBN ISO 15288:2015 Systems and software engineering — System life cycle processes published by International Standardisation Organisation, Geneva.2015 Copyright © 2016 by Kevan Boll Published and used by INCOSE UK Ltd and INCOSE with permission.
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.