Presentation is loading. Please wait.

Presentation is loading. Please wait.

Vulnerability of Human Organizations: Models & Methodological Synthesis Vulnerabilita delle Organizzazioni Umane: Modelli e Sintesi Metodologica Adam Maria.

Similar presentations


Presentation on theme: "Vulnerability of Human Organizations: Models & Methodological Synthesis Vulnerabilita delle Organizzazioni Umane: Modelli e Sintesi Metodologica Adam Maria."— Presentation transcript:

1 Vulnerability of Human Organizations: Models & Methodological Synthesis Vulnerabilita delle Organizzazioni Umane: Modelli e Sintesi Metodologica Adam Maria Gadomski http://erg4146.casaccia.enea.it 2 December 2005 Part III

2 Presentation outline Recalls Top-Definitions: top-intuitive, action-reaction frames Models: key properties, model network Vulnerability (weak points): observability Methodology for Problem Investigation Strategies soft and hard improvements Intelligent Infrastructure Network: Conclusions

3 This research is focused on: elaboration of models for the computer simulation and what-if analysis of the vulnerability of human organizations We are focused on the vulnerability of h-organization caused by Vulnerability of D-M processes. -numerous studies lead to the conclusion that the main critical aspect of organizational vulnerabilities is a decision-making. (Part I) vulnerability decreasing should lead to the reduction of the possibility of losses caused by organizational decisional errors in situations of crisis and emergency. Objective of the work ENVIRONMENT Decisional Processes Network Ind. DM Org.DM Intra- Org.DM Aut.DM Org.DM Aut.DM

4 Main Concepts Definitions: Emergency, Crisis,Vulnerability Syntetic def. Vulnerability: Lack of immunity or insufficient résistance on unexpected but possible events. Crisis: a complex situation/phenomenon where a routine management is not more efficient. Crisis creates a system with unknown functionality [Addis,1990] and behavior. Crisis can appear on various organizational levels. In extreme crisis situation routine control/management is not more possible. Using a model-base interpretation: Crisis is when the model applied for the management is not more adequate to the real organization structures and processes. SoA: Human-Organizations Vulnerability (HOV)

5 Main Concepts Definitions Emergency is focused on unacceptable levels of risk and losses generation caused by abnormal events and immediate interventions is or have to be performed during whole emergency state. Crisis states usually activate, sooner or later, an emergency state. [Gadomski.1990]. Vulnerability is a readiness to a crisis state. We distinguish two basic types of vulnerability: A. Vulnerability on external events: dangerous situations, attacks, intrusions - human-based threats, natural threats, technological, market threats. B. Vulnerability on internal events: internal crisis, pathologies, improper reorganization. Efficacy of organization in its mission realization is considered its top-attribute. Human-Organizations Vulnerability (HOV) How these concepts look from Organization perspective

6 Visibility of Vulnerability: action-reaction framework Hypothetical qualitative curve of the efficacy of an organization in its lifecycle. Lifecycle phases FoundationSelf-organization Proper ActivityRe-organization Proper Activity Time Ef i Efficacy Domain of Interest Now we may talk about links between: vulnerability – crisis - emergency

7 We have different levels of emergency They depend on the nature of emergency and risks. Emergency level i  necessary critical efficacy level i, Efi. Visibility of Vulnerability: action-reaction framework We may distinguish three following necessary critical efficacy levels: - Survive efficacy, Ef 0. - Emergency Critical efficacy, Ef 1 - Routine Critical efficacy, Ef 2 (enable bureaucratic functioning) Time Ef 0 Ef 2 Ef 1 Crisis Efficacy Proper Activity phase Re-organization Vulnerability Pathological organiz. Vulnerability If organization efficacy ef < Ef 1 then organization is vulnerable Modelling Problems: definition of: metrics, measurement, estimation/assessment of ef (t), Ef 2, Ef o, for t ∈ T. For this we needs models.

8 1.Main Organization observables (attributes) 2.Models as a networks aggregate (start from recognition of objects – relations - changes) Recognition of: 1.Critical functionalities (in function network) 2.Critical points (in system network) 3.Critical processes (in process network) 4.Dynamics (propagation of vulnerability) Vulnerability Identification Methodology Vulnerable objects and relations (TOGA Methodology) How to model ?

9 Existing approaches to HOV modeling We distinguish three main types of modeling approaches in the SoA: 1.Soft modeling: descriptive, partial and intuitive – human-user-oriented 2. Hard mathematico-physical modeling: partial, continuous processes, difficulty with measurements, idealistic – for illustrative simulations, Computer oriented, numerical and logical calculations ( for example: Operational Research). 3. Flexible socio-cognitive modeling: computational, real-world conditions, systemic, AI techn., external and internal observers. For simulation and decision-support. Interdisciplinary, Human-computer oriented. In the development. Human-Organizations Vulnerability (HOV) Our domain of Interest We need a formal theory.

10 We need completeness & utility HOT is a Real-World theory, it means it has to be complete on the level of generality of a real-world description in order to fulfill utility requirements. Remark: Every theory is a knowledge. Most generally and formally speaking: let U denotes an infinite set of states of real world W, and M x denotes a complete model of W then Th y is a real world theory if exist such set of states U in attributes space Y related to the goal X (A) of the model M x and: Human-Organization Theory (HOT): Modeling Framework  Th y ( W ) M x (U), where U  U and nA ==n Y Examples of a complete description of the W M 1 : { A, B }, where A denotes all material objects and B are all only energy objects. M 2 : {A, B, C, D }, where A are all humans, B are all their interactions, C are other W components, D are other interactions. HOT ( a sub-theory of TOGA)

11 Methodological soc-cog framework: TOGA According to the current needs. TOGA will be introduced successively and we will use TOGA’s: - axioms… - terminology - generic systemic computational models - methodology T op-down – problem recognition & specification O bject-based – a fundamental conceptualization G oal-oriented – problem recognition & specification A pproach

12 HOT’s Top-Ontology: First comprehension level Human organization,  Environment,  Interactions, R ( , R,  ) Foundation Goal,  Human organization is an artificial system which includes human components  R  ( ,.)  (.) ? We use: G-S interrelation Theorem Every components of the triple ( , R,  ) is decomposable, i.e.  1,  2, … R 1, …  1,  2, … are functionally, processually and structurally connected.   i, Rj,  k which are components subsequently of : , R, . Proof: it results from the TOGA axioms. A modeler ( M ) perspective M Valid for every problem

13 HOT Top-Ontology: Definition of vulnerability Human organization,  Environment,  Interactions, R Domain of Activity,  States, S Foundation Goal,  … Vulnerability on X, v Vulnerability v ( , X) is an attribute of , when exist such class of S( ,  ) which may produce losses for  in the case of the R | X, where X denotes a specific class of R characterized by an unaccepted risk. Possible h-organization worlds include domains of activity  with: - goal-domain - cooperation domain - intervention domain. Abstract objects 

14 HOT Top-Ontology: Identification of vulnerability: Human organization,  Environment,  Interactions, R Domain of Activity,  States, S Foundation Goal,  … Vulnerability on X, v Risk r Observation time Interval T Identification of the vulnerability requires an identification of objects and relations involved: v( , X)  W ( , , S( ,  ), R| X ) W - denotes a world of problem. On the other hand, identification of v( , X) is necessary for its analyzing and reducing. Therefore we need to have a problem-independent framework of a generic world of problem : W ( , , S( ,  ), R| X (r,.), T) Such model has to be decomposed successively and should enable to observe and simulate pathologies of organizations which lead to organizational erroneous decisions.

15 HOT Top-Ontology: Identification of Problem World W ( , , S( ,  ), R| X (r,.)) ……… (*) is a carrier of organizational decisional processes (ODM). ODM is constructed on different levels of h-organization. We need to identify such set of observable/measured attributes ( AW ) which will be common for the model of W and ODM. In this case, modification of ODM will change W and will lead to the changes of v( , X). In order to find AW the components of the W model (*) have to be decomposed and individually modeled. Some separate models of the W-model components are in the subject matter literature. Human organization,  Environment,  Interactions, R Domain of Activity,  States, S Foundation Goal,  … Vulnerability on X, v Risk r Observation time Interval T Problem world W Decisional ODM processes Common Space AW

16 HOT Top-Ontology: Models of the components of W-model We have many specific models of: organizations, their domains of activity, risky and losses generation events (emergency, crisis, …) managerial decisional mechanisms, but they have numerous different goals, conceptualizations (ontologies), and are not integrated/ordered for the vulnerability modeling. Anyway some critical relations between W-models components are recognized. Human organization,  Environment,  Interactions, R Domain of Activity,  States, S Foundation Goal,  … Vulnerability on X, v Risk r Observation time Interval T Problem world W Decisional ODM processes Common Space AW The main are: ODM – organization structure Individual risk – organization risks – ODM Event types - ODM constrains

17 Social Factors: Decomposition of the Domain Social factors identification requires decomposition of the organization environment.    11 22 nn  Organization World: decomposed objects and relations intervention cooperation dependence mm Social factors: a. development/life cycle phase; new, old … b. structural constrains c. preparedness : proper exercitations d. politic influences e. technological communication infrastructures Decomposition/specialization

18 Cognitive Factors: Decomposition of an Organization Cognitive organizational factors i. individual motivations ii. accepted risk iii. individual power and autonomy iv. individual recognition   Organizational unit human unit (intelligent agents) technological support unit Critical relations: ODM (decision-making) – org.structure Decomposition/specialization

19 Critical relations: intelligent object - decision-making Organization is seen as an abstract intelligent agent and a embedded complex object. Here, new cognitive, AI, socio-cognitive perspectives are involved. Systemic Approach InformationResponse Decision-Making Managerial IPK resources structured according to the role network.

20 Intelligent Agent Decomposition: IPK Paradigms Information is processed by Knowledge: I’ = K j ( I ), j=1, …N, for  l where choice of j depends on Preferences. - I nformation - - How situation looks - Past/Present/Future states of Domain-of-Activity (D-o-A) - P references - - A partial ordering of possible states of D-o-A and they determine what is more important - K nowledge - - What agent is able to associate (descriptive/model knowledge: rules, models) - What agent is able to do in Domain-of- Activity (operational knowledge) I K P “ Mind Cell” Elementary IPK Computational Model Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 23/06/2005http://erg4146.casaccia.enea.it Abstract Intelligent Object Modelling (TOGA,93) Domain of activity

21 IPK Cognitive Architecture Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 23/06/2005http://erg4146.casaccia.enea.it

22 IPK: Cooperative Intelligent Objects Real Emergency Domain Agent 1 Agent 2 Agent 3 Agent N I2I2 P K InIn P C I1I1 P K I3I3 P K Infrastructure Network.. I – information system P – preferences system K – knowledge system Agent Manager I P K Example Copy rights High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 8/10/2003http://erg4146.casaccia.enea.it [Balducelli,Gadomski,1993]

23 IPK Bases: an example Copy rights High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 8/10/2003http://erg4146.casaccia.enea.it

24 Component Errors Modelling Copy rights High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 8/10/2003http://erg4146.casaccia.enea.it Human ERRORs : Not proper or not sufficient Information Lack or not proper Importance Scale (Preferences, risk ass.) Not proper or not sufficient instructions, procedures (Knowledge) Wrong Cognitive and Organizational Factors (Motivations). Models are Knowledge Problem Specifications are: Requested & Modified Information Motivations create proper Preferences which activate adequate Knowledge Basic Modelling Framework IPK Cognitive computational model (Information, Preferences, Knowledge) Application I KP I 2 = K i I 1, where K i = P {K}

25 SOCIO-COGNITIVE ENGINEERING : an Intelligent Organization Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 28/09/2003http://erg4146.casaccia.enea.it TOGA theory framework Organization Mission/Fundation-GoalProducts/Actions General Functional Frame Intelligent Organization  is specified by: set of roles, {  } structure,  decisional mechanisms, ODM, and Resources/means, , such as information network  ( , , ODM,  ) All of them can be a cause of Vulnerability : v( , X). Unexpected events

26 Fig. 1. H-Organization: A graphical illustration of Universal Management Paradigm (UMP): the cooperating-manager environment from the subjective perspective of a pre- selected decision-making manager [4]. DOMAIN OF ACTIVITY AND MANAGER’s GOAL-DOMAIN EXECUTOR informationtasks ADVISOR expertises COOPERATING MANAGER cooperation SUPERVISOR/ COORDINATOR tasks information Knowledge & Preferences repository INFORMER MANAGER with the same relative internal structure UMPincludes6canonicalrolesandtheirinterrelations Components: Universal Management Paradigm (UMP)

27 Dynamic Role Model (computational) Integration of IPK in the definition of role ( TOGA) Role ( competences, duties, privileges ) Competences: what he/she/it is able to do, possessed models of the domain (knowledge) Duties: responsibility, tasks and requested preferences Privileges: Access to the information. It produces conceptual images of the domain. Access to execution tools (information); org.power. The roles are specified by their own IPK Bases Set: Information Bases – how situation looks, continuously updated Preferences Bases – importance scales/relations, ethics rules Knowledge Bases – required models & know how Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 28/09/2003http://erg4146.casaccia.enea.it Remark: Structure depends on roles, and roles depend on IPKs

28 Pathologies of Organizations: Examples Adam Maria Gadomski, http:// erg4146.casaccia.enea.it/ High-Intelligence & Decision Research Group, 2005 HID Complex situation: Every human-agent is in 3 roles together : 1. Organizational role – requested/defined by the structure (fixed) 2. Informal role – applied, structure independent (variable) 3. Personal/real role – really realized (variable) Conflicts of Roles Compromise, inefficient risky decisions. Necessity of negotiations Dynamics of roles may create different lack of congruence between them & conflict of interests Conflict of Interests/Motivations Differ Risk-Benefits modelling for All of them influence Org. DM Social interest Organization interest Personal interest

29 Decision-Making Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 28/09/2003http://erg4146.casaccia.enea.it New Information or task Knowledge Base Preferences Base Decision-Making No action/response Meta-action/Pseudo-action Action adequate to D-M’er role and situation Cognitive Definitions [TOGA] Decision-making: an individual or group reasoning implied by the request/necessity of a choice caused by received information or task, or by delivered conclusion about possibility of risks/benefits. It is started when either choice criteria are unknown or alternatives are unknown and finished when choice is performed. Action-oriented decision-making: it is a decisional process when alternatives represent possible actions in pre-chosen physical domain. Mental decision-making: when the final choice refers not to actions but to conceptual objects related to a preselected domain of activity of intelligent agent. Group decision-making: when responsibility for decision is allocated to a group of intelligent agents and is based on shared decision-making process.

30 Pathologies of Decision-Making (computational models) Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 28/09/2003http://erg4146.casaccia.enea.it Controlability & updating of Ethics concept reasoning path critical node alternatives d-m datadecision ? ? Types of Proper and Pathological Decisions Main classes: - meta-D-M, - pseudo D-M, - proper D-M. Pathologies are related to: - response on source type ( “safety” filters ); - response on subject ( lack of competences, emotional reaction, out of Interest). - response according domain-preferences (organizational/personal role): proper DM. If D-M autonomy increases then: Efficacy of Control decreases & Importance of Ethics and personal motivation increases. This rule indicate importance of Motivation Management.

31 Pathology of Bureaucracy: two iron laws There are two iron laws of bureaucratic behavior of the self- aggrandizement managers : 1. They tend to maximize the resources they control, usually at the expense of their competitors within the organization. [J. Wilson, 2005] 2. They (in different manners) tend to minimaize their own personal risk. [G.Ridman, 2001] The primary: such actions increase subjective security and informal power. The second law implies that managers tends to take only unavoidable risks, and all decisions that seem to carry some risk to the decision- maker will be (bucked up) as far as possible. These laws apply equally to private- and public-sector. Frequently the personal risk is hidden and officially “does not exist” but it influences strongly bureaucratic decision-making, and it is significant component of vulnerability (VoHO).

32 Strategies: continuous improvement by D. Keith Denton, Creating a system for continuous improvement - improving an organization's decision-making process. Business Horizons, Jan-Feb, 1995 To have continuous improvement, there has to be some factor that binds people together. There must be a common purpose, and each member must understand his or her role. If you want real, long-lasting change, then you must have a way of focusing people on the change. Individual motivation building is essential factor for the organization continuous improvement and robustness.

33 Strategies: Human-Oriented MANAGEMENT OF STRATEGIES: A primary concern of every consumer of management theory is to understand where it applies, and where it does not apply. {Paul R. Carlile, 2005] On November 14, 2005, KMCI will hold its One-Day Workshop on Reducing Risk by Killing Your Worst Ideas. Most contemporary approaches to risk concentrate on assessing risk in the context of some model being applied by the person or group assessing risk, so if that model is false or illegitimate the risk assessment is too. This workshop views risk assessment from this internal perspective. It tells you how to reduce risk, particularly in business, by using both creative learning and critical thinking. The problem of a wrong strategy choice how to cope with vulnerability - What is clear but How is not yet well defined. Systemic Approach

34 Response STRATEGIES: TOGA Strategy  ( A, B, C, D, E, F,. ) is a pattern for a class of actions, it depends on attributes of ,  and R| X.. Components of a Strategy in different phases of the lifecycle of an organization ( they have decrease vulnerability v( , X) ). A.Learning (continuous knowledge acquisition) B.Training (real, simulated), games C.Motivation building (individual, group), competition D.IDSS functions (computerized, real-time) E.Reorganization (in crisis) F.Bottom-up local reasoning according to clear and accepted top- down rules (routine).

35

36 Strategies/actions for Decreasing of vulnerability Strategies/actions for Decreasing of vulnerability The take-away Effective knowledge management goes beyond information technology or special, one-time efforts. Successful companies (as reckoned by financial and other performance indicators) set ambitious goals for product development and process innovation and provide a range of financial and nonfinancial incentives for employees who share knowledge with colleagues. http://www.mckinseyquarterly.com/article_abstract.asp?tk=352635:991:21&ar =991&L2Risk and carrier building strategies, against soc-cognitive vulnerability http://www.mckinseyquarterly.com/article_abstract.asp?tk=352635:991:21&ar =991&L2 Risk and carrier building strategies (CaBS) - a natural process gives motivation to increase labor effort and to the acceptation of higher decisional risks.

37 Strategies: Intelligent Infrastructures (IIN) IINs are highly autonomous systems which support services and industrial/production systems enabling them to execute human end-users oriented functions. IINs are one of emergent challengers of our new century, they are feasible for realization. Recently, intelligent infrastructures networks or intelligent networked infrastructures( a "multi-brain nervous system") are becoming emergent components of embedded dependable computer and human-computer systems. They should lead to the building of different forms of "collective intelligence“ (organiz-human-computer). Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 23/06/2005http://erg4146.casaccia.enea.it EC, Unit G3: "Embedded Systems"

38 Abstract Intelligent Kernel for Intelligent Infrastructures Functional requests We need a software module with capacity of: - autonomy in decision making - reasoning/inferencing in problem solving - learning from the environment and from communication - modification of its own goal - modeling/identification of its world (discovery) - knowledge and information acquisition by communication - interaction with environment by effectors and communication.... and TOGA includes a preliminary framework of such abstract properties. Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 23/06/2005http://erg4146.casaccia.enea.it

39 Applications: TOGA Methodology for Intelligent Kernel Design From the ENEA’s Tech. Proposals of the EU Project EIDA,1996 & EMIR 2004 (Abstract Managerial Intelligence) Based on SPG Approach.

40 Infrastructure Simulation Game System World Editor World Simulator IntelI. Infrast. Kernel Human Supervisor or Manager “Absolute Observer” (designer) Interface Servicies Units Communication Interface Communication Servicies Functional Units Intelligent Infrastructure Top view of the Infrastructure Simulation Game System Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 23/06/2005http://erg4146.casaccia.enea.it

41 MANAGER INFORMERs EXECUTORs information tasks ADVISORs expertise s COOPERATING MANAGERs cooperation SUPERVISOR tasks information Knowledge Preferences An example: Intelligent Chip for m-Learning & m-IDSS I-Chip USB m PC PC+Web Intelligence Infrastrutture Network Domain of Activity Artificial Organization – mixed two webs (Personoids, see Web) http://erg4146.casaccia.enea.it/wwwerg26701/per- hom2.html Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 23/06/2005http://erg4146.casaccia.enea.it TOGA’s-Model

42 Computer support & substitution of human functions HUMAN tasks COMPUTER personoids tasks Life-functions Information Systems & DSSs or Robots Social-functions (Decision-Making Sypport Systems) Activities % contribution to an activity/task 100 50 00 trend Development of autonomous computer infrastructure networks

43 Some References, 1 1.A.M. Gadomski.TOGA: A methodological and Conceptual Pattern for modelling of Abstract Intelligent Agent. In Proc. of the ‘First International Round-Table on Abstract Intelligent Agent’,25-27 Jan 1993, Enea print (1994). 2. A. M. Gadomski. Personoids Organizations: An Approach to Highly Autonomous Software Architectures, “11th International Conference on Mathematical and Computer Modeling and Scientific Computing,, March 31 - April 3, 1997, Georgetown University Conference Center, Washington. 3.A.M.Gadomski et al., Towards Intelligent Decision Support Systems for Emergency Managers: The IDA Approach. International Journal of Risk Assessment and Management, IJRAM, 2001, Vol 2, No 3/4. 4.A. M. Gadomski, Meta-Knowledge Engineering Server (since \997): http://erg4146.casaccia.enea.it 5.Hannan, Michael T., and John Freeman, "Structural Inertia and Organizational Change." American Sociological Review, 49 (1984): 149-164. 6.Amburgey, Terry L., Dawn Kelly, and William P. Barnett, "Resetting the Clock: The Dynamics of Organizational Change and Failure." Administrative Science Quarterly, 38 (1993): 51-73 7.Levinthal, D., "A Survey of Agency Models of Organizations." Journal of Economic Behavior and Organization, 9 (1988): 153-185 8.Eisenhardt, K. M., "Agency Theory: An Assessment and Review." Academy of Management Review, 14 (1989): 57-74. 9.Simon, H. (1976), Administrative Behavior (3rd edition). New York: The Free Press. 10.Allison, G. (1997), The Essence of Decision. Glenview, IL: Scott, Foresman & Co. © ENEA, 2004. A.M.Gadomski., E-mail: gadomski_a@casaccia.enea.it

44 References, 2 Adam Maria Gadomski, http:// erg4146.casaccia.enea.it/ High-Intelligence & Decision Research Group HID - 1. A.M. Gadomski, SOPHOCLES - EUREKA & MURST & ENEA: Intelligent Cognitive Systems Engineering, Transparent-sheets, 20/09/2000, Updated 17/06/2001 ENEA, ITEA materials. 2.A.M. Gadomski, SOPHOCLES Project – Cyber Virtual Enterprise for Complex Systems Engineering: Cognitive Intelligent Interactions Manager for Advanced e-Design, Transparent-sheets, 28/08/2001, ENEA. ITEA materials. 3.A.M.Gadomski. TOGA: A Methodological and Conceptual Pattern for modeling of Abstract Intelligent Agent.Proceedings of the "First International Round-Table on Abstract Intelligent Agent". A.M. Gadomski (editor), 25-27 Gen., Rome, 1993, Published by ENEA, Feb.1994. 4. A.M.Gadomski, "The Nature of Intelligent Decision Support Systems". The key paper of the Workshop on "Intelligent Decision Support Systems for Emergency Management ", Halden, 20th-21st October, 1997. 5. A.M.Gadomski, S. Bologna, G.Di Costanzo, A.Perini, M. Schaerf. Towards Intelligent Decision Support Systems for Emergency Managers: The IDA Approach. International Journal of Risk Assessment and Management, 2001. For more information yet:

45 Thank you

46 Crisis and Vulnerability.Change management: Resistance to Change in Organisations Summary of a survey among 245 individuals, (Oliver Recklies), http:www.themanager.org Crisis Response: It is no longer a question of “if” an organization will face a crisis; it is, rather, a question of “when,” “what type” and “how prepared” the company is to deal with it (Mitroff et al., 1996). No one person or organization, no country, nor a system is immune from crisis (Coombs, 1999). Fink (1986) suggests that planning for a crisis “… is the art of removing much of the risk and uncertainty to allow you to achieve more control over your own destiny”. How to predict threats to ethical decision making during crisis? S. L. Christensen Business & Society, Vol. 42, No. 3, 328-358 (2003). Vulnerability Metrics – key problem. Complexity science does not make the current terminology redundant but gives a new context to crisis response. (A. Paraskevas,2005). Implementing Vulnerability Scanning in a Large Organisation, Defence in Depth strategy. “have I demonstrably improved the security of my organisation?” R. Grime, 2003. Copyright High-Intelligence & Decision Research Group, CAMO, ENEA, http://erg4146.casaccia.enea.it Adam M. Gadomski, 23/06/2005http://erg4146.casaccia.enea.it

47 Presentation outline Objective of the work SoA: Human-Organizations Vulnerability (HOV) Human-Organization Theory (HOT) HOT’s Top-Ontology Social and Cognitive Factors Critical Relations (weak points) IPK, UMP and Role frameworks Pathologies and Errors Organization Decision-Making Strategies Intelligent Infrastructure Network Conclusions

48 Critical relations: decision-making – org.structure EXAMPLES of Soc-cog CASE STUDY 1. The Collapse of Decision Making and Organizational Structure on Storm King Mountain, T. Putnam, Ph.D. USDA Forest Service, Missoula Technology and Development Center,1995. 2. SOFT MODEL: Restrictive Control and Information Pathologies in Organizations, W. Scholl Journal of Social Issues, Vol.55 Issue 1, Spring 1999 : - Restrictive control is a form of power exertion in which one actor pushes his wishes through against the interests of another actor. - Promotive control, if an actor influences the other in line with his or her interests (common interests). Restrictive control has negative consequences for the production of new or better knowledge, because it induces information pathologies that in turn lower the effectiveness of joint action. These two control hypotheses are tested in a study on 21 successful and 21 unsuccessful innovations.


Download ppt "Vulnerability of Human Organizations: Models & Methodological Synthesis Vulnerabilita delle Organizzazioni Umane: Modelli e Sintesi Metodologica Adam Maria."

Similar presentations


Ads by Google