We think you have liked this presentation. If you wish to download it, please recommend it to your friends in any social system. Share buttons are a little bit lower. Thank you!
Presentation is loading. Please wait.
Published byStephanie Hogan
Modified over 2 years ago
© Copyright QinetiQ limited 2007 QinetiQ Proprietary Human Aspects of NEC: Decision- Making, Organisation and Information Dr Andy Belyavin A presentation to: Operational Research Society Farnborough 18 April 2007
QinetiQ Proprietary © Copyright QinetiQ limited NEC Introduction of new IT to systems presents substantial challenges National Audit Office concluded that benefits rarely realised if previous system is maintained by IT introduced keeping processes constant Introduction of IT is an enabler of organisational change Analysis of the impact must understand this key element Focus of the analysis must be on the people dimensions of the system If the focus is on the IT itself wrong conclusions will be drawn almost surely
QinetiQ Proprietary © Copyright QinetiQ limited Approaches to identifying solution Developing strategy for organisation change is a hard problem Tends to be done by constructing a plausible solution and then iterating by trial and error Not a good solution for military systems Clearly better if the problem can be approached analytically Desirable elements of the solution identified Undesirable elements ruled out Put final polish on solution empirically Presentation will discuss models of human decision-making and measures of performance for C2 systems
QinetiQ Proprietary © Copyright QinetiQ limited NEC Human Challenges Getting NEC to work Manpower Skills Training Command WLC including Manpower Training Info exploitation Interfaces Visualisation Automation Resilience Agility Vulnerability Motivation Messages Doctrine Intent Organisation Will Trust Legacy Avoiding stovepipes Integration risks Non-technical Interoperability Other barriers Interfaces Automation Situation awareness Enabler of process change
QinetiQ Proprietary © Copyright QinetiQ limited Key elements of modelling At an abstract level can regard a C2 system as a complex system for taking large volumes of data in at one end and putting out decisions at a number of levels Critically: need to be able to describe human elements in the system Includes: Need to be able to represent data flow in the system between human agents Need to be able to model the process with time Need to be able to represent conversion of data into models that can be used for information processing and decision-making
QinetiQ Proprietary © Copyright QinetiQ limited Problems in human representation Key issues in the people component of NEC that need to be described for long term concept development Decisions Information flow Organisation form and process Training and doctrine ……… Focus discussion on decisions, information flow and the assessment of organisation performance
QinetiQ Proprietary © Copyright QinetiQ limited Decision making
QinetiQ Proprietary © Copyright QinetiQ limited Basic principles and assumptions Assumed that low- to medium- level military decisions are trained decisions made under time pressure Appeal to Kleins recognition-primed decision-making as the model Effectively classify the inputs and map directly to courses of action From the statistical point of view this corresponds directly to the multivariate discrimination problem First approach developed by Fisher in 1930s – Fishers Linear Discriminant (LDF) Demonstrated that solution to classification problem optimal if use weighted likelihood ratio
QinetiQ Proprietary © Copyright QinetiQ limited A appropriate B appropriate c 1 =c 2 c 1 c 2 Measure 1 Measure 2 Simple discrimination Discriminant function slope
QinetiQ Proprietary © Copyright QinetiQ limited Components of the solution Three key inputs to the classification Mental model used to classify outcomes (discriminant function) Perceived costs and benefits of outcomes (individual characteristics) Data on which model is based (information) Complicating factor is that decision is not at single time Decision may evolve with time – need to model development We can solve the problem for optimal classifier In practice classifier does not need to be optimal; just pretty good and varies from individual to individual under some conditions Can update decision with time to correct imperfect decisions
QinetiQ Proprietary © Copyright QinetiQ limited Investigating model choices in decision-making DECIDE Objective of the trials was to investigate use of information in decision-making DECIDE task was developed under the guidance of Neville Moray at Surrey University The aim was to control flow of troops through hostile territory to achieve the largest number sent with minimum casualties Casualties were incurred when enemy strength was high and low when strength low Score determined as a combination of flow achieved and casualties incurred
QinetiQ Proprietary © Copyright QinetiQ limited DECIDE task (1) The task is to send troops through a hostile zone Enemy strength varies in the hostile zone and this determines the number of casualties taken Participants had to decide when to send and when to stop sending troops The task is to send the most number of troops through the zone whilst incurring the fewest casualties Information is initially hidden and participants must request information by clicking on the source Each request for information is recorded in a data log
QinetiQ Proprietary © Copyright QinetiQ limited DECIDE task (2) Participants can access four sinusoidal information sources (with added noise) The four sources have different amplitudes and wavelengths They must use these sources to infer enemy strength The actual enemy strength is the sum of the four sine waves (without noise) The best indicator is given by the sum of the four noisy sources Metric of task success is:
QinetiQ Proprietary © Copyright QinetiQ limited Participant performance Performance was different for the three groups Each group was given a different level of information about the task: Group A: No information about the sources Group B: Basic information about how the sources relate to enemy strength and an indication that two of the sources are better than the other two Group C: Received the same information as Group B but after a period of training A score of 500 represents a good score The best participant scored 1600 on a number of runs
QinetiQ Proprietary © Copyright QinetiQ limited Human variability Three main sources of human variability: different sources of information used to estimate enemy strength: this was deduced from the frequency of request of each source and the post trial interviews frequency of use for each source: each participant had access to different information depending on their update frequencies willingness to take casualties: some participants sent as enemy strength was just start to drop and others sent when enemy strength had reached a trough The information value at each time step of the task was collected and used to fit classification models to the behaviour of the subjects
QinetiQ Proprietary © Copyright QinetiQ limited DECIDE task IPME model DECIDE was simulated using the underlying equations governing the generation of the information sources etc. A simple probabilistic model of the monitoring of the information sources was created based on the observed frequency of request for the individual sources A two-state (send/not sending) operator decision model was developed: at the end of each iteration the state was re- evaluated using the classification model state is changed when there were two consecutive positive decisions to change state classification model was based on the current state of the decision
QinetiQ Proprietary © Copyright QinetiQ limited Classification model Classification model is used to separate data into a number of populations In the case of the DECIDE task we have two decisions: to send when not sending to stop when sending The threshold of the decision was determined by coupling the model to an optimisation algorithm The performance of the operator was used as the objective of the optimisation The threshold was altered by the algorithm until the performance matched the observed performance The threshold gives some indication as to whether people are willing to send early (upper boundary) or late (lower boundary)
QinetiQ Proprietary © Copyright QinetiQ limited Performance of the model against the observed data The classification models were able to reproduce performance scores well for 38 participants The remainder did not appear to be using the information sources Start decision was well modelled Stop decision was more difficult to model and there was a tendency for simulated participants to stop sending too soon and then resend shortly afterwards There was a relationship between personality and the timing of the start/stop decisions Observed Score = 633 Simulated Score = 560
QinetiQ Proprietary © Copyright QinetiQ limited Summary conclusions Basic classification model can vary from individual to individual Crude representation of evolution of decision with time can be quite effective Rule of three used in DECIDE task modelling Criterion influenced by individual characteristics – personality in this case Principles employed in simulation of behaviour of Anti-Air Warfare Officers in naval simulation with credible results
QinetiQ Proprietary © Copyright QinetiQ limited Information and organisation performance
QinetiQ Proprietary © Copyright QinetiQ limited Information in an organisational context Two aspects to system performance: time to perform and quality of output Much analysis of processes focuses on time to perform but quality of output is as important Can model decision making at the pattern matching level as described earlier Can this be extended to provide assessment of processes and procedures within a C2 system? Ideally need some approach that encapsulates these factors and can be used for engineering a system Study described here was based on methods for measuring information Two widely used measures of information content: Shannons information (entropy) Fishers Information
QinetiQ Proprietary © Copyright QinetiQ limited Shannons entropy Data and information are different although often treated as the same Data are part of the physical domain and measured in bits; information is in the cognitive domain and is measured in models of the current and future state of the world Shannons entropy is strictly a measure of optimal coding for messages and therefore of data Has no concern about the meaning of a message – information content Interested in the quantity of data measured in number of bits Provides a measure of data flow given assumptions about the pattern of data elements in the stream
QinetiQ Proprietary © Copyright QinetiQ limited Fishers Information Fishers Information measures the amount of information data provides about a set of model parameters Expressed in terms of the precision of these estimates provided by the data Derived from the Maximum Likelihood estimation procedure Can be viewed as a measure of the quality of the model in terms of describing the data Can be extended to describing the information content of the model Decided to use Shannons entropy as a measure of data flow and Fishers Information as a measure of information content Basic measures are not commensurate Have used the approach of Cedilnik and Košmelj to bring them onto a common scale
QinetiQ Proprietary © Copyright QinetiQ limited Mathematical definition of the measures Shannons entropy e p is defined by the equation on the right If it is assumed that there are n possible values for the content and there are all equally likely, the measure simplifies Fishers Information I is based on the estimate of the variances of a set of k parameters θ. If it is assumed that the parameters lie in a range (a,b) the expression on the right provides a measure that is consistent with e p
QinetiQ Proprietary © Copyright QinetiQ limited Example data flow Consider a sample of data that might be coming into the system Series of pairs of numbers – a sample shown on the right Considered from the point of view of Shannons entropy the information content is the length of the message The message comprises 20 numbers reported as a maximum of three decimal digits The length of the message is a maximum of 20 x 7 bits = 140 bits That is the data content……. (1.0, 1.0) (2.0, 1.7) (3.0, 3.3) (4.0, 4.1) (5.0, 4.9) (6.0, 5.5) (7.0, 7.2) (8.0, 8.3) (9.0, 8.9) (10.0, 9.9)
QinetiQ Proprietary © Copyright QinetiQ limited Develop context and model (1) Suppose this sequence of pairs of numbers records the advance of an entity with time Extra information: we can estimate the average speed
QinetiQ Proprietary © Copyright QinetiQ limited Develop context and model (2) Suppose this sequence of pairs of numbers records the advance of an entity with time Extra information: we can estimate the average speed A model we are applying to the data Speed is not exact as data has noise Extra information can be estimated using Fishers information Using basic assumptions the information added is 5.46
QinetiQ Proprietary © Copyright QinetiQ limited Develop context and model (3) Suppose this sequence of pairs of numbers records the advance of an entity with time Extra information: we can estimate the average speed A model we are applying to the data Speed is not exact as data has noise Extra information can be estimated using Fishers information Using basic assumptions the information added is 5.46 We can estimate the position at 15 and 18 Following same logic, further information added is 9.48
QinetiQ Proprietary © Copyright QinetiQ limited Develop context and model (4) Suppose the underlying observations are twice as variable Using basic assumptions the information added is 4.66 We can estimate the position at 15 and 18 Following same logic further information is 7.88
QinetiQ Proprietary © Copyright QinetiQ limited Fisher and good and bad models Previous example was developed using the true model What happens if inappropriate model is applied? Appropriate model fit is shown in the upper graph Inappropriate model shown on the lower graph The estimates of Fishers information for the slopes in the two cases are: If we used this for prediction the added information would be small for the inappropriate model
QinetiQ Proprietary © Copyright QinetiQ limited Metrics, models and data Examples displayed in previous slides illustrate three key points: We can construct a methodology for measuring effect of information transactions The metrics are sensitive to data quality and model quality They demand an understanding of how models are acquired Simple example deals with a model constructed from data gathered as part of the information flow For data fusion the model will have been constructed prior to system use To apply the previous logic we need to know the quality of the model In addition we will have to handle variability in the data to which we apply predictive models
QinetiQ Proprietary © Copyright QinetiQ limited Approach to testing the metrics in an organisational model Selected a model with a repetitive decision that had been modelled previously Based on the DECIDE task Original form comprised a single-person task with multiple information sources The task was taken as the basis for a model of a headquarters with four streams of information and a simple decision to make Permits an overall measure of effectiveness through task score Can manipulate information use and study overall effect Includes natural delays and possible representation of corruption Information flow resembles that of some Battlegroup headquarters
QinetiQ Proprietary © Copyright QinetiQ limited General Behaviours in an Organisation Decision making is a special case of process where information is turned into an order
QinetiQ Proprietary © Copyright QinetiQ limited Structure The structure of an organisation is determined by: causality between processes formal relationships between agents informal relationships between agents
QinetiQ Proprietary © Copyright QinetiQ limited Basic building blocks in the HQ model Information processing behaviours Gather data Process and fuse information Decide Order action Representation of the impact of decisions by closing the loop using a pseudo-military task Use original information pattern from DECIDE task Abstract data observation and interpretation as flows between cells in a notional HQ
QinetiQ Proprietary © Copyright QinetiQ limited
QinetiQ Proprietary © Copyright QinetiQ limited Problems to be represented in the metrics as applied to the model Quality of decision-making procedure in information terms – reflecting training and experience Impact of timeliness on decisions Impact of unreliable information sources Impact of inappropriate models Two aspects must be addressed so that Fishers Information can be calculated Precision of the fusion model Variability of the data employed in the fusion
QinetiQ Proprietary © Copyright QinetiQ limited Acquisition of the data fusion models In the development of the statistics of the data fusion model it was assumed that the model was based on experience of the real system This was represented by gathering data from the simulated task and fitting the fusion model to the observations From the model fits the variance characteristics of the model are described It is assumed that training and experience is represented by a level of exposure to real situations Observations of performance following training indicate a performance curve that follows a t -½ law where t is the training time The model that assumes exposure will follow the same law statistically
QinetiQ Proprietary © Copyright QinetiQ limited Timeliness The timeliness aspects of information are captured in two components of the model The rate at which enemy strength changes in the simulated world Time delays in the processing of information in the model
QinetiQ Proprietary © Copyright QinetiQ limited Unreliability of information and appropriateness of the model In the simulated HQ information sources can become corrupt An extra step was inserted in the information processing to check the quality of the source vulnerable to corruption Simple linear prediction was used to describe the check For the construction of this model it was assumed that effectively unlimited experience would be available for own sensors Variance of the model therefore assumed to be small
QinetiQ Proprietary © Copyright QinetiQ limited Conditions tested Simulations of the HQ model were conducted varying the following conditions Amount of experience of the decision-maker Level of noise on the data for the training of the decision-maker Level of noise on the data in the simulated decision making Presence or absence of source corruption Effectively trying to measure three aspects of information handling Quality of basic data Quality of models used in decision-making Appropriateness of decision making models
QinetiQ Proprietary © Copyright QinetiQ limited Basic features of demonstration Data flows at the same rate under all circumstances Noise on the data is used to modify the effective input information according to Shannons entropy – assumed that data reported to appropriate precision Fishers Information is summed from the analysis of potentially corrupt data and from the calculation of fused information In general the information added in data fusion is of the same order as the information in the input data Quality of training and experience contributes about the same amount as the data gathered from sensors
QinetiQ Proprietary © Copyright QinetiQ limited Effect of noise on performance and ModFI
QinetiQ Proprietary © Copyright QinetiQ limited Effect of training on performance and ModFI
QinetiQ Proprietary © Copyright QinetiQ limited Effect of information delay on performance and ModFI
QinetiQ Proprietary © Copyright QinetiQ limited ModFI as a predictor of performance
QinetiQ Proprietary © Copyright QinetiQ limited Conclusions It is possible to describe transactions in a model C2 system using a combination of Shannons entropy and Fishers Information The information metrics correlate with overall performance in the abstract example used in the study The key to the approach is the description of the models applied in decision- making An essential element is the description of the statistical properties of these models Some of these elements can be estimated through additional simulation It is also important to describe data accuracy and information content in the same terms
QinetiQ Proprietary © Copyright QinetiQ limited Overall summary Human decision-making in a range of contexts can be represented using models from statistical classification There is variability in the quality of the models employed by individuals as a function of training and experience Individual characteristics can affect the decision taken through perception of the outcomes Impact of information flow processes can be captured using Fishers information Sources of variability that affect Fishers information include Quality of decision making model Reliability of basic data on which it is based Influence of organisational processes that affect variability Within limits of current study Fishers information is a passable predictor of organisational performance
© Copyright QinetiQ limited 2007 Independent expertise where it matters most.
Using Learning Outcomes and Assessment Criteria Peter Noakes Department of Electronic Systems Engineering University of Essex.
Chapter 3: Supervised Learning. CS583, Bing Liu, UIC 2 Road Map Basic concepts Decision tree induction Evaluation of classifiers Rule induction Classification.
1 Chapter 2: Decision Making, Systems, Modeling, and Support Conceptual Foundations of Decision Making The Systems Approach How Support is Provided.
Computing Higher - SD Process – Topic 2 St Andrew’s High School Unit 2 Software Development Process.
UNIT V: LEARNING. LEARNING Learning from Observation Inductive Learning Decision Trees Explanation based Learning Statistical Learning methods Reinforcement.
GTRI_B-1 Human Systems Integration © 2007 Georgia Institute of Technology August 2007 Slide 1 Human Systems Integration Introduction Dennis J. Folds, Ph.D.
1 Psychological Practical (Year 2) PS2001 Introduction Dr. John Beech.
Chapter 7 / Slide 1 Copyright © 2008 Pearson Education Canada Part 3 Groups and Teamwor Copyright © 2008 Pearson Education Canada Social Behaviour and.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved. Lecture Slides Elementary Statistics Eleventh Edition and the Triola.
1 GREY BOX TESTING Web Apps & Networking Session 10 Boris Grinberg
ISBN Prentice-Hall, 2006 Chapter 5 Designing the System Copyright 2006 Pearson/Prentice Hall. All rights reserved.
Chapter 7 – Design and Implementation 1Chapter 7 Design and implementation Note: These are a modified version of Ch 7 slides available from the authors.
Berling Associates, Inc. 1 T.E.A.M. EFFORT A Primer on Process Management (A View From The Improvements Team Perspective)
All Q & A Issued by PMI 1 1. A project is: A. A set of sequential activities performed in a process or system. B. A revenue-generating activity that needs.
1 by L Goel Professor & Head of Division of Power Engineering School of Electrical & Electronic Engineering Nanyang Technological University,
Chapter 4 Slide 1 Original 33 slides by Prof. Anita Beecroft, Kwantlen Polytechnic University 16 slides added Feb 2011 by Prof. Tim Richardson, University.
PLANNING THE AUDIT Individual audits must be properly planned to ensure: Appropriate and sufficient evidence is obtained to support the auditors opinion;
Training on Cost Estimation & Analysis Karen Richey Jennifer Echard Madhav Panwar.
1 Assessing Evidence Reliability In Performance Audits NSAA April 14, 2008.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 5 Slide 1 Topics covered l Functional and non-functional requirements l User requirements.
Work measurement Part II of Work Study. 2 Introduction Work measurement is the application of techniques designed to establish the time for a qualified.
1 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Process Improvement IS301 – Software.
1 Evolutionary Systems Paul CRISTEA Politehnica University of Bucharest Spl. Independentei 313, Bucharest, Romania, Phone: , Fax:
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 16 Slide 1 User interface design.
Economic Insights Toolkit Press F5 to launch the toolkit If you are seeing this message, you have opened the toolkit in PowerPoint. To launch it automatically,
Sept. 9, 2009 Seminar at Beijing University 1 Deriving and Analysing the Quality of Software Architectural Designs Hong Zhu Department of Computing, Oxford.
1 MRes Course LOGLINEAR MODELLING. 2 Risk factors in the incidence of an antibody A researcher has reason to believe that there may be a higher.
1 Statistical sampling principles for the environment Marian Scott August 2013.
9.1 9 Programmin g Languages Foundations of Computer Science Cengage Learning.
Lecture 20 Missing Data and random effect modelling.
© 2016 SlidePlayer.com Inc. All rights reserved.