Presentation on theme: "Revisiting the JDL Data Fusion Model II"— Presentation transcript:
1 Revisiting the JDL Data Fusion Model II James Llinasa, Christopher Bowmanb, Galina Rogovac, Alan Steinbergd, Ed Waltze , and Frank Whitefa: Research Professor, University at Buffalo. Buffalo, NY, USA,b: Consultant, Data Fusion & Neural Networks, Colorado, USA,c: Encompass Consulting, Honeoye Falls, NY, USA,d: Technical Director, Utah State University Space Dynamics Lab, Utah, USA,e: Technical Director, Intelligence Programs, General Dynamics - Advanced Information Systems, Ann Arbor, MI,USA,f: Director, Program Development, US Navy SPAWAR Systems Center, San Diego, CA, USA,
2 Some History of Fusion Models JDL—Original, circa 1987Dasarathy, Data-Feature-Decision Layered Model—1997Steinberg, Bowman, and White, Revision I to JDL, 1999Bedworth and O’Brien, Omnibus Model,Salerno, Situation Awareness Model, 2002Blasch and Plano, Level 5, 2003
3 The Reference JDL Model* * Steinberg, A.N., Bowman, C.L., and White, F.E., “Revisions to the JDL Data Fusion Model”, in Sensor Fusion: Architectures,Algorithms, and Applications, Proceedings of the SPIE, Vol. 3719, 1999
4 Motivations for Revisiting the JDL Model(s) “External” Factors(Driven by Opnl Needs)“Internal” Factors(Driven by Need forDeeper Understanding)“Common (or Consistent, or Relevant or Single Integrated or User Defined) Operational Picture”“Network-Centric Warfare”“Dominant Battlespace Knowledge”“Operations Other Than War”“Asymmetric Warfare”“Information Warfare”“FORCEnet”Distributed, Service-BasedInformation ArchitecturesBetter understanding of the “Levels”Dynamically-ComposableData and InformationFusion ServicesInsight into Inter-Level Processing--Information operations--Adjudication and conflict resolution--Output management--Effects of Input ReliabilityPedigree, Metadata, Context ServicesConventions and StandardsOntologiesIntegrating Inductive/Abductive InferencingUnderlying implications for the primaryconceptual and semantic DF model:The JDL Model
5 Discussion Topics Reexamining our understanding of the “Levels”2) Insight into Inter-Level Processing--Information operations--Adjudication and conflict resolution--Output management--Effects of Input Reliability3) Integrating Inductive/Abductive Inferencing4) Aspects of Distributed Fusion5) Pedigree6) Ontologically-based Data Fusion Processes
6 1) Revisiting the “Levels” For Alan to do—some bullets on new perspectives re Levels
8 2) Insight into Inter-Level Processing (a) Information operations The idea of inter-Level information and control flow is not very explicit in the traditional JDL ModelNeed to specify inter-Level “informing”, controlling, and exploitationTrades off added value/utility vs cost of additional processing; raises need for consistency
11 2) Insight into Inter-Level Processing (b) Adjudication and conflict resolution Both Atomic Level and Meta-Level Adjudication
12 2) Insight into Inter-Level Processing (c) Output management JDL Model not specific in how output Quality & Consistency are controlledExpect hierarchical Value system; within-process and system-levelNew State Estimate Quality andConsistency achieved as peroperations in 2(a),2(b)Output Inferencing Quality Control viaAddtl info using L4Output Inferencing Consistency viaBelief Change
13 2) Insight into Inter-Level Processing (d) Effects of Input Reliability Reliability akin to second-order Uncertainty (in source inputs)Typically not accounted for in fusion algorithmsEven if Source Reliability specified, how to compute Fused-Estimate Reliability?Possible situations (Dubois and Prade, 1992)It is possible to assign a numerical degree of reliability to each source.A subset of sources is reliable but we do not know which one.Reliabilities of the sources can be ordered but no precise reliability values are known.Strategies to be considered:Strategies for identifying the quality of data input to fusion processes and elimination of data of poor reliability.Strategies for modifying the data and information by considering their reliability before fusion.Strategies for modifying the fusion process to account for the reliability of the input.Combination of strategies mentioned above.- is a context dependent operator, which depends on the strategy selected and is defined within the framework used for uncertainty representation
14 3) Integrating Inductive, Abductive Inferencing Asymmetric adversaries are quite unpredictable in their behavior, tactics, weapons, and choice of targets.Induction usually a precursor to Deduction but requires knowledge of relationship between observable signatures and Truth statesAbduction forms best plausible explanation for the observables and observable patternsA Hybrid Inferencing Process follows the typical sequence of scientific discovery and proof, using a sequence of steps to conjecture, hypothesize, generalize and validate.DiscoveryData mining tools to locatepatterns of meaningful relationshipsCorrelated patterns are examinedfor relevanceAbductive PhaseGeneralization & ValidationApplies inductive generalizationModel parameters are estimatedDetectionValidated model provides a targetdetection “template’
15 3) Integrating Inductive, Abductive Inferencing Integrated Data Mining and Data Fusion Processes StepProcessReasoning ProcessExample use of Typical Automated Tools1.DiscoveryData Mining – Discovery of a potential specific target and it’s characteristics in raw data setsAbduction – Reason about a specific target, conjecturing and hypothesizing to discover the best explanation of relationships to describe a target. (Hypothesis creation)Analyst uses data mining tools to locate patterns of relationships in contacts, financial exchanges, associates, and concurrent activities of a terrorist cell.2.Generalization and ValidationTarget Modeling Generalization – Characterize the target class in a general modelInduction – Generalize the fundamental characteristics of the target in a descriptive model. Test and validate the characteristics on multiple cases. (Hypothesis validation)Analyst develops sand refines a quantitative model of the terrorist cell behavior. The model is tested on additional data to evaluate its detection value using data mining Tools.3.DetectionData Fusion – Detection of subsequent occurrences of the target based on comparison with target models.Deduction – Test real-time and massive volume data against multiple target templates to detect (deduce) the presence of targets. (Hypothesis testing)Real time raw data are ingested by an automated data fusion tool to detect the presence of evidence for other similar terrorist cells.
16 3) Integrating Inductive/Abductive Inferencing * * Waltz, Edward L., “Information Understanding: Integrating Data Fusion and Data Mining Processes”, Proc. of IEEE International Symposium onCircuits and Systems, Monterey CA, May 31-June 4, For a more detailed description of the integration, see, Waltz, Edward, KnowledgeManagement in the Intelligence Enterprise, Norwood MA: Artech, 2003, Chapter 8.
17 4) Aspects of Distributed Fusion Requirement, framework for sensibly all modern, future military, homeland security IT environmentsArchitectural issues—need for empirical studies, architectural analysis toolsNeed for local and network fusion algorithmsSpecification of Information-Sharing StrategiesDesign of adaptive network topologiesNeed for a “Distributed Fusion JDL Model”
19 5) PedigreeWe define Pedigree as “an attachment to a massage or communication between nodes that includes any information necessary to the receiving node(s) such that the receiving node fusion processing maintains it’s formal and mathematical processing integrity”.
20 6) Ontologically-based Data Fusion Processes One important foundation toward achieving Interoperability and Shared Understanding, especially for Higher-Level Fusion statesOntological relationships as a basis for development of new Theoretical constructs for the “True” worldTheory as a basis for Algorithm development in the observed world
21 What about a L2 Ontology?What is a “Situation”? Not adequately specific--No common understandingHow is it “Refined” No metrics/quality measuresWhat kind of algorithmic process yields a “Situation Estimate”?Llinas assertion: “Situation” is too coarse/abstract to engineer to—MUST get more specificObservabilityWhat IS a “Convoy”?Algorithm reThing-componentIn theReal WorldSome set ofComponents,in some relationship= “Situation”Model (Theory) ofThing-componentIn theReal WorldAlgorithm reThing-componentIn theReal WorldPartitioningAndLabelingNature of,models ofObservationalProcessesAlgorithm reThing-componentIn theReal WorldSituations in theReal WorldAlgorithm reThing-componentIn theObserved WorldAggregated-object(“Convoy”)TrackingAlgorithm(Analysis;Ontology—Sufficient SpecificityTo develop Theories)Nature of :Aggregated ObjectsBehaviors,EventsAssumptions,Approximations,Application-needsInforms, BoundsRealWorldUserWorldTask ReqmtsObserved Worldin the Application (Task) ContextSituations as inherent: an attack (situation) may be occurringeven if the user’s task-at-hand has no interest in an attack state
22 SummaryThere is a clear need for expanding and enhancing the JDL Model to deal with and incorporate the effects of the various issues raised hereinThe Model has been an anchor-point for communication and understanding in the Fusion Community and has served us well but it needs contemplative review and a consensus-based modernization