Presentation on theme: "Automated Dynamic Symbology for Visualization of Higher Level Fusion T. Kesavadas and Youngseok Kim Virtual Reality Laboratory Center for Multisource Information."— Presentation transcript:
Automated Dynamic Symbology for Visualization of Higher Level Fusion T. Kesavadas and Youngseok Kim Virtual Reality Laboratory Center for Multisource Information Fusion (CMIF) The State University of New York at Buffalo www.vrlab.buffalo.edu
Preview 1.Features of Fusion Visualization (related to COP) 2.Visualization Tasks for Fusion 3.Low Level Fusion 4.High Level Fusion 5.Future Work
Features of Fusion Visualization (related to COP) 1.Common Operational Picture (COP): A correlated and fused near-realtime picture of a battle space including geo-locational track information on friendly, hostile, and neutral land, sea, and air forces… The goal is to create a general approach to COP interoperability and adaptability while not replacing or recreating existing national systems…(COP for Coalition Forces A Software Architecture-based Approach, July 2002, ISA & SPAWAR) 2.Features of Fusion Visualization - Dynamism: Icons/symbols/graphics are NOT pre-determined. Generated “at Run Time”. - Automation: Visual elements/effects are parameterized to plug-in to fusion outputs (e.g. tactical symbols/graphics based on MIL-STD-2525B. Common War-fighting Symbology ) - Human Factors: Not only traditional data/information display, but also includes visual effects for better SA. (e.g., blurred symbols, seamless display transition) These features can not only contribute to traditional COP in system/interface level, but also further expand its concept to human factor/cognitive level.
Our Simulation Environment & Goal Simulation Environment: Natural/man-made post-disaster situation.Simulation Environment: Natural/man-made post-disaster situation. Simulator: The Run-Time Interface (RTI) for earthquake in Northridge, CA. in 1994. ( There are several federates in RTI, such as report generator, data fusion, walk-in casualty, medical facility, dispatcher-router, and visualization.)Simulator: The Run-Time Interface (RTI) for earthquake in Northridge, CA. in 1994. ( There are several federates in RTI, such as report generator, data fusion, walk-in casualty, medical facility, dispatcher-router, and visualization.) The goal of the visualization federate is: to develop effective display and user interface for monitoring, situation awareness (SA), and decision support for command and control.The goal of the visualization federate is: to develop effective display and user interface for monitoring, situation awareness (SA), and decision support for command and control.
Visualization Tasks for Low & High Level Fusion Position, Identification Level 1 Level 2 Level 3 Situation Awareness Threat Assessment GIS Vector Map Matching UTM Coordinate Raster Map Dynamic Symbology Viz. of Casualty Groups Tactical & Seamless Graphics Positioning Symbols Positioning identifying symbols on the map. Showing the pattern and its meaning. Fusion Levels & GoalsVisualization Tasks
Post-disaster Visualization Earthquake simulation for Northridge area, CA Earthquake simulation for Northridge area, CA There are too many cluttered and overlapped icons, and hard to recognize the situation. There are too many cluttered and overlapped icons, and hard to recognize the situation. Viz. of high level fusion started with “Symbology” study. Viz. of high level fusion started with “Symbology” study.
Visualization for High Level Fusion Situation Awareness (Level 2) Goal: Display patterns and tactical information.Goal: Display patterns and tactical information. Related Tasks:Related Tasks: 1.Automated Dynamic Symbology Tactical Symbols Tactical Symbols Tactical Graphics Tactical Graphics 2. Seamless Transition in Display Transparency Control Transparency Control Application to Casualty Grouping Application to Casualty Grouping
Early Symbology ► Related Research (’60s ~ early ’90s) Mostly implemented by US Army Research Institute for the behavioral and social science Difficult to apply to today’s display of variable symbol features (size, color, visual effects, etc.) Various types of symbol: (a) tactical symbol, (b) image mapped on 3D cubes, (c) detailed icon, (d) blocked outline for larger pixel CRT, (e) filled silhouette, (f) outlined silhouette, and (g) and detailed 3D model
Early Symbology Department of the Army Field Manual. Military symbols. U.S. Army Field Manual (FM 21-30). Headquarters, Department of the Army, 1965. Michael G. Samet, Ralph E. Geiselman and Betty M. Landee. An experimental evaluation of tactical symbol-design features. Technical Report 498. U.S. Army Research Institute for the Behavioral and Social Science. April, 1980. Beverly G. Knapp. The Precedence of Global Features in the Perception of Map Symbols. U.S. Army Technical Report 803. ARI Field Unit at Fort Huachuca, Arizona, USA, 1988. Elizabeth Wheatley. An experiment on coding preference for display symbols. Ergonomics, 20(5), pages 543-552. 1977. Philip Bersh, Franklin L. Moses and Richard E. Maisano. Investigation of the strength of association between graphic symbology and military information. Technical Report 324. U.S. Army Research Institute for the Behavioral and Social Science. September, 1978. ► References on Symbols
Modern Symbology ► Recent Research Establish parametric method for remote applications (based on XML type architecture). MIL-STD-2525B (1999). Common War-fighting Symbology Department of Defense Interface Standard. U.S. Department of Defense. Two types of symbols are suitable for strategic level ► Tactical Symbols ► Tactical Graphics (a) Tactical Symbols (b) Tactical Graphics
Our Dynamic Symbology ► Definition Dynamic Symbology is a parameterized symbol system in which the values of symbolic entities and graphical properties are generated/calculated in runtime.\ ► Goal: System Automation, Flexibility and Remoteness No need to maintain large number of fixed symbol sets
Dynamic Symbology ► Automation Tactical symbols and graphics are generated by composing its components. ► Flexibility Symbols/graphics images are not pre-defined, but generated at run-time. Display can be changed dynamically to fit a user’s need during the simulation. Tactical Symbol Components Tactical Graphics Components
Dynamic Symbology ► Controlling Visual Effects Can be used for various visual effects and human factors Example: Uncertainty visualization with blurred symbol ► 100 Levels of degraded images are generated from only one icon image ► Dynamically generated at run time ► saves time of manual image processing Uncertainty 50 % Uncertainty 0 %
Video Clips for Dynamic Icons (Casualties and Ambulances)
Tactical Graphics ► Tactical Graphics for Fusion Visualization “Tactical Graphics often produce better results in situation understanding, especially for experienced users.” (Feibush, Gagvani & Williams, Visualization for situation awareness. (Feibush, Gagvani & Williams, Visualization for situation awareness. IEEE Graphics. pp. 38-45. September/October 2000) IEEE Graphics. pp. 38-45. September/October 2000) Tactical Graphics is useful for visualization of higher level fusion, since it provides strategic view. Automation of tactical graphics can improve situation awareness. Can be applied to visualization of a large number of casualties.
Tactical Graphics ► Problem in Visualization Current visualization is for location & identification of each casualty from RTI reports. However, with a large number of similar icons, information is usually overloaded with cluttered and overlapped icons.
Tactical Graphics ► Proposed Solution Current visualization is for location & identification of each casualty from RTI reports. However, with a large number of similar icons, information is usually overloaded with cluttered and overlapped icons. Grouping can be an efficient Tactical Graphics Identification (Lower Fusion Level) Situation Awareness (Higher Fusion Level) Seamless Transition
► Why Seamless Transition? Too Much Detail or High Realism Can Distract User Retains Focus on Objects of Interest Saves Time for researching in Different Display Alleviates Work Stress Level 2 Situation Awareness Level 0 & 1 Position, Identification Common Display
Seamless Transition ► Transparency (Opacity) Control Transparency blends two or more state The level of blending can be obtained by interpolation curves, such as Bernstein’s blending curves or any sinusoidal curves which make all sums to 1.0. None Black & White 0 1.0 Abstractness Opacity Color Realism
Seamless Transition ► Transition from Raster Map to Analysis Space Visual attributes are manipulated for seamless transition (Level 0 & 1) Realism Abstractness PositionPattern (Level 2 & 3) Color, B&W, Transparency
Visualization of Casualties ► Features C++ & OpenGL program in 3D environment Raster map was made with aerial photos of Northridge Region Used casualty data from RTI report Groups are sorted by three severity levels (1,2 &3) Outlined groups are shown over the raster map GUI for Seamless Transition
Visualization of Casualty Groups Raster map + Casualty groups Severity of Casualties Level 1 Level 2 Level 3
Visualization of Casualty Groups More opacity added
Visualization of Casualty Groups Removed the realism of raster map
Visualization of Casualty Groups Level 3 groups (in red) added The overlapped regions in purple can be regarded as dense areas.
Visualization of Casualty Groups Added the context of raster map
Future Works ► Continue Situation Awareness Development Casualty Groups ► Real Data from Fusion Federate Interface Design for Fast Identification Visualization of Statistical Data ► 3D Urban City Modeling and VR Simulation of Earthquake Impact Fusion representation Damage and Fire Simulation ► Time-aggregation data (statistical information) ► Performance Evaluation of the Viz. & UI system
Acknowledgement This work was supported by the AFOSR under award F49620-01-1-0371. The authors gratefully acknowledge valuable input from Dr. Peter D. Scott, Dr. James Llinas and Dr. Ann M. Bisantz. Contact Dr. T. Kesavadas email@example.com Or visit us online: Virtual Reality Laboratory www.vrlab.buffalo.edu firstname.lastname@example.org www.vrlab.buffalo.edu email@example.com www.vrlab.buffalo.edu