Presentation is loading. Please wait.

Presentation is loading. Please wait.

Military Simulation Case Study

Similar presentations


Presentation on theme: "Military Simulation Case Study"— Presentation transcript:

1 Military Simulation Case Study
Victor Jimenez Sr. Engineer, Northrop Grumman

2 Overview Military Simulation Architecture
Distributed Training Problems Implemented Solution Towards a Common Operational Picture

3 Simulation Architecture
Different Types of Simulations Live Constructive Virtual We will limit ourselves to Virtual Simulations

4 Virtual Simulations Place the warfighter into the midst of the action
Also called man-in-the-loop sim May or may not control path through world Has a first person perspective (sort-of) Flight sims are virtual

5 Flight Simulation Architecture
The ownship is a distributed application Physics, Haptics, Weather, Combat Resolution may be separate computers Graphics is usually a cluster of computers May have a motion platform Computers for an ownship usually talk private protocol over separate network Usually talk to external platforms using standard protocol on an external network

6 Standard Protocols Distributed Interactive Simulation (DIS)
Broadcasts entire state of an entity Has different packets (PDU) for each action IEEE1278.1a is last version High Level Architecture (HLA) Sends only update of what changed Has Objects and Interactions separated IEEE1516 is latest

7 Distributed Training Problems
Has to be fast Has to be reliable Has to work in a WAN configuration Has to be AFFORDABLE! In terms of Bandwidth In cost to implement Problems with protocols Problems with conception of battlespace

8 Protocol Issues - DIS DIS broadcasts – can’t send over long haul WAN unless some type of bridge is setup Bridge is very hard to set up for more than 2 Additional Bandwidth costs Hard to manage what is sent Usually, you just didn’t want the data anyway

9 Protocol Issues - HLA Assumes that Multicast groups are free
ATM doesn’t support multicast HLA is an interface IEEE 1516 support is just starting No IIOP specified Data sent is defined on the fly – doesn’t necessarily match what other guy uses

10 Common Battlespace Issues
Your training mission defines your battlespace Differences in fidelity crop up Differences in what you implement appear Differences in how you use the protocol IMPEDANCE MISMATCH OCCURS!

11 Even though the simulators are built distributed,
AHA! Even though the simulators are built distributed, THEY DON’T PLAY WITH EACH OTHER!

12 Integrate vs Interoperate
Integrate means that things are reworked to create one common way to do things Everybody becomes “the same” Interoperate means that disparate systems can talk to each other – have limitations Everybody stays the “the same”

13 The Path Less Followed We decided to interoperate
Couldn’t change other company’s simulations Didn’t want to change since this would lead to having continuously changing sims Didn’t make sense since each has a different battlespace We wanted to isolate each site from each other Oh, the Horror!

14 Implemented Solution Created ATM cloud
Tie individual sites into cloud with appropriate edge device Created an architectural device to handle impedance problems Customized Software Runs on ‘normal’ computers under W2K

15 Standards, Standards, Standards
Defined Standards Network Services and Setup Protocol Representation Object Representation Certify different sites as to level of compatibility Sites work with other sites with some limits

16 Object Model TIM Had a series of Technical Information Meetings (TIMs) to discover what is important to each site Figured out mapping from local representation to global objects

17 Lost in the Translations
We realized we needed to created “never-fail” software The Key is Test, Test and Test Some More Sequence of testing steps Unit tests Factory Acceptance Scenario test with CGF’s Integrate on Site

18 Unit Tests Just for the particular software unit - written first!
Created by the programmer to test normal running conditions boundary conditions tricky calculations Whatever else fails later on Automated, runs before check-in Requires discipline – lots of it

19 Factory Acceptance Testing
Test Cases created by the test team Inputs and Outputs defined by requirements Programming team involved only to create tools Uses carefully crafted data Runs every night in an automated fashion Immediate feedback to Programmers Not necessarily representative – (oops!)

20 Scenario Testing Needed something to cover data gap
Between CGF from MTC and other standard CGFs Run manually by test team Slowly being automated, requires Deep Knowledge and understanding Longer cycle (2 weeks) to run

21 On Site Integration Install lines, equipment, software
Testing On the Site Using actual equipment Typical small scenario Surprises still occur! Mini-Development Cycle – we can change things with impunity

22 Towards a C.O.P. Common Operational Picture
Subset of the battlespace that everybody understands well enough to work Standards continue to evolve Protocols Capabilities New Simulators Updated Simulators New Training objectives

23 Change Review Board Found out we need this to manage complexity
Too many changes occurring due to different development schedules Need to coordinate rollout of capabilities and needs

24 Conclusion Networks are message passing architectures
Provide abstraction Contracts between participants formalized in standards and models Not perfect but “close enough” Very useful to have something in the middle Manage bandwidth Prioritize data Translate to/from COP Hide network details from users


Download ppt "Military Simulation Case Study"

Similar presentations


Ads by Google