DDDAMS-based Surveillance and Crowd Control via UAVs and UGVs

Slides:



Advertisements
Similar presentations
1 Academic/Industrial Roadmap for Smart Process Manufacturing (SPM) Jim Davis (UCLA) Tom Edgar (UT-Austin)
Advertisements

Exploring Large Social Networks Nathalie Henry French co-supervisor Dr. Jean-Daniel Fekete INRIA / LRI University of Paris-Sud (Orsay)
ITRS Roadmap Design + System Drivers Makuhari, December 2007 Worldwide Design ITWG Good morning. Here we present the work that the ITRS Design TWG has.
Wenke Lee and Nick Feamster Georgia Tech Botnet and Spam Detection in High-Speed Networks.
1 Probability and the Web Ken Baclawski Northeastern University VIStology, Inc.
Proactive Traffic Merging Strategies for Sensor-Enabled Cars
Dynamic Spatial Mixture Modelling and its Application in Cell Tracking - Work in Progress - Chunlin Ji & Mike West Department of Statistical Sciences,
Höchstleistungsrechenzentrum Stuttgart SEGL Parameter Study Slide 1 Science Experimental Grid Laboratory (SEGL) Dynamical Parameter Study in Distributed.
JMA Takayuki MATSUMURA (Forecast Department, JMA) C Asia Air Survey co., ltd New Forecast Technologies for Disaster Prevention and Mitigation 1.
Flexible Airborne Architecture
DRC 2006.Presenters Name.March 7-9.p 1 Californias enormous computing resources allow climate simulations at unprecedented resolution capturing Californias.
Making the System Operational
Developing Event Reconstruction for CTA R D Parsons (Univ. of Leeds) J Hinton (Univ. of Leicester)
IEEE CDC Nassau, Bahamas, December Integration of shape constraints in data association filters Integration of shape constraints in data.
1 Hierarchical Part-Based Human Body Pose Estimation * Ramanan Navaratnam * Arasanathan Thayananthan Prof. Phil Torr * Prof. Roberto Cipolla * University.
4-th IEEE International Conference on Advanced Learning Technologies, Joensuu, Finland, August 30 – September 1, th IEEE International Conference.
1 Welcome To the Electrical and Computer Engineering Department Cal Poly Pomona.
1 A Real-Time Communication Framework for Wireless Sensor-Actuator Networks Edith C.H. Ngai 1, Michael R. Lyu 1, and Jiangchuan Liu 2 1 Department of Computer.
高度情報化社会を支えるネットワーキング技術 (大阪大学 工学部説明会資料)
Vanishing Point Detection and Tracking
Université du Québec École de technologie supérieure Face Recognition in Video Using What- and-Where Fusion Neural Network Mamoudou Barry and Eric Granger.
anywhere and everywhere. omnipresent A sensor network is an infrastructure comprised of sensing (measuring), computing, and communication elements.
INTRODUCTION TO SIMULATION WITH OMNET++ José Daniel García Sánchez ARCOS Group – University Carlos III of Madrid.
Communications Research Centre (CRC) Defence R&D Canada – Ottawa 1 Properties of Mobile Tactical Radio Networks on VHF Bands Li Li & Phil Vigneron Communications.
Matthias Wimmer, Bernd Radig, Michael Beetz Chair for Image Understanding Computer Science TU München, Germany A Person and Context.
June 2010 Smart Grid Wireless Solutions End-to-end Smart Meter/Grid solution, specifically designed to meet the demands of Utilities and their customers.
1 Analysis of Random Mobility Models with PDE's Michele Garetto Emilio Leonardi Politecnico di Torino Italy MobiHoc Firenze.
Dynamic Location Discovery in Ad-Hoc Networks
MIKEL, Inc USW Solutions Evidence-Based Contact Tracking E-BaCT Brian W. Guimond Steven Nardone.
CYPRUS UNIVERSITY OF TECHNOLOGY DEPARTMENT OF ENVIRONMENTAL SCIENCE AND TECHNOLOGY Rogiros D. Tapakis and Alexandros G. Charalambides COMPUTATION OF CLOUD.
People Counting and Human Detection in a Challenging Situation Ya-Li Hou and Grantham K. H. Pang IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART.
Overview Unmanned Aerial Vehicles (UAVs) DDDAS
1 ECE 776 Project Information-theoretic Approaches for Sensor Selection and Placement in Sensor Networks for Target Localization and Tracking Renita Machado.
A KLT-Based Approach for Occlusion Handling in Human Tracking Chenyuan Zhang, Jiu Xu, Axel Beaugendre and Satoshi Goto 2012 Picture Coding Symposium.
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
A Robotic Wheelchair for Crowded Public Environments Choi Jung-Yi EE887 Special Topics in Robotics Paper Review E. Prassler, J. Scholz, and.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Dr. Shankar Sastry, Chair Electrical Engineering & Computer Sciences University of California, Berkeley.
Overview and Mathematics Bjoern Griesbach
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
1 PSO-based Motion Fuzzy Controller Design for Mobile Robots Master : Juing-Shian Chiou Student : Yu-Chia Hu( 胡育嘉 ) PPT : 100% 製作 International Journal.
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
1 Sequential Acoustic Energy Based Source Localization Using Particle Filter in a Distributed Sensor Network Xiaohong Sheng, Yu-Hen Hu University of Wisconsin.
Multiple Autonomous Ground/Air Robot Coordination Exploration of AI techniques for implementing incremental learning. Development of a robot controller.
Zhiyong Wang In cooperation with Sisi Zlatanova
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
A General Framework for Tracking Multiple People from a Moving Camera
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Pedestrian Detection and Localization
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
Cooperative Air and Ground Surveillance Wenzhe Li.
1 Structure of Aalborg University Welcome to Aalborg University.
Secure In-Network Aggregation for Wireless Sensor Networks
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Jiu XU, Axel BEAUGENDRE and Satoshi GOTO Computer Sciences and Convergence Information Technology (ICCIT), th International Conference on 1 Real-time.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Performance of Adaptive Beam Nulling in Multihop Ad Hoc Networks Under Jamming Suman Bhunia, Vahid Behzadan, Paulo Alexandre Regis, Shamik Sengupta.
Adaptive Tracking in Distributed Wireless Sensor Networks Lizhi Yang, Chuan Feng, Jerzy W. Rozenblit, Haiyan Qiao The University of Arizona Electrical.
A Framework with Behavior-Based Identification and PnP Supporting Architecture for Task Cooperation of Networked Mobile Robots Joo-Hyung Kiml, Yong-Guk.
Using Adaptive Tracking To Classify And Monitor Activities In A Site W.E.L. Grimson, C. Stauffer, R. Romano, L. Lee.
WLD: A Robust Local Image Descriptor Jie Chen, Shiguang Shan, Chu He, Guoying Zhao, Matti Pietikäinen, Xilin Chen, Wen Gao 报告人:蒲薇榄.
Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.
SENSOR FUSION LAB RESEARCH ACTIVITIES PART I : DATA FUSION AND DISTRIBUTED SIGNAL PROCESSING IN SENSOR NETWORKS Sensor Fusion Lab, Department of Electrical.
Pervasive Data Access (PDA) Research Group
Factors that Influence the Geometric Detection Pattern of Vehicle-based Licence Plate Recognition Systems Martin Rademeyer Thinus Booysen, Arno Barnard.
Distributed Sensing, Control, and Uncertainty
RFID Object Localization
Automated traffic congestion estimation via public video feeds
Presentation transcript:

DDDAMS-based Surveillance and Crowd Control via UAVs and UGVs Sponsor: Air Force Office of Scientific Research under FA9550-12-1-0238 Program Manager: Dr. Frederica Darema PIs: Young-Jun Son1, Jian Liu1, Jyh-Ming Lien2 Students: A. Khaleghi1, D. Xu1, Z. Wang1, M. Li1, and C. Vo2 1Systems and Industrial Engineering, University of Arizona 2Computer Science, George Mason University PI Contact: son@sie.arizona.edu; 1-520-9530 ICCS 2013, June 7, 2013

Motivating Problem: TUCSON-1 (Border Patrol) 23-mile long area surrounding the Sasabe port of entry in AZ Major components: sensor towers, communication towers, ground sensors, UAVs (Arducopter; Parrot AR.Drone), UGVs (sonar, laser), crowds, terrain (NASA) Eagle Vision Ev3000-D-IR Capabilities Up to: IR Laser Illuminator 3 km Wireless Link 50 km IR Thermal Camera Range 20 km Uncooled Thermal Camera Range 4 km Trident Sentry Node Capabilities   Life Span 30-45 days Radio Frequency Range 300 m Personal Detection and Tacking Range 30-50 m

Surveillance and Crowd Control Vehicle motion planning Overview of Project Problem: A highly complex, uncertain, dynamically changing border patrol environment Goal: To create robust, multi-scale, and effective urban surveillance and crowd control strategies using UAV/UGVs Various modules: detection, tracking, motion planning Approach: Comprehensive planning and control framework based on DDDAMS Involving integrated simulation environment for hardware, software, human components Key: What data to use; when to use; how to use[1] Surveillance and Crowd Control Formation control, swarm coordination Target Tracking Vehicle Assignment and Searching Target Detection Surveillance region selection Vehicle motion planning DDDAMS

Proposed DDDAMS Framework

Fidelity Representation via Crowd Dynamics Crowd view by UAV Coarser visibility cell View of dense crowd regions Representation of crowd individuals Low fidelity Low density region invisible by UAV High density region visible by UAV Crowd view by UGV Finer visibility cell View of individuals Next waypoints of UGV Control command generation for UAV/UGV motion planning Hybrid fidelity High fidelity Next waypoints of UAVs

Framework of Proposed Methodology

UAV UGV for Crowd Detection Detection algorithms (using open source computer vision libraries - OpenCV) Human detection (upright humans) Histogram of Oriented Gradient (HOG)[5] Motion detection Background subtraction algorithms[6] Sensors in Parrot AR. Drone 2.0 Front Camera: CMOS sensor with a 90 degrees angle lens Ultrasound sensor Emission frequency: 40kHz Range: 6 meters

Framework of Proposed Methodology

Negligible observation error Crowd Tracking Module Crowd Dynamics Identification Crowd Motion Prediction UGV Negligible observation error With observation error Low resolution UGV autoregressive model -> forecasting UAV aggregation model -> prediction UGV state space model -> filtering High Resolution UGV autoregressive model -> forecasting UAV state space model -> filtering UAV

UAV with Low Resolution Ci,j(t): unit cell at the ith row and the jth column at time t. Visibility state: 0 (invisible), 1 (visible). Si,j(t)=[Ci-1,j-1(t), Ci-1,j(t), Ci-1,j+1(t), Ci,j-1(t), Ci,j(t), Ci,j+1(t), Ci+1,j-1(t), Ci+1,j(t), Ci+1,j+1(t)]T. C2,3(t) C2,4(t) C2,5(t) C3,3(t) C3,4(t) C3,5(t) C4,3(t) C4,4(t) C4,5(t) C2,4(t)=1, visible C3,4(t)=0, invisible S3,4(t)=[0,1,1,0,0,0,0,0,0]T Crowd view by UAV at time t

UAV with Low Resolution – Visibility Prediction Let ps(i,j)=Pr(Ci,j(t)=1|Si,j(t-1)=s), s ranging from [0,0,0,0,0,0,0,0,0]T to [1,1,1,1,1,1,1,1,1]T. Predict Ci,j(t+1)’s visibility Estimate ps(i,j) C3,4(t+1)=0 If ps(3,4)<0.5 ? C3,4(t+1)=1 If ps(3,4)>0.5 Observation at time t Prediction at time t+1 ps(3,4)=Pr(C3,4(t+1)=1|S3,4(t)=s),where s=[0,1,1,0,0,0,0,0,0]T

Crowd Prediction by UAV Bayesian estimation of ps(i,j) Example: ps(3,4), s=[0,1,1,0,0,0,0,0,0]T t t+1 Beta prior: Prior knowledge of visibility Data: Total # of observations t1 t1+1 ty ty+1 t’1 t’1+1 t’N-y t’N-y+1 … … counts counts

Crowd Prediction by UAV (Cont’d) Posterior: updated belief of cell (3, 4) being visible. where Based on UAV’s information t+1 UAV’s prior UAV’s data UAV’s data UAV’s prior Prediction at t+2 Bayesian estimator: expected probability for cell (3, 4) to be visible.

Crowd Prediction by UAV with UGVs’ Information Aggregation Individual dynamics captured by UGVs UGVs’ prediction at t+2 t+1 UAV’s prediction at t+2 Combined prediction at t+2 t+1

Crowd Prediction by UAV with UGVs’ Information Aggregation (Cont’d) With aggregated information: where UGV’s information UAV’s information UGV’s information UAV’s information Balance weight: Close-to-one weight assigned if UGV’s prediction is accurate Close-to-zero weight assigned if UGV’s prediction is inaccurate UGV’s information = predicted individual dynamics by UGVs Crowd Dynamics Identification

Crowd Dynamics Identification - Model State space model for linear dynamic systems : stack up of NI state vector, : state noise : measurement noise location speed : state transition matrix (interaction among individuals) : stack up of NI observation vector Gaussian random variable

Crowd Dynamics Identification – Model (Cont’d) State space model for linear dynamic systems : observation matrix (link state with observation) : environmental effect; : environmental factors : effects of UAV/UGVs; : state of UAV/UGVs Functionality of With UAV/UGV effect Without UGV/UAV effect

Tracking Algorithms Objective State-of-the-art methods Identify the crowd motion dynamics and predict crowd location State-of-the-art methods Algorithm Feature Advantage Limitation Kalman Filter[1] Linear dynamic systems Gaussian noise Continuous state space Exact solution Normality and linearity assumptions Extended Kalman filter[2] First-order Taylor series Linearization For nonlinear system Computational complexity Unscented Kalman filter[3] Sampling-based linearization Uni-modal assumption

Tracking Algorithms (Cont’d) State-of-the-art methods Algorithm Feature Advantage Limitation Switching Kalman filter[] Dynamic switching For nonlinear systems Computationally intensive, approximate solutions Grid-based filter[4] Discrete state space No linear/normality assumption Computationally intensive Particle filter[5] Monte Carlo estimation

Crowd Dynamics Identification for UGV Assumption: negligible measurement noise in state space model Equivalence to autoregressive model with exogenous input (ARX model[7]) ARX model

Crowd Dynamics Identification for UGV (Cont’d) ARX Model fitting Data: most recent observations for each detected individual Estimation method: ordinary least square When certain cell becomes dense according to prediction Benefit: effective, lower computation cost in identification observed location at time t observations up to time t predicted location at time t+1 Update UAV prior

Framework of Proposed Methodology

Motion Planning Module - Algorithm Find the optimal path, given a map, start location & destination A grid is constructed based on physical obstacles in the environment for vehicle movements 0 for empty grids 1 for occupied grids A* algorithm with 8- point connectivity used, considering two objectives for UAV ( 𝑓 1 𝐴 , 𝑓 2 𝐴 ) and UGV ( 𝑓 1 𝐺 , 𝑓 2 𝐺 ) Graph Search Algorithms Method Complexity Features Dijkstra Exact cost from start point to any vertex n high Not fast enough Best-First Search Estimated cost from vertex n to the end point Low Not optimal A* Total cost from start point to end point low Guarantees the shortest path in a reasonable time

Motion Planning Module - Objectives Objective 1: Minimize Euclidian traveling distance of UAVs and UGVs Objective 2: Minimize energy consumption considering UAV’s above ground level (AGL) change and UGV’s downhill/uphill movement where, 𝑓 1 𝐴 = 𝑡= 𝑡 0 𝑇−1 [ 𝑥 𝐴 𝑡+1 − 𝑥 𝐴 𝑡 2 + 𝑦 𝐴 𝑡+1 − 𝑦 𝐴 𝑡 2 + 𝑧 𝐴 𝑡+1 − 𝑧 𝐴 𝑡 2 0.5 𝑓 1 𝐺 = 𝑡= 𝑡 0 𝑇−1 [ 𝑥 𝐺 𝑡+1 − 𝑥 𝐺 𝑡 2 + 𝑦 𝐺 𝑡+1 − 𝑦 𝐺 𝑡 2 + 𝑒𝑙𝑒𝑣( 𝑥 𝐺 𝑡+1 , 𝑦 𝐺 𝑡+1 )−𝑒𝑙𝑒𝑣( 𝑥 𝐺 𝑡 , 𝑦 𝐺 𝑡 ) 2 } 0.5 𝑓 2 𝐴 = 𝑡= 𝑡 0 𝑇−1 𝑒𝑙𝑒𝑣𝑃𝑒𝑛𝑎𝑙𝑡𝑦 𝑎𝑛𝑔𝑙𝑒 ∗ [ 𝑥 𝐴 𝑡+1 − 𝑥 𝐴 𝑡 2 + 𝑦 𝐴 𝑡+1 − 𝑦 𝐴 𝑡 2 + 𝑧 𝐴 𝑡+1 − 𝑧 𝐴 𝑡 2 0.5 𝑓 2 𝐺 = 𝑡= 𝑡 0 𝑇−1 𝑒𝑙𝑒𝑣𝑃𝑒𝑛𝑎𝑙𝑡𝑦 𝑎𝑛𝑔𝑙𝑒 ∗ [ 𝑥 𝐺 𝑡+1 − 𝑥 𝐺 𝑡 2 + 𝑦 𝐺 𝑡+1 − 𝑦 𝐺 𝑡 2 + elev( 𝑥 𝐺 t+1 , 𝑦 𝐺 t+1 )−elev( 𝑥 𝐺 t , 𝑦 𝐺 t ) 2 } 0.5 elevPenalty Angle (degree) UGV UAV [ 𝑥 𝐴 t , 𝑦 𝐴 t , 𝑧 𝐴 t ]: Cartesian coordinates of UAV at time t [ 𝑥 𝐺 t , 𝑦 𝐺 t ]: Cartesian coordinates of UGV at time t elev[ 𝑥 𝐺 t , 𝑦 𝐺 t ]: terrain elevation at time t angle= arctan 𝑒𝑙𝑒𝑣 𝑡+1 −𝑒𝑙𝑒𝑣(𝑡) 𝑥 𝐺 𝑡+1 − 𝑥 𝐺 𝑡 2 + 𝑦 𝐺 𝑡+1 − 𝑦 𝐺 𝑡 2 0.5

Control Strategies minimizing a linear combination of both objectives ( ) minimizing travel time ( ) minimizing energy consumption ( ) min 𝛼 1 𝑓 1 + 𝛼 2 𝑓 2 Multi-objective optimization for the evaluation function of various paths based on control strategies Weighting method for multi-objective motion planning problem with Normalize each objective 𝑓 𝑘 (𝑥) as 𝑓 𝑘 (𝑥) Compute the convex combination of all objectives using weights 𝛼 𝑘 for 𝑓 𝑘 (𝑥) inf 𝑓 𝑘 (𝑥) = 𝑚 𝑘 sup 𝑓 𝑘 (𝑥) = 𝑀 𝑘

Assembled Arduino-based UAV (Arducopter) Specifications Additional payload: up to 1200g Flying time: up to 30 minutes (with one 5000 mAh 11.1V lithium polymer battery) Radio range: approx. 1 mile (more range with amplifier) Air data rates: up to 250kbps Radio Frequency: US915 Mhz, Europe433 Mhz Radio Control Training Simulator(AeroSIM®) used for practice to minimize risk of damaging real hardware Flight test path on Google map

Agent-based Hardware-in-the-loop Simulation Another simulator

Finite State Automata based UAV/UGV Execution

Without observation error With Low resolution High UGV Without observation error With Low resolution UGV autoregressive model -> forecasting UAV aggregation model -> prediction UGV state space model -> filtering High UGV autoregressive model -> forecasting UAV state space model -> filtering UAV

Performance Improved significantly Experiment – (1) Investigate the effect of detection range on crowd coverage performance Setting A crowd of 40 individuals 1 UAV, 1 UGV; same detection range Fixed value for tracking horizon and motion planning frequency Result Explanation If detection range increases, crowd coverage percentage increases and gets smoother over time Performance Improved significantly

Performance Improved significantly Experiment – (2) Investigate the effect of number of grids on crowd coverage performance (e.g. GPS resolution) Setting 1 UAV, 1 UGV; Detection range is 15m for both UAV and UGV A crowd of 40 individuals; Result Explanation Performance fluctuates a lot at number of grids 20*20 Performance improves and fluctuation diminishes over time as number of grids increases. Performance Improved significantly

Experiment – (3) Investigate different fidelity levels on crowd coverage performance and computational resource usage (e.g. CPU usage) Setting 2 UAVs, 2 UGVs A crowd of 80 individuals separated as two different groups Result Explanation High simulation fidelity improve the system performance, but consuming more computational power Low fidelity performance reduced

Without observation error With Low resolution High UGV Without observation error With Low resolution UGV autoregressive model -> forecasting UAV aggregation model -> prediction UGV state space model -> filtering High UGV autoregressive model -> forecasting UAV state space model -> filtering UAV

Numerical Simulation Study Simulation setting 50 individuals with changing dynamics (e.g. changing speed) One UAV (monitor whole crowd) and two UGVs (track two subgroups of individuals) UGV tracking: AR model prediction UAV tracking: proposed Bayesian updating approach Crowd dynamics UGV view range Time 1 to 4 Time 5 to 6 Time 7 to 15 UAV view range

Crowd Tracking Outcome

Crowd Tracking Outcome

Summary and Ongoing Works DDDAMS-based planning and control framework for surveillance and crowd control via UAVs and UGVs A series of algorithms for crowd tracking Motion planning algorithms Integrated agent-based simulation platform with UAVs/UGVs Ongoing works Address different control architectures (hierarchical, heterarchical, hybrid) Coordinators: ground controller, UAV (team leader) Further develop vision-based crowd detection Enhancement of crowd motion dynamics by extended-BDI Enhance crowd motion dynamics identification + crowd motion prediction (e.g. multiple visibility level) Expand system automation test

QUESTIONS Young-Jun Son; son@sie.arizona.edu http://www.sie.arizona.edu/faculty/son/index.html 1-520-626-9530

References – (1) [1] Darema, F. "Dynamic Data Driven Applications Systems: A New Paradigm for Application Simulations and Measurements.“ International Conference on Computational Science. 2004, pp. 662–669. [2] Hays, R., Singer, M. Simulation fidelity in training system design: Bridging the gap between reality and training. Springer-Verlag, 1989. [3] Celik, N., Lee, S., Vasudevan, K., Son, Y., "DDDAS-based Multi-fidelity Simulation Framework for Supply Chain Systems," IIE Transactions, 2010, 42(5), 325-341. [4] Celik, N., Son, Y.-J., “Sequential Monte Carlo-based Fidelity Selection in Dynamic-data-driven Adaptive Multi-scale Simulations,” International Journal of Production Research, 2012, 50, 843-865. [5] Dalal N., Triggs, B., “Histograms of Oriented Gradients for Human Detection,” Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), 2005, 886-893. [6] Sheikh, Y., Javed, O., Kanade, T., “Background Subtraction for Freely Moving Cameras,” Proceedings of IEEE 12th International Conference on Computer Vision, 2009, 1219-1225. [7] Welch, G. and G. Bishop, An introduction to the Kalman filter. 1995. [8] Einicke, G.A.; White, L.B. (September 1999). "Robust Extended Kalman Filtering". IEEE Trans. Signal Processing 47 (9): 2596–2599.

References – (2) [9] Julier, S.J. and J.K. Uhlmann, Unscented filtering and nonlinear estimation. Proceedings of the IEEE, 2004. 92(3): p. 401-422. [10] Pavlovic, V., J.M. Rehg, and J. MacCormick, Learning switching linear models of human motion. Advances in Neural Information Processing Systems, 2001: p. 981-987. [11] Silbert, M., T. Mazzuchi, and S. Sarkani. Comparison of a grid-based filter to a Kalman filter for the state estimation of a maneuvering target. in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series. 2011. [12] Ristic, B., S. Arulampalam, and N. Gordon, Beyond the Kalman filter: Particle filters for tracking applications. 2004: Artech House Publishers. [13] Lütkepohl,H. New Introduction to Multiple Time Series Analysis. Berlin: Springer, 2006. [14] Patel, A. 2003. "Amit's Thoughts on Path-finding- Introduction to A*" Accessed Jun 2, 2013, http://theory.stanford.edu/~amitp/GameProgramming/AStarComparison.html. [15] Sinnott, R. W. “Virtues of the Haversine.” Sky telescope, 1984, 68, 159. [16] Moussaïd, M., D. Helbing, and G. Theraulaz. “How Simple Rules Determine Pedestrian Behavior and Crowd Disasters.” In Proceedings of the National Academy of Sciences 2011, 108, No. 17:6884-6888.