Download presentation

Published byDavion Chappell Modified over 3 years ago

1
**DDDAMS-based Surveillance and Crowd Control via UAVs and UGVs**

Sponsor: Air Force Office of Scientific Research under FA Program Manager: Dr. Frederica Darema PIs: Young-Jun Son1, Jian Liu1, Jyh-Ming Lien2 Students: A. Khaleghi1, D. Xu1, Z. Wang1, M. Li1, and C. Vo2 1Systems and Industrial Engineering, University of Arizona 2Computer Science, George Mason University PI Contact: ICCS 2013, June 7, 2013

2
**Motivating Problem: TUCSON-1 (Border Patrol)**

23-mile long area surrounding the Sasabe port of entry in AZ Major components: sensor towers, communication towers, ground sensors, UAVs (Arducopter; Parrot AR.Drone), UGVs (sonar, laser), crowds, terrain (NASA) Eagle Vision Ev3000-D-IR Capabilities Up to: IR Laser Illuminator 3 km Wireless Link 50 km IR Thermal Camera Range 20 km Uncooled Thermal Camera Range 4 km Trident Sentry Node Capabilities Life Span 30-45 days Radio Frequency Range 300 m Personal Detection and Tacking Range 30-50 m

3
**Surveillance and Crowd Control Vehicle motion planning**

Overview of Project Problem: A highly complex, uncertain, dynamically changing border patrol environment Goal: To create robust, multi-scale, and effective urban surveillance and crowd control strategies using UAV/UGVs Various modules: detection, tracking, motion planning Approach: Comprehensive planning and control framework based on DDDAMS Involving integrated simulation environment for hardware, software, human components Key: What data to use; when to use; how to use[1] Surveillance and Crowd Control Formation control, swarm coordination Target Tracking Vehicle Assignment and Searching Target Detection Surveillance region selection Vehicle motion planning DDDAMS

4
**Proposed DDDAMS Framework**

5
**Fidelity Representation via Crowd Dynamics**

Crowd view by UAV Coarser visibility cell View of dense crowd regions Representation of crowd individuals Low fidelity Low density region invisible by UAV High density region visible by UAV Crowd view by UGV Finer visibility cell View of individuals Next waypoints of UGV Control command generation for UAV/UGV motion planning Hybrid fidelity High fidelity Next waypoints of UAVs

6
**Framework of Proposed Methodology**

7
**UAV UGV for Crowd Detection**

Detection algorithms (using open source computer vision libraries - OpenCV) Human detection (upright humans) Histogram of Oriented Gradient (HOG)[5] Motion detection Background subtraction algorithms[6] Sensors in Parrot AR. Drone 2.0 Front Camera: CMOS sensor with a degrees angle lens Ultrasound sensor Emission frequency: 40kHz Range: 6 meters

8
**Framework of Proposed Methodology**

9
**Negligible observation error**

Crowd Tracking Module Crowd Dynamics Identification Crowd Motion Prediction UGV Negligible observation error With observation error Low resolution UGV autoregressive model -> forecasting UAV aggregation model -> prediction UGV state space model -> filtering High Resolution UGV autoregressive model -> forecasting UAV state space model -> filtering UAV

10
**UAV with Low Resolution**

Ci,j(t): unit cell at the ith row and the jth column at time t. Visibility state: 0 (invisible), 1 (visible). Si,j(t)=[Ci-1,j-1(t), Ci-1,j(t), Ci-1,j+1(t), Ci,j-1(t), Ci,j(t), Ci,j+1(t), Ci+1,j-1(t), Ci+1,j(t), Ci+1,j+1(t)]T. C2,3(t) C2,4(t) C2,5(t) C3,3(t) C3,4(t) C3,5(t) C4,3(t) C4,4(t) C4,5(t) C2,4(t)=1, visible C3,4(t)=0, invisible S3,4(t)=[0,1,1,0,0,0,0,0,0]T Crowd view by UAV at time t

11
**UAV with Low Resolution – Visibility Prediction**

Let ps(i,j)=Pr(Ci,j(t)=1|Si,j(t-1)=s), s ranging from [0,0,0,0,0,0,0,0,0]T to [1,1,1,1,1,1,1,1,1]T. Predict Ci,j(t+1)’s visibility Estimate ps(i,j) C3,4(t+1)=0 If ps(3,4)<0.5 ? C3,4(t+1)=1 If ps(3,4)>0.5 Observation at time t Prediction at time t+1 ps(3,4)=Pr(C3,4(t+1)=1|S3,4(t)=s),where s=[0,1,1,0,0,0,0,0,0]T

12
**Crowd Prediction by UAV**

Bayesian estimation of ps(i,j) Example: ps(3,4), s=[0,1,1,0,0,0,0,0,0]T t t+1 Beta prior: Prior knowledge of visibility Data: Total # of observations t1 t1+1 ty ty+1 t’1 t’1+1 t’N-y t’N-y+1 … … counts counts

13
**Crowd Prediction by UAV (Cont’d)**

Posterior: updated belief of cell (3, 4) being visible. where Based on UAV’s information t+1 UAV’s prior UAV’s data UAV’s data UAV’s prior Prediction at t+2 Bayesian estimator: expected probability for cell (3, 4) to be visible.

14
**Crowd Prediction by UAV with UGVs’ Information Aggregation**

Individual dynamics captured by UGVs UGVs’ prediction at t+2 t+1 UAV’s prediction at t+2 Combined prediction at t+2 t+1

15
**Crowd Prediction by UAV with UGVs’ Information Aggregation (Cont’d)**

With aggregated information: where UGV’s information UAV’s information UGV’s information UAV’s information Balance weight: Close-to-one weight assigned if UGV’s prediction is accurate Close-to-zero weight assigned if UGV’s prediction is inaccurate UGV’s information = predicted individual dynamics by UGVs Crowd Dynamics Identification

16
**Crowd Dynamics Identification - Model**

State space model for linear dynamic systems : stack up of NI state vector, : state noise : measurement noise location speed : state transition matrix (interaction among individuals) : stack up of NI observation vector Gaussian random variable

17
**Crowd Dynamics Identification – Model (Cont’d)**

State space model for linear dynamic systems : observation matrix (link state with observation) : environmental effect; : environmental factors : effects of UAV/UGVs; : state of UAV/UGVs Functionality of With UAV/UGV effect Without UGV/UAV effect

18
**Tracking Algorithms Objective State-of-the-art methods**

Identify the crowd motion dynamics and predict crowd location State-of-the-art methods Algorithm Feature Advantage Limitation Kalman Filter[1] Linear dynamic systems Gaussian noise Continuous state space Exact solution Normality and linearity assumptions Extended Kalman filter[2] First-order Taylor series Linearization For nonlinear system Computational complexity Unscented Kalman filter[3] Sampling-based linearization Uni-modal assumption

19
**Tracking Algorithms (Cont’d)**

State-of-the-art methods Algorithm Feature Advantage Limitation Switching Kalman filter[] Dynamic switching For nonlinear systems Computationally intensive, approximate solutions Grid-based filter[4] Discrete state space No linear/normality assumption Computationally intensive Particle filter[5] Monte Carlo estimation

20
**Crowd Dynamics Identification for UGV**

Assumption: negligible measurement noise in state space model Equivalence to autoregressive model with exogenous input (ARX model[7]) ARX model

21
**Crowd Dynamics Identification for UGV (Cont’d)**

ARX Model fitting Data: most recent observations for each detected individual Estimation method: ordinary least square When certain cell becomes dense according to prediction Benefit: effective, lower computation cost in identification observed location at time t observations up to time t predicted location at time t+1 Update UAV prior

22
**Framework of Proposed Methodology**

23
**Motion Planning Module - Algorithm**

Find the optimal path, given a map, start location & destination A grid is constructed based on physical obstacles in the environment for vehicle movements 0 for empty grids 1 for occupied grids A* algorithm with 8- point connectivity used, considering two objectives for UAV ( 𝑓 1 𝐴 , 𝑓 2 𝐴 ) and UGV ( 𝑓 1 𝐺 , 𝑓 2 𝐺 ) Graph Search Algorithms Method Complexity Features Dijkstra Exact cost from start point to any vertex n high Not fast enough Best-First Search Estimated cost from vertex n to the end point Low Not optimal A* Total cost from start point to end point low Guarantees the shortest path in a reasonable time

24
**Motion Planning Module - Objectives**

Objective 1: Minimize Euclidian traveling distance of UAVs and UGVs Objective 2: Minimize energy consumption considering UAV’s above ground level (AGL) change and UGV’s downhill/uphill movement where, 𝑓 1 𝐴 = 𝑡= 𝑡 0 𝑇−1 [ 𝑥 𝐴 𝑡+1 − 𝑥 𝐴 𝑡 𝑦 𝐴 𝑡+1 − 𝑦 𝐴 𝑡 𝑧 𝐴 𝑡+1 − 𝑧 𝐴 𝑡 𝑓 1 𝐺 = 𝑡= 𝑡 0 𝑇−1 [ 𝑥 𝐺 𝑡+1 − 𝑥 𝐺 𝑡 𝑦 𝐺 𝑡+1 − 𝑦 𝐺 𝑡 𝑒𝑙𝑒𝑣( 𝑥 𝐺 𝑡+1 , 𝑦 𝐺 𝑡+1 )−𝑒𝑙𝑒𝑣( 𝑥 𝐺 𝑡 , 𝑦 𝐺 𝑡 ) 2 } 0.5 𝑓 2 𝐴 = 𝑡= 𝑡 0 𝑇−1 𝑒𝑙𝑒𝑣𝑃𝑒𝑛𝑎𝑙𝑡𝑦 𝑎𝑛𝑔𝑙𝑒 ∗ [ 𝑥 𝐴 𝑡+1 − 𝑥 𝐴 𝑡 𝑦 𝐴 𝑡+1 − 𝑦 𝐴 𝑡 𝑧 𝐴 𝑡+1 − 𝑧 𝐴 𝑡 𝑓 2 𝐺 = 𝑡= 𝑡 0 𝑇−1 𝑒𝑙𝑒𝑣𝑃𝑒𝑛𝑎𝑙𝑡𝑦 𝑎𝑛𝑔𝑙𝑒 ∗ [ 𝑥 𝐺 𝑡+1 − 𝑥 𝐺 𝑡 𝑦 𝐺 𝑡+1 − 𝑦 𝐺 𝑡 elev( 𝑥 𝐺 t+1 , 𝑦 𝐺 t+1 )−elev( 𝑥 𝐺 t , 𝑦 𝐺 t ) 2 } 0.5 elevPenalty Angle (degree) UGV UAV [ 𝑥 𝐴 t , 𝑦 𝐴 t , 𝑧 𝐴 t ]: Cartesian coordinates of UAV at time t [ 𝑥 𝐺 t , 𝑦 𝐺 t ]: Cartesian coordinates of UGV at time t elev[ 𝑥 𝐺 t , 𝑦 𝐺 t ]: terrain elevation at time t angle= arctan 𝑒𝑙𝑒𝑣 𝑡+1 −𝑒𝑙𝑒𝑣(𝑡) 𝑥 𝐺 𝑡+1 − 𝑥 𝐺 𝑡 𝑦 𝐺 𝑡+1 − 𝑦 𝐺 𝑡

25
Control Strategies minimizing a linear combination of both objectives ( ) minimizing travel time ( ) minimizing energy consumption ( ) min 𝛼 1 𝑓 1 + 𝛼 2 𝑓 2 Multi-objective optimization for the evaluation function of various paths based on control strategies Weighting method for multi-objective motion planning problem with Normalize each objective 𝑓 𝑘 (𝑥) as 𝑓 𝑘 (𝑥) Compute the convex combination of all objectives using weights 𝛼 𝑘 for 𝑓 𝑘 (𝑥) inf 𝑓 𝑘 (𝑥) = 𝑚 𝑘 sup 𝑓 𝑘 (𝑥) = 𝑀 𝑘

26
**Assembled Arduino-based UAV (Arducopter)**

Specifications Additional payload: up to 1200g Flying time: up to 30 minutes (with one 5000 mAh 11.1V lithium polymer battery) Radio range: approx. 1 mile (more range with amplifier) Air data rates: up to 250kbps Radio Frequency: US915 Mhz, Europe433 Mhz Radio Control Training Simulator(AeroSIM®) used for practice to minimize risk of damaging real hardware Flight test path on Google map

27
**Agent-based Hardware-in-the-loop Simulation**

Another simulator

28
**Finite State Automata based UAV/UGV Execution**

29
**Without observation error With Low resolution High**

UGV Without observation error With Low resolution UGV autoregressive model -> forecasting UAV aggregation model -> prediction UGV state space model -> filtering High UGV autoregressive model -> forecasting UAV state space model -> filtering UAV

30
**Performance Improved significantly**

Experiment – (1) Investigate the effect of detection range on crowd coverage performance Setting A crowd of 40 individuals 1 UAV, 1 UGV; same detection range Fixed value for tracking horizon and motion planning frequency Result Explanation If detection range increases, crowd coverage percentage increases and gets smoother over time Performance Improved significantly

31
**Performance Improved significantly**

Experiment – (2) Investigate the effect of number of grids on crowd coverage performance (e.g. GPS resolution) Setting 1 UAV, 1 UGV; Detection range is 15m for both UAV and UGV A crowd of 40 individuals; Result Explanation Performance fluctuates a lot at number of grids 20*20 Performance improves and fluctuation diminishes over time as number of grids increases. Performance Improved significantly

32
Experiment – (3) Investigate different fidelity levels on crowd coverage performance and computational resource usage (e.g. CPU usage) Setting 2 UAVs, 2 UGVs A crowd of 80 individuals separated as two different groups Result Explanation High simulation fidelity improve the system performance, but consuming more computational power Low fidelity performance reduced

33
**Without observation error With Low resolution High**

UGV Without observation error With Low resolution UGV autoregressive model -> forecasting UAV aggregation model -> prediction UGV state space model -> filtering High UGV autoregressive model -> forecasting UAV state space model -> filtering UAV

34
**Numerical Simulation Study**

Simulation setting 50 individuals with changing dynamics (e.g. changing speed) One UAV (monitor whole crowd) and two UGVs (track two subgroups of individuals) UGV tracking: AR model prediction UAV tracking: proposed Bayesian updating approach Crowd dynamics UGV view range Time 1 to 4 Time 5 to 6 Time 7 to 15 UAV view range

35
**Crowd Tracking Outcome**

36
**Crowd Tracking Outcome**

37
**Summary and Ongoing Works**

DDDAMS-based planning and control framework for surveillance and crowd control via UAVs and UGVs A series of algorithms for crowd tracking Motion planning algorithms Integrated agent-based simulation platform with UAVs/UGVs Ongoing works Address different control architectures (hierarchical, heterarchical, hybrid) Coordinators: ground controller, UAV (team leader) Further develop vision-based crowd detection Enhancement of crowd motion dynamics by extended-BDI Enhance crowd motion dynamics identification + crowd motion prediction (e.g. multiple visibility level) Expand system automation test

38
**QUESTIONS Young-Jun Son; son@sie.arizona.edu**

39
References – (1) [1] Darema, F. "Dynamic Data Driven Applications Systems: A New Paradigm for Application Simulations and Measurements.“ International Conference on Computational Science. 2004, pp. 662–669. [2] Hays, R., Singer, M. Simulation fidelity in training system design: Bridging the gap between reality and training. Springer-Verlag, [3] Celik, N., Lee, S., Vasudevan, K., Son, Y., "DDDAS-based Multi-fidelity Simulation Framework for Supply Chain Systems," IIE Transactions, 2010, 42(5), [4] Celik, N., Son, Y.-J., “Sequential Monte Carlo-based Fidelity Selection in Dynamic-data-driven Adaptive Multi-scale Simulations,” International Journal of Production Research, 2012, 50, [5] Dalal N., Triggs, B., “Histograms of Oriented Gradients for Human Detection,” Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), 2005, [6] Sheikh, Y., Javed, O., Kanade, T., “Background Subtraction for Freely Moving Cameras,” Proceedings of IEEE 12th International Conference on Computer Vision, 2009, [7] Welch, G. and G. Bishop, An introduction to the Kalman filter [8] Einicke, G.A.; White, L.B. (September 1999). "Robust Extended Kalman Filtering". IEEE Trans. Signal Processing 47 (9): 2596–2599.

40
References – (2) [9] Julier, S.J. and J.K. Uhlmann, Unscented filtering and nonlinear estimation. Proceedings of the IEEE, (3): p [10] Pavlovic, V., J.M. Rehg, and J. MacCormick, Learning switching linear models of human motion. Advances in Neural Information Processing Systems, 2001: p [11] Silbert, M., T. Mazzuchi, and S. Sarkani. Comparison of a grid-based filter to a Kalman filter for the state estimation of a maneuvering target. in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series [12] Ristic, B., S. Arulampalam, and N. Gordon, Beyond the Kalman filter: Particle filters for tracking applications. 2004: Artech House Publishers. [13] Lütkepohl,H. New Introduction to Multiple Time Series Analysis. Berlin: Springer, [14] Patel, A "Amit's Thoughts on Path-finding- Introduction to A*" Accessed Jun 2, 2013, [15] Sinnott, R. W. “Virtues of the Haversine.” Sky telescope, 1984, 68, 159. [16] Moussaïd, M., D. Helbing, and G. Theraulaz. “How Simple Rules Determine Pedestrian Behavior and Crowd Disasters.” In Proceedings of the National Academy of Sciences 2011, 108, No. 17:

Similar presentations

OK

MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.

MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on articles of association for a church Ppt on regular expression in php Ppt on management by objectives articles Ppt on porter's five forces diagram Ppt on centring definition Ppt on different types of dance forms of bangladesh Ppt on tcp/ip protocol suite solutions Ppt on the elements of the short story Ppt on business model canvas Ppt on thermal power plant working