Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Collaborative Processing in Sensor Networks Lecture 4 - Distributed In-network Processing Hairong Qi, Associate Professor Electrical Engineering and.

Similar presentations


Presentation on theme: "1 Collaborative Processing in Sensor Networks Lecture 4 - Distributed In-network Processing Hairong Qi, Associate Professor Electrical Engineering and."— Presentation transcript:

1 1 Collaborative Processing in Sensor Networks Lecture 4 - Distributed In-network Processing Hairong Qi, Associate Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville http://www.eecs.utk.edu/faculty/qi Email: hqi@utk.edu Lecture Series at ZheJiang University, Summer 2008

2 2 Research Focus - Recap Develop energy-efficient collaborative processing algorithms with fault tolerance in sensor networks –Where to perform collaboration? –Computing paradigms –Who should participate in the collaboration? –Reactive clustering protocols –Sensor selection protocols –How to conduct collaboration? –In-network processing –Self deployment

3 3 A Signal Processing and Fusion Hierarchy for Target Classification modality 1 modality M Temporal fusion Multi-modality fusion Multi-sensor fusion …… Temporal fusion … … … … … …… node 1 node inode N Local processing Local processing Local processing Local processing Local processing Local processing Mobile-agent Middleware

4 4 Signal - Acoustic/Seismic/IR

5 5 Local Signal Processing Amplitude stat. Time series signal Power Spectral Density (PSD)Wavelet Analysis Shape stat. Peak selection Coefficients Feature vectors (26 elements) Feature normalization, Principal Component Analysis (PCA) Target Classification (kNN) Feature extraction Classifier design

6 6 A Signal Processing and Fusion Hierarchy for Target Classification modality 1 modality M Temporal fusion Multi-modality fusion Multi-sensor fusion …… Temporal fusion … … … … … …… node 1 node inode N Local processing Local processing Local processing

7 7 Temporal Fusion Fuse all the 1-sec sub-interval local processing results corresponding to the same event (usually lasts about 10-sec) Majority voting number of possible local processing results number of local output c occurrence

8 8 A Signal Processing and Fusion Hierarchy for Target Classification modality 1 modality M Temporal fusion Multi-modality fusion Multi-sensor fusion …… Temporal fusion … … … … … …… node 1 node inode N Local processing Local processing Local processing

9 9 Multi-modality Fusion Difference sensing modalities can compensate each other’s sensing capability and provide a comprehensive view of the event Treat results from different modalities as independent classifiers – classifier fusion Majority voting won’t work Behavior-Knowledge Space algorithm (Huang&Suen) Assumption: - 2 modalities - 3 kinds of targets - 100 samples in the training Set Then: - 9 possible classification combinations c 1, c 2 samples from each classfused result 1,110/3/31 1,23/0/63 1,35/4/51,3 … 3,30/0/63

10 10 A Signal Processing and Fusion Hierarchy for Target Classification modality 1 modality M Temporal fusion Multi-modality fusion Multi-sensor fusion …… Temporal fusion … … … … … …… node 1 node inode N Local processing Local processing Local processing

11 11 Value-based vs. Interval-based Fusion Interval-based fusion can provide fault tolerance Interval integration – overlap function –Assume each sensor in a cluster measures the same parameters, the integration algorithm is to construct a simple function (overlap function) from the outputs of the sensors in a cluster and can resolve it at different resolutions as required w5s5w5s5 w4s4w4s4 w3s3w3s3 w2s2w2s2 w1s1w1s1 w6s6w6s6 w7s7w7s7 32103210 O(x)O(x) Crest: the highest, widest peak of the overlap function

12 12 Fault Tolerance Comparison of Different Overlap Functions n: number of sensors f: number of faulty sensors M: to find the smallest interval that contains all the intersections of n-f intervals S: return interval [a,b] where a is the (f+1)th left end point and b is the (n-f)th right end point Ω: overlap function N: interval with the overlap function ranges [n-f, n]

13 13 Multi-sensor Fusion for Target Classification Generation of local confidence ranges (For example, at each node i, use kNN for each k  {5,…,15}) Apply the integration algorithm on the confidence ranges generated from each node to construct an overlapping function confidence range confidence level smallest largest in this column Class 1 Class 2 … Class n k=5 3/5 2/5 … 0 k=6 2/6 3/6 … 1/6 … … … … … k=15 10/15 4/15 … 1/15 {2/6, 10/15} {4/15, 3/6} … {0, 1/6}

14 14 Distributed Integration Scheme Integration techniques must be robust and fault- tolerant to handle uncertainty and faulty sensor readouts Provide progressive accuracy A 1-D array serves to represent the partially- integrated overlap function Mobile-agent-based framework is employed

15 15 Mobile-agent-based Multi-sensor Fusion for Target Classification Node 1 Node 2 Node 3 [2/6, 10/15] Class 1 [3/15, 3/6] Class 2 … Confidence range CR1 [3/6, 4/5] Class 1 [0, 5/15] Class 2 … Confidence range CR2 [3/5, 10/12] Class 1 [1/15, 2/8] Class 2 … Confidence range CR3 At node 2, fusion code generates partially integrated confidence range CR12 [3/6, 10/15] Class 1 [3/15, 5/15] Class 2 … Fusion result at node 2: class 1 (58%) Mobile agent carries CR1 Mobile agent carries CR12 Fusion result at node 3: class 1 (63%) At node 3, fusion code generates partially integrated confidence range CR123 [3/5, 10/15] Class 1 [3/15, 2/8] Class 2 …

16 16 An Example of Multi-sensor Fusion [0.10 0.29] [0.46 0.65] [0.10 0.21] [0.05 0.14] [0.05 0.41] [0.22 0.58] [0.05 0.15] [0.49 0.59] [0.08 0.16] [0.51 0.60] node 1 node 2 node 3 node 4 h: height of the highest peak w: width of the peak acc: confidence at the center of the peak

17 17 Example of Multi-sensor Integration Result at Each Stop stop 1stop 2stop 3stop 4 caccc c c class 110.20.50.1250.750.1251 class 22.30.5754.550.350.60.10.750.125 class 30.70.1750.50.253.30.553.450.575

18 18 Performance Evaluation of Target Classification Hierarchy North-South East-West Center

19 19 SITEX02 Scenario Setup Acoustic sampling rate: 1024Hz Seismic sampling rate: 512 Hz Target types: AAV, DW, and HMMWV Collaborated work with two other universities (Penn State, Wisconsin) Data set is divided into 3 partitions: q1, q2, q3 Cross validation –M1: Training (q2+q3); Testing (q1) –M2: Training (q1+q3); Testing (q2) –M3: Training (q1+q2); Testing (q3)

20 20 Confusion Matrices of Classification on SITEX02 Acoustic (75.47%, 81.78%) Seismic (85.37%, 89.44%) Multi-modality fusion (84.34%) Multi-sensor fusion (96.44%) AAVDWHMV AAV2921 DW0188 HMV0223

21 21 Result from BAE-Austin Demo Suzuki Vitara Ford 350Ford 250 Harley Motocycle

22 22 Confusion Matrices Acoustic Seismic Multi-modal

23 23

24 24 Demo Mobile-agent-based multi-modal multi-sensor classification –Mobile agent migrates with a fixed itinerary (Tennessee) –Mobile agent migrates with a dynamic itinerary realized by distributed services (Auburn)

25 25 run 1: car walker run 2: car

26 26 Target Detection in Sensor Networks Single target detection –Energy threshold –Energy decay model: Multiple target detection –Why is multiple target detection necessary? –Requirements: energy-efficient & bandwidth efficient –Multiple target detection vs. source number estimation source source matrix observation matrix unmixing matrix Assumption: Target separation Speaker separation

27 27 Source Number Estimation (SNE) Task: Algorithms for source number estimation –Heuristic techniques –Principled approaches: Bayesian estimation, Markov chain Monte Carlo method, etc. Limitations –Centralized structure –Large amount of raw data transmission –Computation burden on the central processing unit Environment Sensor 1Sensor i central processing unit data …… decision Sensor n

28 28 Distributed Source Number Estimation Scheme Sensor … cluster head cluster head Fusion Center decision cluster 1 cluster L System architecture Algorithm structure m targets present in sensor filed Local estimation at cluster 1 progressive Local estimation at cluster L progressive Posterior probability fusion … centralized

29 29 Centralized Bayesian Estimation Approach : sensor observation matrix : hypothesis of the number of sources : source matrix : mixing matrix, : unmixing matrix, and : latent variable, and : non-linear transformation function : noise, with variance : marginal distribution of Source: S. J. Roberts, “Independent component analysis: source assessment & separation, a Bayesian approach,” IEE Proc. on Vision, Image, and Signal Processing, 145(3), pp.149-154, 1998.

30 30 Progressive Bayesian Estimation Approach Motivation –Reduce data transmission –Conserve energy m targets present in sensor filed Local estimation at cluster 1 progressive Local estimation at cluster L progressive Posterior probability fusion … centralized

31 31 Progressive Bayesian Estimation Approach Environment Sensor 1Sensor 2 … Sensor i … Sensor n Estimation accuracy is progressively improved includes … Transmitted information

32 32 Progressive Update at Local Sensors only depends on the updating rule of mixing matrix Progressive update of log-likelihood function:

33 33 Progressive Update at Local Sensors only depends on the updating rule of mixing matrix Progressive update of mixing matrix : BFGS method (Quasi-Newton method) Progressive update of error:

34 34

35 35 An Example

36 36 Mobile-agent-based Implementation of Progressive Estimation Sensor 1 Sensor 2 Sensor 3 Initialization Update Update terms in log-likelihood function Compute estimated latent variable Update estimation error Update Update terms in log-likelihood function Compute estimated latent variable Update estimation error Output

37 37 Distributed Fusion Scheme Bayes theorem Dempster’s rule of combination –Utilize probability intervals and uncertainty intervals to determine the likelihood of hypotheses based on multiple evidence m targets present in sensor filed Local estimation at cluster 1 progressive Local estimation at cluster L progressive Posterior probability fusion … centralized where

38 38 Performance Evaluation Target types Sensor nodes clustering Signals: 1-second acoustic, 500 samples each m targets present in sensor filed Local estimation progressive Local estimation progressive Posterior probability fusion … centralized BayesianDempster’s rule m targets present in sensor filed centralizedprogressive Source number estimation centralized Experiment 1 progressive Experiment 2 Bayesian Experiment 3 Dempster’s rule Experiment 4 centralized progressive Experiment 5 Experiment 6

39 39 Average Log-likelihood Comparison Experiment 1 (Centralized) Experiment 2 (Centralized + Bayesian) Experiment 3 Centralized + Dempster’s rule Experiment 4 (Progressive) Experiment 5 & 6 (Progressive (cluster 1) vs. fusion) Experiment 5 & 6 (Progressive (cluster 2) vs. fusion)

40 40 Output Histogram Comparison Experiment 1 (Centralized) Experiment 2 (Centralized + Bayesian fusion) Experiment 3 (Centralized + Dempster’s rule) Experiment 4 (Progressive) Experiment 5 (Progressive + Bayesian fusion) Experiment 6 (Progressive + Dempster’s rule)

41 41 Performance Comparison Kurtosis Data transmission Detection probability Energy consumption

42 42 Discussion Develop a novel distributed multiple target detection scheme with –Progressive source number estimation approach (first in literature) –Mobile agent implementation –Cluster-based probability fusion The approach (progressive intra-cluster estimation + Bayesian inter- cluster fusion) has the best performance KurtosisDetection probability (%) Data transmission (bits) Energy comsumption ( uW*sec) Centralized -1.24970.25144,000486,095 Centralized + Bayesian fusion 5.69050.65128,160433,424 Progressive + Bayesian fusion 4.88660.5524,160 (16.78%)89,242 (18.36%)

43 43 Reference X. Wang, H. Qi, S. Beck, H. Du, “Progressive approach to distributed multiple-target detection in sensor networks,” pp. 486-503, Sensor Network Operations. Editor: Shashi Phoha, Thomas F. La Porta, Christopher Griffin. Wiley-IEEE Press, March 2006. H. Qi, Y. Xu, X. Wang, “Mobile-agent-based collaborative signal and information processing in sensor networks,” Proceedings of the IEEE, 91(8):1172-1183, August 2003. H. Qi, X. Wang, S. S. Iyengar, K. Chakrabarty, “High performance sensor integration in distributed sensor networks using mobile agents,” International Journal of High Performance Computing Applications, 16(3):325-335, Fall, 2002.


Download ppt "1 Collaborative Processing in Sensor Networks Lecture 4 - Distributed In-network Processing Hairong Qi, Associate Professor Electrical Engineering and."

Similar presentations


Ads by Google