1 ECE 776 Project Information-theoretic Approaches for Sensor Selection and Placement in Sensor Networks for Target Localization and Tracking Renita Machado.

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

Pattern Recognition and Machine Learning
Rachel T. Johnson Douglas C. Montgomery Bradley Jones
STATISTICS Sampling and Sampling Distributions
By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
Cooperative Transmit Power Estimation under Wireless Fading Murtaza Zafer (IBM US), Bongjun Ko (IBM US), Ivan W. Ho (Imperial College, UK) and Chatschik.
Energy-Efficient Distributed Algorithms for Ad hoc Wireless Networks Gopal Pandurangan Department of Computer Science Purdue University.
and 6.855J Cycle Canceling Algorithm. 2 A minimum cost flow problem , $4 20, $1 20, $2 25, $2 25, $5 20, $6 30, $
Wireless Networks Should Spread Spectrum On Demand Ramki Gummadi (MIT) Joint work with Hari Balakrishnan.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
0 - 0.
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
MULTIPLYING MONOMIALS TIMES POLYNOMIALS (DISTRIBUTIVE PROPERTY)
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
Addition Facts
MCMC estimation in MlwiN
The Poisson distribution
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
On Sequential Experimental Design for Empirical Model-Building under Interval Error Sergei Zhilin, Altai State University, Barnaul, Russia.
Bayesian network for gene regulatory network construction
THERMAL-AWARE BUS-DRIVEN FLOORPLANNING PO-HSUN WU & TSUNG-YI HO Department of Computer Science and Information Engineering, National Cheng Kung University.
AMCS/CS229: Machine Learning
Department of Engineering Management, Information and Systems
Chapter 18 Methodology – Monitoring and Tuning the Operational System Transparencies © Pearson Education Limited 1995, 2005.
Detection Chia-Hsin Cheng. Wireless Access Tech. Lab. CCU Wireless Access Tech. Lab. 2 Outlines Detection Theory Simple Binary Hypothesis Tests Bayes.
Slide 1 ILLINOIS - RAILROAD ENGINEERING Railroad Hazardous Materials Transportation Risk Analysis Under Uncertainty Xiang Liu, M. Rapik Saat and Christopher.
Routing and Congestion Problems in General Networks Presented by Jun Zou CAS 744.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
Insert Date HereSlide 1 Using Derivative and Integral Information in the Statistical Analysis of Computer Models Gemma Stephenson March 2007.
Module 16: One-sample t-tests and Confidence Intervals
1 General Iteration Algorithms by Luyang Fu, Ph. D., State Auto Insurance Company Cheng-sheng Peter Wu, FCAS, ASA, MAAA, Deloitte Consulting LLP 2007 CAS.
Addition 1’s to 20.
25 seconds left…...
Dynamic Location Discovery in Ad-Hoc Networks
Detecting Spam Zombies by Monitoring Outgoing Messages Zhenhai Duan Department of Computer Science Florida State University.
Week 1.
1 Random Sampling - Random Samples. 2 Why do we need Random Samples? Many business applications -We will have a random variable X such that the probability.
Coverage in Wireless Sensor Network Phani Teja Kuruganti AICIP lab.
1/22 Worst and Best-Case Coverage in Sensor Networks Seapahn Meguerdichian, Farinaz Koushanfar, Miodrag Potkonjak, and Mani Srivastava IEEE TRANSACTIONS.
Using Cramer-Rao-Lower-Bound to Reduce Complexity of Localization in Wireless Sensor Networks Dominik Lieckfeldt, Dirk Timmermann Department of Computer.
Basics of Statistical Estimation
Probabilistic Reasoning over Time
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
A Beacon-Less Location Discovery Scheme for Wireless Sensor Networks Lei Fang (Syracuse) Wenliang (Kevin) Du (Syracuse) Peng Ning (North Carolina State)
Scalable Information-Driven Sensor Querying and Routing for ad hoc Heterogeneous Sensor Networks Maurice Chu, Horst Haussecker and Feng Zhao Xerox Palo.
Visual Recognition Tutorial
1 Sensor Placement and Lifetime of Wireless Sensor Networks: Theory and Performance Analysis Ekta Jain and Qilian Liang, Department of Electrical Engineering,
Exposure In Wireless Ad-Hoc Sensor Networks S. Megerian, F. Koushanfar, G. Qu, G. Veltri, M. Potkonjak ACM SIG MOBILE 2001 (Mobicom) Journal version: S.
Energy-Aware Scheduling with Quality of Surveillance Guarantee in Wireless Sensor Networks Jaehoon Jeong, Sarah Sharafkandi and David H.C. Du Dept. of.
On Energy-Efficient Trap Coverage in Wireless Sensor Networks Junkun Li, Jiming Chen, Shibo He, Tian He, Yu Gu, Youxian Sun Zhejiang University, China.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
A Passive Approach to Sensor Network Localization Rahul Biswas and Sebastian Thrun International Conference on Intelligent Robots and Systems 2004 Presented.
SCALABLE INFORMATION-DRIVEN SENSOR QUERYING AND ROUTING FOR AD HOC HETEROGENEOUS SENSOR NETWORKS Paper By: Maurice Chu, Horst Haussecker, Feng Zhao Presented.
1 Value of information – SITEX Data analysis Shubha Kadambe (310) Information Sciences Laboratory HRL Labs 3011 Malibu Canyon.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Maximizing Lifetime per Unit Cost in Wireless Sensor Networks
SCALABLE INFORMATION-DRIVEN SENSOR QUERYING AND ROUTING FOR AD HOC HETEROGENEOUS SENSOR NETWORKS Paper By: Maurice Chu, Horst Haussecker, Feng Zhao Presented.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Robust Estimation With Sampling and Approximate Pre-Aggregation Author: Christopher Jermaine Presented by: Bill Eberle.
Machine Learning 5. Parametric Methods.
Smart Sleeping Policies for Wireless Sensor Networks Venu Veeravalli ECE Department & Coordinated Science Lab University of Illinois at Urbana-Champaign.
- A Maximum Likelihood Approach Vinod Kumar Ramachandran ID:
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
Probability Theory and Parameter Estimation I
Introduction to particle filter
Introduction to particle filter
The Coverage Problem in a Wireless Sensor Network
LECTURE 23: INFORMATION THEORY REVIEW
Presentation transcript:

1 ECE 776 Project Information-theoretic Approaches for Sensor Selection and Placement in Sensor Networks for Target Localization and Tracking Renita Machado

2 Wireless sensor networks and their applications Wireless sensor networks (WSNs) are networks of large number of nodes deployed over a region to sense, gather and process data about their environment. The self organizing capabilities of WSNs enable their use in applications ranging from surveillance, ecology monitoring, bio- monitors and various other applications for developing smart environments.

3 Key challenges in WSNs Performance measures Coverage Connectivity Optimal redundancy Reliability of network operation Constraints Power limited nodes Economic constraints for dense deployments

4 Application: Target localization and tracking In localization/ tracking, each sensor updates the probability distribution of the target location estimation. Each observation reduces the uncertainty about target location or equivalently, gains information about the target location.

5 Preliminaries and problem formulation Given that we have the Prior target location distribution p (x) Set of candidate sensors for selection S Locations of candidate sensors x i Observation models of candidate sensors p (z i |x) Find The sensor whose observation minimizes the expected conditional entropy of the posterior target location distribution, i.e.  Equivalently, observation of this sensor maximizes the expected reduction of the target location entropy

6 Entropy difference in minimizing the uncertainty of localization Reduction of localization uncertainty attributable to a sensor depends on the difference between A. Entropy of noise free sensor observation B. Entropy of that sensor observation model corresponding to the true target location

7 A. Sensor observation model Sensor observation model corresponding to the true target location  Probability distribution of the sensor observation conditioned on true target location  Incorporates observation error from all sources, including Target Signal modeling error in estimation algorithm used by the sensor Inaccuracy of the sensor hardware Amount of uncertainty in sensor observation model may depend on the target location.

8 Determination of the sensor observation model Since true target location is unknown during the process of target localization and tracking, we have to use an estimated target location to approximate the true target location to determine the sensor observation model.

9 Single-modal target location For a single model target location distribution p(x) that has a single peak, we can use the maximum likelihood estimate (MLE) estimate of the target location x’ to estimate the true target location and the approximate sensor observation model is When p(x) is a single-modal distribution, H(Z i |x’) is the entropy of the sensor observation model for the most likely target location estimate x’.

10 Multimodal target location distribution For a multimodal target location distribution p(x), viz., x’ (m), m=1, 2…M, the entropy of the observation model of sensor i can be approximated by a weighted average as follows When p(x) is a multi-modal distribution, the entropy of the sensor observation model is averaged over all target locations with local maximum likelihood.

11 Relationship of H (Z i |x) to H(Z i |x’) H (Z i |x) is actually the entropy of the sensor observation model averaged over all possible target locations. When the entropy of the sensor observation model H(Z i |x) changes slowly with respect to the target location x, H(Z i |x’) reasonably approximates H (Z i |x).

12 B. Noise free sensor observation Noise free sensor observation  No error is introduced in the sensor observation Let Z i v = noise-free observation of sensor i. Z i v assumes no randomness in the process of observation regarding the target location. Hence it is a function of target location X and sensor location x i. The target location X is a random variable, sensor location x i is a deterministic quantity. Hence the noise free sensor observation is a random variable.

13 Distribution of the noise free sensor observation The target location X could be three-dimensional. The noise-free sensor observation Z i v could be two-dimensional. The distribution of the noise-free sensor observation Z i v is where the observation perspective of sensor i largely depends on the sensor location x i.

14 Computing the noise free sensor observation distribution and its entropy Let X be the set of target location grid values with non-trivial probability density, Let Z be the set of noise-free sensor observation grid values of non-trivial probability density For each grid point z i v є Z, initialize p(z i v ) to zero; For each grid point x є X, the corresponding grid point z i v є Z is calculated using Z i v = f (X, x i ) The probability is updated as p (z i v )= p (z i v ) + p (x) Normalize p (z i v ) to make the total probability of Z to be 1. From p (z i v ), we calculate the noise-free sensor observation entropy H(Z i v ).

15 Relationship of H(Z i v ) to H(Z i ) H(Z i ) is the entropy of the predicted sensor observation distribution, The predicted sensor observation distribution p(Z i ) becomes the noise-free sensor observation p(z i v ) when the sensor observation model p(z i |x) is deterministic without any uncertainty. The uncertainty in the sensor observation model p(z i |x) makes the predicted sensor observation entropy H(Z i ) larger than the noise-free sensor observation entropy H(Z i v ).

16 Since and Thus, the sensor with the maximum entropy difference probably also has the maximum mutual information. When the sensor observation model has only a small amount of uncertainty, Approximations to the mutual-information calculation

17 Why not select the mutual information I(X;Z i )? For target location X and the predicted sensor observation Z i The target location could be 3-dimensional and the sensor observation could be 2-dimensional. Then I(X;Zi) could be a complex integral in the joint state space of 5 dimensions. Thus, the total cost to select one of K candidate sensors is O(n 5 ).

18 Complexity of the entropy approach H(Z i v ) can be computed from p(z i v ) with complexity O(n 2 ) Computing H(Z i |x’) (from single and multi-modal) distributions also requires complexity of O(n 2 ). Thus the cost to compute the entropy difference for one candidate sensor is O(n 3 ). Thus the total cost to select one sensor out of K candidate sensors is O(n 3 ).

19 Reduction in complexity The computational complexity of the mutual information approach is greater than that of the entropy difference approach. With power constraints and processing complexity constraints, the entropy difference approach fares better than the mutual information approach for selecting a sensor for target localization.

20 Results TDOA sensors TDOA, range and DOA sensors

21 Conclusions The entropy difference approach is simpler to calculate than the mutual information criterion for sensor selection. A sub-optimal sensor can be selected without retrieving sensor data. Reference H.Wang, K.Yao and D.Estrin, “Information-theoretic approaches for sensor selection and placement in sensor networks for target localization and tracking”, CENS Technical Report #52, 2005.