Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jie Gao Joint work with Amitabh Basu*, Joseph Mitchell, Girishkumar Stony Brook Distributed Localization using Noisy Distance and Angle Information.

Similar presentations


Presentation on theme: "Jie Gao Joint work with Amitabh Basu*, Joseph Mitchell, Girishkumar Stony Brook Distributed Localization using Noisy Distance and Angle Information."— Presentation transcript:

1 Jie Gao Joint work with Amitabh Basu*, Joseph Mitchell, Girishkumar Sabhnani* @ Stony Brook Distributed Localization using Noisy Distance and Angle Information To appear in ACM MobiHoc 2006

2 Localization in sensor networks Given local measurements –Connectivity –Distance measurements –Angle measurements Find –Relative positions –Absolute positions

3 Localization in sensor networks Location info is important for –Integrity of sensor readings –Many basic network functions Topology control Geographical routing Clustering and self-organization.

4 Localization problem Extensively studied. Anchor-based methods –Anchors know positions, e.g., via GPS. –Triangulation-type of methods, e.g., [Savvides et al.] Anchor-free methods –Local measurements  global layout. –We use this approach.

5 Anchor-free localization Distance information only –Global optimization MDS [Shang 03], SDP [Biswas & Ye 04] –Localized, distributed algorithm Mass-spring optimization, robust quadrilateral [Moore 04], etc. Graph rigidity!

6 Our approach Distance + angle information Measurements are noisy. Assume a global north. Upper/lower bound on distance and direction of neighbors. Goal: find an embedding that satisfies all the constraints.

7 Our results Finding a feasible solution with noisy distance + angle is NP-hard. A distributed, iterative algorithm for a relaxation.

8 Hardness results Accurate distance + angle: trivial. Infinite noise, non-neighbors >1 = Unit disk graph embedding: NP-hard [Breu & Kirkpatrick]. Accurate angle, infinite noise in distance, non- neighbors >1: NP-hard [Bruck05]. Accurate distance, infinite noise in angle, non- neighbors >1: NP-hard [Aspnes et. al. 04].

9 This paper 1.ε noise in distance, δ noise in angle, for arbitrarily small ε, δ, finding a feasible solution is NP-hard. 2.Accurate distance, relative angle, non-neighbors >1: NP-hard. Reduction from 3SAT. or

10 Solve a relaxation Use a convex approximation to the non- convex frustum, e.g, a trapezoid. All the constraints are linear. Use linear programming to solve for an embedding. Solution not unique. Compute all of them.

11 Weak deployment regions We solve for Regions of Deployment Weak deployment –All feasible solutions. Upper bound. –Fix a sensor,  a feasible solution for the other sensors.

12 Strong deployment regions We solve for Regions of Deployment Strong deployment –Inherent uncertainty. Lower bound. –Pick any point within each region independently  a feasible solution.

13 Linear programming We can also solve weak and strong deployment by LP. Let ’ s look at weak deployment first.

14 Weak deployment and LP LP for feasibility of embedding. n sensors, m edges. Variables: (x i, y i ) for each sensor i. # variables 2n, # constraints: 8m. A valid embedding is a point in R 2n. The feasible polytope P in R 2n : collection of all feasible solutions. Weak deployment region for sensor i = projection of P onto plane (x i, y i ).

15 Theory of convex polytope The feasible polytope P has 8m faces. In general, the complexity of P (# vertices) and its projection, can be exponential in 8m.

16 Solve for weak deployment Our problem has special structures: The weak deployment region has O(m) complexity in the worst case. We can solve it in polynomial time by linear programming. There is a distributed algorithm that finds the same solution as the global LP.

17 What next? A distributed, iterative algorithm for the weak deployment problem. Show why the complexity of weak deployment region is O(m). Simulation results. Strong deployment.

18 RiRi RjRj Forward constraint propagation Each node keeps a current feasible region R i. Region R i shrinks region R j. R j  R j ∩ R i  F ij. Minkowski sum X  Y={p+q | p ∊ X, q ∊ Y} F ij

19 Backward constraint propagation RiRi RjRj When R j shrinks, then R i can also shrink. R i  R i ∩ R j  (-F ij ). -F ij

20 Iterative algorithm Pin down one node at the origin. Initialize all other regions as R 2. Until all regions stabilize –For each sensor, compute new regions from all neighbors ’ regions Both forward & backward propagation. –Shrink its current region to the common intersection.

21 Iterative algorithm correctness The iterative algorithm computes the weak deployment regions. Proof sketch: –Regions always shrink. –It converges to weak deployment region when shrinking stops. –The algorithm stops after a finite number of steps

22 Convergence Prove by contradiction. Assume a point p  R i * for sensor i. For every sensor j, propagate the constraints from i to j along all possible paths. Take the common intersection of these regions, say P i. p

23 Convergence Recall p  R i *. Thus either 1.One region P j is empty. 2.The origin k is outside P k. 1 is not possible. –The shape of P j doesn’t depend on p. –Start from a point in R i *, the LP is infeasible. p p* Pj

24 Convergence Recall p  R i *. Thus either 1.One region P j is empty. 2.The origin k is outside P k. If 2 happens. –Reverse the paths from k to i. –The point p will be eliminated. –The algorithm hasn’t converged. p k=origin

25 Why the regions are O(m)? All the operations are Minkowski sums and intersections. Minkowski sum X  Y: boundary comes from the boundaries of X and Y

26 Why the regions are O(m)? All the operations are Minkowski sums and intersections. Slopes of the region boundary come from the original constraints. There are only 8m different slopes. If we use rectangle constraints, then all the deployment regions are rectangles.

27 Convergence rate Nodes randomly deployed. Communication graph: unit disk graph.

28 Robustness to link variation Links switch on ↔ off with prob p: 0~1. The deployment regions are stable.

29 Robustness to link variation Links switch on ↔ off with prob p: 0~1. Due to network disconnection. When p is small, it is slow to get re-connected.

30 Comparison to SDP [Biswas & Ye] SDP only uses noisy distance measurements. We use angle range  /4. Less dependency on # anchors.

31 Comparison to SDP [Biswas & Ye] SDP only uses noisy distance measurements. We choose angle range  /4. Two metrics: Center furthest point. WD: weak deployment SD: strong deployment

32 Strong deployment –Inherent uncertainty. Lower bound. –Pick any point within each region independently  a feasible solution.

33 Strong deployment More subtle! One can shrink the region for one to get a larger region for the others. We propose to find the same shaped region for every node, e.g., square, as large as possoble. Formulate as LP? Infinite # constraints?

34 Strong deployment By convexity, if the constraints are satisfied for every pair of corners of the deployment regions, then the constraints are satisfied for every pair of internal points. Formulate a LP w/ constraints on all pairs of corners. Maximize the size of the region r.

35 Strong deployment Reduce to weak deployment. Distributed algorithm. –Guess the size r. –Solve for center of the strong deployment region. –Binary search on r.

36 Conclusion Localization with noisy distance + angle measurements. Complete the hardness results. Upper/lower bound: weak/strong deployment regions. Linear programming and distributed implementation.

37 Future work Convergence rate of the distributed iterative algorithm. Bound the approximation through the relaxation of non-convex constraints. Generalize the noise model to probabilistic distributions.

38 Questions? Thank you!


Download ppt "Jie Gao Joint work with Amitabh Basu*, Joseph Mitchell, Girishkumar Stony Brook Distributed Localization using Noisy Distance and Angle Information."

Similar presentations


Ads by Google