Presentation is loading. Please wait.

Presentation is loading. Please wait.

Localization in Wireless Networks

Similar presentations


Presentation on theme: "Localization in Wireless Networks"— Presentation transcript:

1 Localization in Wireless Networks
David Madigan Columbia University

2 The Problem Estimate the physical location of a wireless terminal/user in an enterprise Radio wireless communication network, specifically, based

3 Example Applications Use the closest resource, e.g., printing to the closest printer Security: in/out of a building Emergency 911 services Privileges based on security regions (e.g., in a manufacturing plant) Equipment location (e.g., in a hospital) Mobile robotics Museum information systems

4

5 Physical Features Available for Use
Received Signal Strength (RSS) from multiple access points Angles of arrival Time deltas of arrival Which access point (AP) you are associated with We use RSS and AP association RSS is the only reasonable estimate with current commercial hardware

6 Known Properties of Signal Strength
Signal strength at a location is known to vary as a log-normal distribution with some environment-dependent  Variation caused by people, appliances, climate, etc. Frequency (out of 1000) Signal Strength (dB) The Physics: signal strength (SS; in dB) is known to decay with distance (d) as SS = k1 + k2 log d

7

8 Location Determination via Statistical Modeling
Data collection is slow, expensive (“profiling”) “Productization” Either the access points or the wireless devices can gather the data Focus on predictive accuracy

9 Prior Work discrete, 3-D, etc. Take signal strength measures at many points in the site and do a closest match to these points in signal strength vector space. [e.g. Microsoft’s RADAR system] Take signal strength measures at many points in the site and build a multivariate regression model to predict location (e.g., Tirri’s group in Finland) Some work has utilized wall thickness and materials

10

11 Krishnan et al. Results Infocom 2004 Smoothed signal map per access point + nearest neighbor

12

13 Tree Models

14 What do you predict for the 56-year-old?
Age MaxCharlson Y1Hospital NumClaims Y2Hospital 75 4 Yes 12 1 No 35 38 15 3 5 23 2 11 56 22 ? Age > 70? 1/5 2/2 yes no Age > 70? 1/5 2/2 yes no Age > 30? 3/3 3/4 yes no What do you predict for the 56-year-old?

15 Age > 70? 1/5 2/2 NumClaims > 10? 1/6 1/1 Y1Hospital = “yes” 2/4
MaxCharlson Y1Hospital NumClaims Y2Hospital 75 4 Yes 12 1 No 35 38 15 3 5 23 2 11 56 22 ? Age > 70? 1/5 2/2 yes no NumClaims > 10? 1/6 1/1 yes no Y1Hospital = “yes” 2/4 1/3 yes no

16 Age MaxCharlson Y1Hospital NumClaims Y2Hospital 75 4 Yes 12 1 No 35 38 15 3 5 23 2 11 Y1Hospital = “yes” no yes MaxCharlson > 3 NumClaims > 10 no yes no yes 2/2 1/1 3/3 1/1

17 NumClaims Age

18 5 4 3 2 1 10 20 30 40 50 60 70 Age NumClaims NumClaims < 4
no yes Age > 30 Age > 60 yes no yes no Age < 42 blue NumClaims < 5 no yes no yes red blue NumClaims < 2 red no yes blue Age < 38 NumClaims 5 no yes red blue 4 3 2 1 10 20 30 40 50 60 70 Age

19 Regression Trees idea: pick splits to minimize mean square error where

20 Nearest Neighbor Methods

21 Nearest Neighbor Methods
k-NN assigns an unknown object to the most common class of its k nearest neighbors Choice of k? Choice of distance metric?

22 numClaims age

23

24

25

26 Library(kknn) myModel <- kknn(yesno~dollar+bang+money,spam.train,spam.test,k=3) > names(myModel) [1] "fitted.values" "CL" "W" "D" "prob" "response" "distance" "call" "terms" > myModel$fitted.values [1] n y y n y n y y y y n n n n n y n n y n n n y n n n y n y n n y n y n n n n n n y y n n n n n n y n n n n y n n n n y y y n n n n y n y n y n n n n n y n

27 Bayesian Graphical Model Approach
X Y D1 D2 D3 D4 D5 S1 S2 S3 S4 S5 average

28 X1 Y1 X2 Y2 D11 D12 D13 D14 D15 D21 D22 D23 D24 D25 S11 S12 S13 S14 S15 Xn Yn S21 S22 S23 S24 S25 Dn1 Dn2 Dn3 Dn4 Dn5 b40 b50 b10 b20 b30 b41 b51 b11 b21 b31 Sn1 Sn2 Sn3 Sn4 Sn5

29 Plate Notation Xi Yi Dij Sij i=1,…,n b0j b1j j=1,…,5

30

31

32

33 Hierarchical Model X Y D1 D2 D3 D4 D5 S1 S2 S3 S4 S5 b1 b2 b3 b4 b5 b

34 Hierarchical Model Xi Yi Dij Sij i=1,…,n b0j b1j j=1,…,5 m0 t0 m1 t1

35 Xi Yi full joint: [m0][t0][m1][t1][b0|m0,t0][b1|m1,t1][S|b0,b1,D][D|X,Y][X][Y] Dij metropolis moves one variable at a time from the full conditional, e.g b1: Sij “Gibbs sampling” [b1|b0,m1,t1,m0,t0,S,D,X,Y] b0j b1j [b1,b0,m1,t1,m0,t0,S,D,X,Y] m0 t0 m1 t1 [b1|m1,t1][S|b0,b1,D] sampling importance resampling, rejection sampling, slice sampling

36

37 Pros and Cons Bayesian model produces a predictive distribution for location MCMC can be slow Difficult to automate MCMC (convergence issues) Perl-WinBUGS (perl selects training and test data, writes the WinBUGS code, calls WinBUGS, parses the output file)

38 What if we had no locations in the training data?

39

40

41 Zero Profiling? Simple sniffing devices can gather signal strength vectors from available WiFi devices Can do this repeatedly Locations of the Access Points

42 Why does this work? Prior knowledge about distance-signal strength
Prior knowledge that access points behave similarly Estimating several locations simultaneously

43 Corridor Effects X Y D1 D2 D3 D4 D5 C1 C2 C3 C4 C5 S1 S2 S3 S4 S5 b1

44 Results for N=20, no locations
corridor main effect corridor -distance interaction average error with mildly informative prior on the distance main effect… corridor main effect corridor -distance interaction average error

45

46 Discussion Informative priors
Convenience and flexibility of the graphical modeling framework Censoring (30% of the signal strength measurements) Repeated measurements & normal error model Tracking Machine learning-style experimentation is clumsy with perl-WinBUGS

47

48 Prior Work Use physical characteristics of signal strength propagation and build a model augmented with a wall attenuation factor Needs detailed (wall) map of the building; model portability needs to be determined [RADAR; INFOCOM 2000] based on [Rappaport 1992]


Download ppt "Localization in Wireless Networks"

Similar presentations


Ads by Google