Presentation is loading. Please wait.

Presentation is loading. Please wait.

U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Routing and Data Dissemination Presented by: Li, Huan Liu, Junning.

Similar presentations


Presentation on theme: "U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Routing and Data Dissemination Presented by: Li, Huan Liu, Junning."— Presentation transcript:

1 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Routing and Data Dissemination Presented by: Li, Huan Liu, Junning

2 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Outline Motivation and Challenges Basic Idea of Three Routing and Data Dissemination schemes in Sensor Networks Some Thoughts on Comparison of the Data dissemination schemes

3 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Differences with Current Networks Difficult to pay special attention to any individual node: Collecting information within the specified region Collaboration between neighbors Sensors may be inaccessible: embedded in physical structures. thrown into inhospitable terrain.

4 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Differences with Current Networks Sensor networks deployed in very large ad hoc manner No static infrastructure They will suffer substantial changes as nodes fail: battery exhaustion accidents new nodes are added.

5 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Differences with Current Networks User and environmental demands also contribute to dynamics: Nodes move Objects move Data-centric and application-centric Location aware Time aware

6 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Overall Design of Sensor Networks One possible solution? Internet technology coupled with ad-hoc routing mechanism  Each node has one IP address  Each node can run applications and services  Nodes establish an ad-hoc network amongst themselves when deployed  Application instances running on each node can communicate with each other

7 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Why Different and Difficult? A sensor node is not an identity (address)  Content based and data centric Where are nodes whose temperatures will exceed more than 10 degrees for next 10 minutes? Tell me the location of the object ( with interest specification) every 100ms for 2 minutes.

8 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Why Different and Difficult? Multiple sensors collaborate to achieve one goal. Intermediate nodes can perform data aggregation and caching in addition to routing. where, when, how?

9 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Why Different and Difficult? Not node-to-node packet switching, but node-to-node data propagation. High level tasks are needed: At what speed and in what direction was that elephant traveling? Is it the time to order more inventory?

10 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Challenges Energy-limited nodes Computation Aggregate data Suppress redundant routing information Communication Bandwidth-limited Energy-intensive Goal: Minimize energy dissipation

11 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Challenges Scalability: ad-hoc deployment in large scale Fully distributed w/o global knowledge Large numbers of sources and sinks Robustness: unexpected sensor node failures Dynamically Change: no a-priori knowledge sink mobility target moving

12 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Challenges Topology or geographically issue Time : out-of-date data is not valuable Value of data is a function of time, location, and its real sensor data. Is there a need for some general techniques for different sensor applications? Small-chip based sensor nodes Large sensors, e.g., rada Moving sensors, e.g., robotics

13 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science SPIN: The Goal Broadcast with minimum energy W.R.Heinzelman, J.Kulik, H.Balakrishnan

14 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Conventional Approach B D E F G C A Flooding Send to all neighbors E.g., routing table updates

15 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Resource Inefficiencies Implosion A B C D (a) A B C (r,s) (q,r) qs r Data overlap Resource blindness

16 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science What is the optimum protocol? B D E F G C A “Ideal” Shortest-path routes Avoids overlap Minimum energy Need global topology information

17 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Two basic ideas Exchanging sensor data may be expensive, but exchanging data about sensor data may not be. Nodes need to monitor and adapt to changes in their own energy resources

18 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science SPIN Family Data negotiation Meta-data (data naming) Application-level control Model “ideal” data paths SPIN messages ADV- advertise data REQ- request specific data DATA- requested data Resource management A B A B A B ADV REQ DATA Sensor Protocol for Information via Negotiation

19 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science B A ADV REQ DATA SPIN-PP Example: ADV REQ DATA

20 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science SPIN on Point-to-Point Networks SPIN-PP 3-stage handshake protocol Advantages Simple Minimal start-up cost SPIN-EC SPIN-PP + low-energy threshold Modifies behavior based on current energy resources

21 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Test Network 25 Nodes Antenna reach = 10 meters Average degree = 4.7 neighbors 59 Edges Network diameter = 8 hops Data 500 bytes 16 bytes Meta-Data

22 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Unlimited Energy Simulations -- SPIN-PP -- Ideal -- Flooding Flooding converges first No queuing delays SPIN-PP Reduces energy by 70% No redundant DATA messages

23 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Limited Energy Simulations SPIN-EC distributes additional 20% data -- Ideal -- SPIN-EC -- SPIN-PP -- Flooding

24 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Conclusions Successfully use meta-data negotiation to solve the implosion, overlap problem of simple flooding and gossiping. Resource-adaptive enhancements Simple scheme, small communication overhead, but a performance close to the ideal situation.

25 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Future work Consider the cost of not only communicating data, but also synthesizing data, make it more realistic resource- adaptation protocols. Queuing delay, loss-prone nature of wireless channels can be incorporated and experimented.

26 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Limitations The SPIN EC(Energy Constrained) version’s strategy may be too simple. There should be a topology dependant strategy, e.g. a narrow bridge connecting two connected component should be more energy conservative. The ideal criteria used to compare with SPIN is ideal in terms of data dissemination rate, so really not ‘ideal’ anymore when energy or other resources are limited, need a new goal function.

27 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Directed Diffusion A Scalable and Robust Communication Paradigm for Sensor Networks C. Intanagonwiwat R. Govindan D. Estrin

28 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Application Example: Remote Surveillance e.g., “Give me periodic reports about animal location in region A every t seconds” e.g., “Give me periodic reports about animal location in region A every t seconds” Tell me in what direction that vehicle in region Y is moving? Tell me in what direction that vehicle in region Y is moving?

29 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Basic Idea In-network data processing (e.g., aggregation, caching) Distributed algorithms using localized interactions Application-aware communication primitives expressed in terms of named data

30 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Elements of Directed Diffusion Naming Data is named using attribute-value pairs Interests A node requests data by sending interests for named data Gradients Gradients is set up within the network designed to “draw” events, i.e. data matching the interest. Reinforcement Sink reinforces particular neighbors to draw higher quality ( higher data rate) events

31 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Naming Content based naming Tasks are named by a list of attribute – value pairs Task description specifies an interest for data matching the attributes Animal tracking: Interest ( Task ) Description Type = four-legged animal Interval = 20 ms Duration = 1 minute Location = [-100, -100; 200, 400] Request Node data Type =four-legged animal Instance = elephant Location = [125, 220] Confidence = 0.85 Time = 02:10:35Reply

32 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Interest The sink periodically broadcasts interest messages to each of its neighbors Every node maintains an interest cache Each item corresponds to a distinct interest No information about the sink Interest aggregation : identical type, completely overlap rectangle attributes Each entry in the cache has several fields Timestamp: last received matching interest Several gradients: data rate, duration, direction

33 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Source Sink Interest = Interrogation Gradient = Who is interested ( data rate, duration, direction ) Setting Up Gradient Neighbor’s choices : 1. Flooding 2. Geographic routing 3. Cache data to direct interests

34 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Data Propagation Sensor node computes the highest requested event rate among all its outgoing gradients When a node receives a data : Find a matching interest entry in its cache Examine the gradient list, send out data by rate Cache keeps track of recent seen data items (loop prevention) Data message is unicast individually to the relevant neighbors

35 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Source Sink Reinforcing the Best Path Low rate eventReinforcement = Increased interest The neighbor reinforces a path: 1. At least one neighbor 2. Choose the one from whom it first received the latest event (low delay) 3. Choose all neighbors from which new events were recently received

36 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Local Behavior Choices  For propagating interests  In the example, flood  More sophisticated behaviors possible: e.g. based on cached information, GPS  For setting up gradients  data-rate gradients are set up towards neighbors who send an interest.  Others possible: probabilistic gradients, energy gradients, etc.

37 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Local Behavior Choices For data transmission Multi-path delivery with selective quality along different paths Multi-path delivery with selective quality along different paths probabilistic forwarding single-path delivery, etc. For reinforcement reinforce paths based on observed delays reinforce paths based on observed delays losses, variances etc.

38 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Initial simulation study of diffusion Key metric Average Dissipated Energy per event delivered indicates energy efficiency and network lifetime diffusion Compare diffusion to flooding flooding omniscient multicast centrally computed tree (omniscient multicast)

39 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Diffusion Simulation Details ns-2 Simulator: ns-2 Network Size: 50-250 Nodes Transmission Range: 40m Constant Density: 1.95x10 -3 nodes/m 2 (9.8 nodes in radius) MAC: Modified Contention-based MAC Energy Model: Mimic a realistic sensor radio [Pottie 2000] 660 mW in transmission, 395 mW in reception, and 35 mw in idle

40 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Diffusion Simulation Surveillance application 5 sources are randomly selected within a 70m x 70m corner in the field 5 sinks are randomly selected across the field High data rate is 2 events/sec Low data rate is 0.02 events/sec Event size: 64 bytes Interest size: 36 bytes All sources send the same location estimate for base experiments All sources send the same location estimate for base experiments

41 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Average Dissipated Energy 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 050100150200250300 Average Dissipated Energy (Joules/Node/Received Event) Network Size Diffusion Omniscient Multicast Flooding Diffusion can outperform flooding and even omniscient multicast. (suppress duplicate location estimates)

42 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Conclusions  Can leverage data processing/aggregation inside the network  Achieve desired global behavior through localized interactions  Empirically adapt to observed environment

43 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Comments Primary concern is energy Simulations only Only use five sources and five sinks How to exam scalability? ???

44 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science TTDD: A Two-tier Data Dissemination Model for Large-scale Wireless Sensor Networks Haiyun Luo Fan Ye, Jerry Cheng Songwu Lu, Lixia Zhang UCLA CS Dept.

45 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Assumptions Fixed source and sensor nodes, mobile or stationary sinks nodes densely applied in large field Position-aware nodes, sinks not necessarily Once a stimulus appears, sensors surrounding it collectively process signal, one becomes the source to generate the data report

46 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Sensor Network Model Source Stimulus Sink

47 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Mobile Sink Excessive Power Consumption Increased Wireless Transmission Collisions State Maintenance Overhead

48 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Goal, Idea Efficient and scalable data dissemination from multiple sources to multiple, mobile sinks Two-tier forwarding model Source proactively builds a grid structure Localize impact of sink mobility on data forwarding A small set of sensor node maintains forwarding state

49 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Grid setup Source proactively divide the plane into αXα square cells, with itself at one of the crossing point of the grid. The source calculates the locations of its four neighboring dissemination points The source sends a data-announcement message to reach these neighbors using greedy geographical forwarding The node serving the point called dissemination node This continues…

50 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science TTDD Basics Source Dissemination Node Sink Data Announcement Query Data Immediate Dissemination Node

51 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science TTDD Mobile Sinks Source Dissemination Node Sink Data Announcement Data Immediate Dissemination Node Immediate Dissemination Node Trajectory Forwarding Trajectory Forwarding

52 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science TTDD Multiple Mobile Sinks Source Dissemination Node Data Announcement Data Immediate Dissemination Node Trajectory Forwarding Source

53 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Grid Maintenance Issues: Efficiency Handle unexpected dissemination node failures Solutions: Source sets the Grid Lifetime in Data Announcement DN replication: each DN recruits several sensor nodes from its one-hop neighbor, replicates the location of the upstream DN DN failure detected and replaced on-demand by on-going query and data flows

54 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Grid Maintenance Source Dissemination Node Data Immediate Dissemination Node X

55 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Grid Maintenance (cont’d) Source Dissemination Node Data Immediate Dissemination Node X

56 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Ns-2 Simulation Metrics Energy consumption, delay, success rate Impacts of Cell size Number of sources and sinks Sink mobility Node failure rates

57 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Conclusions TTDD: two-tier data dissemination Model Exploit sensor nodes being stationary and location-aware Construct & maintain a grid structure with low overhead First Infrastructure-approach in semi- stationary sensor networks Efficiency & effectiveness in supporting mobile sinks Proactive sources Localize sink mobility impact

58 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Limitations and Future work Knowledge of cell size Greedy geographical routing failures, it is not clear how the greedy geographical routing works in terms of the neighbor’s range, which may lead to a problem of finding two dissemination node for one Mobile stimulus Mobile sensor node Sink mobility speed: limited speed Data aggregation

59 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Comparison of routing algorithms Attributes Algo. Data Efficiency Energy Efficiency (data/energy ratio) State complexity FloodingFastest Low b/c Implosion Small, upstream Gossiping Slowest No. 7 Lowest Random walk None Rumor Routing Very slow No. 6 Very lowSome SPINVery Fast Higher than above, SPIN-EC close to ideal Data- neighbor pairs Directed Diffusion Quite Fast No. 3 Higher than TTDD global flooding + strong aggregation Complex: Neighbor X Interest TTDD Very Fast No.2 Reasonable local flooding+ reasonable aggregation OK: Four neighbor, Constant IP MulticastFastest Low: b/c heavy machinery, ‘big’ node Most complex

60 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Discussions Source initiated or Sink initiated? Why?

61 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Discussion (con) Should we build more infrastructure or not, what’s the trade off?

62 U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science The End Thank you!


Download ppt "U NIVERSITY OF M ASSACHUSETTS, A MHERST – Department of Computer Science Routing and Data Dissemination Presented by: Li, Huan Liu, Junning."

Similar presentations


Ads by Google