Presentation is loading. Please wait.

Presentation is loading. Please wait.

GDI Environmental monitoring app Data & lessons learned Robert Szewczyk Joe Polastre Alan Mainwaring David Culler January 15, 2002.

Similar presentations


Presentation on theme: "GDI Environmental monitoring app Data & lessons learned Robert Szewczyk Joe Polastre Alan Mainwaring David Culler January 15, 2002."— Presentation transcript:

1 GDI Environmental monitoring app Data & lessons learned Robert Szewczyk Joe Polastre Alan Mainwaring David Culler January 15, 2002

2 Outline Application overview Sensor node analysis Network analysis Conclusions

3 Great Duck Island Petrel monitoring Goal: build ecological models for breeding preferences of Leach’s Storm Petrel –Burrow (nest) occupancy during incubation –Differences in the micro-climates of active vs. inactive burrows –Environmental conditions during 7 month breeding season Inconspicuous Operation –Reduce the “observer effect” Sensor Network –Lifetime, size, quantity requirements –Environmental conditions Data –As much as possible in the power budget Predictable operation –Confidence in collected readings Unattended, off-the-grid operation

4 Application context

5 Application architecture Transit Network Basestation Gateway Sensor Patch Patch Network Base-Remote Link Data Service Internet Client Data Browsing and Processing Sensor Node

6 Sensor Patch Network Transmit Only Network Single Hop Repeaters –2 hop initially –Most energy challenged Adheres to Power Budget Nodes: –Approximately 50 –Half in burrows, half outside –RF unpredictable »Burrows »Obstacles »Drop packets or retry?

7 Transit Network Two implementations –Linux (CerfCube) 802.11b –Relay Mote RF Antennae Evaluation criteria: –Reliability –Power Budget Duty cycle of sensor nodes dictates transit network duty cycle Implementation of transit network depends on: –Distance –Obstacles

8 Transit Network Renewable Energy Sources –CerfCube needs 60Wh/day –Assuming an average peak of 1 direct sunlight hour per day: –Panel must be 924 in 2 or 30” x 30” for a 5” x 5” device! –A mote only needs 2Wh per day, or a panel 6” x 6”

9 Base Station / Wide Area Network Disconnected Operation and Multiple Levels of State –Laptop »DirecWay Satellite WAN »PostgreSQL »47% uptime –Redundancy and Replication »Increase number of points of failure –Remote Access »Physical access limited –Keep state all areas of network –Resiliency to »Disconnection »Network failures »Packet loss –Potential Solution: Keep Local Caches Synchronization

10 Sensor data – building confidence Data collected –light levels, temperature, relative humidity, thermopile –No new science about petrel breeding behavior, many insights into how to build sensors that would yield that science

11 Temperature and light measurements Light measurements –Uncalibrated –Little more than binary reading –Important in building confidence in the sensor reading Temperature measurements –Digital sensor –Observed resolution – 2 °C, rather than expected 0.5 °C –Measured temperature inside the enclosure, rather than ambient temperature –On cloudy days good correlation with Coast Guard measurements –Collected plausible readings in - 10 – 60 °C

12 Humidity measurements Sensor characteristics –Analog sensor, largest sensor on the board –Special packaging characteristics Problems encountered –Wet sensors short out the battery; in extreme cases crash motes –Off the scale readings – 13% outside expected range Design direction –Switch to a much smaller, digital sensor w. integrated heater

13 Thermopile measurements Sensor characteristics –Passive IR and ambient temperature sensor –Packaging requirements Encountered problems –Calibration coefficients Interpretation of data –Good correlation between the thermistor and the digital temperature readings –Difficult to interpret the data with confidence, even as an occupancy detector »Absence of any periodic patterns »Absence of ground truth Future directions –Re-implement using a digital module –Investigate alternative occupancy sensors

14 Thermopile data (cont.)

15 Power Management Expected 6+ months @ 3% duty cycle –Real world performance MUCH worse – best node lasted only 2.5 months Correlation between packet success rate and battery voltage –Boost converter provides less consistency than expected –Batteries can be drained down to 0.8 V per cell, poor reliability below 1.1 V per cell Battery voltage at node 57, the most reliable mote from the initial deployment. Last packet from that node on that set of batteries was received on 9/24; the node reliability declined drastically after 9/22.

16 Global statistics 43 distinct nodes reporting data between July 13 and November 18 1132548 packets logged in the DB 3 major maintenance events, roughly every month Best nodes – nearly 90000 packets on a pair of AA batteries, over 2.5 months unattended operation

17 Global network performance Heavy daily losses –Between 3 and 5% daily Reliability of packet delivery improves with time –Good motes survive? –Collisions?

18 Loss distribution

19

20 Packet loss analysis Available data –Periodic and predictable behavior expected from the base station –Packet arrival times recorded by the database Several underlying causes for packet loss –Laptop / database crash »Solved with the addition of a replicated database/base station –Low battery levels »Only a problem towards the end of the lifetime –Environmental conditions – wind blowing antennas out of alignment, rain affecting humidity sensor and short-circuiting the battery –Collisions Packet loss distribution –Not an independent distribution –Packet loss likely a result of global conditions, rather than a function of the link Fail-stop model does not describe node behavior well –Dead nodes reappear, sometimes with reliable characteristics

21 Phase stability

22 Phase stability (cont.)

23 Collision analysis Relevant network characteristics –Periodic application behavior –Low network utilization –CSMA MAC without acknowledgements Results –Coupling behavior between motes induced by the transit network –No significant collisions

24 Conclusions Iterative, application-driven design –Delivered on several expected requirements - low power, unattended operation –Application data analysis drives the future design and verification procedures, carry over to other applications Sensor design and packaging insights –Part selection –Sensor calibration and data interpretation –Confidence in readings is crucial –Packaging for the application Low power operation –Very difficult to consistently meet the power budget –Boost converter tradeoffs »Mediocre efficiency »Improved behavior on transient battery problems »Significantly reduced performance on weak batteries

25 Conclusions (cont.) Application design –Use watchdogs to recover from a wide class of failures –Design applications to minimize coupling behavior between motes –Incorporate more diagnostic information (link quality, routing, etc.) –Many concerns with this app have been addressed in newer versions of TinyOS components – link layer acks, channel monitoring component, watchdog timer, etc. Future work –Application redeployment in a more controlled environment –Incorporating the lessons learned into Generic Sensor Kit and second generation weather board Accessing GDI data –http://www.greatduckisland.nethttp://www.greatduckisland.net –PostgreSQL database »Server: webbie.berkeley.intel-research.net »Username: reader »Password: readonly »Database: gdi »Most interesting table: weather

26 Q & A Thank you http://www.greatduckisland.net

27

28

29 Mote 18: Outside

30 Mote 26: Burrow 115a

31 Mote 53: Burrow 115b

32 Mote 47: Burrow 88a

33 Mote 40: Burrow 88b

34 Mote 39: Burrow 84


Download ppt "GDI Environmental monitoring app Data & lessons learned Robert Szewczyk Joe Polastre Alan Mainwaring David Culler January 15, 2002."

Similar presentations


Ads by Google