Presentation is loading. Please wait.

Presentation is loading. Please wait.

An approach to dynamic control of sensor networks with inferential ecosystem models James S. Clark, Pankaj Agarwal, David Bell, Carla Ellis, Paul Flikkema,

Similar presentations


Presentation on theme: "An approach to dynamic control of sensor networks with inferential ecosystem models James S. Clark, Pankaj Agarwal, David Bell, Carla Ellis, Paul Flikkema,"— Presentation transcript:

1 An approach to dynamic control of sensor networks with inferential ecosystem models James S. Clark, Pankaj Agarwal, David Bell, Carla Ellis, Paul Flikkema, Alan Gelfand, Gabriel Katul, Kamesh Munagala, Gavino Puggioni, Adam Silberstein, and Jun Yang Duke University

2 Motivation Understanding forest response to global change –Climate, CO 2, human disturbance Forces at many scales Complex interactions, lagged responses Heterogeneous, incomplete data

3 CO 2 fumigation of forests Experimental hurricanes Remote sensing Heterogeneous data Individual seedlings

4 Some ‘data’ are model output Wolosin, Agarwal, Chakraborty, Clark, Dietze, Schultz, Welsh

5 TDR Maturity obs Data Processes Parameters Hyperparameters Canopy photos CO2 treatment Seed traps Climate Diameter increment Height increment Remote sensing Canopy models Survival Allocation Dispersal Maturation Fecundity Height growth Die-back Diameter growth Mortality risk Observation errors Process uncertainty Heterogeneity Dynamics Light Canopy status Soil moisture Hierarchical models to infer processes, parameter values p(unknowns|knowns) Spatio-temporal (no cycles)

6 Clark, LaDeau, Ibanez Ecol Monogr (2004) Random indiv effects Year effects Model error Sources of variability/uncertainty in fecundity Some example individuals

7 Allocation Inference on hidden variables

8 Can emerging modeling tools help control ecosystem sensor networks? Capacity to characterize factors affecting forests, from physiology to population dynamics

9 Ecosystem models that could use it Physiology: PSN, respiration responses to weather, climate C/H 2 O/energy: Atmosphere/biosphere exchange (pool sizes, fluxes) Biodiversity: Differential demographic responses to weather/climate, CO 2, H 2 O

10 Physiological responses to weather H 2 O, N, P H 2 O, CO 2 light, CO 2 Temp PSN Resp Sap flux Allocation Fast, fine scales

11 H 2 0/energy/C cycles respond to global change H 2 O, N, P H 2 O, CO 2 light, CO 2 Temp Fast, coarse scales

12 Prasad and Iverson Biodiversity: demographic responses to weather/climate Reproduction Mortality Growth H 2 O, N, P H 2 O, CO 2 light, CO 2 Slow, fine & coarse scales

13 Sensors for ecosystem variables Soil moisture W j,t Precip P t Evap E j,t Transpir Tr j,t Drainage D t Light I j,t Temp T j,t VPD V j,t C/H 2 O/energy DemographyBiodiversity Physiology

14 WisardNet Multihop, self-organizing network Sensors: light, soil & air T, soil moisture, sap flux Tower weather station Minimal in-network processing Transmission expensive

15 Capacity Unprecedented potential to collect data all the time New insight that can only come from fine grained data

16 The dynamic control problem What is an observation worth? –How to quantify learning? How to optimize it over competing models? The answer recognizes: –Transmission cost of an observation –Need to assess value in (near) real time Based on model(s) Minimal in-network computation capacity Use (mostly) local information –Potential for periodic out-of-network input

17 Pattern ecosystem data Where could a model stand in for data? Slow variables Predictable variables Events Less predictable

18 How to quantify learning? Sensitivity of estimate to observation Model dependent: Exploit spatiotemporal structure, relationships with other variables PAR at 3 nodes, 3 days: PSN/Resp modeling observations

19 Real applications Multiple users, multiple models Learning varies among models

20 Information needed at different scales C/H20/energy balance wants fine scale

21 May Jun Jul Aug Volumetric soil moisture (%) The 2-mo drought of 2005 Soil moisture sensors in the EW network gap Models learn at different scales Biodiversity: seasonal drought & demography

22 Differential sensitivity among species

23 Why invest in redundancy? Shared vs unique data features (within nodes, among nodes) Exploit relationships among variables/nodes? Slow, predictable relationships?

24 D ij,t Data from multiple sources Diameter data Process: annual change in diameter Parameters Hyperparameters: spatio- temporal structure Population heterogeneity Mean growth D ij,t-1 D ij,t+1 Increment data Individual effects Year effect  t-1 Year effect  t Year effect  t+1 Diameter error Increment error Process error ‘Data’ can be modeled Clark, Wolosin, Dietze, Ibanez (in review) i individual j stand t year

25 ‘Data’ can be modeled Clark, Wolosin, Dietze, Ibanez (in review) i individual j stand t year

26 Capacity vs value Data may not contribute learning A model can often predict data –Reduces data value Different models (users) need different data

27 Controlling measurement Inferential modeling out of network –Ecosystem models have multiple variables, some are global (transmission) –Data arrive faster than model convergence Periodic updating (from out of network) –parameter values –state variables Simple rules for local control –Use local variables –Models: Most recent estimates from gateway Basic model: point prediction vs most recent value

28 In network data suppression An ‘acceptable error’  –Considers competing model needs Option 1: change? Option 2: change predictable? {X} j local information (no transmission) { ,X} t global info, periodically updated from full model M I simplified, in-network model

29 {W,E,Tr,D} t Data Calibration data (sparse!) Process Parameters Hyperparameters heterogeneity Process parameters Location effects time effect  t-1 time effect  t time effect  t+1 Measurement errors Process error {W,E,Tr,D} t-1 {W,E,Tr,D} t+1 Out-of-network model is complex Outputs: sparse data and ‘posteriors’

30 Soil moisture example Simulated process, parameters unknown Simulated data –TDR calibration, error known (sparse) –5 sensors, error/drift unknown (often dense, but unreliable) Estimate process/parameters (Gibbs sampling) Use estimates for in-network processing –Point estimate only, periodic updating Transmit only when predictions exceed threshold

31 Model assumptions Process: Sensor j: Rand eff: TDR calibration: Inference:

32 ‘truth y ’ 95% CI 5 sensors calibration Colors: Drift parameters {  } Estimates and truth (dashed lines) Dots: Simulated process & data Network down

33 Process parameters  Estimates and truth (dashed lines) Field capacity Evap const Wilting point Keepers (40%) Prediction error large Increasing drift reduces predictive capacity Lesson: model stands in for data

34 A framework Bayesification of ecosystem models: a currency for learning assessment Model-by-model error thresholds  In-network simplicity: point predictions based on local info, periodic out-of-network inputs Out-of-network predictive distributions for all variables


Download ppt "An approach to dynamic control of sensor networks with inferential ecosystem models James S. Clark, Pankaj Agarwal, David Bell, Carla Ellis, Paul Flikkema,"

Similar presentations


Ads by Google