Download presentation

Presentation is loading. Please wait.

Published byAden Edkins Modified about 1 year ago

1
Error and Uncertainty in Modeling George H. Leavesley, Research Hydrologist, USGS, Denver, CO

2
Sources of Error and Uncertainty Model Structure Parameters Data Forecasts of future conditions

3
Sacramento Conceptualization of Reality

4
homog. (x eff, eff ) input output identicalIdentical ? heterog. measurement After Grayson and Blöschl, 2000, Cambridge Univ. Press real world model Effective Parameters and States

5
1000M Mountain blockage of radar 2000M 3000M Precipitation Measurement Source: Maddox, et. al. Weather and Forecasting, Sparse precip gauge distribution SAHRA – NSF STC

6
Streamflow Measurement Accuracy (USGS) Excellent –95% of daily discharges are within 5% of true value Good –95% of daily discharges are within 10% of true value Fair –95% of daily discharges are within 15% of true value Poor –Do not meet Fair criteria Different accuracies may be attributed to different parts of a given record

7
Dimensions of Model Evaluation From Wagener 2003, Hydrological Processes

8
Dimensions of Model Evaluation From Wagener 2003, Hydrological Processes

9
Performance Measures Mean (observed vs simulated) Standard deviation (observed vs simulated) Root Mean Square Error (RMSE) Mean Absolute Error (MAE)

10
Performance Measures Coefficient of Determination R 2 Not a good measure -- High correlations can be achieved for mediocre or poor models

11
Performance Measures Coefficient of Efficiency E Nash and Sutcliffe, 1970, J. of Hydrology Widely used in hydrology Range – infinity to +1.0 Overly sensitive to extreme values

12
Seasonal Variability of Nash-Sutcliffe Efficiency

13
Performance Measures Index of Agreement d Willmott, 1981, Physical Geography Range 0.0 – 1.0 Overly sensitive to extreme values

14
Analysis of Residuals

15
Performance Measure Issues Different measures have different sensitivities for different parts of the data. Are assumptions correct regarding the nature of the error structures (i.e. zero mean, constant variance, normality, independence, …)? Difficulty in defining what constitutes an acceptable level of performance for different types of data.

16
Dimensions of Model Evaluation From Wagener 2003, Hydrological Processes

17
Definitions Uncertainty analysis: investigation of the effects of lack of knowledge or potential errors on model components and output. Sensitivity analysis: the computation of the effect of changes in input values or assumptions on model output. EPA, CREM, 2003

18
Parameter Sensitivity The single, “best-fit model” assumption

19
Magnitude of Parameter Error 5% 10% 20% 50% %> VAR %> SE soil_moist_max %> VAR %> SE hamon_coef %> VAR %> SE ssrcoef_sq joint Error Propagation

20
routing gw rech soil moisture et soil moisture gw rech routing baseflow Objective Function Selection

21
Relative Sensitivity S R = ( Q PRED / P I ) * (P I / Q PRED )

22
Relative Sensitivity Analysis Soil available water holding capacity Evapotranspiration coefficient

23
Relative Sensitivity Analysis Snow/rain threshold temperature Snowfall adjustment

24
Parameter Sensitivity The “parameter equifinality” assumption Consider a population of models Define the likelihood that they are consistent with the available data

25
Regional Sensitivity Analysis Apply a random sampling procedure to the parameter space to create parameter sets Classify the resulting model realizations as “behavioural” (acceptable) or “non-behavioural” Significant difference between the set of “behavioural” and “non-behavioural” parameters identifies the parameter as sensitive Spear and Hornberger, 1980, WRR

26
SensitiveNot sensitive B = behaviouralB = non-behavioural Regional Sensitivity Analysis

27
Monte Carlo generated simulations are classified as behavioural or non-behavioural, and the latter are rejected. The likelihood measures of the behavioural set are scaled and used to weight the predictions associated with individual behavioural parameter sets. The modeling uncertainty is then propagated into the simulation results as confidence limits of any required percentile. Generalized Likelihood Uncertainty Analysis (GLUE)

28
Dotty Plots and Identifiability Analysis behavioural

29
GLUE computed 95%confidence limits

30
Uncalibrated Estimate Parameter Equifinality (deg F) (inches) RockiesSierrasCascades Regional Variability

31
Increasing the information content of the data Multi-criteria Analysis A single objective function: cannot capture the many performance attributes that an experienced hydrologist might look for uses only a limited part of the total information content of a hydrograph when used in calibration it will tend to bias model performance to match a particular aspect of the hydrograph A multi-criteria approach overcomes these problems (Wheater et al., 1986, Gupta et al., 1998, Boyle et al., 2001, Wagener et al., 2001).

32
Identifying Characteristic Behavior

33
Developing Objective Measures peaks/timing baseflow quick recession

34
Pareto Optimality Pareto Solutions

35
500 Pareto Solutions

36
SAC-SMA Hydrograph Range

37
Overall Performance Measures RMSE min BIAS min

38
Parameter Sensitivity by Objective Function

39
Dimensions of Model Evaluation From Wagener 2003, Hydrological Processes

40
Evaluation of Model Component Processes PET Day Annual Runoff Percent Groundwater Nash-Sutcliffe Daily Q Observed SCE Final SCE Year Year Year J F M A M J J A S O N D Month J F M A M J J A S O N D Month Solar Radiation

41
Coupling SCA remote sensing products with point measures and modeled SWE to evaluate snow component process Integrating Remotely Sensed Data

42
Identifiability Analysis Identification of the model structure and a corresponding parameter set that are most representative of the catchment under investigation, while considering aspects such as modeling objectives and available data. Wagener et al., 2001, Hydrology Earth System Sciences

43
Dynamic Identifiability Analysis - DYNIA Information content by parameter

44
Dynamic Identifiability Analysis - DYNIA Identifiability measure and 90% confidence limits

45
A Wavelet Analysis Strategy Daily time series Seasonally varying daily variance (row sums) Seasonally varying variance frequency decomposition (column sums) Annual average variance frequency decomposition John Schaake, NWS

46
Variance Decomposition Precipitation Streamflow Variance Transfer Functions 8 day window 64 day window 1-Day 8-Day

47
Streamflow Obs Linear PRMS SAC

48
Variance Transfer Functions Obs Linear PRMS SAC

49
Forecast Uncertainty

50
Ensemble Streamflow Prediction Using history as an analog for the future Simulate to today Predict future using historic data Probability of exceedence NOAA USGS BOR

51
2005 ESP Forecast Forecast Period 4/3 – 9/30 Made 4/2/2005 All historic years Only el nino years Observed 2005

52
Ranked Probability Skill Score (RPSS) for each forecast day and month using measured runoff and simulated runoff (Animas River, CO) produced using: (1) SDS output and (2) ESP technique Forecast Day Month J F M A M J J A S O N D RPSS ESP SDS Perfect Forecast: RPSS=1 Given current uncertainty in long-term atmospheric- model forecasts, seasonal to annual forecasts may be better with ESP

53
This presentation has been a selected review of uncertainty and error analysis techniques. No single approach provides all the information needed to assess a model. The appropriate mix is a function of model structure, problem objectives, data constraints, and spatial and temporal scales of application. Still searching for the unified theory of uncertainty analysis. Summary

54
Input-output behaviour of the model is consistent with measured behaviour - performance Model predictions are accurate (negligible bias) and precise (prediction uncertainty relatively small) Model structure and behaviour are consistent with the understanding of reality Necessary Conditions for a Model to be Considered Properly Calibrated Gupta, H.V., et al, in review

55
National and international groups are collaborating to assess existing methods and tools for uncertainty analysis and to explore potential avenues for improvement in this area.

56
A Federal Interagency Working Group is developing a Calibration, Optimization, and Sensitivity and Uncertainty Analysis Toolbox International Workshop Proceedings describes this effort: available at

57
Aquatic, Riparian & Terrestrial GIS Landuse Geochemical Flowpaths Coupled Hydrological Modelling Systems Hydrological Modelling INPUTS Model Complexity Scale Uncertainty Analysis PREDICTIONS Increased Model Complexity More Parameters More Spatial Interactions More Complex Responses but still data limited …. MORE MODELLING UNCERTAINTY Future Model Development and Application

58
Improved Representations of Hydrological Processors and Predictions Model Structures Visualisations of Models & Uncertainty Field Measurements Visualisations of Measurements Modular Modelling System - USGS Visual Uncertainty Analysis Framework Freer, et al., Lancaster Univ., UK

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google