Presentation is loading. Please wait.

Presentation is loading. Please wait.

Hypothesis-Testing Model-Complexity. Hypothesis Testing …..

Similar presentations


Presentation on theme: "Hypothesis-Testing Model-Complexity. Hypothesis Testing ….."— Presentation transcript:

1 Hypothesis-Testing Model-Complexity

2 Hypothesis Testing …..

3 Domain of groundwater model...

4 …topographic contours...

5 … a dam...

6 … irrigated area...

7 … channel system...

8 … extraction bores...

9 … native woodland...

10 … observation bores

11 Inflow from uphill Supplied “from outside”

12 Inflow from uphill Groundwater interaction with rivers Supplied “from outside”

13 Inflow from uphill Groundwater interaction with dam Groundwater interaction with rivers Supplied “from outside”

14 Inflow from uphill Groundwater interaction with dam Groundwater interaction with rivers Leakage from channels Supplied “from outside”

15 Inflow from uphill Groundwater interaction with dam Groundwater interaction with rivers Leackage from channels Aquifer extraction Supplied “from outside”

16 Inflow from uphill Groundwater interaction with dam Groundwater interaction with rivers Leackage from channels Groundwater recharge Aquifer extraction Supplied “from outside”

17 More often than not, a definitive model cannot be built. Recognize this, focus on the question that is being asked and, if necessary, use the model for hypothesis testing.

18 Remember that model calibration is a form of data interpretation. The whole modelling process is simply advanced data processing.

19 Cattle Ck.

20 Cattle Creek Catchment

21 Soils and current land use

22 Model grid; fixed head and drainage cells shown coloured

23 Groundwater levels in June 1996

24 Groundwater levels in January 1991

25 Modelled and observed water levels after model calibration.

26 Calibrated transmissivities

27 Cattle Creek Catchment

28 CANE EXPANSION New Development CURRENT

29 Increased cane production Leakage from balancing storage: 2.5 mm/d at calibration 2.5 mm/d for prediction 46R10P8

30 Increased cane production Leakage from balancing storage: 2.5 mm/d at calibration 2.5 mm/d for prediction 46R15P8

31 Increased cane production Leakage from balancing storage: 2.5 mm/d at calibration 2.5 mm/d for prediction Zone 17 absent 48R14P8

32 Increased cane production Leakage from balancing storage: 0.0 mm/d at calibration 0.0 mm/d for prediction 46R3P7

33 Increased cane production Leakage from balancing storage: 0.0 mm/d at calibration 0.0 mm/d for prediction 46R4P7

34 Increased cane production Leakage from balancing storage: 0.0 mm/d at calibration 0.0 mm/d for prediction Zone 17 absent 48R8P7

35 Increased cane production Leakage from balancing storage: 2.5 mm/d at calibration 2.5 mm/d for prediction 46R10P10

36 Increased cane production Leakage from balancing storage: 2.5 mm/d at calibration 2.5 mm/d for prediction 46R11P10

37 Increased cane production Leakage from balancing storage: 2.5 mm/d at calibration 2.5 mm/d for prediction Zone 17 absent 48R14P10

38 PE d M KsKs Simple Model Runoff

39 PE d M KsKs Simple Model M Soil Moisture Capacity (mm/m depth) dEffective Rooting Depth K i Initital loss f cap Field Capacity K s Saturated Hydraulic Conductivity

40 M PE d M KsKs Simple Model Runoff M Soil Moisture Capacity (mm/m depth) dEffective Rooting Depth K i Initital loss f cap Field Capacity K s Saturated Hydraulic Conductivity

41 p1p1 p2p2 A probability contour:- “Fixing” a parameter

42 p1p1 p2p2 This has the potential to introduce bias into key model predictions. A probability contour:-

43 p1p1 p2p2 Also, what if this parameter is partly a surrogate for an unrepresented process? A probability contour:-

44 p1p1 p2p2 “Fixing” a parameter A probability contour:-

45 p1p1 p2p2 “Fixing” a parameter A probability contour:-

46 Not only does uncertainty arise from parameter nonuniqueness; it also arises from lack of certainty in model inputs/outputs and model boundary conditions. The model can be used as an instrument for data interpretation, allowing various hypotheses concerning inputs/outputs and boundary conditions to be tested. Where did the idea ever come from that there should be one calibrated model?

47 modeller constructioncalibrationprediction “the deliverable”

48 prediction “the deliverable”

49 prediction “the deliverable”

50 modeller constructioncalibrationprediction

51 “Dual calibration”

52 Observation bore Pumped bore K = 5 S y = 0.1 K = 5 S y = 0.1 K = 25 S y = 0.3 Inflow = 2750 Fixed head = 50 A River Valley

53 Recharge × 10 -3 Recharge rate

54 Discharge Pumping rate

55 Water level Borehole hydrographs

56 The finite-difference grid

57 and parameter zonation

58 K=5; S y =0.1 K=25; S y =0.3 Calibrated parameters Field data Model-calculated Field and model-generated borehole hydrographs

59 K=10.2; S y =0.21 K=18.8; S y =0.21 Field data Model-calculated Calibrated parameters Field and model-generated borehole hydrographs

60 Simulation of Drought Conditions Decrease inflow from left from 2750 to 2200 m 3 /day. Increase pumping from left bore from (1500, 1000, 0, 1500) to 2000 m 3 /day. Increase pumping from right bore from (2000,1000,500,1500) to 3000 m 3 /day. Run model for 91 days. Same initial heads, ie. 54 m. For “true parameters”, water level in right bore after this run is 43.9m.

61 Is it possible that the water level in the left bore will be as low as 42m?

62 Use PEST with “model” comprised of two MODFLOW runs, one under calibration conditions and one under predictive conditions. In the latter case there is only one “observation”, viz water level in right pumped cell is 42m at end of run (weight is the sum of the weights used for all water levels over calibration period). Methodology

63 Model Input files Output files PEST writes model input files reads model output files

64 Model calibration conditions Input files Output files PEST Input files Model predictive conditions Output files

65 K=22; S y =0.14 K=16; S y =0.16 K=9.8; S y =0.28 Field data Model-calculated Field and model-generated borehole hydrographs over calibration period. Water level in right pumped bore at end of drought = 42m. Calibrated parameters

66 Is it possible that the water level in the left bore will be as low as 40m?

67 K=22; S y =0.14 K=16; S y =0.16 K=9.8; S y =0.28 Field data Model-calculated Water level in right pumped bore at end of drought = 40m. Field and model-generated borehole hydrographs over calibration period. Calibrated parameters

68 K=5; S y =0.099 K=14; S y =0.11 K=20; S y =0.32 Field data Model-calculated Water level in right pumped bore at end of drought = 40m. K=4.6; S y =0.090 Calibrated parameters Field and model-generated borehole hydrographs over calibration period.

69 Is it possible that the water level in the left bore will be as low as 36m?

70 K=8.8; S y =0.13 K=15; S y =0.14 K=18; S y =0.29 Field data Model-calculated Water level in right pumped bore at end of drought = 36m. K=2.7; S y =0.19 Calibrated parameters Field and model-generated borehole hydrographs over calibration period.

71 We are not calibrating a groundwater model. We are calibrating our regularisation methodology.

72 Some Lessons if possible, include in the calibration dataset measurements of the type that you need to predict intuition and knowledge of an area plays just an important part in modelling as does the model itself focus on what the model needs to predict when building the model…..

73 There should be no such thing as a model for an area, only for a specific problem.

74 So how should we model?

75 open cut mine underground mine waterholes A model area extraction bores

76 open cut mine underground mine waterholes A model area extraction bores monitoring bores guaging stations

77 A model area

78

79

80

81

82

83

84

85

86

87 Sources of Uncertainty Close to Waterholes conductance of bed (and heterogeneity thereof) change in bed conductance with wetted perimeter change in bed conductance with inflow/outflow and season relationship between area and level relationship between level and flow rate of evaporation hydraulic properties of rocks close to ponds behaviour during flood events change in hydraulic characteristics after flood events uncertainty in future flows inflow to ponds from neighbouring surface catchment lack of borehole data to define groundwater mounds uncertainties in streamflow

88 Let’s start again…..

89 Complexity leads to parameter uncertainty. Parameter correlation can be enormous due to inadequate data.

90 Parameter uncertainty may lead to predictive uncertainty. The more that the prediction depends on system “fine detail”, the more this is likely to occur.

91 Predictive uncertainty must be analysed.

92 Complexity must be “focussed” - dispense with non-essential complexity. No model should be built independently of the prediction which it has to make.

93 A model area

94

95

96

97

98

99 open cut mine underground mine waterholes Sensitive area

100 open cut mine underground mine waterholes Sensitive area

101 open cut mine underground mine waterholes Sensitive area

102 A model is not a database! A model is a data processor.

103 Ubiquitous complexity in a “do-everything model”

104

105 Focussed complexity in a prediction-specific model

106

107 Model Complexity

108 For reasons which we have already discussed, a complex model is really a simply model in disguise.

109 Complex models:-  More parameters  Longer run times  Greater likelihood of numerical instability  More costly  Destroys user’s intuition

110

111 The level of complexity is set by system properties to which the prediction is most sensitive.

112 p1p1 p2p2 Objective function minimum Objective function contours linear model

113 p1p1 p2p2 A probability contour:-

114 p1p1 p2p2 11 11

115 p1p1 p2p2 11 11 22 22

116 p1p1 p2p2

117 p1p1 p2p2  p1+p2 A probability contour:-

118 p1p1 p2p2  p1+p2  p1-p2 Ideally, simplification of a model should be done in such a way that only the parameters that “don’t matter” are dispensed with.

119 There are many cases where a specific prediction depends on at least one of the values of the individual parameters - the parameters that cannot be resolved by the parameter estimation process. In fact, that is often why we are using a physically based model; if calibration alone sufficed for full parameterisation, then a black box would be all we need.

120 p1p1 p2p2 Over-simplified model design introduces bias, for we are effectively assuming values for unrepresented parameters.

121 p1p1 p2p2 A probability contour:- “Fixing” a parameter

122 p1p1 p2p2 A probability contour:- “Fixing” a parameter

123 p1p1 p2p2 A probability contour:- “Fixing” a parameter

124 Increasing model complexity potential error in prediction complexity bias But we don’t know how much bias we are introducing. ?

125 Increasing model complexity complexity bias predictive uncertainty These levels are equal potential error in prediction

126 Increasing model complexity complexity bias predictive uncertainty These levels are equal potential error in prediction

127 The point where no further complexity is warranted, is the point where the uncertainty of a specific model prediction no longer rises.

128 Essential and non-essential complexity are prediction-dependent.

129 Complexity does not guarantee the “right answer” - it guarantees that the right answer will lie within the limits of predictive uncertainty.

130 Complexity without uncertainty analysis is a waste of time. A complex model can be just as biased as a simple model. Use a simple model and add the “predictive noise” – far cheaper. A complex model allows you to replace “predictive noise” with science. But if you don’t do it, what is the point of a complex model.

131 An Example….

132

133 Observed and modelled flows

134 Observed and modelled monthly volumes

135 Observed and modelled exceedence fractions

136 Parameter LZSN 2.0 UZSN 2.0 INFILT0.0526 BASETP0.200 AGWETP0.00108 LZETP 0.50 INTFW10.0 IRC 0.677 AGWRC0.983

137 Observed and modelled flows

138 Observed and modelled monthly volumes

139 Observed and modelled exceedence fractions

140 Parameter Set 1 Set 2 Set 3 Set 4 Set 5 Set 6 LZSN 2.0 2.0 2.0 2.0 2.0 2.0 UZSN 2.0 1.79 2.0 2.0 1.76 2.0 INFILT0.0526 0.0615 0.0783 0.0340 0.0678 0.0687 BASETP0.200 0.182 0.199 0.115 0.179 0.200 AGWETP0.00108 0.0186 0.0023 0.0124 0.0247 0.0407 LZETP 0.50 0.50 0.20 0.72 0.50 0.50 INTFW10.0 3.076 1.00 4.48 4.78 2.73 IRC 0.677 0.571 0.729 0.738 0.759 0.320 AGWRC0.983 0.981 0.972 0.986 0.981 0.966

141 Observed and modelled flows over validation period

142 Observed and modelled monthly volumes over validation period

143 Observed and modelled exceedence fractions over validation period

144 Observed and modelled flows over validation period

145 Parameterisation using PEST’s predictive analyser

146 Observed and modelled flows over calibration period

147 Parameter LZSN UZSN INFILT BASETP AGWETP LZETP INTFW IRC AGWRC

148 Parameter LZSN UZSN INFILT BASETP AGWETP LZETP INTFW IRC AGWRC DEEPFR

149 Observed and modelled flows over validation period Parameterisation using PEST’s predictive analyser

150 Observed and modelled flows over calibration period


Download ppt "Hypothesis-Testing Model-Complexity. Hypothesis Testing ….."

Similar presentations


Ads by Google