Presentation is loading. Please wait.

Presentation is loading. Please wait.

by Mike Van Liew Dept. of Biological Systems Engineering

Similar presentations


Presentation on theme: "by Mike Van Liew Dept. of Biological Systems Engineering"— Presentation transcript:

1 Applying and Interpreting the SWAT Sensitivity Analysis and Auto-calibration Tools
by Mike Van Liew Dept. of Biological Systems Engineering University of Nebraska Lincoln, NE Heartland Regional Water Coordination Initiative

2 Available Auto-calibration tools in SWAT
--Auto-calibration tools created by Ann van Griensven (2005) --Tools include: Sensitivity Analysis Parasol model calibration parameter uncertainty Sunglasses parameter uncertainty for calibration and and validation periods

3 Limitations of the ArcSWAT Interface Auto-calibration Tool
The ArcSWAT Interface Sensitivity Analysis/Auto-Calibration and Uncertainty Tools only allow calibration at a single point within a watershed In some cases, a multi-point, regional approach to calibration is highly desirable, especially for large watersheds

4 Running the Sensitivity Analysis/Auto-calibration Tool in the Project Directory
The ArcSWAT Interface provides a framework for constructing files that are necessary for performing sensitivity analysis or a multi-gage, multi-parameter calibration Some files employed in the Interface tools must be modified by hand to perform a multi-gage or multi-parameter calibration This can be accomplished by working in the project directory instead of the ArcSWAT Interface

5 Today’s Objectives: Learn how to create and modify the necessary files for running the sensitivity analysis and auto-calibration tools in a project directory for multi-gage, multi-constituent configurations Learn how to interpret the output files generated from the sensitivity analysis and auto-calibration tools

6 Parameter Sensitivity
Challenge of determining which parameters to calibrate so that the model response mimics the actual field, subsurface, and channel conditions as closely as possible Calibration process becomes complex and computationally extensive when the number of parameters in a model is substantial Sensitivity analysis can be helpful to identify and rank parameters that have significant impact on specific model outputs of interest

7 Sensitivity Analysis in SWAT
helpful to model users in identifying parameters that are most influential in governing streamflow or water quality response allows model users to conduct two types of analyses: --the first analysis may help to identify parameters that improve a particular process or characteristic of the model (assesses the impact of adjusting a parameter value on some measure of simulated output, such as average streamflow) --second type of analysis uses measured data to provide an overall “goodness of fit” estimation between the modeled and the measured time series (identifies the parameters that are affected by the characteristics of the study watershed and those to which the given project is most sensitive)

8 Sensitivity Analysis Sensitivity analysis demonstrates the impact that change to an individual input parameter has on the model response Method in SWAT combines the Latin Hypercube (LH) and One-factor-At-a-Time (OAT) sampling LH = generates a distribution of plausible collections of parameter values from a multidimensional distribution During sensitivity analysis, SWAT runs (p+1)*m times, where p is the number of parameters being evaluated and m is the number of LH intervals or loops For each loop, a set of parameter values is selected such that a unique area of the parameter space is sampled

9 Sensitivity Analysis That set of parameter values is used to run a baseline simulation for that unique area Then, using one-at-a-time (OAT) sampling, a parameter is randomly selected, and its value is changed from the previous simulation by a user-defined percentage SWAT is run on the new parameter set, and then a different parameter is randomly selected and varied After all the parameters have been varied, the LH algorithm locates a new sampling area by changing all the parameters

10 Getting Started: Building Files to Conduct Sensitivity Analysis
ArcSWAT Interface Sensitivity Analysis Tool Input and Output Windows Manually modify files in project directory that are written from the Interface

11 Sensitivity Input Window
Analysis Location: Select from the SWAT simulation list a simulation for performing the sensitivity analysis Subbasin: Select a subbasin within the project where observed data will be compared against simulated output

12 Sensitivity Input Window
Hypercube intervals (Alpha_Bf): 10 intervals of 0-0.1, … OAT change (Alpha_Bf): Changes by 5% x ( ) = 0.05 Initial value of 0.13 becomes 0.08 or 0.18

13 Sensitivity Input Window
Observed Data File Name Select Parameters for conducting sensitivity analysis Lower bound = 0.0 Upper bound = 10.0 Adjust if necessary Variation Method: Replace by value Add to value Multiply by value (%)

14 Sensitivity Analysis Output Window
Output Evaluation: Comparison variable(s) Select Average Modeled Output (eg, streamflow) Or Percent of Time output is < a threshold value) Select Concentrations or Loads for Water Quality Objective Function: Select optimization method Write Input Files to Project Directory

15 Main Output: Sensout.out
Input Data: Objective and Response Functions List of Parameters

16 Sample of Senspar.out file OAT = .05 Loops = 5
run ALPHA_BF ESCO CH_K2 SOL_AWC GW_DELAY 0-1.0 0-150. 0-30 1 0.16 0.59 25.61 -0.99 8.76 2 7.26 3 0.21 4 -2.99 5 18.11 6 0.64 7 0.94 0.33 32.36 -13.83 19.30 8 39.86 9 20.80 10 0.99 11 -11.83 12 0.38 + 20%

17 Main Output: Sensout.out
Parameter Ranking

18 Ranking of 16 Parameters for Mahantango Creek Watershed, PA

19 Ranking of 16 Parameters for Stevens Creek Watershed, PA

20 Mean Value Percent Difference in Objective Function Value with a 5% Change in Parameter Value for Stevens Creek Watershed, NE

21 Strengths of the Automated Approach to Calibration in SWAT
Manual calibration of a dozen or more parameters that govern streamflow can be a very time consuming and frustrating process The auto-calibration procedure in SWAT provides a powerful, labor-saving tool that can be used to substantially reduce the frustration and uncertainty often associated with manual calibration The Parasol with Uncertainty Analysis tool in SWAT provides optimal parameter values that are determined through an optimization search. It also provides an indication of how sensitive a parameter is to being precisely calibrated, based upon the user supplied input range

22 Shuffled Complex Evolution Algorithm (SCE-UA)
calibration procedure based on a Shuffled Complex Evolution Algorithm (SCE-UA) and a single objective function In a first step, the SCE-UA selects an initial population of parameters by random sampling throughout the feasible parameter space for “p” parameters to be optimized, based on given parameter ranges The population is partitioned into several communities (complexes), each consisting of “2p+1” points

23 Shuffled Complex Evolution Algorithm (SCE-UA)
Each community is made to evolve based on a statistical “reproduction process” that uses the simplex method, an algorithm that evaluates the objective function in a systematic way with regard to the progress of the search in previous iterations At periodic stages in the evolution, the entire population is shuffled and points are reassigned to communities to ensure information sharing As the search progresses, the entire population tends to converge toward the neighborhood of global optimization, provided the initial population size is sufficiently large

24 Shuffled Complex Evolution Algorithm (SCE-UA)
Initialize Select Parents Repeat x times to generate x offspring Shuffle Generate Offspring Repeat y times to generate y offspring Evolve Assess Replace Parents by Offspring No Yes End

25 Limitations of the ArcSWAT Interface Auto-calibration Tool
The ArcSWAT Interface Sensitivity Analysis/Auto-Calibration and Uncertainty Tools only allow calibration at a single point within a watershed In some cases, a multi-point, regional approach to calibration is highly desirable, especially for large watersheds

26 Building Files to Conduct Auto-calibration
ArcSWAT Interface Auto-calibration Tool Input and Output Windows Manually modify files in project directory that are written from the Interface

27 Auto-calibration Input Window
Analysis Location: Select from the SWAT simulation list a simulation for performing the calibration Subbasin: Select a subbasin within the project where observed data will be compared against simulated output

28 Auto-calibration Input Window
Optimization Settings MAXN = Maximum number of trials before optimization is terminated NGS = Number of complexes IPROB = sets the threshold for ParaSol: 1 = 90% CI 2 = 95% CI 3 = 97.5% CI Observed Data File Name Calibration Method: ParaSol or ParaSol with Uncertainty Analysis

29 Auto-calibration Input Window: Observed Daily Record for Streamflow
Year of observed record Julien day of observed record Observed Daily Streamflow in cms

30 Input Window: Observed Monthly Record for Streamflow and Sediment
Year of observed record Month of observed record Observed Monthly Streamflow (cms) Observed Monthly Sediment Load (tons/day)

31 Auto-calibration Input Window
Select Parameters for calibration Adjust initial lower and upper bounds, if necessary (note: minimum lower bound for SURLAG = 0.5)

32 Auto-calibration input files: Changepar Parasolin
MAXN NGS IPROB

33 Auto-calibration input: Multigage Changepar file is created by combining two or more changepar files that are specific for certain subbasins or HRUs in the project Upper Gage Lower Gage

34 Auto-calibration input: Multigage Changepar file is created by combining two or more changepar files that are specific for certain subbasins or HRUs in the project For parameters that vary by HRU, select All Land Uses, Soils, and Slopes for Subbasins that are relevant to a particular gage For parameters that vary by Subbasin, select All Subbasins that are relevant to a particular gage

35 Auto-calibration input: Multigage Changepar file is created by combining two or more changepar files that are specific for certain subbasins or HRUs in the project Subbasins for gage 1 Subbasins for gage 2 HRUs for gage 1 HRUs for gage2

36 Auto-calibration Output Window
Output Evaluation: Select parameter to be calibrated Objective Function: Select optimization method Select Concentrations or Loads for Water Quality Calibration Write Input Files to Project Directory

37 Auto-calibration input file: fig
Autocal Command Code and Observed Data Files for 2 Gage Locations

38 Auto-calibration input file: Filecio
Number of years simulated ICLB =AutoCalibration Default = 0 Sensitivity = 1 Optimization = 2 Optimization with uncertainty = 3 Bestpar = 4 NYSKIP = Warm-up

39 Auto-calibration input file: Objmet
Code number for calibration variable Concentration or load Given weight for objective function Code number for Autocalfile in .fig Objective function method

40 Auto-calibration output file: Parasolout
Parameter Uncertainty Ranges

41 Auto-calibration output file: goodpar and bestpar
Parameter listings Calibration

42 Auto-calibration output file: Autocal
Monthly Sediment Load Parameter Uncertainty Ranges Monthly Streamflow Calibration

43 Measured versus Simulated Streamflow with Parasol Uncertainty CI

44

45


Download ppt "by Mike Van Liew Dept. of Biological Systems Engineering"

Similar presentations


Ads by Google