Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lessons Learned from Backcasting and Forecasting Exercises

Similar presentations


Presentation on theme: "Lessons Learned from Backcasting and Forecasting Exercises"— Presentation transcript:

1 Lessons Learned from Backcasting and Forecasting Exercises
16th TRB National Transportation Planning Applications Conference Thomas Rossi, Cambridge Systematics, Inc. Sarah Sun, Federal Highway Administration May 16, 2017

2 Backcasting and Forecasting in Validation
Base year validation demonstrates a model’s ability to estimate travel behavior for a single point in time A second point would help demonstrate sensitivity of the model to changing conditions that affect travel demand Backcasting and forecasting: Compare model results to observed data for a year other than the base year For forecasting, choose year after base year but still in the past

3 Study Approach Use two different model versions with different base years Backcast or forecast the base year scenario for the “other” model version Use models from two urban areas: Baltimore (BMC) Cincinnati (OKI)

4 Key Observations Changes in input data may have effects that dominate the results Examples: Higher network speeds in updated model, lower trip rates in updated model Results for base year scenarios match observed data better than forecasts/backcasts No surprise here, as validation was done mainly considering the base year There is more consistency between scenarios run using the same model than between scenarios run for the same travel conditions (analysis year)

5 What We Learned Impacts of changes in model parameters between model versions Behavior changes, errors fixed, improved model form, better hardware, expanded analytical capabilities Effects of changes can propagate, models cannot anticipate everything, updating a model does always make it more “correct” Accuracy of data inputs is important Forecasting/backcasting can help identify “hard to find” errors Changes in assumptions can have unanticipated effects Effects of changes must be considered in validation

6 What Else We Learned Calibration changes should be made only to improve the model’s predictive ability Earlier components show greater accuracy in forecasting (maybe) May reflect error propagation from earlier components downstream Use of fixed factors can make models insensitive to changes over time When a model is updated, the fixed factors are reestimated…and results change

7 Recommendations Validation should always include temporal validation and sensitivity testing Temporal model validation should include a backcast and/or a forecast year application Recognize that changes in model procedures, assumptions, and input data can change model results Model inputs need to be thoroughly checked during model development and validation Estimate effects of changes in calibrated parameters Recognize effects of error propagation Test sensitivity effects of fixed factors

8 Resources Project report link: Webinar (December 2016)
/index.cfm Webinar (December 2016) Summary: 1214/ Recording: Slides: le_attachments/711336/TMIP%2Bwebinar%2B12%2B14%2B2016%2BFinal %2BComp.pdf

9 Acknowledgments Sarah Sun, FHWA Kazi Ullah, Cambridge Systematics
And especially… Baltimore Metropolitan Council (BMC) Charles Baber Matt de Rouville Ohio-Kentucky-Indiana Council of Governments (OKI) Andrew Rohne


Download ppt "Lessons Learned from Backcasting and Forecasting Exercises"

Similar presentations


Ads by Google