Baseline Analysis CBP, AMP, and DBP Steve Braithwait, Dan Hansen, and Dave Armstrong Christensen Associates Energy Consulting DRMEC Spring Workshop May.

Slides:



Advertisements
Similar presentations
Multivariate Meta-analysis: Notes on Correlations Robert Platt Department of Epidemiology & Biostatistics McGill University Jack Ishak United BioSource.
Advertisements

Load Impact Estimation for Demand Response Resources Nicole Hopper, Evaluation Manager July 14, 2009 National Town Meeting on Demand Response and Smart.
A Two-Level Electricity Demand Model Hausman, Kinnucan, and Mcfadden.
2013 Statewide BIP Load Impact Evaluation Candice Churchwell DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco, California May 7, 2014.
Automated Demand Response Pilot 2005/2004 Load Impact Results and Recommendations Final Report © 2005 Rocky Mountain Institute (RMI) Research & Consulting.
DISPUTES & INVESTIGATIONS ECONOMICS FINANCIAL ADVISORY MANAGEMENT CONSULTING ©2014 Navigant Consulting, Inc. May 7, 2014 Navigant Reference: Impact.
© 2008 McGraw-Hill Higher Education The Statistical Imagination Chapter 4. Measuring Averages.
2013 SDG&E Summer Saver Load Impact Evaluation Dr. Stephen George DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco, California May 7, 2014.
Improving Forecast Accuracy by Unconstraining Censored Demand Data Rick Zeni AGIFORS Reservations and Yield Management Study Group May, 2001.
Empirical Data on Settlement of Weather Sensitive Loads Josh Bode, M.P.P. ERCOT Demand Side Working Group Austin, TX September 20, 2012.
Mitigating Risk of Out-of-Specification Results During Stability Testing of Biopharmaceutical Products Jeff Gardner Principal Consultant 36 th Annual Midwest.
California Energy Commission Resource Adequacy Demand Forecast Coincidence Adjustments R Resource Adequacy Workshop January.
CS 8751 ML & KDDEvaluating Hypotheses1 Sample error, true error Confidence intervals for observed hypothesis error Estimators Binomial distribution, Normal.
CAP and ROC curves.
November 2001 CHRISTENSENASSOCIATES RTP as a Demand Response Program – How Much Load Response Can You Expect? Peak Load Management Alliance Fall Conference.
Forecasting IME 451, Lecture 2. Laws of Forecasting 1.Forecasts are always wrong! 2.Detailed forecasts are worse than aggregate forecasts! Dell forecasts.
Resource Adequacy Forecast Adjustment(s) Allocation Methodology
Overview – Non-coincident Peak Demand
CHAPTER 18 Models for Time Series and Forecasting
Copyright 2012 John Wiley & Sons, Inc. Chapter 7 Budgeting: Estimating Costs and Risks.
Presentation Overview
The arithmetic mean of a variable is computed by determining the sum of all the values of the variable in the data set divided by the number of observations.
1 Quality Control Review of E3 Calculator Inputs Comparison to DEER Database Brian Horii Energy and Environmental Economics, Inc. November 16, 2006.
 Cost of capital must include the opportunity costs from all sources of capital – since free cash flow is available to all investors, who expect compensation.
Part III Exchange Rate Risk Management Information on existing and anticipated economic conditions of various countries and on historical exchange rate.
Measurement, Verification, and Forecasting Protocols for Demand Response Resources: Chuck Goldman Lawrence Berkeley National Laboratory.
ERCOT 2003 UFE ANALYSIS By William Boswell & Carl Raish AEIC Load Research Conference July 13, 2005.
INFORMATION NOT RELEASABLE TO THE PUBLIC UNLESS AUTHORIZED BY LAW: This information has not been publicly disclosed and may be privileged and confidential.
Ab Page 1 Advanced Experience Ratemaking Experience Rating and Exposure Shift Presented by Robert Giambo Swiss Reinsurance America Seminar on Reinsurance.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
Intergenerational Poverty and Mobility. Intergenerational Mobility Leblanc’s Random Family How does this excerpt relate to what we have been talking about?
Capacity Forecast Report Sean Chang Market Analysis and Design Suresh Pabbisetty CQF, ERP, CSQA Credit CWG/MCWG September 16, 2015 ERCOT Public.
Compiled by Load Profiling ERCOT Energy Analysis & Aggregation
Size Effect Matthew Boyce Huibin Hu Rajesh Raghunathan Lina Yang.
How Important Is Option-Implied Volatility for Pricing Credit Default Swaps? By Charles Cao, Fan Yu, Zhaodong Zhong Comments by Dan Nuxoll 27 October 2006.
Introduction to Biostatistics, Harvard Extension School © Scott Evans, Ph.D.1 Descriptive Statistics, The Normal Distribution, and Standardization.
UFE 2003 Analysis June 1, UFE 2003 ANALYSIS Compiled by the Load Profiling Group ERCOT Energy Analysis & Aggregation June 1, 2005.
Applying SGP to the STAR Assessments Daniel Bolt Dept of Educational Psychology University of Wisconsin, Madison.
2013 California Statewide Critical Peak Pricing Evaluation Josh L. Bode Candice A. Churchwell DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco,
Lecture 2 Forestry 3218 Lecture 2 Statistical Methods Avery and Burkhart, Chapter 2 Forest Mensuration II Avery and Burkhart, Chapter 2.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 3 Section 2 – Slide 1 of 27 Chapter 3 Section 2 Measures of Dispersion.
April 15, 2003 UFE 2002 ANALYSIS. April 15, 2003 LOAD AND UFE – ERCOT PEAK 2002 This is a graphic depiction of load and UFE on the ERCOT Peak Day for.
MAPE Publication Neil McAndrews For Bob Ryan of Deutsche Bank.
UFE 2005 Analysis 1 UFE 2005 ANALYSIS Compiled by Load Profiling ERCOT Energy Analysis & Aggregation.
Settlement Accuracy Analysis Prepared by ERCOT Load Profiling.
UFE 2008 Analysis 1 UFE 2008 ANALYSIS Compiled by Load Profiling Energy Analysis & Aggregation.
NPRR 571 ERS Weather Sensitive Loads Requirements Carl Raish, ERCOT QSE Managers Working Group November 5, 2013.
ERCOT UFE Analysis UFE Task Force February 21, 2005.
United Nations Workshop on Revision 3 of Principles and Recommendations for Population and Housing Censuses and Evaluation of Census Data, Amman 19 – 23.
Advanced Lighting Control Systems (ALCS) Energy Estimation Tool Pacific Gas & Electric Company Dave Alexander.
The Impact of Retail Rate Structure on the Economics of Commercial Photovoltaic Systems in California Ryan Wiser, Andrew Mills, Galen Barbose & William.
Forecasting Exchange Rates 9 9 Lecture Chapter Objectives To explain how firms can benefit from forecasting exchange rates; To describe the common.
1 Summary of Reviews: Workpapers Approved by the California Technical Forum Part 2 Meeting: California Technical Forum January 28, 2016 Jeff Hirsch/Kevin.
Building Blocks for Premise Level Load Reduction Estimates ERCOT Loads in SCED v2 Subgroup July 21, 2014.
Chapter 7: The Distribution of Sample Means
2015 California Statewide Critical Peak Pricing Evaluation DRMEC Spring 2016 Load Impact Evaluation Workshop San Francisco, California May, 2016 Prepared.
Metering and Measuring of Multi-Family Pool Pumps, Phase 1 March 10, 2016 Presented by Dan Mort & Sasha Baroiant ADM Associates, Inc.
2013 Load Impact Evaluation Capacity Bidding Program (CBP) Steve Braithwait, Dan Hansen, and Dave Armstrong Christensen Associates Energy Consulting DRMEC.
DRMEC Spring 2016 Load Impacts Evaluation Workshop San Francisco, California May 10, SDG&E Summer Saver Load Impact Evaluation.
MAPE Publication Neil McAndrews For Bob Ryan of Deutsche Bank.
Copyright 2015 John Wiley & Sons, Inc. Chapter 7 Budgeting: Estimating Costs and Risks.
2015 SDG&E PTR/SCTD Evaluation DRMEC Spring 2016 Load Impact Workshop George Jiang May 11 th, 2016 Customer Category Mean Active Participants Mean Reference.
2013 Load Impact Evaluation of Southern California Edison’s Peak Time Rebate Program Josh Schellenberg DRMEC Spring 2014 Load Impact Evaluation Workshop.
Customer Specific Regression Overview DRMEC Spring 2016 Evaluation and Enrollment Workshop – Session 3 Kelly Marrin, Director, Applied Energy Group.
Joint Energy Auction Implementation Proposal of PG&E, SCE and SDG&E California Public Utilities Commission Workshop – November 1, 2006.
Measures of Dispersion
Emergency Response Service Baselines
Resource Adequacy Demand Forecast Coincidence Adjustments
Summary descriptive statistics: means and standard deviations:
(-4)*(-7)= Agenda Bell Ringer Bell Ringer
Presentation transcript:

Baseline Analysis CBP, AMP, and DBP Steve Braithwait, Dan Hansen, and Dave Armstrong Christensen Associates Energy Consulting DRMEC Spring Workshop May 7, 2014 May

2 Presentation Outline  Objectives  Methodology  Data  Performance measures  Aggregator program (CBP and AMP) results  Demand Bidding Program (DBP) results

May Objective: Assess Performance of Alternative Baseline Types  For each Utility and Notice type:  All customers, with BL adjustment as chosen  All customers, simulated with universal selection of the BL adjustment  Sum of individual BL vs. portfolio BL (constructed from aggregated customer loads), for AMP and CBP only  Examine unadjusted and day-of adjustments with 20%, 30%, 40%, 50% caps, and uncapped

May Analysis Details  For actual program event days  The “true” baseline is the estimated reference load from the ex post evaluation  For event-like non-event days  The “true” baseline is the observed load

May Performance Measures (1) Percentage Baseline Error  Percentage BL error for each customer/portfolio- event day is:  Percentage error = (L P d – L A d ) / L A d  L A d = actual, or “ true ” baseline load on day d  L P d = “predicted” baseline to be evaluated  Positive value = over-estimated baseline (implies over-stated program load impact)  Negative value = under-estimated baseline (implies under-stated program load impact)

May Performance Measures (2) Accuracy  Accuracy is measured as the median absolute percentage error (MAPE)  Calculate the absolute value of the percentage error for each customer/event-day  Calculate the median of values across customer/event- days (mean can be misleading due to extreme values)  Higher values correspond to larger baseline errors

May Performance Measures (3) Bias  Bias is measured by the median percentage error, without taking the absolute value  Positive values indicate upward bias (i.e., the program baseline tends to over-state the “true” baseline)  Negative values indicate downward bias (i.e., the program baseline tends to under-state the “true” baseline)

Nominated Customers by Choice of BL Adjustment – CBP and AMP May

9 Accuracy (Median Abs. % Error) PG&E CBP-DO

May Bias (Median % Error) PG&E CBP-DO

May Percentiles of % Errors – PG&E CBP-DO Actual Events, by Adjustment Cap

May Percentiles of % Errors – PG&E CBP-DO Simulated Events, by Adjustment Cap

Summary: Accuracy & Bias (Aggregated Indiv.; Universal Adj.; 40% cap) May

Summary: Percentiles of % Errors (Aggregated Indiv.; Universal Adj.; 40% cap) May

May Summary of Findings  Accuracy and bias measures vary by utility, program and notice type  Suggests that factors other than baseline type and adjustment caps may be most important, such as types of customers (e.g., highly variable load) and event-day characteristics (e.g., event on isolated hot day)  Day-of adjustment often improves accuracy and reduces bias, but level of cap is less important  Largest errors typically occur for Unadjusted BL and Unlimited cap  BL with small median error (e.g., 1%) can have >10% errors in 20 percent of cases

May DBP Results: PG&E Distribution of % Errors

May DBP Results: SCE Distribution of % Errors

Summary  Day-of adjustments tend to improve baseline accuracy and reduce bias  The analysis provides support for making the day-of adjustment the default option  The effectiveness of the day-of adjustment is not very sensitive to the level of the cap May

May Questions?  Contact – Steve Braithwait or Dan Hansen, Christensen Associates Energy Consulting Madison, Wisconsin   

Appendix  SCE – CBP DO  SDG&E – CBP DO  PG&E – AMP DO  SCE – AMP DO May

May Accuracy (Median Abs. % Error) SCE CBP-DO

May Bias (Median % Error) SCE CBP-DO

May Percentiles of % Errors – SCE CBP-DO Actual Events, by Adjustment Cap

May Percentiles of % Errors – SCE CBP-DO Simulated Events, by Adjustment Cap

May Accuracy (Median Abs. % Error) SDG&E CBP-DO

Accuracy – Med. Abs. Err. (MW) SDG&E CBP DO May

May Bias (Median % Error) SDG&E CBP-DO

May Accuracy (Median Abs. % Error) PG&E AMP-DO

May Bias (Median % Error) PG&E AMP-DO

May Accuracy (Median Abs. % Error) SCE AMP-DO

May Bias (Median % Error) SCE AMP-DO