Emergency Response Service Baselines

Slides:



Advertisements
Similar presentations
Demand Response Forecasting, Measurement & Verification Methods/Challenges/Considerations National Town Meeting on Demand Response Washington, DC - June.
Advertisements

Determining and Forecasting Load Reductions from Demand Side Programs September 11-12, 2007 Presented by: Bill Bland, V.P. Consulting, GoodCents Liza Thompson,
A Two-Level Electricity Demand Model Hausman, Kinnucan, and Mcfadden.
2013 Statewide BIP Load Impact Evaluation Candice Churchwell DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco, California May 7, 2014.
1 Distributed Generation Task Force November 5, 2007.
DISPUTES & INVESTIGATIONS ECONOMICS FINANCIAL ADVISORY MANAGEMENT CONSULTING ©2014 Navigant Consulting, Inc. May 7, 2014 Navigant Reference: Impact.
1 Wal-Mart’s View on Demand Response Program Design Anoush Farhangi Angela Beehler.
2013 SDG&E Summer Saver Load Impact Evaluation Dr. Stephen George DRMEC Spring 2014 Load Impacts Evaluation Workshop San Francisco, California May 7, 2014.
ERS Update Presented to: Demand Side Working Group December 5, 2014.
California Energy Commission Resource Adequacy Demand Forecast Coincidence Adjustments R Resource Adequacy Workshop January.
Presented to the PWG Meeting of May 26, 2010
1 Econometric Load Forecasting Peak and Energy Forecast 06/14/2005 Econometric Load Forecasting Peak and Energy Forecast 06/14/2005.
2011 Long-Term Load Forecast Review ERCOT Calvin Opheim June 17, 2011.
Overview – Non-coincident Peak Demand
Presentation Overview
Electric / Gas / Water Eric Fox Oleg Moskatov Itron, Inc. April 17, 2008 VELCO Long-Term Demand Forecast Methodology Overview.
ERCOT 2003 UFE ANALYSIS By William Boswell & Carl Raish AEIC Load Research Conference July 13, 2005.
Baseline Analysis CBP, AMP, and DBP Steve Braithwait, Dan Hansen, and Dave Armstrong Christensen Associates Energy Consulting DRMEC Spring Workshop May.
ERS Update for DSWG June 1, Agenda June – September 2012 Procurement XML Project Update Clearing Price discussion NPRR 451 Q & A.
Presented By: Mark Patterson ERCOT Manager, Demand Integration November 29, Minute ERS Pilot update to TAC 1.
Compiled by Load Profiling ERCOT Energy Analysis & Aggregation
ERCOT Long-Term Demand and Energy Forecasting February 20, 2007 Bill Bojorquez.
1 Presented to ERCOT Retail Market Subcommittee January 9, 2002 Profiling Working Group Darryl Nelson, Chair Load Profiling Operating Guides (LPOG)
UFE 2003 Analysis June 1, UFE 2003 ANALYSIS Compiled by the Load Profiling Group ERCOT Energy Analysis & Aggregation June 1, 2005.
Demand Integration Update to DSWG ERCOT Emergency Responsive Service (ERS) Report to the Public Utility Commission of Texas for the 2013 ERS Program Year.
Grabbing Balancing Up Load (BUL) by the Horns December 2006.
April 15, 2003 UFE 2002 ANALYSIS. April 15, 2003 LOAD AND UFE – ERCOT PEAK 2002 This is a graphic depiction of load and UFE on the ERCOT Peak Day for.
1 ERCOT LRS Precision Analysis PWG Presentation February 27, 2007.
UFE 2005 Analysis 1 UFE 2005 ANALYSIS Compiled by Load Profiling ERCOT Energy Analysis & Aggregation.
May 03, UFE ANALYSIS Old – New Model Comparison Compiled by the Load Profiling Group ERCOT Energy Analysis & Aggregation May 03, 2007.
Weather Sensitive ERS Training Presenter: Carl Raish Weather Sensitive ERS Training Workshop April 5, 2013.
Settlement Accuracy Analysis Prepared by ERCOT Load Profiling.
UFE 2008 Analysis 1 UFE 2008 ANALYSIS Compiled by Load Profiling Energy Analysis & Aggregation.
1 UFE Workshop Sponsored by COPS October 19, 2004.
2016 Long-Term Load Forecast
NPRR 571 ERS Weather Sensitive Loads Requirements Carl Raish, ERCOT QSE Managers Working Group November 5, 2013.
DSWG – March 9, 2015 Four-CP Response in ERCOT Competitive Area Carl L Raish.
Load Profiling Working Group RMS Presentation 8/01/2002 by Ernie Podraza Reliant Energy Retail Group Chair PWG.
Final Report Weather Sensitive Emergency Response Service (WS ERS) Pilot Project Carl Raish, ERCOT Technical Advisory Committee November 7, 2013.
1 ERCOT LRS Sample Design Review PWG Presentation March 27, 2007.
Price Responsive Load / Retail DR RMS Update Paul Wattles Carl Raish October 6, 2015.
ERCOT PUBLIC 10/7/ Load Forecasting Process Review Calvin Opheim Generation Adequacy Task Force October 7, 2013.
Price Responsive Load / Retail DR DSWG Update Paul Wattles Carl Raish September 17, 2015.
Solar Profiling Interstate Renewable Energy Council presentation to the ERCOT Profiling Working Group Jan. 22, 2008.
Developing Load Reduction Estimates Caused by Interrupting and/or Curtailing Large Customers By Carl L. Raish 2000 AEIC Load Research Conference.
Overview of Governing Document for Weather-Sensitive ERS Pilot Project Stakeholder Workshop Mark Patterson, ERCOT Staff March 1, 2013.
Building Blocks for Premise Level Load Reduction Estimates ERCOT Loads in SCED v2 Subgroup July 21, 2014.
DSWG - June 25, Four-CP Response for Transmission- and Distribution- Connected ESIIDs in ERCOT Competitive Area Carl L Raish.
Update: 30-Minute Emergency Response Service (ERS) Pilot Project Mark Patterson, ERCOT October 7, 2013.
Price Responsive Load / Retail DR RMS Update Paul Wattles Carl Raish September 1, 2015.
Demand Response Options Review Carl Raish November 27, 2007.
2013 Load Impact Evaluation Capacity Bidding Program (CBP) Steve Braithwait, Dan Hansen, and Dave Armstrong Christensen Associates Energy Consulting DRMEC.
Distributed Renewable Generation Profiling Methodology ERCOT Load Profiling March 4, 2008.
DRMEC Spring 2016 Load Impacts Evaluation Workshop San Francisco, California May 10, SDG&E Summer Saver Load Impact Evaluation.
Calculations of Peak Load Contribution (PLC) AND Network Service Peak Load (NSPL) As of 1/1/2016.
Weather-Sensitive ERS Pilot Project Proposal March 7, 2013 TAC Meeting.
Draft NPRR Weather Sensitive ERS Loads December 2012.
MAPE Publication Neil McAndrews For Bob Ryan of Deutsche Bank.
2015 SDG&E PTR/SCTD Evaluation DRMEC Spring 2016 Load Impact Workshop George Jiang May 11 th, 2016 Customer Category Mean Active Participants Mean Reference.
2013 Load Impact Evaluation of Southern California Edison’s Peak Time Rebate Program Josh Schellenberg DRMEC Spring 2014 Load Impact Evaluation Workshop.
Load Resource Participation in EILS EILS Subgroup October 27, 2011.
Analysis of Load Reductions Associated with 4-CP Transmission Charges in ERCOT Carl L Raish Principal Load Profiling and Modeling Demand Side Working Group.
Advanced Meter School August 18-20,2015 Time of Use and Load Profile Jeremiah Swann.
Smart Grid Tariff Changes
Principal Load Profiling and Modeling
UNC Modification Proposal 0202 National Grid Distribution
PLC = Peak Load Contribution (aka “ICAP”)
Resource Adequacy Demand Forecast Coincidence Adjustments
ERS Update Mark Patterson, ERCOT Demand Side Working Group
Behavior Modification Report with Peak Reduction Component
Presentation transcript:

Emergency Response Service Baselines Carl L Raish Principal Load Profiling and Modeling

ERS Resource Identification Drop-by baseline options Mid 8-of-10 (M810) – average of 10 most recent days of same day-type excluding the highest and lowest Matching-Day-Pair (MDP - A variation of a proxy day routine) Finds and average of 10 best fitting pairs of non-event days of the same day-type based on intervals for the prior day and up to one hour before the event on the event day Regression (REG) – three stage models are estimated Daily energy, hourly fraction, 15-minute interval models Based on weather, calendar, sunrise/sunset Meter-Before-Meter-After* (MBMA) – specific for Loads with flat load shapes Baseline is the first full interval prior to the dispatch instruction Nearest 20 Like Days* (Near20) – already in use for Alternate baseline loads for fleet/QSE portfolio performance Average of the 20 days occurring closest to the event that have the same day-type as the event day Control group (Residential only) Random sample of sites in the Load are selected by ERCOT and withheld from deployment Load must be large enough to meet its obligation with the deployed sites

ERS Resource Identification Day-of-Adjustment Applicable to Mid 8-of-10 Like Days Matching Day Pair Regression Near 20 Like Days Scalar adjustment based on ratio of actual kWh to baseline kWh for the two-hour window beginning three hours before the dispatch instruction Baseline is multiplied by the scalar adjustment factor

ERS Resource Identification

ERS Baseline Analysis Historical interval data required to qualify for a default baseline Regression – 270 days M810, MDP, MBMA, Near20 6 months Control group – none ERCOT simulates consecutive 2-hour events for the submitter-selected time-periods across all available historical interval data Applies baseline methodology (including day-of-adjustment) for each simulated event Compares baseline interval estimates to actuals and computes “goodness-of-fit statistics” (R2, MAPE, Mean Difference, 90th, 95th and 99th percentile differences) Alternate baseline is always available as a choice

ERS Baseline Ranking Ranking algorithm used to select the “best” baseline method(s) from among the available choices Head-to-head scoring between all combinations of pairs of baselines … low score wins R2 If the difference for a pair of baselines > 3% then the baseline with the lower R2 gets one point; otherwise both baselines get 0.5 point MAPE If the difference for a pair of baselines > 2% then the baseline with the higher MAPE gets one point; otherwise Bias (Absolute Value of Mean percent Difference) If the difference for a pair of baselines > 2% then the baseline with the higher bias gets one point; otherwise 95th Percentile Percent Difference as Percent of Average Load If the difference for a pair of baselines > 2% then the baseline with the higher difference gets one point; otherwise If the MAPE for a baseline > 20% or the bias > 5% the baseline is given score of 24 Baseline(s) with lowest score (or tied with the lowest score) allowed as choice(s) If all baselines scores are 24, only the Alternate baseline option is allowed

ERCOT Baseline Usage Results from October 2016 – January 2017 Resource Identification Process Site-level Baseline Qualification Site-level Baseline Qualification History

ERCOT Baseline Usage Results from October 2016 – January 2017 Resource Identification Process Resource-level Baseline Qualification Resource-level Baseline Qualification History

ERCOT Baseline Usage Results from June – September 2016 procurement

ERCOT Baseline Usage Results from October – January 2017 procurement

ERCOT Control Group Methodology Pre-stratify resource customers on average summer day use 4 Strata with boundaries at 55, 75, and 100 kWh / day Proportional allocation of control group sites to strata Typically select 2 separate control groups per month … one to use for first half of a month and the other for the second half Expand control group to deployment group level to check accuracy Use Combined ratio estimation Compare control group estimates to deployment actuals for summer week days noon – 8:00 pm Evaluation of test/event performance Post-stratify based on customer’s kWh consumption for previous 24-hour period (noon on day before to noon on day of test/event) 3 Strata with boundaries at 80 and 120 kWh / day Use combined ratio estimation for expansion Note: the post-stratification expansion also is checked for accuracy at the time of control group design and selection Use of both the pre- and post-stratification process improves baseline accuracy

Control Group Accuracy – Premise Load Control group accuracy depends on the size of both the control and deployment groups Graphs are based on average accuracy across 20 independently selected control groups

Control Group Accuracy – Load Reduction

Residential Aggregation – Regression Baseline ERID Statistics Initial Sites - 2,564 Event Statistics Residential Sites - 2,273 Average MW reduction - 1.00 MAPE – 4.2% R2 – 98.8% Midnight to noon

Commercial Aggregation – Regression Baseline ERID Statistics Initial Sites - 143 Event Statistics (Excluding 4 hours around event) Commercial sites - 230 Average MW reduction - 0.35 MAPE – 2.7% R2 – 99.9% Midnight to noon

Residential Aggregation – Control Group ERID Statistics Initial Sites - 8,250 Event Statistics Deployed sites - 10,016 Control Group – 1,000 Average MW reduction – 17.2 MAPE – 2.0% R2 – 99.8% Midnight to noon

Questions? ON OFF craish@ercot.com 512/248-3876