LRS Update Meeting LRS Participants and Profiling Working Group October 27, 2005.

Slides:



Advertisements
Similar presentations
Load Research Project Review of Processes for Round 2 Sampling Presented by ERCOT Staff To the Profiling Working Group Meeting On April 24, 2007.
Advertisements

AMS DATA WORKSHOP III TDSP UPDATE September 19,
1.  An inadvertent issue begins upon the discovery of an Inadvertent Gain or Move-In transaction submission. Upon identification of an Inadvertent Gain.
Line Efficiency     Percentage Month Today’s Date
Profiling Working Group March 13, PWG Update Report By Ernie Podraza of Reliant Energy ERCOT PWG Chair for COPS Meeting March 13, 2007.
Profiling Working Group 1 PWG Update Report By Brad Boles of Cirro Energy ERCOT PWG Vice-Chair for COPS Meeting November 6, 2007.
Presented to the PWG Meeting of May 26, 2010
Jim Yoder and Judy Martin | Nov U.S. Department of Education 2012 Fall Conference Hands-On NSLDS Session 19.
October 28, 2014 Update to RMS 1. * Reviewed and Discussed: * ERCOT Protocols and Retail Market Guide Requirements * TDSPs’ Data Processes for IDR vs.
© 2012 IBM Corporation 3 rd Party Registration & Account Management 1 1 SMT 2015 Approved AMWG Change Requests.
Retail Sub-Committee Update Robert Connell June 14, 2002.
1 ERCOT Load Research Update PWG August 24, 2005.
1 AMS Data Workshop ERCOT Overview of AMS Data Processes June 27, 2014 ERCOT June 27, 2014.
PR Phase II SCR 727 SCR 740 Web Services Jackie Ashbaugh Commercial Operations Data Integrity & Administration August 18, 2006.
Profiling Working Group January xx, PWG Update Report By Ernie Podraza of Reliant Energy ERCOT PWG Chair for RMS Meeting January ??, 2006.
SCR 740 Implementation of Web Services for ESIID level data Jackie Ashbaugh EAA Data Management.
1 Update from ERCOT Retail Market Services to RMS Additional Material February 12, 2004.
814_20 – Substation ID updates Background and Proposed Action Plan TX SET – 10/25/07.
RMSUpdate January 6, 2005 Retail Market Subcommittee Update to TAC.
1 ERCOT LRS Detail Sample Design PWG Presentation April 24, 2007.
1 RMS Update - ERCOT June 10, Supporting Reports Section.
MARS Taskforce COPS Update May 12, Execution MilestonesBaseline DateStatus Technical Architecture Complete 05/15/2009On Schedule Development Complete07/24/2009On.
Advanced Meter Settlement Background and NPRR. Overview PUC Rule (wholesale settlement) Project – Wholesale Settlement Project Filed Deployment.
09/15/10 RMS RMS Market Reports – Recommendations Karen Farley Manager, Retail Customer Choice.
TX SET January 21, 2009 Retail Release Items for February 21, 2009 Kathryn Thurman ERCOT.
Rob Connell May 1, 2002 Retail Sub-Committee Update.
1 Market Operations Presentation Board of Director’s Meeting January 22, 2003.
10/13/10 RMS RMS Market Reports – Recommendations Karen Farley Manager, Retail Customer Choice.
May 03, UFE ANALYSIS Old – New Model Comparison Compiled by the Load Profiling Group ERCOT Energy Analysis & Aggregation May 03, 2007.
1 PWG Update Report By Ed Echols Of Oncor ERCOT PWG Chair Jim Lee of Direct Energy ERCOT PWG Vice Chair for COPS Meeting Sept 10, 2014.
Demand Response Status Report Calvin Opheim October 9, 2007.
Rocky Mountain Power - Utah Workgroups I and II Load Research and Peak-Hour Forecasting Presented by UTAH INDUSTRIAL ENERGY CONSUMERS Salt Lake City, Utah.
1 ESI ID SERVICE HISTORY AND USAGE DATA EXTRACT SYSTEM CHANGE REQUEST (SCR 727) February 24, 2003.
1 RMS Update - ERCOT May 14, Supporting Reports Section.
LRS Progress Report and Action Plan Update to the Profiling Working Group July 24, 2006.
3 rd Party Registration & Account Management SMT Update To AMWG August 26, 2014.
Load Profiling Working Group RMS Presentation 8/01/2002 by Ernie Podraza Reliant Energy Retail Group Chair PWG.
MARS 867_03F ROR vs. Settlement vs. 810 Scenarios ERCOT September 2008.
RMS/COPS Workshop VI 1 October 06, Antitrust Admonition ERCOT strictly prohibits Market Participants and their employees who are participating in.
Profiling Working Group October 16, PWG Update Report By Ernie Podraza of Reliant Energy ERCOT PWG Chair for RMS Meeting October 16, 2003.
MARS Advanced Metering – ERCOT Market Facing Changes Jackie Ashbaugh Manager Data Integrity and Administration 3/9/2009.
MARS Taskforce RMS Update November 10, Conference Call concerning Settlement Estimates TDSPs held a conference call on October 28, 2010, where we:
1 Linked-Service Address Discussion Thursday - April 8, 2004 (Updated 4/12/04 to include meeting results) Airport Hilton - Austin.
1 Update on the 867_03 Contingency Plan Nancy Hetrick February 25, 2003.
814_20 – Substation ID updates Background and Proposed Action Plan RMS – 11/07/07.
Profiling Working Group August 14, PWG Update Report By Ernie Podraza of Reliant Energy ERCOT PWG Chair for RMS Meeting August 14, 2003.
ERCOT MARKET EDUCATION Retail 101. Introductions, Roles and Responsibilities.
Demand Response Task Force. 2 2 Outline  Overview of ERCOT’s role in the CCET Pilot  Overview of Stakeholder Process – What’s been done to date?  Questions.
1 TX SET Mass Transition Project RMS Update March 15, 2006.
1DRAFT for DISCUSSION Transition From Non-IDR to IDR Load Profile and LSE 15-minute Data for AMS Market Advanced Readings and Settlements Taskforce 10/9/09.
AEP ONCOR TNMP TDSP AMS Data Practices Summary TDSP_AMS Data Practices Summary.ppt CenterPoint.
February 2, 2016 RMS Meeting 1. * Reasons: * Per the ERCOT Board Report dated 8/5/14 there were 6.6M Advanced Metering System (AMS) Electric Service Identifiers.
Profiling Working Group 1 PWG Update Report By Brad Boles of Cirro Energy ERCOT PWG Vice-Chair for COPS Meeting December 3, 2007.
1 ERCOT COPS Round 2 Sample Design Review April 10, 2007.
Demand Response Options Review Carl Raish November 27, 2007.
1.  What is the purpose of DEVs? Data Extract Variances (DEVs) are used to synchronize the data among all Market Participants (MP)  What is a DEV? It's.
1 Status of True-up Settlements Sept Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec April 26, 2003 Date we expect to be back to.
November 17, 2014 ERCOT AMS Data Loading Result Code Descriptions & Distribution AMS Workshop IV.
SMT Update To AMWG DEC 2016.
SMT Update To AMWG NOV 2016.
Welcome to HICAPS CSS!.
Jan 2016 Solar Lunar Data.
AEP ONCOR TNMP CenterPoint
Settlement Timeline Workshop
2016 Annual Validation Update PWG
MARS Taskforce RMS Update December 9, 2009.
SMT Historical Usage Reporting Functionality Backfill Requests Interim Solution Instructions Version 6 Dated March 17, 2014.
Nexus Meter Read Performance April 2018
Gantt Chart Enter Year Here Activities Jan Feb Mar Apr May Jun Jul Aug
Presentation transcript:

LRS Update Meeting LRS Participants and Profiling Working Group October 27, 2005

2 Agenda Process update  Sample Point Reconciliation  Interval Data Validation – TDSP’s  Interval Data File Delivery – CR’s Action Plan

3 Sample Point Reconciliation Sample point installation summary report  Summary of primary and replacement points installed by TDSP, profile type, weather zone, and stratum number  Status indicates if cell is fully installed, no. of points to install and whether replacements are available.  CSV file sent to TDSP’s

4 Sample Point Reconciliation Sample point installation Summary Report Only TDSP’s with outstanding installs were sent this report however any TDSP can request this report.

5 LRS Sample Point Replacement Guidelines Once an (LRS) IDR has been installed, a sample point may be replaced for the following reasons:  Unsafe conditions / vicious animals  Demolition / service removed / ESI ID status changed to Inactive  De-energized for 6 or more consecutive months after IDR installed  Competitively-owned NIDR meter is to replace (LRS) IDR at Premise  ESI ID’s Profile Type is out of scope (e.g., BUSIDRRQ, NMLIGHT, NMFLAT)  Customer request for (LRS) IDR removal  Premise switches to OMR  The sample is terminated or refreshed by ERCOT. ERCOT LRS documents can be found at: 

6 Sample Point Reconciliation Interval data cuts were compared with ‘ValidIDRCut’ events from the tracking log. Existing cuts and events have been reconciled. Queries to detect LRS database inconsistencies such as:  Recorder ID is active, sample point has been removed  Recorder install date is earlier than Sample Point start date (excluding TXUED legacy sample points.)

7 Interval Data Validation October 1, :00 was chosen as the start of the analysis period because approx. 80 % of the IDR installations were complete at this time. The analysis start date may move until enough validated data is available to perform the analysis. Most validation reports begin with this date/time. Data prior to this time has been retained but will not be validated and loaded into C/S LodeStar.

8 Interval Data Validation

9 The following validations are performed prior to loading into C/S LodeStar.  Received LS file cuts with no recorder in the LRS Recorder table  Valid recorders with no data received  Recorder indicated as removed but still receiving data  Cuts that are overdue or contain gaps.  Look for Service History changes in Profile Type, Weather Zone or Status (de-energized greater than 6 months.)

10 LS Cuts with No Installed Rec Rec ID File Name Interval nterval File transfer Date IN LSFile Start Dt End Dt 1 TDSP_ ls 25OCT04:08:46 23NOV04:01:00 23NOV2004:15:16:07 1 TDSP_ ls 22DEC04:01:31 24JAN05:01:00 24JAN2005:15:15:47 1 TDSP_ ls 23NOV04:01:01 22DEC04:01:30 27DEC2004:15:15:30 1 TDSP_ ls 24JAN05:01:01 22FEB05:00:30 22FEB2005:15:15:20 1 TDSP_ ls 24MAR05:01:01 25APR05:00:30 25APR2005:15:15:31 1 TDSP_ ls 22FEB05:00:31 24MAR05:01:00 14APR2005:16:43:56 1 TDSP_ ls 25APR05:00:31 25MAY05:01:00 25MAY2005:15:15:33 1 TDSP_ ls 25MAY05:01:01 02JUL05:00:45 06JUL2005:15:15:55 1 TDSP_ ls 02JUL05:00:46 26JUL05:01:00 26JUL2005:15:15:40 1 TDSP_ ls 26JUL05:01:01 24AUG05:01:00 24AUG2005:15:15:35 1 TDSP_ ls 24AUG05:01:01 27SEP05:01:00 30SEP2005:15:16:00 2 TDSP_ ls 30SEP04:11:16 01OCT04:09:00 15OCT2004:15:15:51 2 TDSP_ ls 13SEP04:16:46 30SEP04:11:15 15OCT2004:15:15:51 2 TDSP_ ls 30AUG04:01:01 13SEP04:16:45 15OCT2004:15:15:51 3 TDSP_ ls 04JAN05:11:31 10JAN05:08:30 11JAN2005:15:15:41 3 TDSP_ ls 10JAN05:08:31 31JAN05:00:15 31JAN2005:15:15:41 3 TDSP_ ls 31JAN05:00:16 02MAR05:00:15 02MAR2005:15:15:29 3 TDSP_ ls 02MAR05:00:16 01APR05:00:30 14APR2005:16:45:24 3 TDSP_ ls 01APR05:00:31 02MAY05:00:30 02MAY2005:15:15:34 3 TDSP_ ls 02MAY05:00:31 02JUN05:01:15 03JUN2005:15:15:23 3 TDSP_ ls 02JUN05:01:16 01JUL05:01:00 01JUL2005:15:15:26 3 TDSP_ ls 01JUL05:01:01 02AUG05:01:15 03AUG2005:15:15:23

11 Recorders with No Interval Data ESIID Recorder Id Install Date Removal Date JUN2004:00:00:00 19AUG2004:00:00: JUN2003:00:00: JUN2003:00:00: AUG2005:00:00: AUG2005:00:00: SEP2004:00:00:00 10SEP2004:00:00: AUG2004:00:00:00 10AUG2004:00:00: OCT2004:00:00: SEP2004:00:00: SEP2004:00:00: AUG2004:00:00: OCT2004:00:00: NOV2004:00:00: SEP2005:00:00: NOV2004:00:00: NOV2004:00:00: NOV2004:00:00: SEP2005:00:00: NOV2004:00:00: SEP2005:00:00: AUG2005:00:00:00.

12 LRS Overdue Data Report ESIID Recorder Id Stop Date Days Overdue of Last Cut MAY05:08: _________________________________________________________ JUL05:09: _________________________________________________________ SEP05:10: _________________________________________________________ AUG05:10: _________________________________________________________ SEP05:11: _________________________________________________________

13 LRS Gap Report ESIID Recorder Id Gap Begin Date Gap End Date Gap Days FEB05:09:01 14JUL05:23: ____________________________________________________________________________ DEC04:11:31 24JAN05:23: JAN05:00:01 25JAN05:12: ____________________________________________________________________________ DEC04:12:16 24JAN05:13: ____________________________________________________________________________ JUN05:18:01 25JUL05:23: JUL05:00:01 26JUL05:15: ____________________________________________________________________________ NOV04:00:16 14DEC04:01: AUG05:00:16 21SEP05:01: ____________________________________________________________________________

14 Sample Maintenance Report See Excel handout.

15 Interval Data Validation C/S LodeStar is used.  Interval Data is input (D110)  Validation tests are run (X210)  Outage, Zero, and NonNormal messages that are below the tolerance levels do not cause a cut invalid status so these messages are deleted.  The time intervals covered by the Zero messages are compared with the corresponding status in the ESI ID Service History table. The zero messages with a corresponding active status are retained.  Output simplified messages to Excel file.

16 C/S LodeStar Time Conventions LS file header record date format is: "MMDDYYhhmm". C/S LodeStar rounds the Start Time down by 1 minute and adds "00" seconds. For example, an input cut start time of 09/16/04 12:50 results in a LodeStar time of 09/16/04 12:49:00. The cut end time is set at 1 sec before the interval end time so the end time seconds are "59". For example, an input cut end time of 11/12/04 14:00 results in a LodeStar time of 11/12/04 13:59:59.

17 Validation Tests Internal and External Validation Current Tolerances for Internal Validation  ENERGY (Energy Discrepancy Test)  OUTAGE 96 (Uncorrected Power Outages Test)  NON-NORMAL 15 (Non-Normal Intervals Test)  ZERO 288 (Zero Intervals Test)  SPIKE 10 50% (Spike Intervals Test)  HIGH (High Interval Demand Test)  LOW 0 (Low Interval Demand Test)

18 Interval Data Validation Internal,TDSPNoCBHCA10021, ,11/15/04- 00:00:00,11/18/04-08:29:59, (INTERNAL) ENERGY DIFFERENCE (M - I): RATIO (M/I): Outages,TDSPNo,CBHSA10013, ,11/18/04- 09:00:00,11/30/04-13:14:59, OUTAGES: 11/18/04-09:14:59 Spike,TDSPNO,CBLSA10091, ,11/23/04- 11:30:00,12/29/04-15:14:59, SPIKE: 12/27/04-08:29:59 Zeros,TDSPNo,CBLSA10141, ,10/04/04- 15:30:00,11/03/04-14:44:59, ZEROS: 10/15/04-13:59:59 Sample of Validation Report

19 Validation Tests Energy Discrepancy Test  The cut’s energy usage based on its meter readings is compared with the energy usage calculated from the interval data values. If the cut fails the energy discrepancy test, the following validation message is written.  (INTERNAL) ENERGY DIFFERENCE (M - I): RATIO (M/I): 1.604

20 Validation Tests Uncorrected Power Outages Test  The number of consecutive intervals with status codes = ‘1’ must not exceed the limit specified by the OUTAGE parameter in the Validation Environment File. If the number of consecutive uncorrected power outages exceeds the limit, the test fails and the following validation message is written.  OUTAGES: 09/22/04-14:29:59 where nnnn is the number of consecutive uncorrected power outages.

21 Validation Tests Number of Intervals Test  The actual number of data intervals in a cut must correspond to the difference between the cut’s start-time and stop-time. If the expected number of intervals does not equal the recorded number of intervals, the test fails and one of the following validation messages is written.  (INTERNAL) COMPUTED STOP TIME: 12/27/04- 12:29:59 4 INTERVAL(S) SHORT  (INTERNAL) COMPUTED STOP TIME: 01/01/05- 02:59:59 12 INTERVAL(S) LONG

22 Validation Tests Non-Normal Intervals Test  The number of consecutive intervals with non- normal status codes must not exceed the limit specified by the NON-NORMAL parameter.  A non-normal status code is defined as a value of ‘2’ to ‘9’. If the number of consecutive non- normal status codes exceeds the limit, the test fails and the following validation message is written.  NON-NORMAL: 06/28/04-00:14:59  Where nn is the number of non-normal intervals

23 Validation Tests Spike Intervals Test  The cut must not include any abnormally high peaks or “spikes”. Both the number of peaks to be averaged within a cut and a percentage not to be exceeded are defined by the SPIKE parameter. If the spike test fails, the following validation message is written.  SPIKE: 10/27/04-10:14:59 10/27/04-09:59:59

24 Validation Tests High Interval Demand Test  The number of intervals with an abnormally high demand must not exceed the limit defined by the HIGH parameter in the Validation Environment File. If the HIGH test fails, the following validation message is written.  HIGH DEMAND: 08/24/04-17:59:59 08/25/04-16:14:59 08/26/04-1

25 Validation Tests Low Interval Demand Test  The number of intervals with an abnormally low demand must not exceed the limit defined by the LOW parameter the Validation Environment File. If the LOW test fails, the following validation message is written.  LOW DEMAND: 08/24/04-17:59:59

26 Validation Tests Zero Intervals Test  The number of consecutive intervals in a cut whose values are 0.0 may not exceed a limit specified by the ZERO parameter. If the ZERO test fails, the following validation message is written.  ZEROS: 08/09/04-13:14:59

27 Validation Tests Current tolerances for External Validation  Time (Recording Period Match Up Test)  Meter 2 1 (Meter Reading Match-Up Test)

28 Validation Tests Recording Period Match Up Test  The stop-time of the current cut and the start-time of the following cut must be within the underlap (gap) and overlap tolerances specified on the TIME parameter. If the stop-time of the current cut is less than the start-time of the following cut an underlap exists and the following validation message is written to the validation message file.  (EXTERNAL) TIME UNDERLAP: 3974 INTERVAL(S)  If the stop-time of the current cut is greater than the start- time of the following cut and they do not fall within the same interval, an overlap exists and the following validation message is written.  (EXTERNAL) TIME OVERLAP: 3974 INTERVAL(S)

29 Validation Tests Meter Reading Match-Up Test  If meter readings exist for the cut series, the meter-stop-reading of the current cut and the meter-start-reading of the following cut must be within the underlap and overlap tolerances specified in the METER parameter. If the meter-stop-reading of the current cut is less than the meter-start-reading of the following cut, an underlap exists. If the size of the underlap is greater than the limit specified in the METER parameter, the cut fails and the following validation message is written.  (EXTERNAL) METER UNDERLAP: x UNITS  where x = size of the underlap.  If the meter-stop-reading of the current cut is greater than the meter-start-reading of the following cut, an overlap exists. If the size of the overlap is greater than the limit specified in the METER parameter, the cut fails and the following validation message is written.  (EXTERNAL) METER OVERLAP: x UNITS  where x = size of the overlap.

30 Validation Tests Merge Attribute Match-Up Test  The unit-of-measure and seconds-per-interval fields for the current cut must be equal to the corresponding fields for the following cut. If either of these fields is not equal, the cut fails the test and one or more of the following validation messages are written.  (EXTERNAL) UNIT OF MEASURE DISCREPANCY  (EXTERNAL) SECONDS-PER-INTERVAL DISCREPANCY: n1 FOLLOWED BY n2

31 Validation Discussion Suggestions to improve the reports? Frequency to receive reports? (i.e. weekly, monthly, quarterly) Would the TDSP’s prefer the validation reports in quarterly pieces or everything at once? Other issues?

32 ERCOT First Contacts AEP – Bill Boswell (512) CenterPoint – Diana Ott (512) TXU ED – Ron Hernandez (512) Sharyland and TNMP – Adrian Marquez (512) Nueces – Theresa Werkheiser (512)

33 Interval Data File Delivery – CR’s Only cuts that pass internal and external validation are delivered Current data covers period from 10/01/04 to approximately 12/31/04. No totalized cells are included in the current data Both CSV and LSE formatted files delivered to CR mailboxes. Environment file delivered that links the encrypted Recorder_Id's (used in the interval data files) to the sample point cell specified by Profile Type, Weather Zone, Stratum, and Service Voltage All three files have been pre-pended with the CR DUNS+4 and the delivery date/time.

34 Interval Data File Delivery – CR’s As a prerequisite for receiving LRS files from ERCOT, each CR will have downloaded the current FTP Replacement client with an updated digital certificate Use FTP Replacement protocol (ReceiveALL) to download files Call ERCOT Help Desk if you have problems (512)

35 Interval Data File Delivery – CR’s Normal Day (4 placeholder commas at end of record) AEFG4567, ,96,0.21,0.2,0.19,0.18,0.17,0.17,0.16,0.16,~~~~~,0.19,0.31,0.3,0.29,0.28,0.27,0.26,0.25,,,, Spring DST Day (8 placeholder commas at end of record) ABCD12345, ,92,0.22,0.22,0.21,0.2,0.2,0.19,0.19,0.18,~~~~~,0.29,0.29,0.28,0.27,,,,,,,, Fall DST Day (0 placeholder commas at end of record) AKLM4567, ,100,0.22,0.21,0.2,0.2,0.19,0.19,0.18,0.18,~~~~~,0.31,0.31,0.3,0.29,0.28,0.27,0.25,0.24, 0.24,0.23,0.24,0.23 First Day of Cut (‘n’ placeholder commas at beginning of record based on cut start time) ANOP1234, ,96,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.13,0.2,0.2,0.2,~~~~~,0.37,0.36,0.36,0.35,0.33,0.32,0.31,0. 3,,,, Last Day of Cut (‘n’ placeholder commas at end of record based on cut stop time) AQRS4567, ,96,0.21,0.21,0.2,0.2,0.19,0.19,0.18,0.18,~~~~~,0.06,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, CSV Format

36 Interval Data File Delivery – CR’s ,100104,96,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,0.06,0.695,0.145,0.17,0.13,0. 12,0.12,0.1,0.18,0.17,0.18,0.77,0.07,0.855,0.13,0.955,0.885,0.445,1.035, 1.23,0.685,0.97,1.63,0.95,2.135,2.19,1.855,1.8,2.16,1.58,2.165,1.38,1.89 5,1.265,1.86,1.175,1.86,1.27,1.575,1.69,1.205,1.825,1.09,1.11,1.735,1.1 5,1.03,1.36,1.315,0.975,1.135,1.575,0.985,0.86,1.355,1.235,0.675,,,, ,102904,96,1.025,0.83,0.595,0.84,0.95,0.685,0.38,0.84,0.26,0.67, 0.74,0.23,0.785,0.24,0.855,0.265,0.815,0.235,0.8,0.22,0.76,0.2,0.82,0.24,0.845,0.33,0.62,0.34,0.43,0.765,0.6,1.275,2.055,0.895,0.875,0.275,0.85, 0.53,0.615,0.96,0.33,1.005,0.785,0.495,0.905,0.91,0.41,0.945,1.105,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, Sample of Actual Data – CSV Format * Each row contains data for a 24 hour day (except DST days.) * Start and end interval times are positional

37 Interval Data File Delivery – CR’s.LSE file format .LSE file format is also a.CSV format .LSE file format documentation will be sent if requested ,154357,1, , ,Y,Y ,1517.0,3903.1,1.0,0.0,0.005,0.0,900,1,0,-1,,, , , ,M , ,,, ,,, ,,, ,,, ,,, ,,, Sample of Actual Data – LSE Format

38 Action Plan Based on the analysis start date of October 1, 2004, we are closing-in on 1 year of data. Immediate tasks:  Continue to build and document processes  Load replacement cuts  Build LodeStar CNTL files to totalize 100% sampled cells  Build CNTL files to move validated cuts to ELDB  Build analysis CNTL and ENV files Next steps:  Current profile model assessments by Profile Type and Weather Zone  Sample design and sample point selection for Round 2  Build new profile models.

39 Action Plan Early December: Schedule a conference call to discuss on-going data validation and delivery issues End of 2005: Resolve all data issues covering 1 year of data. 03/31/06: Analysis of current profile models complete. 04/30/06: Sample Point selection for round 2 complete. 06/30/06: Complete profile model building. This Action Plan is contingent upon ERCOT and the TDSP’s resolving data validation issues by year end.