WECC Compliance Committee September 18, 2013 Salt Lake City, UT.

Slides:



Advertisements
Similar presentations
2004 NERC, NPCC & New England Compliance Programs John Norden Manager, Operations Training, Documentation & Compliance August 31, 2003 RC Meeting.
Advertisements

Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Refugee Protection Division Navigating the Sea of Change – Refugee Lawyers Group CLE 2013.
Slide 1 FastFacts Feature Presentation October 15, 2013 To dial in, use this phone number and participant code… Phone number: Participant.
1 Balloting/Handling Negative Votes September 11, 2006 ASTM Training Session Bob Morgan Brynn Iwanowski.
2) Long term (1/4) UNJSPF : Evolution of Actuarial situation since
1 Balloting/Handling Negative Votes September 22 nd and 24 th, 2009 ASTM Virtual Training Session Christine DeJong Joe Koury.
Task Group Chairman and Technical Contact Responsibilities ASTM International Officers Training Workshop September 2012 Scott Orthey and Steve Mawn 1.
UNITED NATIONS Shipment Details Report – January 2006.
Document #07-2I RXQ Customer Enrollment Using a Registration Agent (RA) Process Flow Diagram (Move-In) (mod 7/25 & clean-up 8/20) Customer Supplier.
Quality Education Investment Act of 2006 (QEIA) 1 Quality Education Investment Act (QEIA) of 2006 County Superintendents Oversight and Technical Assistance.
Standards Development and Approval Process Steve Rueckert Director of Standards Joint Guidance Committee WECC Leadership Annual Training Session Salt Lake.
1 TSS Report PCC Meeting Salt Lake City, UT October 26-27, 2006.
1 MSRATF Update to TSS (Modeling SPS and RAS Ad Hoc Task Force) Scope of Work Approval January 25, 2013 Joe Seabrook Puget Sound Energy.
Process for Developing and Approving WECC Regional Criteria Preschedule Process Regional Criteria Drafting Team Meeting Conference Call - Webinar October.
1 Compliance Report WECC Board of Directors Meeting December 7-8, 2006 Steve Rueckert Director, Standards and Compliance.
Unscheduled Flow Administrative Subcommittee Report David Lemmons March 27, 2013 Salt Lake City, UT.
Tier One Standards Update Steve Rueckert Director – Standards and Compliance Joint Guidance Committee Meeting January 23-24, 2007 Salt Lake City, Utah.
Harmonized implementation of CDM Accreditation CDM-Accreditation Panel.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
State of New Jersey Department of Health and Senior Services Patient Safety Reporting System Module 2 – New Event Entry.
1 CREATING AN ADMINISTRATIVE DRAW REQUEST (OCC) Complete a Checklist for Administrative Draw Requests (Form 16.08). Draw Requests amount must agree with.
August 28, 2009 Federal Emergency Management Agency Public Assistance Arbitration Process.
1 Lisa Alexander and Helen Dennis April 16, 2008 Data Updates/Data Review Form District Test Coordinator Meeting.
Add Governors Discretionary (1G) Grants Chapter 6.
Course Objectives After completing this course, you should be able to:
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
Credit Card Understanding Your Credit Card Credit Cards 101 Trivia.
Case Management Techniques
1 Payment Systems Funds Availability Problems. 2 Problems Funds Availability Scenarios All deposits made in Oklahoma City on Monday March 1 Bank open.
Human Capital Investment Programme Disability Activation Project (DACT) WELCOME Support Workshop Thursday 7 th February
AEMCPAGE Relaunch 1 June 2009.
Behind the Fence and Contract Change Stakeholder Update September 18, 2012.
Office for Human Research Protections 1 Updating the Common Rule Governing Human Subjects Research Protections Jerry Menikoff.
VOORBLAD.
TCCI Barometer March “Establishing a reliable tool for monitoring the financial, business and social activity in the Prefecture of Thessaloniki”
Jerry Rust Chair - Underfrequency Load Shedding Criterion Team
CUG Meeting June 3 – 5 Salt Lake City, UT
W. Shannon Black Manager, Standards Processes Results Based Drafting 2013.
Keshav Sarin Manager, Compliance Risk Analysis
COMPLIANCE 101 Module One.
CIP Version 5 Transition Guidance September 2013 Open-Webinar
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Jim Haywood (Product Manager for Statutory Returns) Adopted from Care - Spring Release 2014.
© 2012 National Heart Foundation of Australia. Slide 2.
Page 1 of 43 To the ETS – Bidding Query by Map Online Training Course Welcome This training module provides the procedures for using Query by Map for a.
GEtServices Services Training For Suppliers Requests/Proposals.
Training Monday, February 04, Saturday, October 11, Welcome to the Accounts Payable Invoice Processing Course.
BAL-001-TRE-1 Primary Frequency Response Update for Texas RE RSC
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
School Census Summer 2011 Headlines Version Jim Haywood Product Manager for Statutory Returns.
Page 1 of 36 The Public Offering functionality in Posting allows users to submit requests for public offerings of Petroleum and Natural Gas(PNG) and Oil.
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
PSSA Preparation.
Weekly Attendance by Class w/e 6 th September 2013.
PIC Investments Performance Update Year ended 30 September 2014 APPENDIX Advisor Use Only.
Procedures and Forms 2008 FRCC Compliance Workshop April 8-9, 2008.
Item 5d Texas RE 2011 Budget Assumptions April 19, Texas RE Preliminary Budget Assumptions Board of Directors and Advisory Committee April 19,
Problem Areas Updates Penalties FRCC Compliance Workshop September / October
2011 ReliabilityFirst 693 Compliance Audit Process for 6 Year Audit Cycle Entities Glenn Kaht Senior Consultant - Compliance ReliabilityFirst Corporation.
WHAT YOU NEED TO KNOW ABOUT NERC ERO COMPLIANCE Louise McCarren Chief Executive Officer – WECC APPA National Conference June 16, 2009 Salt Lake City, Utah.
Enforcement 101 Rachael Ferrin Associate Process Analyst.
Background (history, process to date) Status of CANs
Garret Story, Enforcement Analyst
Support Systems and Tools
CMEP Technology Project
Presentation transcript:

WECC Compliance Committee September 18, 2013 Salt Lake City, UT

2 I. Welcome and IntroductionsLee Beyer, Chair Agenda Review Approval of Minutes of June 26, 2013 Meeting II. Compliance Update and MetricsConstance White III. Board Oversight of Compliance Post-BifurcationRuben Arredondo IV. Compliance Outreach Laura Scholl V. WICF UpdateMatt Jastram VI. Update and Return Disclosure Form VII. Adjourn AGENDA

Constance B. White Vice President, Compliance WECC Compliance Update Wednesday, September 18, 2013

4 1 Index of Slides 2Audit and Spot Check Reports 3Violation Review and Validation-Data 4Violation Review and Validation-Line Chart 5Mitigation Plan Review-Data 6Mitigation Plan Review-Line Chart 7Completed Mitigation Plan Review-Data 8Completed Mitigation Plan Review-Line Chart 9Violations Received-CIP (Reporting Method) 10Violations Received-O&P (Reporting Method) 11Violations Received (Shown by Percent) 12Violation Aging 13Regional Entity Inventory Appendix (attached): Detailed explanation for each measure WECC Compliance Report

5 Measure: Average days to produce audit report and file with NERC (“non-public” = violations are not yet processed to finality) Spot Check data includes only those since 1/1/2011 when reports were required to be sent to NERC. “Days Outstanding” measures average days that the incomplete reports are pending Goal: Per CMEP, “normally” 60 days for audit reports; 90 days for spot check reports if there are no violations) * Year-to-date Non-public Audit & Spot Check Reports Audits: Audit YearComplete Reports Average Days to Completion Incomplete ReportsDays Outstanding * Spot Checks: Spot Check YearCompleted Reports Average Days to Completion Incomplete ReportsDays Outstanding *----

6 Measure: Average days to complete reviews for validated (enforceable) violations reviewed during the referenced period. Includes technical review. Upon completion of review validated violations are sent to Enforcement staff for further processing and final disposition. This data does not include dismissed violations. Goal: 60 days (Internal; no CMEP requirements) # CIP pending review as of 7/31/2013: 33 # O&P pending review as of 7/31/2013: 22 * Quarter-to-date Violation Review and Validation Violation Review Report As of 7/31/2013 Quarter/Year Critical Infrastructure Protection (CIP)Operations and Planning (O&P) Reviewed & Validated Average Days to Review Pending Review > 90 Days Reviewed & Validated Average Days to Review Pending Review > 90 Days Q Q Q Q Q Q 2013*

7 Shows information from the preceding slide (slide 3) Goal: 60 days (Internal; no CMEP requirements) # pending review as of 7/31/2013: 55 # pending review > 90 days as of 7/31/2013: 25 * Quarter-to-date Violation Review and Validation

8 Measure: Average days to complete reviews for accepted Mitigation Plans reviewed during the referenced period Does not include rejected MPs or MPs with dismissed violations Goal: 60 days (Internal. CMEP: Region has 30 days to accept, reject, or extend review. If the MP is not reviewed within 30 days, WECC always extends the review within 30 days.) # CIP pending review as of 7/31/2013: 29 # O&P pending review as of 7/31/2013: 13 * Quarter-to-date Mitigation Plan Review Mitigation Plan Review Report As of 7/31/2013 Quarter/Year Critical Infrastructure Protection (CIP)Operations and Planning (O&P) Reviewed & Accepted Average Days to Review Pending Review > 90 Days Reviewed & Accepted Average Days to Review Pending Review > 90 Days Q Q Q Q Q Q 2013*

9 Measure: Shows information from the preceding slide (slide 5) Goal: 60 days (Internal. CMEP: Region has 30 days to accept, reject, or extend review. WECC always extends the review within 30 days, if necessary.) # pending review as of 7/31/2013: 42 # pending review > 90 days as of 7/31/2013: 7 * Quarter-to-date Mitigation Plan Review

10 Measure: Average days to complete reviews for accepted Completed Mitigation Plans reviewed during the referenced period Does not include rejected CMPs or CMPs with dismissed violations Goal: 60 days (Internal; no CMEP requirements) # CIP pending review as of 7/31/2013: 86 # O&P pending review as of 7/31/2013: 6 * Quarter-to-date Completed Mitigation Plan Review Completed Mitigation Plan Review Report As of 7/31/2013 Quarter/Year Critical Infrastructure Protection (CIP)Operations and Planning (O&P) Reviewed & Accepted Average Days to Review Pending Review > 90 Days Reviewed & Accepted Average Days to Review Pending Review > 90 Days Q Q Q Q Q Q 2013*

11 Measure: Shows information from the preceding slide (slide 7) Goal: 60 days (Internal; no CMEP requirements) # pending review as of 7/31/2013: 92 # pending review > 90 days as of 7/31/2013: 34 * Quarter-to-date Completed Mitigation Plan Review

12 Measure: Enforceable violations (Reviewed violations that are not dismissed) shown by source (CMEP monitoring method) * "Other" includes Spot Check, Exception Report, Periodic Data Submittal, Compliance Investigation and Complaint ** Year-to-date Violations Reporting Method

13 Measure: Enforceable violations (Reviewed violations that are not dismissed) shown by source (CMEP monitoring method) * "Other" includes Spot Check, Exception Report, Periodic Data Submittal, Compliance Investigation and Complaint ** Year-to-date Violations Reporting Method

14 Measure: Enforceable violations (Reviewed violations that are not dismissed), with source expressed as a percentage of enforceable violations. * "Other" includes Spot Check, Exception Report, Periodic Data Submittal, Compliance Investigation and Complaint ** Year-to-date Violations Reporting Method

15 Excludes Federal Entity violations on hold. Violation Aging

16 Source: NERC Enforcement Metrics as of 6/30/2013 These new metrics are developed based on violations processed in the first 6 months of Violations that are held by appeal, a regulator, or a court are excluded from computation of these metrics. Regional Entity Inventory Region Regional Inventory Average months in Regional Inventory Regional Caseload Index FRCC MRO NPCC RFC SERC SPP TRE WECC

17 Detailed Explanation of Measures Appendix to Compliance Report

18 Slide 2 Explanation: Audit and Spot Check Reports Measure:This data chart measures the average number of days it takes for the WECC audit team to produce an audit report following an audit and file it with NERC. It also measures the number of completed reports, the number of incomplete reports and the total number of days incomplete reports are outstanding. Measurement Period: One year. (4 years’ worth of audit reports and 3 years’ worth of spot check reports are shown) Purpose:WECC measures this to gauge our compliance with the CMEP time suggested; to gauge the audit and audit support staff efficiency and efficiency of both staff and tools. Terms:“Non public” :refers to final audit reports send to NERC. Audit reports are not posted publically until all due process relating to disposing of any violations are complete, i.e. a Notice of Penalty is approved by FERC. “Days outstanding” measures the average number of days that the incomplete reports have been pending. Goal:60 days for audit report; 90 days for spot check report. Per the Compliance Monitoring and Enforcement Program) CMEP, “normally” audit reports would be filed with NERC 60 days following conclusion of the audit. The CMEP indicates that for spot check reports, these should be filed within 90 days if there are no violations. Notes:Spot Check data includes only those since 1/1/2011 when reports were required to be sent to NERC. Appendix - Explanation of Measures

19 Slides 3-4 Explanation: Violation Review and Validation Report (Data and Line Charts) Measure:These charts measure the average number of days staff (generally SMEs, Subject Matter Experts) takes to complete reviews of all NPVs (New Possible Violations) which are validated and transferred to WECC Enforcement staff for processing and disposition. It also measures the number of violations reviewed and validated and the number of violations pending review > 90 days at the end of the measurement period. Also noted are the total number of CIP and O&P violations currently pending review. Measurement Period: Quarter (six most recent quarters are shown) Purpose:WECC measures this as a gauge of staffing sufficiency given workload, and efficiency of both staff and processes. Terms:“NPV” is New Possible Violation. NPVs are reported and entered into the tracking system within five days after initial indication of a violation. “CMEP” is the FERC-approved Compliance Monitoring and Enforcement Program: Goal:60 days from entry of the NPV to the date it is either dismissed or referred for further processing by WECC Enforcement staff. This is an internal goal; the CMEP specifies no goals or requirements. Notes:NPV are reviewed by an appropriate WECC SME, who may review the available record, request further information from the entity, or issue data requests. At that point, the NPV is either: (1) Dismissed (meaning that upon review the SME determined that no violation in fact exists) or (2) “Validated” and becomes an “Alleged Violation” which is forwarded to the WECC Enforcement staff for appropriate disposition (for example, through FFT, Find, Fix and Track, or an Expedited Settlement Agreement, Notice of Penalty, or Spreadsheet NOP). This measure currently does not include violations that are dismissed. The dismissal data in WECC’s new webCDMS system includes a number of out-of-process outliers, which can skew the average days calculation. WECC is working with the webCDMS vendor to add the ability to flag these outliers for exclusion. Appendix - Explanation of Measures

20 Slides 5-6 Explanation: Mitigation Plan Review (Data and Line Charts) Measure: These charts measure the average number of days staff (generally SMEs, Subject Matter Experts) takes to complete reviews of Mitigation Plans (MPs) which were accepted during the referenced period. It also measures the number of MPs reviewed and accepted and the number of MPs pending review > 90 days at the end of the measurement period. Also noted are the total number of CIP and O&P MPs currently pending review. These measures do not include MPs that were rejected after review or that pertain to violations that ultimately were dismissed. Measurement Period: Quarter (six most recent quarters are shown) Purpose: WECC measures this as a gauge of staffing sufficiency given workload, and efficiency of both staff and processes. Terms: “MP” is Mitigation Plan. Entities are required to file these on a requirements (rather than Standard) level. Thus, for example, violations of two different requirements relating of a single standard would result in two MPs. “CMEP” is the FERC-approved Compliance Monitoring and Enforcement Program: Goal:60 days from submittal of MP to the date it is either accepted or rejected. This is an internal goal; the CMEP specifies the region has 30 days to accept, reject, or extend the review period. WECC always extends the review period within 30 days, if necessary. Notes: WECC believes that 30 days for review is not a realistic goal. Under the CMEP, entities are not required to file MPs until after issuance of the NOAV (Notice of Alleged Violation), if not contested. Yet entities are encouraged to file MPs as quickly as possible once they believe they are in violation. It is not unusual for an entity to file a Self Report, or certify non-compliance, simultaneously (or nearly so) with a corresponding MP. The dilemma is that WECC cannot assess the sufficiency of the MP until it understands the violation by performing its review (initial validation of the New Possible Violation, then Enforcement staff assessing of the scope and risk of the Alleged Violation). The CMEP contains no measures for these violation review activities, but it’s only after doing performing them that staff assess whether it can accept the MP. All these activities easily can take longer than 30 days. Figures also include the federal cases on hold (pending legal resolution). In these cases MPs are not required and entities have been submitting voluntarily. Appendix - Explanation of Measures

21 Slides 7-8 Explanation: Completed Mitigation Plan Review (Data and Line Charts) Measure: These charts measure the average number of days staff (generally SMEs, Subject Matter Experts) takes to complete reviews of Completed Mitigation Plans (CMPs) which were accepted during the referenced period. It also measures the number of CMPs reviewed and accepted and the number of CMPs pending review > 90 days at the end of the measurement period. Also noted are the total number of CIP and O&P CMPs currently pending review. These measures do not include CMPs that were rejected after review or that pertain to violations that ultimately were dismissed. Measurement Period: Quarter (six most recent quarters are shown) Purpose: WECC measures this as a gauge of staffing sufficiency given workload, and efficiency of both staff and processes. Terms: “CMP” is Completed Mitigation Plan, which includes a Certification of Mitigation Plan Completion and supporting evidence. “CMEP” is the FERC-approved Compliance Monitoring and Enforcement Program: Goal:60 days from submittal of CMP to the date it is either accepted or rejected. This is an internal goal; the CMEP specifies no goals or requirements. Notes: CMP reviews are generally more involved and time consuming compared to MP reviews. In addition to a Certification of Mitigation Plan Completion document, an entity must submit evidence, which demonstrates the mitigating activities outlined in a particular MP are complete. These CMP reviews not only require WECC SMEs to review the evidence provided by entities, but also SMEs often need to issue data requests in order to get the information necessary to verify MP completion. Figures also include the federal cases on hold (pending legal resolution). In these cases MPs are not required and entities have been submitting voluntarily. Appendix - Explanation of Measures

22 Slides 9-11 Explanation: Enforceable Violations and Enforceable Violations by Percent Measure: These charts measures the number of CIP and O&P enforceable violations by reporting method: Self-Report; Self- Certification; Audit and Other, and shows the breakdown by percent as well. Measurement Period: One year (4 most recent years are shown) Purpose: This does not measure performance or efficiency. It is used to track the number of discovered violations relative to the various reporting methods, primarily to assess trends. This information occasionally is requested by stakeholders. Terms: "Other" includes Spot Check, Exception Report, Periodic Data Submittal, Compliance Investigation and Complaint. “Enforceable violations” are violations that are reviewed and validated. Goal:N/A Notes: Dismissed violations are not enforceable. Appendix - Explanation of Measures

23 Slide 12 Explanation: Aging of Violations in Caseload Measure: This chart measures the number and age of violations, which have not been filed with NERC. Measurement Period: Snapshot as of specified date. Purpose: Identifies and tracks older violations. This information is used to help manage violation processing priority and resources. Terms: “Caseload" refers to all discovered violations in WECC’s inventory, which have not been filed with NERC.. Goal: N/A Notes: Excludes Federal Entity violations on hold. Appendix - Explanation of Measures

24 Slide 13 Explanation: Regional Entity Inventory Measure: This is a chart created by NERC, which shows three separate NERC Enforcement Metrics for the eight Regional Entities: Regional Inventory – Number of active violations that have not been submitted to NERC for processing and review. Average Months in Regional Inventory – Average number of months violations have been in the Region’s inventory from discovery to present (the day the metric is computed). Regional Caseload Index – Violations in Regional Inventory divided by the total number of violations filed with NERC over previous 12-months (NOCVs, SNOPs, FFTs, SAs, Dismissals) multiplied by 12. Measurement Period: Quarterly (Most recent 12 months are shown)* Purpose: Regional Inventory – Used in the calculation of the Regional Caseload Index. Average Months in Regional Inventory – Tracks how long violations have been in the Regional Inventory. Regional Caseload Index – Computes the number of months that it would take to clear the violations that are either in the Region’s inventory based upon the average monthly processing rate over the preceding 12-month period. Terms: “NOCV" refers to Notice of Confirmed Violation. “SNOP” refers to Spreadsheet Notice of Penalty. “FFT” refers to Find, Fix, and Track. “SA” = Settlement Agreement. Goal: N/A Notes: * These new metrics are developed based on violations processed in the first 6 months of Each subsequent quarter the metrics will be computed with additional months of processing data reflected until a 12-month average is obtained and then roll the 12-month period thereafter. NERC plans to develop metrics for the BOTCC with 6, 9 and 12 months of processing data for the August 2013, November 2013 and February 2014 meetings, respectively. Violations that are held by appeal, a regulator, or a court are excluded from computation of these metrics. Appendix - Explanation of Measures

Constance B. White Vice President, Compliance WECC

Ruben Arredondo Senior Legal Counsel Board Oversight of Compliance Post-Bifurcation WECC Compliance Committee Update September 18, 2013

Laura Scholl Managing Director – Stakeholder Outreach WECC Compliance Committee Update September 18, 2013

28 Participation in CUG and CIPUG

29 CUG /CIPUG combined attendance up 14 percent for 2012 over * (Winter & Spring data only) trending increase over Attendance

30 CUG 98 percent positive CIPUG98 percent positive Survey Feedback from Portland

31 CIP 101 – Two Day Seminar January 2012 session: Over 100 Participants, maxed capacity December 2012 session: Sold-out 100+ Next Session scheduled September 24-25, 2013 SOLD OUT! *Will double capacity for 2014 Off-site facility

32 Continue to attract ports per call Integrating webcam this week. Open Webinars

33 Offered three times a year, just before CUG/CIPUG meetings as introduction, overview, refresher (90 minutes) o January Ports o May 23 – 104 Ports o October 17- TBD Compliance 101 –Webinar

34 Collaboration with WICF Participate in monthly Steering Committee calls Attend Strategic Planning sessions Coordinate agendas for WICF and CUG/CIPUG meetings Provide “heads-up” information to WICF to share via its website

35 CUG/CIPUG Jan , 2014 CIP v.5 Roadshow Feb. 5-6, 2014 CIP v.5 RoadshowMarch 19-20, 2014 CUG/CIPUGJune 3-5, 2014 CIP 101Sept , 2013 CUG/CIPUGOct , 2013 Open WebinarsThird Thursdays Compliance 101Prior to each CUG BES Definitional ChangeSpring/Summer Schedule of Outreach Events

Laura Scholl Managing Director-Stakeholder Outreach Questions?