Presentation on theme: "1 CM HOW-TO DETAILED DISCUSSION. 2 OBJECTIVE n Upon completion of this training you will demonstrate an understanding of how to perform a CM evaluation."— Presentation transcript:
1 CM HOW-TO DETAILED DISCUSSION
2 OBJECTIVE n Upon completion of this training you will demonstrate an understanding of how to perform a CM evaluation.
3 REVIEW QUESTIONS n Why are both a macro and micro approach taken when evaluating CM? n Describe Phase 1 and 2 work at INPO n What is the relationship of Appendix 2 to Appendix 1 of the How-To? n How should Appendix 1 and 2 be used during the evaluation? n How might the perception of margin differ between engineering and operators? n What is the main difference in the CM delta assessment sheet?
4 REVIEW QUESTIONS n Why are Phase 1 and 2 key to a successful CM evaluation? n Of what value is the CM questionnaire? n Why must a CM evaluator always keep in mind the status of the enablers? n What are some bases for limiting on-site scope? n Explain the difference between a fundamental and high level attribute. n For what two functional areas must the CM evaluator/analyst determine if a focus is needed on-site?
5 CM HOW-TO n CM is a broad area to evaluate n Many plant functional departments support CM
6 APPROACH n Method based on 3 pilots n Addressing enablers/attributes –General approach –MACRO & MICRO approach –1 ST and 2 ND pilots addressed all attributes –3 rd pilot limited work on site (reduced team size) station buy-in on site scope n Must narrow scope of site work n Must identify issues early for E3 success n FOPs are the organizational weaknesses n Must make early call on functional areas
7 TWO PHASES OF REVIEW/ANALYSIS AT INPO KEYS TO SUCCESS n Phase 1 –Determine focus areas –Determine additional specific information needed n Phase 2 –Review detailed documents –Prepare details for site work
8 PHASE 1 n Review normally requested info –Includes CM questionnaire –Determine vertical reviews and potential issues –Request specific documents –Document examples: ODs, DBDs, Mods, AOPs, NOPs, eng. Evals, 50.59s, calcs, complete condition reports, specs, program documents, etc.
9 PHASE 2 n Review specific documents n Limit on-site scope –Call on functional focus (RE &DE) –Enablers/attributes not being pursued on site –Focus areas n Develop preliminary PDSs n Determine site activities
10 PHASE 2 Cont’d n Vertical slice (potential for new issues) example n Status enabler/attribute list n Evaluation plan 95% complete n Functional area focus call
11 PILOT EXPERIENCE n It took 10 man- days for phase 1 & 2 n Counterpart relationship is key to a successful CM evaluation
12 OBSERVATIONS n Observation for each enabler n Team Manager reports to site management on an enabler basis n Paints a better CM picture n CM evaluator owns CM enabler observations
13 COUNTERPARTS n Counterpart team part of eval team n Continually reinforce E3 process (counterpart appreciation of CM effect on other departments) n Explain precursor AFIs n Explain less focus on examples and more on whys
14 HEADS UP n Expected to provide input to management model (late first week) Key in discussing whys n Mark OEO bubble chart to indicate strong or weak areas by 2 nd Thursday (do not assess) n Debrief as a team (focus on whys – example)
15 MOST USEFUL UPFRONT INFO ON THE 3 PILOTS n Event info n System health reports n CAP database n DBDs n NRC reports n response documents n Vertical slice summary reports n Engineering evaluations n Specific calcs n Specific mods n Specific temp mods n ODs n Specific 50.59s n Self-assessments n Specific program documents n Specific procedures
16 HOW TO WALKTHROUGH n Main document (Key Points) –Must narrow scope of work on site –Have observations and preliminary PDSs –INPO must status all OEOs to utility CEO –Responsible for functional area calls on RE and DE –Analysis at INPO is part of evaluation –Fundamental and high level attributes –Micro and Macro approach –CM performance problems (example)
17 ATTACHMENT 1, EVALUATION APPROACH n An attempt to get to specific actions –Vertical slice –Specific activities –Program reviews –Process reviews –Passive component reviews –Power Uprates –Operating Margins
18 ATTACHMENT 2, ENABLER/ATTRIBUTE LIST n Cross reference to info request items and attachment 1 items n Suggested actions n Attribute meaning n Status during eval
19 ATTACHMENT 3, MARGIN MODEL n Common terminology for discussions with counterparts n Based on NSAC/125 n Does not include safety margin terminology
20 ATTACHMENT 4, CM QUESTIONNAIRE n Analyze for trends and insights n Pilot results n Share results with counterpart (opportunity to get counterparts involved) n Support services puts data on Excel spread sheet
21 ATTACHMENT 5, INFORMATION REQUEST n A lot of information to look at during Phase 1
22 ATTACHMENT 6, CM PROGRAMS n Pilot evaluations were based on margin management n Going forward focus is on specific programs (listed in attachment 1) n Going forward evaluate programmatic aspects also n Evaluate programs identified in Attachment 1 (depth of review should be based on indications of weaknesses) n Other programs identified in Attachment 6 should be evaluated based on review/analysis of plant info
23 ATTACHMENT 7, CM ASSESSMENT n Process similar to existing process n Evaluator to provide information to the group. The wisest of men cannot answer all the questions of a fool. n Delta sheet to contain a narrative for each enabler (see attachment 9)
24 ASSESSMENT CRITERIA, ATTACHMENT 8 n Latent weaknesses that are not affecting plant performance (1&2) n CM weaknesses are affecting plant performance (3- 5)
25 DISCUSSION EXAMPLES –CNS EQ –Comanche Peak AFW mod –Seabrook equipment data base –Comanche Peak power uprate –CNS CCW –Comanche Peak fuel