Presentation is loading. Please wait.

Presentation is loading. Please wait.

Meteorological Program Self Assessment Presented at NUMUG San Francisco, CA 2009.

Similar presentations


Presentation on theme: "Meteorological Program Self Assessment Presented at NUMUG San Francisco, CA 2009."— Presentation transcript:

1 Meteorological Program Self Assessment Presented at NUMUG San Francisco, CA 2009

2 Columbia 10m Wind Vanes

3 Columbia 10m Temp Probes

4 75m Instruments

5 Note unusually high frequency of ‘A’ Stability class in 2001. This is impossible in winter months.

6 High frequency of ‘A’ Stability class in 2005 & 2006. This is possible but not likely.

7 Temperature Probe Output Shift Lightning Strike

8 Delta T Shift

9 24hrs – Sunny vs Cloudy

10 Self Assessment Initiation History of problems with met tower data –NRC Submittal – 0% Recovery in 2001 –Prep for Submittal – 0% Recovery in 2005 –Answers to questions to peers LTA –List of issues growing Initiated Self Assessment –Team Makeup –In Depth Look – 29 checklist questions

11 Organizations Affected Reactor Engineering (responsible for accident analysis using met tower data) Licensing (met tower regulatory commitments and LBD management) EP (Interested party for real time met data and offsite dose software applications) Environmental Services (Responsible for environmental monitoring based on met tower data and data trending) Chemistry (Responsible for a) effluent monitor setpoint calculations and b) routine and abnormal release dose calculations both based on real time and historic met tower data) System Engineering (Responsible for instrument performance) Work Planning and scheduling (Responsible for repair of met tower instruments) IT (Responsible for met tower data software and computer hardware support) I&C (Responsible for Surveillances (calibrations)) Maintenance (Responsible for preventive & corrective maintenance)

12 Checklist Questions Desert Sigmas Heat Emission Rates Reference Bases Data Recovery Annual Rainfall Diffusivity Rain Collection Signal Trending Data Screening Wind Frequency Rejection Instrument Separation Sigma Theta Guy Wires As Found Calibration Distribution Diffusivity Instrument Cables Solar Instrument Interference Zones Dew Point Solar Shields Training Data Trending Remediation Tent Interference ISFSI Heat Effect Terrain Effects Advantages of ANSI 3.11-2005 Corrective Action Effectiveness Met Tower Walkdown

13 Self Assessment Findings Accuracy Commitment TS/LCS/ODCM/FSAR Calibrations Methodology Trending Training Tower Design/Climbing Work Request Priority LTA Consensus of Problems

14 Self Assessment Findings Accuracy Commitment Where conflicts exists between recommendations of –RG 1.97 Rev 2 and –Safety (Regulatory) Guide 1.23 Rev 0-1972, CGS complies with RG 1.97 0.2° vs 0.1°C for Delta T Temperature sensor improvement –Current sensor accuracy = ±0.30°C –Young sensor accuracy = ±0.10°C –MetOne sensor Accuracy = ±0.05°C

15 Accuracy Comparisons

16 Threshold for Concern

17 Self Assessment Findings TS/LCS/ODCM/FSAR No LCO or RFO in TS, LCS, or ODCM for inoperable/non-functional met tower instruments FSAR describes program/surveillances Channel Checks not performed for one channel

18 Self Assessment Findings Calibration Methodology Sections of loop not included in calibration –Lack of Cal Lab sensor data with loop test documentation –Lack of line integrity test of tower cables –Lack of validation of computer signal Lack of aspirator inspection As Found OOT not trended Process slow or interrupted Calibration by WO instructions

19 Self Assessment Findings Trending Frequency of instrument trends LTA Depth of trending LTA –Current procedure LTA for untrained personnel and LTA to identify problems and ensure timely corrective action –Lack sufficient action threshold criteria and cross- organization approval of criteria chosen –Does not trend all met tower signals & correct all channels, not just FSAR channels. –Does not compare signals to nearby towers –Limited methods –Interference zones not identified

20 Self Assessment Findings Training Turnover of personnel Background of personnel Time/schedule for training Funding Many groups need training –Right-hand not knowing Left-hand needs

21 Self Assessment Findings Work Request Priority With no LCO or RFO, work priority has been low System Engineers with nuclear safety- related systems & met tower instruments must balance limited resources

22 Self Assessment Findings LTA Consensus of Problems Engineering personnel are not trained in meteorology Belief that a system is not OOT until proven by calibration

23 Self Assessment Findings Tower Design Temperature probes on opposite sides of tower Wind vane design generating wind shadow Annual Rainfall runs November to November Torque guy system wires makes instrument elevator design difficult –Hesitancy to increase frequency of calibrations Tower Climbing –Windy, icy conditions –Maintenance delays (6 – 7 months)

24 Self Assessment Findings Others Input parameters in XOQDOQ –Heat Emission Rate –Terrain Height –Exit Direction XOQDOQ –Desert Sigma Data Validation LTA JFD creation software not robust enough to defend low work prioritization

25 Self Assessment Results 13 Condition Reports –Includes Licensing Bases Documents and implementing procedure revisions to ensure compliance with commitments. 18 Action Request Evaluations to –Consider design changes to improve accuracy and reduce bias of met tower signals, to –Consider software changes to process and validate met tower data prior to use in LBD submittals or ODCM compliance assessments, to –Ensure knowledge retention, and to –Validate input values to dose assessment calculations

26 End Presentation Discussion questions: –How do you convince Engineering or Maintenance that repair is needed? –How is tower maintenance performed in icy, windy conditions without an elevator? –Can delta T accuracy requirements of RG 1.23 Rev 1 be met? –Others?


Download ppt "Meteorological Program Self Assessment Presented at NUMUG San Francisco, CA 2009."

Similar presentations


Ads by Google