Presentation is loading. Please wait.

Presentation is loading. Please wait.

INMM Nuclear Security and Physical Protection Technical Division.

Similar presentations


Presentation on theme: "INMM Nuclear Security and Physical Protection Technical Division."— Presentation transcript:

1 INMM Nuclear Security and Physical Protection Technical Division

2  What is Effectiveness? Definition: Producing or capable of producing a desired effect.  How do we derive to the desired effect? Regulations Vulnerability assessments Risk assessments Programmatic guidance Management decisions

3  What are the expectations of a security system? System will work when needed System will work to the level (effectiveness) needed  How can we determine (with confidence) that our security systems will work as needed or required? Assessments/inspections Operator use and operator failure notifications Management oversight Testing

4  Testing provides the most reliable information about a system’s effectiveness  When performed properly, testing: Continually challenges the protective system to reasonable levels Reports performance information to management and other stakeholders

5 Safeguards and Security Program Audits & Inspections Site-specific VAs Program Management Procedures Training Testing & Evaluations Laws & Codes Policies & Directives SECURITY CIRCLE

6 SUSTAINING SECURITY

7  Testing and data management can be very large and complex Test schedules Testing priorities Corrective action process Interfaces to other programs and sources of data  Performance Assurance Program Plan

8  Performance Assurance Program A management program that provides assurances that protective systems are performing as required Program Objectives: ○ Develop a method to sustain the safeguards and security program ○ Implement a consistent and recurring evaluation process ○ Report information to decision makers

9 What is a Performance Assurance Program? A performance assurance program continually challenges the safeguards and security systems to determine if those systems are functioning to established standards. The program also provides a systemic process to enhance the protection of assets. Contrast to Performance Testing A test to confirm the ability of an implemented and operating system element or total system to meet an established requirement.

10  Benefits of a Performance Assurance Program Determines the effectiveness of the safeguards and security program elements Determines effectiveness of individual critical protection elements Identifies system strengths and weaknesses Validates vulnerability analysis Validates procedures Validates training effectiveness

11  Benefits of a Performance Assurance Program (con’t) Promotes continuous improvement of protection systems Produces data for lifecycle management Provides data for financial analysis for continued support/upgrades Promotes quality by supporting improvement initiatives Integrates MC&A and physical protection

12 Types of Integrated Protection Program Documents Safeguards and Security Plan VA MC&A Plan Tactical Response Plan Performance Assurance Program Plan Regulatory Documents

13 Role of Performance Testing to Validate System Effectiveness VA MC&A Plan Tactical Response Plan Performance Assurance Program Plan Implements Requirements/Criteria Validates Performance to Requirements/ Criteria Security Procedures

14  How to decide what to test? Define those elements most critical to the protection of assets ○ Components ○ Sub-systems ○ Systems Critical protection element defined: A selected portion of a safeguards and security system that directly affects the ability of the system to perform a required function.

15  Critical Protection Elements typically fall into one of the following: Information security program Material control & accountability systems Personnel security program and Human Reliability Physical protection systems Protective forces Transportation systems Critical Infrastructure Cyber security program

16 Scope & Critical Protection Elements Policy & Procedures Operations Hardware & Software Personnel Integration Information Security Material Control & Accountability Personnel Security Physical Protection Protective Forces Transportation Security CPE cross section

17  The size of the protection system can be very large and complex. How do you prioritize testing functions? Graded safeguards Graded protection

18 Graded Safeguards A system designed to provide varying degrees of physical protection, accountability, and material control consistent with the risks and consequences of threat scenarios. Based upon nuclear material: Type Physical form Chemical or isotopic composition

19 Graded Protection The level of effort and magnitude of resources expended for a particular safeguards and security interest are commensurate with the interest’s importance or the impact of its loss, destruction, or misuse. Attractiveness of assets being protected Threats against the assets Results of vulnerability assessments

20 20 Testing Categories Four categories of performance tests are used Acceptance Testing ○ Performed if the element is new to the protection system, or ○ If there have been significant changes to the protection system Validation ○ Repeated testing to derive a level of performance confidence Effectiveness ○ Reveals how well the protection system works Operational (functional) ○ Typically answers the question “Does it work or not” without regard for how well it works.

21 Consequence of Failure Probability of Detection HIGH MEDIUM LOW Prioritizing Testing

22 Critical Protection Elements Prioritization Low Priority Test Types: Acceptance, Validation, Functional, Effectiveness Components: Protect Category III or less SNM & classified matter, property, or information Consequence of Failure Probability of Detection HIGH MEDIUM LOW

23 Medium Priority Test Types: Acceptance, Validation, Functional, Effectiveness Components: Failure reduces protection to an unacceptable level, or assigned P d of <0.75. Consequence of Failure Probability of Detection HIGH MEDIUM LOW Critical Protection Elements Prioritization

24 High Priority Test Types: Acceptance, Validation, Functional, Effectiveness Components: Those for which failure reduces protection to an unacceptable level, or assigned P d of 0.75 or greater Consequence of Failure Probability of Detection HIGH MEDIUM LOW Critical Protection Elements Prioritization

25 25 Testing Cycle of a Critical Protection Element

26 A corrective action process is essential for continual improvement:  Root cause analyses  Compensatory measures  Designation of high, medium, or low priority deficiencies  May require performance tests to validate a corrected deficiency

27 27 End Analysis/Testing YES Conduct VA/Tabletop Perf Test NO Identify Potential Solutions Tabletop/ Analyze Potential Solutions NO Deficiency Found? YES Implement Comp Measures Validate Findings (Test) Deficiency Found? YES NO Upgrades Needed? YES Request Resources Modify Existing System NO Request Supported? Results Acceptable? NO Implement Solution Test Solution Deficiency Found? Retest at Established Interval NO YES Typical Performance Assurance Process Flow

28 28 Collecting and Analyzing Performance Test Data

29 29 Data Collection Are there conditions that will affect the results? Shift-to-shift differences Environmental conditions -Day/Night -Rainy/Dry Differences in security devices Testing should encompass all conditions that may affect the test outcomes

30 30 Statistical Analysis What do I do with the data collected during performance testing? The collected data tells a story about the security component / system.

31 31 Statistical Analysis Many statistical tools are available to use in the management and tracking of performance-program data.

32 32 Statistical Analysis Pareto Analysis Ranks reasons that contribute to the problem Typically, relatively few reasons are responsible for the majority of the problems

33 33 Statistical Analysis Regression Analysis Determines the values of parameters for a function that cause the function to best fit a set of data observations provided For example, do day/night and building contribute to whether a motion sensor alarms?

34 34 Statistical Analysis Control Charts Used to distinguish between normal and unusual variations in a system Some examples: -Physical Protection: monitoring the response of a walk-through metal detector -Material Control: assuring that scales used to weigh material are accurate

35 35 Statistical Analysis Experimental Design / Analysis of Variance A planned approach to determine what factors affect a question of interest A well-designed experiment will provide the most information with the fewest possible tests conducted

36 36 Statistical Analysis These statistical tools:  Are dependent upon the specific test objectives and type of data that is collected  Are identified by factors that significantly affect test results  Determine how many tests to conduct  Assure randomized test runs  Will allow proper analysis of test results

37 37 Statistical Analysis Example Requirement: Test Motion Sensor Agreement Negotiated with Operations Requirement Identified In Performance Assurance Plan Operations Provided with Test Procedure Test Results Forwarded to Performance Assurance Team Performance Assurance Team Analyzes Results/ Takes Actions Operations Trained On Test Procedure Operations Conducts Test

38 38 Function Test of a Motion Sensor Once a week, each shift supervisor is required to test the motion sensor in his facility to determine its operability. The results of these test are reported to the Performance Program group monthly.

39 39 Motion Sensor Function Test Procedure: OP-123 Rev. C Building: Building 1 Model: XYZ Serial Number: 092003-23 DateTimeShiftSupervis or Test Result 7/1/050800MBillAlarm 7/1/051600AJohnAlarm 7/8/050900MBillAlarm 7/8/051700AJohnNo Alarm

40 40 Performance Program group uses a p-chart (control chart) to monitor the results of the motion sensor tests.

41 41

42 42  The January and February 2005 data points were investigated because they were below the lower control limit  The investigation revealed that a new supervisor was working in Building 1 and he was not following the motion sensor test procedure  The supervisor was trained and began performing the test correctly

43 43 Statistical Analysis What data collection tools will be used? Manual Manual/Automated Automated Are my test observers / data collectors properly trained and consistent? Is the test progressing as planned? If no, are changes being documented?

44 44 System Performance Confidence Levels Performance tests are used to validate that a security component or system is performing at a desired level. How is the “desired” level selected? Vulnerability assessment (VA) risk Regulatory documents Management decisions

45 45 Ultimately, the desired levels are management decisions. Management must determine if there are adequate resources to meet the desired levels. System Performance Confidence Levels

46 46 VA Risk LevelConfidence Level Margin of Error High95% 3% Medium90%10% Low80%20% Suppose the VA risk levels are selected as a basis to conduct performance tests. An example: A high confidence level and a small margin of error increases the sample size System Performance Confidence Levels

47 47 For example:  The VA requires a P d of motion sensors in vaults to be 0.95  Management has decided that they would like to be 90% confident in the test data that is produced  Management has also decided that the acceptable margin of error can be no more than 5% Given this situation, how many motion sensors should be tested? System Performance Confidence Levels

48 48 To be 90% confident that the true probability of a motion sensor activating is 0.95 +/- 0.05, 51 motion sensors should be tested. Margin of Error Confidence Level PdPd System Performance Confidence Levels

49 49 One of the most challenging aspects of conducting a performance test is to determine how many tests need to be conducted to characterize the performance of a system with some level of confidence. Performing one or two tests is not sufficient and conducting thousands of tests is probably not feasible. System Performance Confidence Levels

50 50 Security Component Building B1B2B3B4B5B6B7B8B9B10Total Door Alarms 38251338251338251325250 Motion Sensors 256019 13 6025125 TIDs 45075002253002250751501500 Cameras 555555555550 Statistics and the Performance Program

51 51 Statistics and the Performance Program Normally Distributed Data The number of tests to be conducted is determined prior to sampling Two possible decisions: -Reject the hypothesis -Fail to reject the hypothesis Parameters needed -Standard deviation of population of interest -Margin of error -Confidence level

52 52 Normal Distribution where is the value from the standard normal distribution table is the margin of error, and is the standard deviation of the population of interest Statistics and the Performance Program

53 53 Sequential Sampling: The number of tests to be conducted is undecided and is determined only by the sample observations as these are completed Three possible decisions: -Reject the hypothesis -Fail to reject the hypothesis -Keep sampling or conclude that a decision cannot be made Parameters needed: -The performance value -The minimally acceptable lower value -Confidence level Statistics and the Performance Program

54 54 Sequential Testing Decision Map 8 7 6 5 4 3 2 1 01234567891011121314151617181920 Stop/Fix Inconclusive Validate

55 55 Bernouilli Trials The number of tests to be conducted is predetermined prior to sampling Two possible decisions: -Reject the hypothesis -Fail to reject the hypothesis Parameters needed -Proportion of interest (P d ) -Margin of error -Confidence level Statistics and the Performance Program

56 56 Bernouilli Trials where is the value from the standard normal distribution table is the margin of error, and is the proportion of interest Statistics and the Performance Program

57 57 Security Component Confidence Level α Z α/2 Margin of Error Pd Sample Size Door Alarms 95%0.051.960.050.9915 Motion Sensors 90%0.101.6450.050.9551 TIDs 95%0.051.960.050.9573 Cameras 80%0.201.2820.20 7 Statistics and the Performance Program

58 58 Security Component Building B1B2B3B4B5B6B7B8B9B10Total Door Alarms 38251338251338251325250 Number Samples 15 Motion Sensors 256019 13 6025125 Number Samples 51 TIDs 45075002253002250751501500 Number Samples 73 Cameras 555555555550 Number Samples 7 Statistics and the Performance Program

59 59 Security Component Building B1B2B3B4B5B6B7B8B9B10Total Door Alarms38251338251338251325250 % of Total15%10%5%15%10%5%15%10%5%10% # Samples221221221215 Motion Sensors256019 13 6025125 % of Total20%5%0%15% 10% 5%0%20% # Samples1030885530 51 Statistics and the Performance Program

60 60 Assumptions:  Protection elements designed to perform a certain task and that are tested to the same standard are grouped together  Randomly select elements or systems to test Statistics and the Performance Program

61 Summary  A well designed Performance Assurance Program serves as the vehicle to determine the health of the safeguards and security system.  For each critical protection element, equipment, procedures, personnel and system integration must be considered.  The performance assurance program is a continuous process.


Download ppt "INMM Nuclear Security and Physical Protection Technical Division."

Similar presentations


Ads by Google