Presentation is loading. Please wait.

Presentation is loading. Please wait.

(c) 2007 Charles G. Gray1 IT Risk Management, Planning and Mitigation TCOM 5253 / MSIS 4253 Detailed Risk Analysis 18 October 2007 Charles G. Gray.

Similar presentations


Presentation on theme: "(c) 2007 Charles G. Gray1 IT Risk Management, Planning and Mitigation TCOM 5253 / MSIS 4253 Detailed Risk Analysis 18 October 2007 Charles G. Gray."— Presentation transcript:

1 (c) 2007 Charles G. Gray1 IT Risk Management, Planning and Mitigation TCOM 5253 / MSIS 4253 Detailed Risk Analysis 18 October 2007 Charles G. Gray

2 (c) 2007 Charles G. Gray2 Review of Summary Level Risk Analysis Communicating a well-documented risk may trigger stakeholder (business owner) action –Must have enough detail to determine an appropriate mitigation solution –The Risk Management Team must remain involved

3 (c) 2007 Charles G. Gray3 Summary Level Risk Rating Impact (from Impact Table on Slide 30) Probability Value LowMediumHigh ModerateHigh MedLowModerate (or High) High Low Moderate Definition of “moderate” or “high” depends on each organization’s needs

4 (c) 2007 Charles G. Gray4 Summary Risk Level List Develop a “summary” level list, including ALL identified assets (slide 13 – last week) –Based on slide 29 matrix (from last class) Extra columns for supporting information can be added Tailor the process to meet the organization’s individual needs Every organization must define “high risk” in its own unique enterprise –“Medium” impact and “medium” probability may have “moderate” or “high” risk (slide 30)

5 (c) 2007 Charles G. Gray5 Summary Level Risk Rating AssetExposure NameClass HBI MBI LBI Defense In depth layer Threat Descrip- tion Vulner- ability Exposure (H, M, L) Impact (H, M, L) Proba- bility (H, M, L) Summary Risk Level

6 (c) 2007 Charles G. Gray6 Preparing for Detail Level Analysis Become familiar with the entire detailed risk analysis process before beginning Leverage the inputs used in the summary level analysis, but include considerably more detail –Well organized documentation is essential –Microsoft spreadsheets (from the MS Management Guide) are ideal

7 (c) 2007 Charles G. Gray7 Sample Risk Statements Summary level – “Within one year, high value servers may be moderately impacted from a work due to unpatched configurations” Detail level (1) – “Within one year, high value servers may be unavailable for three days due to worm propagation caused by unpatched configurations” Detail level (2) “Within one year, high value servers may be compromised, affecting the integrity of data due to worm propagation caused by unpatched configurations”

8 (c) 2007 Charles G. Gray8 Tasks to Produce the Detailed Level List of Risks Task one – Determine impact and exposure Task two – Identify current controls Task three – Determine probability of impact Task four – Determine detailed risk level

9 (c) 2007 Charles G. Gray9 Confidentiality or Integrity Exposure Ratings Exposure RatingConfidentiality or Integrity of Asset 5 Severe or complete damage to asset, e.g. externally visible and affects business profitability or success 4 Serious but not complete damage to asset, e.g. affects business profitability or success, may be externally visible 3 Moderate damage or loss, e.g. affects internal business practices, causes increase in operation costs or reduction of revenue 2Low damage or loss, e.g. affects internal business practices, cannot measure increase in costs 1Minor or no change in asset

10 (c) 2007 Charles G. Gray10 Availability Exposure Rating Exposure Rating AvailabilityDescription 5Work Stoppage Substantial support costs or business commitments canceled 4Work Interruption Quantifiable increase in support costs or business commitments delayed 3Work Delays Noticeable impact on support costs and productivity. No measurable business impact 2Work Distraction No measurable impact, minor increase in support or infrastructure costs 1Absorbed by normal operations No measurable impact to support costs, productivity, or business commitments

11 (c) 2007 Charles G. Gray11 Composite Exposure Rating Collect exposure ratings for each potential impact Choose the highest value from slide 9 or 10 as the “exposure rating” –E. g., if the “confidentiality” rating is 3, and the ”availability” rating is 4, then choose 4 as the “exposure rating”

12 (c) 2007 Charles G. Gray12 Impact Values Typical impact values for each impact class May be adjusted to “fit” each organization Impact classImpact Class Value (V) HBI (High business impact)10 MBI (Medium)5 LBI (Low)2

13 (c) 2007 Charles G. Gray13 Exposure Factor Microsoft recommends a linear scale Must be tailored to each organization Exposure Rating (From slide 9 or 10) Exposure Factor (EF) 5100% 480% 360% 240% 120%

14 (c) 2007 Charles G. Gray14 Impact Rating Impact = impact class value (V) (from slide 12) times the exposure factor (EF) (from slide 13) Impact = V * EF

15 (c) 2007 Charles G. Gray15 Impact Rating (Example) AssetExposure Asset Name Impact class rating (2-10) Defense Layer Threat Description Vulnerability Description Exposure Rating (1-5) Impact Rating 1234567 Custo- mer finan- cial data 10 (HBI) HostUnauthorized access to customer data by theft of financial advisor credentials Theft of credentials due to outdated anti- virus signatures or outdated security patches 4 (80%) or 0.80 8 column 2 * column 6

16 (c) 2007 Charles G. Gray16 Review - Output from Task One Choose the highest exposure rating between: –Confidentiality or integrity of an asset –Availability of an asset Assign an exposure factor (EF) for each exposure rating (slide 13) Determine the “impact rating” (slide 14) Result is an asset list sorted by impact

17 (c) 2007 Charles G. Gray17 Identify Current Controls Business owners/stakeholders should identify the various controls –“Directed questioning” by the Risk Management Team may be needed The controls themselves may be “objective”, that is, written down (de jure), or may be only “de facto” (word-of-mouth) –“Effectiveness”, however, will probably be subjective (see slides18 and 19)

18 (c) 2007 Charles G. Gray18 Evaluating Effectiveness of Current Controls Effectiveness is subjective and will rely on the experience of the Security Risk Management Team to understand the control environment Answer each question (next slide) and total the values Lower value means the controls are effective and MAY reduce the probability of an exploit occurring

19 (c) 2007 Charles G. Gray19 How Effective are Current Controls? Yes = 0, No = 1 Is accountability defined and enforced effectively?0 or 1 Is awareness communicated and followed effectively? 0 o r1 Are processes defined and practiced effectively?0 or 1 Do existing technology or controls reduce threat effectively? 0 or 1 Are audit practices sufficient to detect abuse or control deficiencies? 0 or 1 Sum of control attributes (0-5) =

20 (c) 2007 Charles G. Gray20 Control Effectiveness - Example QuestionValueDescription Is accountability defined and enforced? 0 (yes)Policy creation and host compliance accountability are well defined Is awareness communicated and followed effectively? 0 (yes)Regular notifications are sent to users, general awareness training Are processes defined and practiced effectively 0 (yes)Compliance measurement and enforcement is documented Do existing technology or controls reduce the threat effectively? 1 (no)Existing controls still allow a length of time between vulnerability and patch Are current audit practices sufficient to detect abuse or control deficiencies? 0 (yes)Measurement and compliance auditing are effective given current tools Sum of all control attributes1

21 (c) 2007 Charles G. Gray21 Review – Output from Task Two A list of controls and their effectiveness agreed between the stakeholders and the Risk Management Team

22 (c) 2007 Charles G. Gray22 Determining Probability of Impact Probability rating depends on: –Probability of the vulnerability existing in the environment based on attributes of the vulnerability and possible exploit (1-5) –Probability of the vulnerability existing based on the effectiveness of current controls (1-5) Relative risk rating = probability rating * impact rating

23 (c) 2007 Charles G. Gray23 Vulnerability Attributes (H) High (Assign value of 5 if ANY apply) –Large attacker population – script kiddie/hobbyist –Remotely executable –Anonymous privileges needed –Externally-published exploitation method –Automated attack possible

24 (c) 2007 Charles G. Gray24 Vulnerability Attributes (M) Medium (Assign value of 3 if ANY apply) –Medium-sized attacker population – expert/specialist –Not remotely executable –User level privileges required –Exploitation method not publicly published –Non-automated

25 (c) 2007 Charles G. Gray25 Vulnerability Attributes (L) Low (Assign value of 1 if ALL apply) –Small attacker population – insider knowledge –Not remotely executable –Administrator privileges required –Exploitation method not publicly published –Non-automated

26 (c) 2007 Charles G. Gray26 Vulnerability Sum Exposure attributes (from slide 23, 24, or 25) High5 Medium3 Low1 Probability value (1, 3, or 5)

27 (c) 2007 Charles G. Gray27 Review – Output of Task Three Probability rating taking into account the current controls in place Sum of vulnerability rating (slide 26) and control effectiveness (slide 19) –Column 9 on the slide 28

28 (c) 2007 Charles G. Gray28 Baseline Risk – Current Controls AssetExposure 12345678910 NameImpact Rating (HBI, MBI, LBI) Defense in Depth Layer Threat Descrip- tion Vulnera- bility Descrip- tion Expo- sure Rating (1-5) Impact Rating (1-10) Current Control Descrip- tion Probability rating w/control (1-10) Risk Rating w/control (0 -100)) XYZ10 (HBI) HostEnter Details Be specific 4 (80%) 8List every control Vulner = 5 Control = 1 Total = 6 Col 7*9 = 48 ABC10 (HBI) HostEnter Details Be specific 4 (80%) 8List every control Vulner = 5 Control = 5 Total = 10 Col 7*9 = 80 Your Com- pany 5 (MBI) PerimeterEnter Details Be specific 2 (40%) 3List every control Vulner = 4 Control = 1 Total = 5 Col 7*9 = 15

29 (c) 2007 Charles G. Gray29 Summary Qualitative Ranking Impact H100 2030405060708090100 909182736455463728190 808162432404854647280 707142128354249566370 606121824303642485460 M505101520253035404550 40481216202428323640 3036912151821242730 202468101214161820 L1012345678910 0123456789 LMH Probability

30 (c) 2007 Charles G. Gray30 Review – Output of Task Four Detailed prioritized risk list with an objective (mostly) “risk rating” with a range of 0 to 100 A risk analysis chart to assist stakeholders in visualizing the relative risk ratings Risk levels should be used only as a guide for decision makers, and some adjustments are allowed by stakeholders –However, everybody must recognize that every asset cannot be “number one” on the priority list

31 (c) 2007 Charles G. Gray31 Next Week Quantifying Risk The hard work starts –Putting numbers ($$$) to the assets and the loss expectancy


Download ppt "(c) 2007 Charles G. Gray1 IT Risk Management, Planning and Mitigation TCOM 5253 / MSIS 4253 Detailed Risk Analysis 18 October 2007 Charles G. Gray."

Similar presentations


Ads by Google