Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer Security Foundations and Principles Nicolas T. Courtois - University College of London.

Similar presentations


Presentation on theme: "Computer Security Foundations and Principles Nicolas T. Courtois - University College of London."— Presentation transcript:

1 Computer Security Foundations and Principles Nicolas T. Courtois - University College of London

2 CompSec Intro Nicolas T. Courtois, January 2009 2 Plan What is CompSec? –Computer Industry –Security = ? –Security Science 123 Life Context and Threats –Small / Big Threats –Economics –Defenders –Risk –Failures –Ethics, Hackers Principles - Defenses –Security Design Principles, Security Engineering –Security Management Principles –Weakest Link? –Secrecy / Open Source –Case Studies - Attack Trees

3 CompSec Intro Nicolas T. Courtois, January 2009 3 About Myself – Short Bio: Cryptologist. Main interests: –Cryptanalysis – how to break things… –Smart cards, 8 years @industry »(see Applied Crypto module, term 2). –Economics, business and ethical aspects of information security. This will influence the way in which this course will be taught.

4 CompSec Intro Nicolas T. Courtois, January 2009 4 Computers …

5 CompSec Intro Nicolas T. Courtois, January 2009 5 Computer Industry We are dealing here with a particular industry: Computer industry. Software and Hardware. This industry is arguably VERY special, unlike any other industry that ever existed… But we can and will compare it to other industries smart cards have their own O.S. –much more focus on security since always mobile phones, similarly developed very differently too… voting machines, door/car lock systems, etc…

6 CompSec Intro Nicolas T. Courtois, January 2009 6 A Sort of Disclaimer This whole part of my slides (152 pages ) contains a lot of “Philosophy” things… complimentary to other parts, could be studied at any moment I don’t intend to ask you abstract “philosophy” questions at the exam, rather more concrete technical questions: beware of vague statements and excessive generalizations… what do the principles / methodology / ideas actually do in practice… be specific and factual

7 CompSec Intro Nicolas T. Courtois, January 2009 7 Computer Industry and Security Social-Econ Background: Science background: Tech Background: “Industry Standards” such as: Intel CPU and Chipset, RAM and hard drives, C language, UNIX / Windows TCP/IP, HTTP, TLS,

8 CompSec Intro Nicolas T. Courtois, January 2009 8 Computer Industry and Security Social-Econ Background: Science background: What technology “enablers”(computers) and “disablers” (cryptology,HWSec) can/cannot achieve? How to define / classify security problems and find “good” solutions “Industry Standards”

9 CompSec Intro Nicolas T. Courtois, January 2009 9 Computer Industry and Security Social-Econ Background: things exist for a reason. “Nice or unpleasant” facts of life: software/hardware economics: which industry dominates which free market triumphs and disasters these stupid humans that cannot be bothered to obey the policy… these bureaucratic organisations that just cannot get their best interest(?) right nobody is buying/using the wonderful(?) technology, adoption barriers theory vs. practice crime war terrorism… laws / regulations etc… Science background: “Industry Standards” insecure rubbish!

10 CompSec Intro Nicolas T. Courtois, January 2009 10 The Product Development cycle Social-Econ Background: Science background: “Industry Standards” everything ends up here rapid obsolescence

11 CompSec Intro Nicolas T. Courtois, January 2009 11 What is Security?

12 CompSec Intro Nicolas T. Courtois, January 2009 12 Definition 1 [Courtois] Security: protect the value(s). What value(s) ? ALL !

13 CompSec Intro Nicolas T. Courtois, January 2009 13 Security: protect the value(s). What value(s) ? Money [economical security] But NOT ONLY MONEY. Life and the quality of life. (CB, car panic button) Privacy is a value in itself. Good technology vs. bad one Freedom, Justice, etc …

14 CompSec Intro Nicolas T. Courtois, January 2009 14 Security: Definition 2. asset holder

15 CompSec Intro Nicolas T. Courtois, January 2009 15 Security  Safety Difference: protect against intentional damages... Notion of an Attacker / Adversary. Notion of an Attack.

16 CompSec Intro Nicolas T. Courtois, January 2009 16 Attacker Attacker = Adversary = Threat Agent

17 CompSec Intro Nicolas T. Courtois, January 2009 17 **Dimensions of Security Physical vs. Logical Psychological / Human vs. Organizational / Business very different!

18 CompSec Intro Nicolas T. Courtois, January 2009 18 *Security on One Slide

19 CompSec Intro Nicolas T. Courtois, January 2009 19 Main Goals: Confidentiality Integrity Authenticity Availability Accountability

20 CompSec Intro Nicolas T. Courtois, January 2009 20 In There a Need for More Security? or will I have a job tomorrow?

21 CompSec Intro Nicolas T. Courtois, January 2009 21 Hegel [1770-1831], German philosophy The history is the process of broadening freedom.

22 CompSec Intro Nicolas T. Courtois, January 2009 22 Why Security, why Cryptography ? Freedom, new technologies  More possibilities  More security problems … “ Theorem ” [Courtois]: Freedom    Security  0

23 CompSec Intro Nicolas T. Courtois, January 2009 23 Freedom  Security issues … Examples: invention of train/metro  threat of pickpockets car traffic:  major cause of death, trains are much safer invention of the internet  pirates and hackers.

24 CompSec Intro Nicolas T. Courtois, January 2009 24 Why Security, why Cryptography ? Freedom of travel by car  main cause of death (e.g. 1.2 M people die every year in car accidents !) Unlimited exchange of data over the internet  unlimited security problems.

25 CompSec Intro Nicolas T. Courtois, January 2009 25 *Prof. Shigeo Tsujii, “ The society should be designed to help people to enjoy the freedoms ”… “ broadened by IT ”. Enjoy benefits, limit/remove disadvantages.

26 CompSec Intro Nicolas T. Courtois, January 2009 26 “ Adding ” Security – Def. 3. The goal of cryptology and security [Courtois]: Add security to the information technologies (IT) that are by nature insecure.

27 CompSec Intro Nicolas T. Courtois, January 2009 27 Adding Security - Criticism

28 CompSec Intro Nicolas T. Courtois, January 2009 28 Critics [Schneier] Why security it is an after-thought? “ Aftermarket security is actually a very inefficient way to spend our security dollars; it may compensate for insecure IT products, but doesn't help improve their security ”. “ Additionally, as long as IT security is a separate industry, there will be companies making money based on insecurity -- companies who will lose money if the internet becomes more secure. ” Why security is not built-in? “ utopian vision ” Security is difficult, expensive and takes time …

29 CompSec Intro Nicolas T. Courtois, January 2009 29 ***Today [Schneier] “ I am regularly asked what average Internet users can do to ensure their security. My first answer is usually, "Nothing--you're screwed. “

30 CompSec Intro Nicolas T. Courtois, January 2009 30 ****Visions [Schneier] Future? Two visions. “ will become another utility ” “ users will simply expect it to work -- and the details of how it works won't matter ”. “ computer security for the general populace will never be achievable. ” “ let's just live with being as secure as we can. ”

31 CompSec Intro Nicolas T. Courtois, January 2009 31 Security Science 1.2.3.

32 CompSec Intro Nicolas T. Courtois, January 2009 32 Claim [Courtois, Schneier]: Computer Security and real-world security are governed by the same laws !!!

33 CompSec Intro Nicolas T. Courtois, January 2009 33 The Security ? 3-point Formal Approach What is Security ? Inability to achieve: 1.Security against what: Adversarial Goal. 2.Against whom: resources of the Adversary: money, human resources, computing power, memory, risk, expertise, etc.. 3.Access to the system.

34 CompSec Intro Nicolas T. Courtois, January 2009 34 1. Attackers

35 CompSec Intro Nicolas T. Courtois, January 2009 35 Vocabulary Attacker / Adversary / Threat Agent

36 CompSec Intro Nicolas T. Courtois, January 2009 36 1. Adversarial Goals

37 CompSec Intro Nicolas T. Courtois, January 2009 37 2. a. Who Are the Attackers bored teenagers, petty => organized criminals, rogue states, industrial espionage, disgruntled employees, … pure legitimate use Inadvertent events, bugs, errors, ourselves (forgot the d … password!), our family / friends / co-workers,

38 CompSec Intro Nicolas T. Courtois, January 2009 38 2. b. Their Means computers (MIPS) / other hardware (antennas, liquid Nitrogen, etc … ) knowledge / expertise risk/exposure capacity

39 CompSec Intro Nicolas T. Courtois, January 2009 39 3. Access

40 CompSec Intro Nicolas T. Courtois, January 2009 40 3. Access to Computers Somewhat similar as for cars:

41 CompSec Intro Nicolas T. Courtois, January 2009 41 3. Access to your car. Miles away … When you drive on the road When parked on the street. Enters your garage Had a key yesterday and he made a copy …

42 CompSec Intro Nicolas T. Courtois, January 2009 42 3. Access to a Computer Remote location, not connected to Internet Remote location, somewhat connected … Physical proximity … Access to USB ports. Access (alone) for a few seconds … Take it home and hack it …

43 CompSec Intro Nicolas T. Courtois, January 2009 43 *3. Access

44 CompSec Intro Nicolas T. Courtois, January 2009 44 Insecurity

45 CompSec Intro Nicolas T. Courtois, January 2009 45 ***Insecurity Aspects

46 CompSec Intro Nicolas T. Courtois, January 2009 46 “ Small ” Little Things

47 CompSec Intro Nicolas T. Courtois, January 2009 47 What are Important Security Issues? Think twice.

48 CompSec Intro Nicolas T. Courtois, January 2009 48 Mildest Forms of Insecurity nuisance and irritation generated by security and updates, necessity to think about such sordid things as security, losing some pennies on it … Slight problem: the long tail theory.

49 CompSec Intro Nicolas T. Courtois, January 2009 49 the Long Tail theory frequency cost

50 CompSec Intro Nicolas T. Courtois, January 2009 50 Major Threats

51 CompSec Intro Nicolas T. Courtois, January 2009 51 Block Buster Insecurity Viruses/Worms (e.g. Melissa 1999) DOS hurting big corp. / countries [Estonia] Financial/personal data files leaked/lost [lots in the media US] Computer break-ins [little publicity?]

52 CompSec Intro Nicolas T. Courtois, January 2009 52 Explosion of Known Vulnerabilities What about the unknown ones? http://www.cert.org/stats/

53 CompSec Intro Nicolas T. Courtois, January 2009 53 Is My PC Infected? Many PCs are

54 CompSec Intro Nicolas T. Courtois, January 2009 54 Why Things Happen? Bugs … or don ’ t care. Programming developed with absence of security. C/C++ is unsafe (Microsoft is currently blacklisting big chunks of standard C, could have happened 10 years ago). Security/cryptography research developed with obsession with security. Both never met. Mystified: –security issues are probably always exaggerated and distorted, one way or another (downplayed OR exaggerated, Ross Anderson: “ hypertrophy ” of security) Economics/Business: –many things just don ’ t matter! –customers do not see => do not care about security –usability: user burden frustrates them.

55 CompSec Intro Nicolas T. Courtois, January 2009 55 CompSec and Economics

56 CompSec Intro Nicolas T. Courtois, January 2009 56 Question “[…] Why do so many vulnerabilities exist in the first place?[…]” Cf. Ross Anderson, Tyler Moore et al: 1.“The Economics of Information Security” In Science, October 2006. 2.“Security Economics and the Internal Market”: public report for ENISA (European Network and Information Security Agency), March 2008.

57 CompSec Intro Nicolas T. Courtois, January 2009 57 Why Commercial Security Fails? Claim: the link between “ money ” and security is frequently broken today: –Security is a public good. “ private ” incentives are weak. –Worse than “ market for lemons ” : Not only that the customer cannot see the difference between good security and bad. Frequently the manufacturer cannot either. Too frequently security (and our safety) remains something that money cannot buy … (money seems to work for cars, and to fail for computers)

58 CompSec Intro Nicolas T. Courtois, January 2009 58 Security and Economics Security is about [sensible] security trade-offs. Closely related to economy: How to allocate resources efficiently. Goes much further: Most security trade-offs are economical in nature. Security is shaped by the economy. Uncertainty and security problems (e.g. social security) appear in all free economies.

59 CompSec Intro Nicolas T. Courtois, January 2009 59 The Very Nature of Security: Bruce Schneier “ Beyond Fear ” book [2003], p.1: Critical to any security decision is the notion of [security] trade-offs, meaning the costs – terms of money, convenience, comfort, freedoms, and so on - that inevitably attach themselves to any security system. People make security trade-offs naturally. Many costs are intangible and hard to trade for money: Paying attention, loss of freedom&privacy, being subject to unforeseen risks and consequences etc.

60 CompSec Intro Nicolas T. Courtois, January 2009 60 *Defenders relevant though boring vocabulary..

61 CompSec Intro Nicolas T. Courtois, January 2009 61 3 Actions of Defenders

62 CompSec Intro Nicolas T. Courtois, January 2009 62 Types of Prevention [+cost -freq]

63 CompSec Intro Nicolas T. Courtois, January 2009 63 Detection and Recovery Detect Recover – have a recovery plan monitoring anomaly analysis etc. example: Intrusion detection

64 CompSec Intro Nicolas T. Courtois, January 2009 64 ***IT Sec and the Military

65 CompSec Intro Nicolas T. Courtois, January 2009 65 Military vs. Commercial Military data security: focus on secrecy, prevent leaks. Commercial data security: integrity and authenticity: prevent fraud.

66 CompSec Intro Nicolas T. Courtois, January 2009 66 ****** Traditional Military Doctrine Each country has 3 frontiers: land sea air, space as a consequence they have 3 armies. Now, we have a new frontier, the digital frontier. Shouldn ’ t we have a fourth army? It would be totally useless and waste of money? Arguably less than the 3 above (better technical education for young people).

67 CompSec Intro Nicolas T. Courtois, January 2009 67 *Risk

68 CompSec Intro Nicolas T. Courtois, January 2009 68 Risk Management = 1+2 1.Measuring or/and assessing risks 2.Developing strategies and solutions to manage risks: reduce/avoid and handle risks

69 CompSec Intro Nicolas T. Courtois, January 2009 69 **Risk Management contd … handle risks risk retention risk transfer contract … hedging (e.g. insurance) (minimise exposure by some transaction)

70 CompSec Intro Nicolas T. Courtois, January 2009 70 Residual Risk = def what remains after defences are in place …

71 CompSec Intro Nicolas T. Courtois, January 2009 71 Failures

72 CompSec Intro Nicolas T. Courtois, January 2009 72 ****Attacks  Target attacker vulnerabilities defences controls countermeasures attack target system expectations, goals properties security policy =what is allowed/not (protection goals) if there is one

73 CompSec Intro Nicolas T. Courtois, January 2009 73 *Security Failures vulnerability security policy: what is allowed/not, protection goals defences: controls countermeasures

74 CompSec Intro Nicolas T. Courtois, January 2009 74 Types of Failures Failure in design Failure in implementation Failure in operation

75 CompSec Intro Nicolas T. Courtois, January 2009 75 If It Can Fail … It Will

76 CompSec Intro Nicolas T. Courtois, January 2009 76

77 CompSec Intro Nicolas T. Courtois, January 2009 77 Key Question: Is actively researching serious security vulnerabilities socially desirable? - Of Course Yes! …will tell you every professional hacker and every academic code-breaker…

78 CompSec Intro Nicolas T. Courtois, January 2009 78 Bruce Schneier [14 May 2008]: Problem: A hacker who discovers one [attack] can sell it on the black market, blackmail the vendor with disclosure, or simply publish it without regard to the consequences. Q: […] is it ethical to research new vulnerabilities? A: Unequivocally, yes. [according to Schneier] Because: Vulnerability research is vital because it trains our next generation of computer security experts. http://www.schneier.com/blog/archives/2008/05/the_ethics_of_v.html

79 CompSec Intro Nicolas T. Courtois, January 2009 79 Our Answer: The question is open to debate and remains somewhat controversial Maybe it depends.. …on what?

80 CompSec Intro Nicolas T. Courtois, January 2009 80 Rescorla [2004] Research and disclose? Yes, if … …these vulnerabilities are likely to be rediscovered Cf. E. Rescorla. “Is finding security holes a good idea?” In 3rd Workshop on the Economics of Information Security (2004).

81 CompSec Intro Nicolas T. Courtois, January 2009 81 Benefits: Disclosure creates incentives for fixing these vulnerabilities.

82 CompSec Intro Nicolas T. Courtois, January 2009 82 Attackers and Hackers

83 CompSec Intro Nicolas T. Courtois, January 2009 83 Motivation Enjoyment, fame, ego, role models Develop science and offensive technology: –University researchers, (good/bad) –Security professionals (defenders) –Professional hackers, pen testers, etc … Profits and other benefits –Crime business, –Promoting legal security business, –Political activism, terrorism,

84 CompSec Intro Nicolas T. Courtois, January 2009 84 Hackers Are: bored teenagers, petty => organized criminals, rogue states, industrial espionage, disgruntled employees, … pure legitimate use Inadvertent events, bugs, errors, ourselves (forgot the d … password!), our family / friends / co-workers,

85 CompSec Intro Nicolas T. Courtois, January 2009 85 2. b. Their Means computers (MIPS) / other hardware (antennas, liquid Nitrogen, etc … ) knowledge / expertise risk/exposure capacity

86 CompSec Intro Nicolas T. Courtois, January 2009 86 Hacker Movement

87 CompSec Intro Nicolas T. Courtois, January 2009 87 Hacktivism Mostly about getting attention in the press through direct and spectacular action (hacking) against a government or corporate entity. In order to achieve a political effect. Usually still a technology claim: Just improve the security! It is broken. Just hire better experts/cryptographers/programmers … Political statement unrelated to technology: Russia does not agree with Estonia.

88 CompSec Intro Nicolas T. Courtois, January 2009 88 Chaos Computer Club started in Hamburg in 1981 as a German-speaking libertarian-anarchist conference and social movement claiming freedom at any price, including freedom to pirate anything …

89 CompSec Intro Nicolas T. Courtois, January 2009 89 Chaos Computer Club Very strong relations with German press makes that it is much harder to put hackers to jail (fear of bad press coverage). Don ’ t do the same things that some people do.. They will get away with it, you will not. These people are very strong defenders of some rights such as privacy and freedom of speech, and thus they achieve a sort of legitimacy in the public eye to indulge in activities that would otherwise be considered just criminal. in the same way trade unions frequently get away with doing illegal things, but not always, many trade union leader have been in prison once or twice. The risk is totally and absolutely non-zero. In the 80s the CCC guys showed the insecurity of Bildschirmtext computer network and made it debit a bank in Hamburg 134K DM in favour of the Club. Then they organized a press conference. The banks were nice, did NOT sue them. But how can you predict they will be nice?

90 CompSec Intro Nicolas T. Courtois, January 2009 90 Recent Trend: Rapid growth. The industrialization of hacking: division of labour, clear definition of roles forming a supply chain professional management

91 CompSec Intro Nicolas T. Courtois, January 2009 91 ***Question:

92 CompSec Intro Nicolas T. Courtois, January 2009 92 Principles of Security Engineering

93 CompSec Intro Nicolas T. Courtois, January 2009 93 Security Engineering Definition: [Ross Anderson] building systems to remain dependable in face of malice, error or mischance.

94 CompSec Intro Nicolas T. Courtois, January 2009 94 Magic Formulas … or “Security Mantras”: repeat after me: C.I.A. C.I.A. In fact we have no silver bullet. on the contrary: Security is about trade-offs. Conflicting engineering criteria…. Conflicting requirements… Overcoming human, technology and market failures. insecure rubbish!

95 CompSec Intro Nicolas T. Courtois, January 2009 95 Proportionality Principle Maximize security??? Maximize “ utility ” (the benefits) while limiting risk to an acceptable level within reasonable cost … »all about economics …

96 CompSec Intro Nicolas T. Courtois, January 2009 96 Efficiency and Effectiveness Security measures must be: Efficient and effective…

97 CompSec Intro Nicolas T. Courtois, January 2009 97

98 CompSec Intro Nicolas T. Courtois, January 2009 98 Open Design Principle Frequently incorrectly understood and confused with open source [cf. also Kerckhoffs and later slides 140-…]. Examples: cryptography such as SHA256 (used in bitcoin) is open source but was designed behind closed doors at the NSA.

99 CompSec Intro Nicolas T. Courtois, January 2009 99 Least Privilege [or Limitation] Principle Every “module” (such as a process, a user or a program) should be able to access only such information and resources that are necessary to its legitimate purpose.

100 CompSec Intro Nicolas T. Courtois, January 2009 100 Attitudes There are two basic attitudes: Default permit - "Everything, not explicitly forbidden, is permitted" Allows greater functionality by sacrificing security. Good if security threats are non-existent or negligible. Default deny - "Everything, not explicitly permitted, is forbidden" Improves security, harder to get any functionality. Believed to be a good approach if you have lots of security threats. BTW. default deny + fine granularity = the Least Privilege principle

101 CompSec Intro Nicolas T. Courtois, January 2009 101 Fail-safe Defaults Secure by default, Example:if we forget to specify access, deny it.

102 CompSec Intro Nicolas T. Courtois, January 2009 102 Economy of Mechanism A protection mechanism should have a simple and small design. –small and simple enough to be build in a rigorous way, and fully tested and analysed

103 CompSec Intro Nicolas T. Courtois, January 2009 103 Separation of Privileges Split into pieces with limited privileges! Implementation in software engineering: Have computer program fork into two processes. The main program drops privileges (e.g. dropping root under Unix). The smaller program keeps privileges in order to perform a certain task. The two halves then communicate via a socket pair. Benefits: A successful attack against the larger program will gain minimal access. –even though the pair of programs will perform privileged operations. A crash in a process run as nobody cannot be exploited to gain privileges. Additional possibilities: obfuscate individual modules and/or make them tamper resistant through software. Or burn them into a dedicated hardware module, and burn the fuse that allows to read the firmware.

104 CompSec Intro Nicolas T. Courtois, January 2009 104 Least Common Mechanism

105 CompSec Intro Nicolas T. Courtois, January 2009 105 Be Friendly!

106 CompSec Intro Nicolas T. Courtois, January 2009 106 Saltzer and Schroeder 1975: Psychologically Acceptable not enough anymore

107 CompSec Intro Nicolas T. Courtois, January 2009 107 Usability + Consent Acceptable and accepted… or not: otherwise the mechanism will be bypassed (-consent) Nice User-friendly

108 CompSec Intro Nicolas T. Courtois, January 2009 108 Be Even More Friendly: Don’t annoy people. –Minimize the number of clicks –Minimize the number of things to remember –Make security easy to understand and self-explanatory –Security should NOT impact users that obey the rules. Established defaults should be reasonable. –People should not feel trapped. It should be easy to –Restrict access –Give access –Personalize settings –Etc…

109 CompSec Intro Nicolas T. Courtois, January 2009 109 More Design Principles

110 CompSec Intro Nicolas T. Courtois, January 2009 110 Think Ahead

111 CompSec Intro Nicolas T. Courtois, January 2009 111 Trust vs. Security

112 CompSec Intro Nicolas T. Courtois, January 2009 112 Trust

113 CompSec Intro Nicolas T. Courtois, January 2009 113 Be Reluctant to Trust Cf. Least Privilege.

114 CompSec Intro Nicolas T. Courtois, January 2009 114 Be Trustworthy

115 CompSec Intro Nicolas T. Courtois, January 2009 115

116 CompSec Intro Nicolas T. Courtois, January 2009 116 Lipner 1982 These can be see as management principles: How to manage development and production, avoid random failures and security breaches alike. Principle of Segregation of Duties = Separation of Duty.

117 CompSec Intro Nicolas T. Courtois, January 2009 117 Segregation of Duties Achieved by (closely related): Principle of Functional separation: Several people should cooperate. Examples: one developer should not work alone on a critical application, the tester should not be the same person as the developer If two or more steps are required to perform a critical function, at least two different people should perform them, etc. –This principle makes it very hard for one person to compromise the security, on purpose of inadvertently. Principle of Dual Control: –Example 1: in the SWIFT banking data management system there are two security officers: left security officer and right security officer. Both must cooperate to allow certain operations. –Example 2: nuclear devices command. –Example 3: cryptographic secret sharing

118 CompSec Intro Nicolas T. Courtois, January 2009 118 Auditing / Monitoring –Record what actions took place and who performed them –Contributes to both disaster recovery (business continuity) and accountability.

119 CompSec Intro Nicolas T. Courtois, January 2009 119 Lipner 1982: Requirements, focus on integrity of the business processes and of the “production” data whatever it means: 1.Users will not write their own programs. 2.Separation of Function : Programmers will develop on test systems and test data, not on “production” systems. 3.A special procedure will allow to execute programs. 4.Compliance w.r.t. 3 will be controlled / audited. 5.Managers and auditors should have access to the system state and to all system logs.

120 CompSec Intro Nicolas T. Courtois, January 2009 120

121 CompSec Intro Nicolas T. Courtois, January 2009 121 Weakest Link Chain metaphor: Schneier: www.schneier.com/blog/archives/2005/12/weakest_link_se.html “security is only as strong as the weakest link.”

122 CompSec Intro Nicolas T. Courtois, January 2009 122 Cryptographic Failures

123 CompSec Intro Nicolas T. Courtois, January 2009 123 Chains vs. Layers

124 CompSec Intro Nicolas T. Courtois, January 2009 124 Two Cases Security can be like a chain: or, better Security can be layered

125 CompSec Intro Nicolas T. Courtois, January 2009 125 Military: Defence in Depth

126 CompSec Intro Nicolas T. Courtois, January 2009 126 Layers Computer systems have multiple layers, e.g. –HW components –Chipset/MB –OS –TCP/IP stack –HTTP application –Secure http layer –Java script –User/smart card interface

127 CompSec Intro Nicolas T. Courtois, January 2009 127 Example 1: assuming 1000 little details …

128 CompSec Intro Nicolas T. Courtois, January 2009 128 Example 2: assuming 1000 little details …

129 CompSec Intro Nicolas T. Courtois, January 2009 129 **Weakest Link vs. Sum of Efforts

130 CompSec Intro Nicolas T. Courtois, January 2009 130 About of Software Development Process

131 CompSec Intro Nicolas T. Courtois, January 2009 131 *Probability Distributions

132 CompSec Intro Nicolas T. Courtois, January 2009 132 *Bell Curve – Repetitive Events Worst Case - Median Case - Best Case, probability cost 50 %

133 CompSec Intro Nicolas T. Courtois, January 2009 133 *Bell Curve – Repetitive Events Average Case? no such thing … => Average (Cost) = Cost Expectancy =  = total surface below the curve  Average (Time) probability cost

134 CompSec Intro Nicolas T. Courtois, January 2009 134 Bell Curve – Repetitive Events Here 99% of the Time Average Cost is Substantial probability cost

135 CompSec Intro Nicolas T. Courtois, January 2009 135 Rare Events Here 99% of the Time Average Cost is VERY low! probability cost

136 CompSec Intro Nicolas T. Courtois, January 2009 136 Rare Events Worst Case - Median Case - Best Case, probability cost 50 %

137 CompSec Intro Nicolas T. Courtois, January 2009 137 Back to Schneier Quote

138 CompSec Intro Nicolas T. Courtois, January 2009 138 Worst Case Defences? Criticism Cormac Herley [Microsoft research]: Most security systems are build to defend against the worst case. In reality, the average case losses are insignificant or small, e.g. actually computer crime worldwide is very small … and many security technologies are maybe -- from the economics point of view -- totally useless but it depends, we cannot judge security technologies by present losses, because there are also losses that have been avoided or deterred by this technology, and also that losses evolve over the time with highly chaotic pattern (they are 0 then suddenly they may explode)

139 CompSec Intro Nicolas T. Courtois, January 2009 139 Worst Case Defences? Conclusion There is no general answer. Both arguments are legitimate. More detailed analysis is needed; what if etc... Can this be done? Maybe there will always be some purely political choices that is impossible neither to justify, nor to undermine.

140 CompSec Intro Nicolas T. Courtois, January 2009 140 Security Technologies: Security as f(penetration rate)

141 CompSec Intro Nicolas T. Courtois, January 2009 141 Deployment / Penetration Rate Two sorts of technologies: A) Those that are effective if deployed at 20%: Examples: 1.virus detection (as opposed to removal / fighting the viruses), 99 % 2.email / hard disk encryption, 20 % 3.making the entry/authentication harder, as an option for the user, 20% B)Those that are totally ineffective even at 99%: Examples: 1.virus removal, 2.buggy anti-virus: “ your anti-virus has just restarted due to an internal error ”… 3.we click YES for 1 % of the security alerts out of fatigue … certificates are frequently invalid … it invalidates the 99 % of the time we did prevent the intrusion … we lost our time 4.if some ATMs still accept a blank mag-stripe only cards, the whole purpose of chips on bank cards is nearly defeated …

142 CompSec Intro Nicolas T. Courtois, January 2009 142 Secrecy vs. Transparency

143 CompSec Intro Nicolas T. Courtois, January 2009 143 Open Source vs. Closed Source and Security

144 CompSec Intro Nicolas T. Courtois, January 2009 144 Secrecy: Very frequently an obvious business decision. Creates entry barriers for competitors. But also defends against hackers.

145 CompSec Intro Nicolas T. Courtois, January 2009 145 Kerckhoffs ’ principle: [1883] “ The system must remain secure should it fall in enemy hands …”

146 CompSec Intro Nicolas T. Courtois, January 2009 146 Kerckhoffs ’ principle: [1883]

147 CompSec Intro Nicolas T. Courtois, January 2009 147 Yes (1,2,3,4): 1. Military: layer the defences.

148 CompSec Intro Nicolas T. Courtois, January 2009 148 Yes (2): 2) Basic economics: these 3 extra months (and not more  ) are simply worth a a lot of money.

149 CompSec Intro Nicolas T. Courtois, January 2009 149 Yes (3): 3) Prevent the erosion of profitability / barriers for entry for competitors / “ inimitability ”

150 CompSec Intro Nicolas T. Courtois, January 2009 150 Yes (4): 4) Avoid Legal Risks companies they don't know where their code is coming from, they want to release the code and they can't because it's too risky! re-use of code can COMPROMISE own IP rights and create unknown ROYALTY obligations (!!!) clone/stolen code is more stable, more reliable, easier to understand!

151 CompSec Intro Nicolas T. Courtois, January 2009 151 What ’ s Wrong with Open Source?

152 CompSec Intro Nicolas T. Courtois, January 2009 152 Kerckhoffs principle: Rather WRONG in the world of smart cards … –Reasons: side channel attacks, PayTV card sharing attacks etc.. But could be right elsewhere for many reasons … –Example: DES,AES cipher, open-source, never really broken KeeLoq cipher, closed source, broken in minutes …

153 CompSec Intro Nicolas T. Courtois, January 2009 153 *Kerckhoffs principle vs. Public Key Crypto vs. Financial Cryptography In Public Key Cryptography one key CAN be made public. In practice this means that –some group of people has it –NO obligation to disclose, to make it really public (and it is almost never done in serious financial applications) Again full disclosure for public keys is unbelievably stupid and simply BAD security engineering and BAD security management. Examples: all ATMs in the country have 6 public keys, not really public though in Bitcoin: the public key can remain a secret for years, only a hash is revealed, this is BRILLIANT key management which makes Bitcoin MUCH more secure that it would otherwise be!

154 CompSec Intro Nicolas T. Courtois, January 2009 154 Which Model is Better? Open and closed security are more or less equivalent … more or less as secure: opening the system helps both the attackers and the defenders. Cf. Ross Anderson: Open and Closed Systems are Equivalent (that is, in an ideal world). In Perspectives on Free and Open Source Software, MIT Press 2005, pp. 127-142.

155 CompSec Intro Nicolas T. Courtois, January 2009 155 Open Source:

156 CompSec Intro Nicolas T. Courtois, January 2009 156 Social Critique of Open Source Software

157 CompSec Intro Nicolas T. Courtois, January 2009 157

158 CompSec Intro Nicolas T. Courtois, January 2009 158 Attack Tree [Schneier 1999] but what about unknown attacks? Get Admin Access Abuse Limited Account Get Limited Access Abuse System Privileges 1/10 Privilege Escalation Exploit #36 sub-goals refinement details

159 CompSec Intro Nicolas T. Courtois, January 2009 159 Expanded Example Get Admin Access Become Admin Directly Obtain Legit. Admin Account Create One’s Own Account Obtain Sb’s CredentialsCreate Account login passwordin person remotely Abuse Limited Account Get Limited Access Abuse System Privileges 1/100 Privilege Escalation Exploit #36 similar bribe £1000 shoulder surfing intercept keyboard sniffer 1/10 guess / crack reset

160 CompSec Intro Nicolas T. Courtois, January 2009 160 Unix Log In

161 CompSec Intro Nicolas T. Courtois, January 2009 161 Weakest Link Security like a chain: Get Admin Access Become Admin Directly Abuse Limited Account 1/100 1/10 easier!

162 CompSec Intro Nicolas T. Courtois, January 2009 162 Defense in Depth also appears in attack trees… Crack Password spec secrecy cipher secrecy password hardness getting the SAM file

163 CompSec Intro Nicolas T. Courtois, January 2009 163 Conclusion: In complex systems, the principles of weakest link and defence in depth will occur simultaneously.

164 CompSec Intro Nicolas T. Courtois, January 2009 164 Accessing Password Database

165 CompSec Intro Nicolas T. Courtois, January 2009 165 Stealing Data with Costs

166 CompSec Intro Nicolas T. Courtois, January 2009 166 Opening a Safe with Costs © Bruce Schneier

167 CompSec Intro Nicolas T. Courtois, January 2009 167 Cheapest Attack without Special Equipment © Bruce Schneier

168 CompSec Intro Nicolas T. Courtois, January 2009 168 Attack Tree for PGP © Bruce Schneier

169 CompSec Intro Nicolas T. Courtois, January 2009 169 Reading a message sent between 2 Windows machines with countermeasures © Bruce Schneier


Download ppt "Computer Security Foundations and Principles Nicolas T. Courtois - University College of London."

Similar presentations


Ads by Google