Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer Security Foundations and Principles Nicolas T. Courtois - University College London.

Similar presentations


Presentation on theme: "Computer Security Foundations and Principles Nicolas T. Courtois - University College London."— Presentation transcript:

1 Computer Security Foundations and Principles Nicolas T. Courtois - University College London

2 CompSec Intro Nicolas T. Courtois, January 2009 2 Plan What is CompSec? –Computer Industry –Security = ? –Security Science 123 Life Context and Threats –Small / Big Threats –Economics –Ethics, Hackers Principles - Defenses –Security Design Principles, Security Engineering –Security Management Principles –Weakest Link? –Secrecy / Open Source –Case Studies - Attack Trees

3 CompSec Intro Nicolas T. Courtois, January 2009 3 Computers …

4 CompSec Intro Nicolas T. Courtois, January 2009 4 Computer Industry We are dealing here with a particular industry: Computer industry. Software and Hardware. This industry is arguably VERY special, unlike any other industry that ever existed… But we can and will compare it to other industries smart cards have their own O.S. –much more focus on security since always mobile phones, similarly developed very differently too… voting machines, door/car lock systems, etc…

5 CompSec Intro Nicolas T. Courtois, January 2009 5 Disclaimer I don’t intend to ask you abstract “philosophy” questions at the exam, rather more concrete technical questions: beware of vague statements and excessive generalizations… what do the principles / methodology / ideas actually do in practice… be specific and factual

6 CompSec Intro Nicolas T. Courtois, January 2009 6 Computer Industry and Security Social-Econ Background: Science background: Tech Background: “Industry Standards” such as: Intel CPU and Chipset, RAM and hard drives, C language, UNIX / Windows TCP/IP, HTTP, TLS,

7 CompSec Intro Nicolas T. Courtois, January 2009 7 Computer Industry and Security Social-Econ Background: Science background: What technology “enablers”(computers) and “disablers” (cryptology,HWSec) can/cannot achieve? How to define / classify security problems and find “good” solutions “Industry Standards”

8 CompSec Intro Nicolas T. Courtois, January 2009 8 Computer Industry and Security Social-Econ Background: things exist for a reason. “Nice or unpleasant” facts of life: software/hardware economics: which industry dominates which free market triumphs and disasters these stupid humans that cannot be bothered to obey the policy… these bureaucratic organisations that just cannot get their best interest(?) right nobody is buying/using the wonderful(?) technology, adoption barriers theory vs. practice crime war terrorism… laws / regulations etc… Science background: “Industry Standards” insecure rubbish!

9 CompSec Intro Nicolas T. Courtois, January 2009 9 “ CompSec Cycle ” : Social-Econ Background: Science background: “Industry Standards”

10 CompSec Intro Nicolas T. Courtois, January 2009 10 At The End: everything ends up here rapid obsolescence

11 CompSec Intro Nicolas T. Courtois, January 2009 11 What Happens After Death?

12 CompSec Intro Nicolas T. Courtois, January 2009 12 What is Security?

13 CompSec Intro Nicolas T. Courtois, January 2009 13 Definition 1 [Courtois] Security: protect the value(s). What value(s) ? ALL !

14 CompSec Intro Nicolas T. Courtois, January 2009 14 Security: protect the value(s). What value(s) ? Money Life and the quality of life. Privacy Good technology vs. bad one Freedom, Justice, etc …

15 CompSec Intro Nicolas T. Courtois, January 2009 15 Security: Definition 2. asset holder

16 CompSec Intro Nicolas T. Courtois, January 2009 16 Security  Safety Difference: protect against intentional damages... Notion of an Attacker / Adversary. Notion of an Attack.

17 CompSec Intro Nicolas T. Courtois, January 2009 17 Attacker Attacker = Adversary = Threat Agent

18 CompSec Intro Nicolas T. Courtois, January 2009 18 **Dimensions of Security Physical vs. Logical Psychological / Human vs. Organizational / Business very different!

19 CompSec Intro Nicolas T. Courtois, January 2009 19 *Security on One Slide

20 CompSec Intro Nicolas T. Courtois, January 2009 20 Main Goals: Confidentiality Integrity Authenticity Availability Accountability

21 CompSec Intro Nicolas T. Courtois, January 2009 21 In There a Need for More Security? or will I have a job tomorrow?

22 CompSec Intro Nicolas T. Courtois, January 2009 22 Hegel [1770-1831], German philosophy The history is the process of broadening freedom.

23 CompSec Intro Nicolas T. Courtois, January 2009 23 Why Security, why Cryptography ? Freedom, new technologies  More possibilities  More security problems … “ Theorem ” [Courtois]: Freedom    Security  0

24 CompSec Intro Nicolas T. Courtois, January 2009 24 Freedom  Security issues … Examples: invention of train/metro  threat of pickpockets car traffic:  major cause of death, trains are much safer invention of the internet  pirates and hackers.

25 CompSec Intro Nicolas T. Courtois, January 2009 25 Why Security, why Cryptography ? Freedom of travel by car  main cause of death (e.g. 1.2 M people die every year in car accidents !) Unlimited exchange of data over the internet  unlimited security problems.

26 CompSec Intro Nicolas T. Courtois, January 2009 26 *Prof. Shigeo Tsujii, “ The society should be designed to help people to enjoy the freedoms ”… “ broadened by IT ”. Enjoy benefits, limit/remove disadvantages.

27 CompSec Intro Nicolas T. Courtois, January 2009 27 “ Adding ” Security – Def. 3. The goal of cryptology and security [Courtois]: Add security to the information technologies (IT) that are by nature insecure.

28 CompSec Intro Nicolas T. Courtois, January 2009 28 Adding Security - Criticism

29 CompSec Intro Nicolas T. Courtois, January 2009 29 Critics [Schneier] Why security it is an after-thought? “ Aftermarket security is actually a very inefficient way to spend our security dollars; it may compensate for insecure IT products, but doesn't help improve their security ”. “ Additionally, as long as IT security is a separate industry, there will be companies making money based on insecurity -- companies who will lose money if the internet becomes more secure. ” Why security is not built-in? “ utopian vision ” Security is difficult, expensive and takes time …

30 CompSec Intro Nicolas T. Courtois, January 2009 30 ***Today [Schneier] “ I am regularly asked what average Internet users can do to ensure their security. My first answer is usually, "Nothing--you're screwed. “

31 CompSec Intro Nicolas T. Courtois, January 2009 31 Security Science 1.2.3.

32 CompSec Intro Nicolas T. Courtois, January 2009 32 Claim [Courtois, Schneier]: Computer Security and real-world security are governed by the same laws !!!

33 CompSec Intro Nicolas T. Courtois, January 2009 33 The Security ? 3-point Formal Approach What is Security ? Inability to achieve: 1.Security against what: Adversarial Goal. 2.Against whom: resources of the Adversary: money, human resources, computing power, memory, risk, expertise, etc.. 3.Access to the system.

34 CompSec Intro Nicolas T. Courtois, January 2009 34 1. Attackers

35 CompSec Intro Nicolas T. Courtois, January 2009 35 Vocabulary Attacker / Adversary / Threat Agent

36 CompSec Intro Nicolas T. Courtois, January 2009 36 1. Adversarial Goals Enjoyment, fame, ego, role models Develop science and offensive technology: $$$ profits and other benefits

37 CompSec Intro Nicolas T. Courtois, January 2009 37 2. a. Who Are the Attackers bored teenagers, petty => organized criminals, rogue states, industrial espionage, disgruntled employees, … pure legitimate use Inadvertent events, bugs, errors, ourselves (forgot the d … password!), our family / friends / co-workers,

38 CompSec Intro Nicolas T. Courtois, January 2009 38 2. b. Their Means computers (MIPS) / other hardware (antennas, liquid Nitrogen, etc … ) knowledge / expertise risk/exposure capacity

39 CompSec Intro Nicolas T. Courtois, January 2009 39 3. Access

40 CompSec Intro Nicolas T. Courtois, January 2009 40 3. Access to Computers Somewhat similar as for cars:

41 CompSec Intro Nicolas T. Courtois, January 2009 41 3. Access to your car. Miles away … When you drive on the road When parked on the street. Enters your garage Had a key yesterday and he made a copy …

42 CompSec Intro Nicolas T. Courtois, January 2009 42 3. Access to a Computer Remote location, not connected to Internet Remote location, somewhat connected … Physical proximity … Access to USB ports. Access (alone) for a few seconds … Take it home and hack it …

43 CompSec Intro Nicolas T. Courtois, January 2009 43 *3. Access Internet +wireless communications made things worse …

44 CompSec Intro Nicolas T. Courtois, January 2009 44 “ Small ” Little Things

45 CompSec Intro Nicolas T. Courtois, January 2009 45 Insecurity

46 CompSec Intro Nicolas T. Courtois, January 2009 46 What are Important Security Issues? Think twice.

47 CompSec Intro Nicolas T. Courtois, January 2009 47 Mildest Forms of Insecurity nuisance and irritation generated by security and updates, necessity to think about such sordid things as security, losing some pennies on it … Slight problem: the long tail theory.

48 CompSec Intro Nicolas T. Courtois, January 2009 48 the Long Tail theory 1 Bank loses 30 M$/Y 300 M people lose 1 minute/day = 9000 M$/Y Cormac Herley [Microsoft Research] no thanks.. the rational rejection of security advice … the customer is your boss ’ s boss ’ s boss! his time is very precious! 30 M$ / minute! frequency cost

49 CompSec Intro Nicolas T. Courtois, January 2009 49 Major Threats

50 CompSec Intro Nicolas T. Courtois, January 2009 50 Block Buster Insecurity Viruses/Worms (e.g. Melissa 1999) DOS hurting big corp. / countries [Estonia] Financial/personal data files leaked/lost [lots in the media US] Computer break-ins [little publicity?]

51 CompSec Intro Nicolas T. Courtois, January 2009 51 Explosion of Known Vulnerabilities What about the unknown ones? http://www.cert.org/stats/

52 CompSec Intro Nicolas T. Courtois, January 2009 52 Is My PC Infected? Many PCs are

53 CompSec Intro Nicolas T. Courtois, January 2009 53 Why Things Happen? Bugs … or don ’ t care. Programming developed with absence of security. C/C++ is unsafe (Microsoft is currently blacklisting big chunks of standard C, could have happened 10 years ago). Security/cryptography research developed with obsession with security. Both never met. Mystified: –security issues are probably always exaggerated and distorted, one way or another (downplayed OR exaggerated, Ross Anderson: “ hypertrophy ” of security) Economics/Business: –many things just don ’ t matter! –customers do not see => do not care about security –usability: user burden frustrates them.

54 CompSec Intro Nicolas T. Courtois, January 2009 54 CompSec and Economics

55 CompSec Intro Nicolas T. Courtois, January 2009 55 Question “[…] Why do so many vulnerabilities exist in the first place?[…]” Cf. Ross Anderson, Tyler Moore et al: 1.“The Economics of Information Security” In Science, October 2006. 2.“Security Economics and the Internal Market”: public report for ENISA (European Network and Information Security Agency), March 2008.

56 CompSec Intro Nicolas T. Courtois, January 2009 56 Why Commercial Security Fails? Claim: the link between “ money ” and security is frequently broken today: –Security is a public good. “ private ” incentives are weak. –Worse than “ market for lemons ” : Not only that the customer cannot see the difference between good security and bad. Frequently the manufacturer cannot either. Too frequently security (and our safety) remains something that money cannot buy … (money seems to work for cars, and to fail for computers)

57 CompSec Intro Nicolas T. Courtois, January 2009 57 Security and Economics Security is about [sensible] security trade-offs. Closely related to economics: How to allocate resources efficiently.

58 CompSec Intro Nicolas T. Courtois, January 2009 58 The Very Nature of Security: Bruce Schneier “ Beyond Fear ” book [2003], p.1: Critical to any security decision is the notion of [security] trade-offs, meaning the costs in terms of money, convenience, comfort, freedoms, and so on – that inevitably attach themselves to any security system.

59 CompSec Intro Nicolas T. Courtois, January 2009 59 Failures

60 CompSec Intro Nicolas T. Courtois, January 2009 60 Types of Failures Failure in design Failure in implementation Failure in operation

61 CompSec Intro Nicolas T. Courtois, January 2009 61 If It Can Fail … It Will Schneier: http://www.schneier.com/essay-005.html “ History has taught us: never underestimate the amount of money, time, and effort someone will expend to thwart a security system.

62 CompSec Intro Nicolas T. Courtois, January 2009 62 Ethics or should Karate classes be legal?

63 CompSec Intro Nicolas T. Courtois, January 2009 63 Key Question: Is actively researching serious security vulnerabilities socially desirable? - Of Course Yes! …will tell you every professional hacker and every academic code-breaker…

64 CompSec Intro Nicolas T. Courtois, January 2009 64 Bruce Schneier [14 May 2008]: Problem: A hacker who discovers one [attack] can sell it on the black market, blackmail the vendor with disclosure, or simply publish it without regard to the consequences. Q: […] is it ethical to research new vulnerabilities? A: Unequivocally, yes. [according to Schneier] Because: Vulnerability research is vital because it trains our next generation of computer security experts. http://www.schneier.com/blog/archives/2008/05/the_ethics_of_v.html

65 CompSec Intro Nicolas T. Courtois, January 2009 65 Our Answer: The question is open to debate and remains somewhat controversial Maybe it depends.. …on what?

66 CompSec Intro Nicolas T. Courtois, January 2009 66 Rescorla [2004] Research and disclose? Yes, if … …these vulnerabilities are likely to be rediscovered Cf. E. Rescorla. “Is finding security holes a good idea?” In 3rd Workshop on the Economics of Information Security (2004).

67 CompSec Intro Nicolas T. Courtois, January 2009 67 Benefits: Disclosure creates incentives for fixing these vulnerabilities.

68 CompSec Intro Nicolas T. Courtois, January 2009 68 Attackers and Hackers

69 CompSec Intro Nicolas T. Courtois, January 2009 69 Motivation Enjoyment, fame, ego, role models Develop science and offensive technology: –University researchers, (good/bad) –Security professionals (defenders) –Professional hackers, pen testers, etc … Profits and other benefits –Crime business, –Promoting legal security business, –Political activism, terrorism,

70 CompSec Intro Nicolas T. Courtois, January 2009 70 Hackers Are: bored teenagers, petty => organized criminals, rogue states, industrial espionage, disgruntled employees, … pure legitimate use Inadvertent events, bugs, errors, ourselves (forgot the d … password!), our family / friends / co-workers,

71 CompSec Intro Nicolas T. Courtois, January 2009 71 2. b. Their Means computers (MIPS) / other hardware (antennas, liquid Nitrogen, etc … ) knowledge / expertise risk/exposure capacity

72 CompSec Intro Nicolas T. Courtois, January 2009 72 Hacker Movement

73 CompSec Intro Nicolas T. Courtois, January 2009 73 Hacktivism Mostly about getting attention in the press through direct and spectacular action (hacking) against a government or corporate entity. In order to achieve a political effect. Usually still a technology claim: Just improve the security! It is broken. Just hire better experts/cryptographers/programmers … Political statement unrelated to technology: Russia does not agree with Estonia.

74 CompSec Intro Nicolas T. Courtois, January 2009 74 Chaos Computer Club started in Hamburg in 1981 as a German-speaking libertarian-anarchist conference and social movement claiming freedom at any price, including freedom to pirate anything …

75 CompSec Intro Nicolas T. Courtois, January 2009 75 Chaos Computer Club Very strong relations with German press makes that it is much harder to put hackers to jail (fear of bad press coverage). Don ’ t do the same things that some people do.. They will get away with it, you will not. These people are very strong defenders of some rights such as privacy and freedom of speech, and thus they achieve a sort of legitimacy in the public eye to indulge in activities that would otherwise be considered just criminal. in the same way trade unions frequently get away with doing illegal things, but not always, many trade union leader have been in prison once or twice. The risk is totally and absolutely non-zero. In the 80s the CCC guys showed the insecurity of Bildschirmtext computer network and made it debit a bank in Hamburg 134K DM in favour of the Club. Then they organized a press conference. The banks were nice, did NOT sue them. But how can you predict they will be nice?

76 CompSec Intro Nicolas T. Courtois, January 2009 76 Recent Trend: Rapid growth. The industrialization of hacking: division of labour, clear definition of roles forming a supply chain professional management

77 CompSec Intro Nicolas T. Courtois, January 2009 77 Principles of Security Engineering

78 CompSec Intro Nicolas T. Courtois, January 2009 78 Security Engineering Definition: [Ross Anderson] building systems to remain dependable in face of malice, error or mischance.

79 CompSec Intro Nicolas T. Courtois, January 2009 79 Magic Formulas … or “Security Mantras”: repeat after me: C.I.A. C.I.A. In fact we have no silver bullet. on the contrary: Security is about trade-offs. Conflicting engineering criteria…. Conflicting requirements… Overcoming human, technology and market failures. insecure rubbish!

80 CompSec Intro Nicolas T. Courtois, January 2009 80 Proportionality Principle Maximize security??? Maximize “ utility ” (the benefits) while limiting risk to an acceptable level within reasonable cost … »all about economics …

81 CompSec Intro Nicolas T. Courtois, January 2009 81 Efficiency and Effectiveness Security measures must be: Efficient and effective…

82 CompSec Intro Nicolas T. Courtois, January 2009 82

83 CompSec Intro Nicolas T. Courtois, January 2009 83 Least Privilege [or Limitation] Principle Every “module” (such as a process, a user or a program) should be able to access only such information and resources that are necessary to its legitimate purpose.

84 CompSec Intro Nicolas T. Courtois, January 2009 84 Default Deny There are two basic attitudes: Default permit - "Everything, not explicitly forbidden, is permitted" Allows greater functionality by sacrificing security. Good if security threats are non-existent or negligible. Default deny - "Everything, not explicitly permitted, is forbidden" Improves security, harder to get any functionality. Believed to be a good approach if you have lots of security threats.

85 CompSec Intro Nicolas T. Courtois, January 2009 85 Fail-safe Defaults Secure by default, Example:if we forget to specify access, deny it.

86 CompSec Intro Nicolas T. Courtois, January 2009 86 Economy of Mechanism A protection mechanism should have a simple and small design. –small and simple enough to be build in a rigorous way, and fully tested and analysed

87 CompSec Intro Nicolas T. Courtois, January 2009 87 Separation of Privileges Split into pieces with limited privileges! Implementation in software engineering: Have computer program fork into two processes. The main program drops privileges (e.g. dropping root under Unix). The smaller program keeps privileges in order to perform a certain task. The two halves then communicate via a socket pair. Benefits: A successful attack against the larger program will gain minimal access. –even though the pair of programs will perform privileged operations. A crash in a process run as nobody cannot be exploited to gain privileges. Additional possibilities: obfuscate individual modules and/or make them tamper resistant through software. Or burn them into a dedicated hardware module, and burn the fuse that allows to read the firmware.

88 CompSec Intro Nicolas T. Courtois, January 2009 88 Least Common Mechanism Mechanisms used to access resources should not be shared. Why? Not so obvious. If everybody depends on it, failure will have a higher impact. One user can do a DOS attack. Shared service [or resource such as CPU cache] can provide side channels. A mechanism serving all users must be designed to the satisfaction of every user, harder than satisfying more specialized requirements.

89 CompSec Intro Nicolas T. Courtois, January 2009 89 Be Friendly!

90 CompSec Intro Nicolas T. Courtois, January 2009 90 Saltzer and Schroeder 1975: Psychologically Acceptable not enough anymore

91 CompSec Intro Nicolas T. Courtois, January 2009 91 Usability + Consent Acceptable and accepted… or not: otherwise the mechanism will be bypassed (-consent) Nice User-friendly

92 CompSec Intro Nicolas T. Courtois, January 2009 92 Be Even More Friendly: Don’t annoy people. –Minimize the number of clicks –Minimize the number of things to remember –Make security easy to understand and self-explanatory –Security should NOT impact users that obey the rules. Established defaults should be reasonable. –People should not feel trapped. It should be easy to –Restrict access –Give access –Personalize settings –Etc…

93 CompSec Intro Nicolas T. Courtois, January 2009 93 More Design Principles

94 CompSec Intro Nicolas T. Courtois, January 2009 94 Think Ahead

95 CompSec Intro Nicolas T. Courtois, January 2009 95 Trust vs. Security

96 CompSec Intro Nicolas T. Courtois, January 2009 96 Trust Following Ross Anderson and US Dept of Defence definitions: Trusted system [paradoxical definition]: one that can break the security policy (in theory, risk). Trustworthy system: one that won ’ t fail us (0 risk). An employee who is selling secrets is trusted and NOT trustworthy at the same time.

97 CompSec Intro Nicolas T. Courtois, January 2009 97 Be Reluctant to Trust Cf. Least Privilege.

98 CompSec Intro Nicolas T. Courtois, January 2009 98 Be Trustworthy Public scrutiny promotes trust. Commitment to security is visible in the long run.

99 CompSec Intro Nicolas T. Courtois, January 2009 99 *Management Principles for Production and Development, or Integrity of a Business Process… [Lipner 1982] slides also relevant and repeated in 02 part

100 CompSec Intro Nicolas T. Courtois, January 2009 100 Lipner 1982 These can be see as management principles: How to manage development and production, avoid random failures and security breaches alike. Principle of Segregation of Duties = Separation of Duty.

101 CompSec Intro Nicolas T. Courtois, January 2009 101 Segregation of Duties Achieved by (closely related): Principle of Functional separation: Several people should cooperate. Examples: one developer should not work alone on a critical application, the tester should not be the same person as the developer If two or more steps are required to perform a critical function, at least two different people should perform them, etc. –This principle makes it very hard for one person to compromise the security, on purpose of inadvertently. Principle of Dual Control: –Example 1: in the SWIFT banking data management system there are two security officers: left security officer and right security officer. Both must cooperate to allow certain operations. –Example 2: nuclear devices command. –Example 3: cryptographic secret sharing

102 CompSec Intro Nicolas T. Courtois, January 2009 102 Auditing / Monitoring –Record what actions took place and who performed them –Contributes to both disaster recovery (business continuity) and accountability.

103 CompSec Intro Nicolas T. Courtois, January 2009 103 Lipner 1982: Requirements, focus on integrity of the business processes and of the “production” data whatever it means: 1.Users will not write their own programs. 2.Separation of Function : Programmers will develop on test systems and test data, not on “production” systems. 3.A special procedure will allow to execute programs. 4.Compliance w.r.t. 3 will be controlled / audited. 5.Managers and auditors should have access to the system state and to all system logs.

104 CompSec Intro Nicolas T. Courtois, January 2009 104 The False Principle: The Weakest Link

105 CompSec Intro Nicolas T. Courtois, January 2009 105 Weakest Link Chain metaphor: Schneier: www.schneier.com/blog/archives/2005/12/weakest_link_se.html “security is only as strong as the weakest link.”

106 CompSec Intro Nicolas T. Courtois, January 2009 106 Chains vs. Layers

107 CompSec Intro Nicolas T. Courtois, January 2009 107 Two Cases Security can be like a chain: or, better Security can be layered

108 CompSec Intro Nicolas T. Courtois, January 2009 108 Military: Defence in Depth

109 CompSec Intro Nicolas T. Courtois, January 2009 109 Layers Computer systems have multiple layers, e.g. –HW components –Chipset/MB –OS –TCP/IP stack –HTTP application –Secure http layer –Java script –User/smart card interface

110 CompSec Intro Nicolas T. Courtois, January 2009 110 Example 1: assuming 1000 little details …

111 CompSec Intro Nicolas T. Courtois, January 2009 111 Example 2: assuming 1000 little details …

112 CompSec Intro Nicolas T. Courtois, January 2009 112 Another False?? Or True?? Principle: Assume the Worst

113 CompSec Intro Nicolas T. Courtois, January 2009 113 Back to Schneier Quote

114 CompSec Intro Nicolas T. Courtois, January 2009 114 Worst Case Defences? Criticism Cormac Herley [Microsoft research]: Most security systems are build to defend against the worst case. In reality, the average case losses are insignificant or small, e.g. actually computer crime worldwide is very small … and many security technologies are maybe -- from the economics point of view -- totally useless but it depends, we cannot judge security technologies by present losses, because there are also losses that have been avoided or deterred by this technology, and also that losses evolve over the time with highly chaotic pattern (they are 0 then suddenly they may explode)

115 CompSec Intro Nicolas T. Courtois, January 2009 115 Worst Case Defences? Conclusion There is no general answer. Both arguments are legitimate. More detailed analysis is needed; what if etc... Can this be done? Maybe there will always be some purely political choices that is impossible neither to justify, nor to undermine.

116 CompSec Intro Nicolas T. Courtois, January 2009 116 Security Technologies: Security as f(penetration rate)

117 CompSec Intro Nicolas T. Courtois, January 2009 117 Deployment / Penetration Rate Two sorts of technologies: A) Those that are effective if deployed at 20%: Examples: 1.virus detection (as opposed to removal / fighting the viruses), 99 % 2.email / hard disk encryption, 20 % 3.making the entry/authentication harder, as an option for the user, 20% B)Those that are totally ineffective even at 99%: Examples: 1.virus removal, 2.buggy anti-virus: “ your anti-virus has just restarted due to an internal error ”… 3.we click YES for 1 % of the security alerts out of fatigue … certificates are frequently invalid … it invalidates the 99 % of the time we did prevent the intrusion … we lost our time 4.if some ATMs still accept a blank mag-stripe only cards, the whole purpose of chips on bank cards is nearly defeated …

118 CompSec Intro Nicolas T. Courtois, January 2009 118 Secrecy vs. Transparency

119 CompSec Intro Nicolas T. Courtois, January 2009 119 Open Source vs. Closed Source and Security

120 CompSec Intro Nicolas T. Courtois, January 2009 120 Secrecy: Very frequently an obvious business decision. Creates entry barriers for competitors. But also defends against hackers.

121 CompSec Intro Nicolas T. Courtois, January 2009 121 Kerckhoffs ’ principle: [1883] “ The system must remain secure should it fall in enemy hands …”

122 CompSec Intro Nicolas T. Courtois, January 2009 122 Kerckhoffs ’ principle: [1883] Most of the time: incorrectly understood. Utopia: Who can force companies to publish their specs??? No obligation to disclose. Security when disclosed. Better security when not disclosed.

123 CompSec Intro Nicolas T. Courtois, January 2009 123 Yes (1,2,3,4): 1. Military: layer the defences.

124 CompSec Intro Nicolas T. Courtois, January 2009 124 Yes (2): 2) Basic economics: these 3 extra months (and not more  ) are simply worth a a lot of money.

125 CompSec Intro Nicolas T. Courtois, January 2009 125 Yes (3): 3) Prevent the erosion of profitability / barriers for entry for competitors / “ inimitability ”

126 CompSec Intro Nicolas T. Courtois, January 2009 126 Yes (4): 4) Avoid Legal Risks companies they don't know where their code is coming from, they want to release the code and they can't because it's too risky! re-use of code can COMPROMISE own IP rights and create unknown ROYALTY obligations (!!!) clone/stolen code is more stable, more reliable, easier to understand!

127 CompSec Intro Nicolas T. Courtois, January 2009 127 What ’ s Wrong with Open Source?

128 CompSec Intro Nicolas T. Courtois, January 2009 128 Kerckhoffs principle: Rather WRONG in the world of smart cards … –Reasons: side channel attacks, PayTV card sharing attacks etc.. But could be right elsewhere for many reasons … –Example: DES,AES cipher, open-source, never really broken KeeLoq cipher, closed source, broken in minutes …

129 CompSec Intro Nicolas T. Courtois, January 2009 129 Which Model is Better? Open and closed security are more or less equivalent … more or less as secure: opening the system helps both the attackers and the defenders. Cf. Ross Anderson: Open and Closed Systems are Equivalent (that is, in an ideal world). In Perspectives on Free and Open Source Software, MIT Press 2005, pp. 127-142.

130 CompSec Intro Nicolas T. Courtois, January 2009 130 Open Source:

131 CompSec Intro Nicolas T. Courtois, January 2009 131 Social Critique of Open Source Software Open source is just the second market failure, as bad as Microsoft, and a part of the same problem. [according to Courtois]: A. Why young and skilled programmers would work for FREE while doing a work of absolutely huge social benefit? B. Why incompetent programmers at ill-intentioned corporations are millionaires in dollars? Economics: Not sustainable. They will be less and less people in group A … Hopefully! Opinion: Working for free is plainly and totally wrong, and can be illegal. Don ’ t work for free [unless you have a personal wealth to support it]. Get paid!

132 CompSec Intro Nicolas T. Courtois, January 2009 132 Attack Trees

133 CompSec Intro Nicolas T. Courtois, January 2009 133 Attack Tree [Schneier 1999] Formal analysis of all known attack avenues. A tree with OR nodes and AND nodes. nodes can be labeled with probabilities or cost estimates but what about unknown attacks? Get Admin Access Abuse Limited Account Get Limited Access Abuse System Privileges 1/10 Privilege Escalation Exploit #36 sub-goals refinement details

134 CompSec Intro Nicolas T. Courtois, January 2009 134 Expanded Example Get Admin Access Become Admin Directly Obtain Legit. Admin Account Create One’s Own Account Obtain Sb’s CredentialsCreate Account login passwordin person remotely Abuse Limited Account Get Limited Access Abuse System Privileges 1/100 Privilege Escalation Exploit #36 similar bribe £1000 shoulder surfing intercept keyboard sniffer 1/10 guess / crack reset

135 CompSec Intro Nicolas T. Courtois, January 2009 135 Unix Log In

136 CompSec Intro Nicolas T. Courtois, January 2009 136 Weakest Link Security like a chain: Get Admin Access Become Admin Directly Abuse Limited Account 1/100 1/10 easier!

137 CompSec Intro Nicolas T. Courtois, January 2009 137 Defense in Depth also appears in attack trees… Crack Password spec secrecy cipher secrecy password hardness getting the SAM file

138 CompSec Intro Nicolas T. Courtois, January 2009 138 Conclusion: In complex systems, the principles of weakest link and defence in depth will occur simultaneously.

139 CompSec Intro Nicolas T. Courtois, January 2009 139 Accessing Password Database

140 CompSec Intro Nicolas T. Courtois, January 2009 140 Stealing Data with Costs

141 CompSec Intro Nicolas T. Courtois, January 2009 141 Opening a Safe with Costs © Bruce Schneier

142 CompSec Intro Nicolas T. Courtois, January 2009 142 Cheapest Attack without Special Equipment © Bruce Schneier

143 CompSec Intro Nicolas T. Courtois, January 2009 143 Attack Tree for PGP © Bruce Schneier

144 CompSec Intro Nicolas T. Courtois, January 2009 144 Reading a message sent between 2 Windows machines with countermeasures © Bruce Schneier


Download ppt "Computer Security Foundations and Principles Nicolas T. Courtois - University College London."

Similar presentations


Ads by Google