Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS194-3/CS16x Introduction to Systems Lecture 4 Security intro, threat models, security goals, access control September 10, 2007 Prof. Anthony D. Joseph.

Similar presentations


Presentation on theme: "CS194-3/CS16x Introduction to Systems Lecture 4 Security intro, threat models, security goals, access control September 10, 2007 Prof. Anthony D. Joseph."— Presentation transcript:

1 CS194-3/CS16x Introduction to Systems Lecture 4 Security intro, threat models, security goals, access control September 10, 2007 Prof. Anthony D. Joseph

2 Lec 4.2 9/10/07Joseph CS194-3/16x ©UCB Fall 2007 Goals for Today Security Introduction How do we assess threats to a system? How do we create a threat model? What is access control and what is its role? Note: Some slides and/or pictures in the following are adapted from slides ©2005 Silberschatz, Galvin, and Gagne. Slides courtesy of Kubiatowicz, AJ Shankar, George Necula, Alex Aiken, Eric Brewer, Ras Bodik, Ion Stoica, Doug Tygar, and David Wagner.

3 Lec 4.3 9/10/07Joseph CS194-3/16x ©UCB Fall 2007 Why is Security such a problem? Monoculture computing environment Web, e-commerce, & distributed / collaborative applications Internet spans national boundaries Poor programming practices

4 Lec 4.4 9/10/07Joseph CS194-3/16x ©UCB Fall 2007 Two Security Nightmares “The transparent society” by David Brin –Will Technology Force Us to Choose Between Privacy and Freedom? “Electronic Pearl Harbor” –Is this just scare-mongering? –Slammer worm took down Bank of America’s ATM network, Seattle 911 service, Ohio's Davis-Besse nuclear power plant’s safety monitoring system, … –Nachi worm invaded Diebold ATMs? –Real worries about e-voting validity –Millions of CC #s, SS #s leaked

5 Lec 4.5 9/10/07Joseph CS194-3/16x ©UCB Fall 2007 What is Computer Security Today? Computing in the presence of an adversary! –An adversary is the security field’s defining characteristic Reliability, robustness, and fault tolerance –Dealing with Mother Nature (random failures) Security –Dealing with actions of a knowledgeable attacker dedicated to causing harm –Surviving malice, and not just mischance Wherever there is an adversary, there is a computer security problem!

6 Lec 4.6 9/10/07Joseph CS194-3/16x ©UCB Fall 2007 Adversaries Adversaries are everywhere! Code Red worm infected 250K in a week –Contained a time-bomb set to attack the White House web server on a specific date –Fortunately, attack on WH was diverted –Estimates: $2 billion in lost productivity and infected machine clean up 2003 viruses cost businesses over $50B Today, crackers build zombie networks of 10K-1M compromised machines & sell services ( Hugely profitable! ) Massive spamming, ID fraud through phishing

7 Lec 4.7 9/10/07Joseph CS194-3/16x ©UCB Fall 2007 Computer Security History Early history interwoven with military apps –First big users of computers –First to worry seriously about the potential for misuse Terminology has military connotations: –Attacker who is trying to attack computer systems –Defenders working to protect their system from these threats

8 Lec 4.8 9/10/07Joseph CS194-3/16x ©UCB Fall 2007 Analyze to Learn! We’re going spend a lot of time studying attackers and thinking about how to break into systems –Why spread knowledge that will help bad guys be more effective? To protect a system, you have to learn how it can be attacked –Civil engineers learn what makes bridges fall down so they can build bridges that last –Software engineering is similar Security is the same and different! –Why?

9 Lec 4.9 9/10/07Joseph CS194-3/16x ©UCB Fall 2007 Challenges in Securing Systems Similar: –Analyze previous successful attacks But, deploy a new defense, they respond, you build a better defense, they respond, you… –Need to find ways to anticipate kinds of attacks Different: –Attackers are intelligent (or some of them are) –Attacks will change and get better with time –Have to anticipate future attacks Security is like a game of chess –Except the attackers often get the last move!

10 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Reality: Static Systems A deployed system is very hard to change –Serious consequences if attackers find a security hole in a widely deployed system Goal: Predict in advance what attackers might do and eliminate all security holes Reality: Have to think like an attacker Thinking like an attacker is not always easy –Can be fun to try to outwit the system –Or can be disconcerting to think about what could go wrong and who could get hurt What if you don't anticipate attacks? –Analog cellular phones in the 80’s and 90’s

11 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Real-World Example: Analog Cellular 1970’s: analog cellular had no security –Phones transmit ID/billing info in the clear –Assumption: attackers wouldn't bother to assemble equipment to intercept info… Attackers built “black boxes” to intercept and clone phones for fraudulent calling –Where’s the best place to intercept? –Cellular operators completely unprepared Early 90's, US carriers losing >$1B/yr –70% of LD cellular calls placed from downtown Oakland on Fri nights fraudulent Problems: huge capital investment/debt, 5–10 yrs & huge replacement cost

12 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Lesson Learned Failing to anticipate types of attacks, or underestimating the threat, can be costly Security design requires studying attacks –Security experts spend a lot of time trying to come up with new attacks –Sounds counter-productive (why help the attackers?), but it is better to learn about vulnerabilities before the system is deployed than after If you know about the possible attacks in advance, you can design a system to resist those attacks –But, anything else is a toss of the dice…

13 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 A Process for Security Evaluation How do we think about the ways that an adversary might use to penetrate system security or otherwise cause mischief? We need a framework to help you think through these issues Start with security goals or in other words: –What properties do we want the system to have, even when it is under attack? –What are we trying to protect from the attacker? –Or, to look at it the other way around, what are we trying to prevent?

14 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Some Common Security Goals Confidentiality: –Private information that we want to keep secret from an adversary (password, bank acct balance, diary entry, …) –Anything we want to prevent adversary from learning Integrity: –Want to prevent adversary from tampering with or modifying information Availability: –System should be operational when needed –Must prevent adversary from taking the system out of service at inconvenient times

15 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Example: BearFacts Grades Database? One obvious goal is protecting its integrity –Don’t want you to be able to give yourself an A+ merely by tampering with grade database Federal law and university rules require us to protect its confidentiality –No one else can learn what grade you are getting We probably also want some level of availability –So you can check your grades to date and we can calculate grades at the end of the semester

16 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Security Goals Can be simple or detailed –Exercise in requirements analysis –Specification of what it means for a system to be secure –Want goals to be met even when adversary tries to violate them Which goals are security goals? –Highly application-dependent –If someone figures out how to violate this goal, would it be a security breach? »If yes, you've found a security goal!

17 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Security Goals Summary “A program that has not been specified cannot be incorrect; it can only be surprising.” [Young, Boebert, and Kain] Or, a system without security goals has not been specified, and cannot be wrong; it can only be surprising

18 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Administrivia Sections Contact Kai with your schedule Projects –Time to form groups of 4-5

19 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Threat Assessment Some questions: –What kind of threats might we face? –What kind of capabilities might we expect adversaries to have? –What are the limits on what the adversary might be able to do to us? Result is a threat model, a characterization of the threats the system must deal with –Think: Who? What? Why?

20 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Developing a Threat Model First decide how much we can predict about kinds of adversaries we will face –Sometimes, know very well who the adversary is, and even their capabilities, motivations, and limitations »Cold War: US military oriented towards main enemy (Soviets) and focused on understanding USSR’s military capabilities/effectiveness/responsiveness If we know potential adversary, can craft a threat model that reflects adversary’s abilities and options and nothing more –However, often adversary is unknown

21 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Thinking Generically Must reason more generically about unavoidable limitations of the adversary –Silly ex: physics means adversary can’t exceed speed of light Can usually look at system design and identify what an adversary might do –Ex: If system never sends secret info over wireless nets, then don’t need to worry about threat of wireless eavesdropping –Ex: If system design means people might discuss secrets by phone, then threat model needs to include possible phone co. insider threats: eavesdrop/re-route/impersonate

22 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 What to Ignore? Good threat model also specifies threats we don’t care to defend against –Ex: home security – I don’t worry about a team of burglars flying a helicopter over my house and rappelling down my chimney Why not? Many easier ways to break into my house… Can classify adversaries by their motivation –Ex: financial gain motivation means won’t spend more money on attack than they’ll gain –Burglar won’t spend $1,000’s to steal car radio Motives are as varied as human nature –Have to prepare for all eventualities…

23 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Helpful to Examine Incentives Ex: Do fast food places make more profit on soft drinks than on food? –Would expect some places to try to boost drink sales (e.g., salting french fries heavily) Ex: Do customer svc reps earn bonuses for handling more than X calls per hour? –Would expect some reps to cut long calls short, or to transfer trouble customers to other depts. when possible Ex: Do spammers make money from those who respond, while losing nothing from those who don’t? –Would expect that spammers send their s as widely as possible, no matter how unpopular it makes them

24 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Incentives As a rule of thumb, organizations tend not to act against their own self-interest (at least not often…) Incentives (frequently) influence behavior –Exposes the motivations of potential adversaries Incentives are particularly relevant when two parties have opposing interests –When incentives clash, conflict often follows. –In this case it is worth looking deeply at the potential for attacks by one such party against the other

25 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Threat Assessment Summary Remember the three W's: –Who are the adversaries we might face? –How might they try to attack us, and what are their capabilities? –Why might they be motivated to attack us, and what are their incentives? Given security goals and threat model, last step is performing a security analysis

26 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Security Analysis Seeing whether there are attacks (within threat model) that successfully violate security goals –Often highly technical and dependent on system details –We’ll show you many security analysis methods One analogy: –Threat model defines set of moves an adversary is allowed to make –System design defines how defender plays game –Security goals define success condition: if adversary violates any goal, he wins; otherwise, the defender wins –Security analysis is examining all moves and counter-moves to see who has a winning strategy

27 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Another Analogy Mystery writers like to talk about means, motive, and opportunity –Security evaluation is similar way of thinking Threat assessment examines the means and motive Security analysis examines what opportunity the adversary might have to do harm

28 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Security Evaluation Summary Identify the security goals –What are we trying to protect? Perform a threat assessment –What threats does the system need to protect against? Do a security analysis –Can we envision any feasible attack that would violate the security goals? –May be very technical Use same process for new system design –Easier to ensure security when you know the security goals and threats –Security analysis helps refine system design

29 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Home Security Analysis Example What are my security goals? –Protecting assets from theft or tampering (integrity) –Protecting my personal safety »Ex: if someone does break in to steal money, I'd much prefer to know, so that I don't surprise them and get shot –Ensuring my house and contents remain in full working order whenever I want them (availability) –Providing a certain measure of privacy (confidentiality) –…–…

30 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Threat Assessment: Motivation Who am I trying to protect against, and what’s their motivation? –Burglar motivated by financial gain –Peeping Tom motivated by curiousity –Grudge holder motivated by revenge –…–…

31 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Threat Assessment: Capabilities/Threats What are their capabilities (tools, skills, knowledge, access, etc.)? What threats might I face? –Burglar has lockpicks and know-how, crowbar, or can cut my phone lines –Repairman might have unaccompanied access –Peepers might have binoculars or telescope –One neighbor might have line-of-sight to my living room window, while another might be blocked by trees Threats to ignore? –Navy ships here for Fleet Day aren’t likely to start shelling my house, …

32 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Security Analysis: Possible Attacks All sorts of crazy scenarios! A burglar breaks window, grabs stuff, leaves I secure the windows, but determined burglar takes chainsaw to the walls and breaks in Slightly smarter burglar might look under the flowerpot and find the spare house key, … Sneaky burglar throws pebble against window at 3am each morning, setting off alarm and bringing the police, each morning until police decide to ignore obviously unreliable burglar alarm, then burglar is free to break in…

33 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 More Possible Attacks Neighbor with line-of-sight to my house uses telescope to peer in window Someone intent on revenge might leave unpleasant items on my lawn, or – depending on my home's security system – might be able to smash my windows Unscrupulous competitor knows I have an important early morning business meeting, so they cut my house’s power at night to make my alarm clock fail

34 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Security Example Summary This was a trivial example, mostly because you’re already familiar with home security However, it helps to have a framework when dealing with a complex computer system –Structures the security evaluation –Defines the process

35 BREAK

36 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Role of Access Control Before closing “back doors” we need to close “front doors” Access control: determines access to files & processes in OS We will return to these themes throughout the course

37 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Classic Models of Security Computer security has its origin in military models of security Different levels of secrecy –e.g. FOUO/SSI/Secret/Top Secret Compartmentalized security –e.g. CNWDI (Critical Nuclear Weapons Design Information), COMSEC (Communications Security),... –TS/SCI (Top Secret/Sensitive Compartmented Information) –CRYPTO

38 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Corresponding Access Control Classic model  Mandatory Access Control (MAC) »We also use the abbreviation MAC for “message authentication code” »Prevents authenticated user at specific classification or trust-level from accessing information in a different level »Characteristics: Non-bypassable, tamper-proof Evaluatable (to determine the usefulness and effectiveness of a rule) Always-invoked (precludes by-passing) User controlled security  Discretionary Access Control (DAC)

39 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Subjects & Objects Subjects do things –Ex: users, processes … Objects have things done to them –Ex: files, processes … Access types are the things that are done –Ex: read, write, append, list, detect, remove, execute …

40 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Read and Write are Different Access types can be distinguished by whether they pass information Generally “write” passes information Generally “read” does not pass information

41 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Access Control Matrix File 1File 2File 3 Alicereadread/write Bobexecute Charlieread

42 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Problems with an Access Control Matrix Sparse matrix – many blank entries Hard to manage Who can manage different entries? What if we need to give “temporary rights”? Common entries?

43 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Sparse Matrix Representations Access Control Lists (ACLs) –Objects list subjects and access types –Ex: This file can be modified by Alice and read by Charlie Capabilities –Subjects have particular “permissions” –Ex: Bob is allowed to modify files Hybrid models also exist

44 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Are ACLs & Capabilities Equivalent? In representative power, yes –Both are sparse matrix representations of the Access Matrix In philosophy, no –Often come with particular features & OS philosophy –Capabilities often appeal to researchers –But capability systems often work poorly –Perennial claim: Capability lists are coming back!

45 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Where is an ACL Applied? In some systems: on the file In some systems: on the directory In some systems: a combination

46 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Who Determines Identity? In (non-distributed) multi-user systems, usually OS »login In distributed systems –Sometimes a central authority –(trusted third party, e.g., Kerberos) »Single login –Sometimes knowledge of a password –(e.g., ssh or “guest” file sharing in Windows) »Remote login

47 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Who is Allowed to Modify an ACL? In some systems, the “owner” of the file/process/directory Example: chmod command in UNIX –World access: read/write/execute »For directories: read = list items execute = “enter” directory –Owner access: read/write/execute

48 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Fine-Grained Control But we need options other than “world access” or “owner-only access” General ACLs allow arbitrary access, but hard to manage Solution: groups

49 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Groups A group is a single ID such as: –Berkeley-undergrads” –“friends of Alice” –“administrative access” A group administrator maintains group membership list

50 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 More on UNIX chmod World: read/write/execute Group: read/write/execute Owner: read/write/execute Can change owner using chown command

51 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Temporary Access This is an area where capabilities systems excel –“Transferring a capability” –Sometimes like giving a reference ACL systems need special mechanism –UNIX: “setuid” bit –Windows NT/XP: “Run as”

52 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Procedure-Oriented Access Control Run a program to determine access Ex: Web server access

53 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Monotonic vs. Non-Monotonic Classic access control was monotonic As we acquire more capabilities, or identities, we get more powerful –“root”, “super-user”, “Administrator” But this often causes problems –What if “root” password is discovered?

54 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Non-Monotonic Access Control With non-monotonic access control, as we gain identities or capabilities, we may lose access Ex: Windows file sharing (administrators have crippled access)

55 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Distributed Access Control Distributed access control is an active research area Ex: who can access an encrypted satellite broadcast? –Users join and leave all the time –Millions or tens of millions of users Solution: “Distributed key distribution” –Must be efficient…

56 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Compatibility of Access Control ACLs predominate, but each system implements them in their own way Systems must “translate” access control –SAMBA supports Windows and Unix-like systems Continual source of serious errors

57 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Autonomous Access Control Each system manages its own access control Requires remote login Problem: people often access hundreds or thousands of systems, and necessarily reuse login info (passwords) Common password problem We will revisit these issues in the course

58 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Summary Access Control is Central to Security –We’ll return to access control repeatedly in the course –Old area of security, but not well understood –Often poorly implemented –And we haven’t even begun to look at “backdoors”! Security analysis framework for a complex computer system –Structures the security evaluation –Defines the process

59 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Computer Security Today Computing in the presence of an adversary! –An adversary is the security field’s defining characteristic –They’re everywhere: fact of life, plan for them! –Security is dealing with/surviving actions of knowledgeable attacker Systems become static after deployment Analyze the past to prepare for the future –Determine security goals, try to anticipate future attacks –We need a process to drive the analysis… Wherever there is an adversary, there is a computer security problem!

60 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Background: Terminology An attack is an attempt to breach system security – Not all attacks are successful A threat is a circumstance or scenario with the potential to cause harm to a system –An attack usually refers to a specific stratagem whereas threat refers to a broader class of ways that things could go wrong A vulnerability is an aspect of the system that permits someone to mount a successful attack. Sometimes called a security hole –A security weakness is like a vulnerability, only it is less clear whether it could actually lead to any direct violation of the security goals –A weakness might represent a potential vulnerability whose risk is unclear; or, several weaknesses might combine to yield a full-fledged vulnerability

61 Lec /10/07Joseph CS194-3/16x ©UCB Fall 2007 Background: Terminology (cont’d) A security goal is a goal that is supposed to be achieved by the system; if it fails, the system will be considered insecure A threat assessment is an attempt to assess the set of all possible threats A threat model is a characterization of the possible threats, usually produced during a threat assessment 24 by 7 refers to the window of time in which systems are most vulnerable to attack –(Ok, this one is a joke from


Download ppt "CS194-3/CS16x Introduction to Systems Lecture 4 Security intro, threat models, security goals, access control September 10, 2007 Prof. Anthony D. Joseph."

Similar presentations


Ads by Google