Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 OS II: Dependability & Trust Threat Modeling & Security Metrics Dependable Embedded Systems & SW Group www.deeds.informatik.tu-darmstadt.de Prof. Neeraj.

Similar presentations


Presentation on theme: "1 OS II: Dependability & Trust Threat Modeling & Security Metrics Dependable Embedded Systems & SW Group www.deeds.informatik.tu-darmstadt.de Prof. Neeraj."— Presentation transcript:

1 1 OS II: Dependability & Trust Threat Modeling & Security Metrics Dependable Embedded Systems & SW Group www.deeds.informatik.tu-darmstadt.de Prof. Neeraj Suri Abdelmajid Khelil Daniel Germanus Dept. of Computer Science TU Darmstadt, Germany

2 2 Terminology  Threat: The adversary‘s goals  Threat Profile: The collection of all threats of a system  Threat Model: A document that provides background information on a system, its threat profile, and analysis of the current system against that threat profile. Threat modeling results in a living threat model.  Vulnerability: A security flaw in the system.  Risk: A characterization of the danger of a vulnerability or condition.  Security Weakness: An insufficient mitigation of a threat (usually resulting in a vulnerability).  Asset: An abstrat/concrete resource that a system must protect from misuse by an adversary.

3 3 Motivation  Threat Model is a master plan for securing software systems  Reckoning applications and technologies w.r.t. their attackability  Acquire attacker’s way of thinking  Minimize impact in case of successful attack  Prioritize development of fixes for discovered weaknesses

4 4 Outline of Today’s Lecture  Discuss fundamental components of a Threat Model  Example of security metrics: Attack surface measure © DEEDS Group SWFT WS ‘07

5 5 Threat Model Components D. Germanus, A. Johansson, N. Suri “Threat Modeling and Dynamic Profiling”, Book chapter in Annals of Emerging Research in Information Assurance, Security and Privacy Services, Elsevier Press, 2008.

6 6 Roles and Phases Threat Modeling  Involves different roles:  Analysts  System architects  Software engineers  Security engineers  Software testers  Is performed in three phases:  Inception  Object identification  Reaction © DEEDS Group SWFT WS ’07-08

7 7 The Three Phases of Threat Modeling © DEEDS Group SWFT WS ’07-08 DFD: Data Flow Diagram Threat effects: STRIDE - Spoofing - Tampering - Repudiation - Information disclosure - Denial of service - Elevation of privilege Ranking of risk: DREAD - Damage potential - Reproducibility - Exploitability - Affected users - Discoverability S/O: Subject/Object

8 8 Threat Modeling: Inception Phase  Sketch data flow diagrams (DFD)  Gain understanding of the system‘s composition & interactions  Find system entry points  How to interact – UI, resources (local, remote), 3rd party SW  Determine assets  Locality can be derived from DFDs  Output:  DFDs, entry points, assets © DEEDS Group SWFT WS ’07-08

9 9 Threat Modeling: Object Identification Phase  Evaluate which user actions are allowed and the objects involved  Different methods exist (use scenarios, feature scenarios etc.), will focus on Subject/Object (S/O) Matrix  Logical, systematic  Output:  External dependencies  Unresolved questions  Deployment constraints  Possible vulnerabilities © DEEDS Group SWFT WS ’07-08

10 10 Object Identification Phase: Subjects & Objects  Identify system‘s subjects and objects  Subjects:  Active entities, carrying out operations on objects  Processes, users  Use DFDs as a basis  Objects:  Subjects are also objects: Processes, users..  Data stores  Data flows © DEEDS Group SWFT WS ’07-08

11 11 Object Identification Phase: S/O Matrix  Subject/Object Matrix generation:  Subjects represented as rows, objects as columns  Assign each subject-object relation an operation  Operations:  Authenticate, Authorize, No Access (Users, Processes)  Load, Read, Execute, Absolute control (Data stores)  When matrix is set-up, first columns and secondly rows are contracted yielding a compacted matrix  If necessary expand respective rows/columns  e.g., to differ between disparate roles © DEEDS Group SWFT WS ’07-08

12 12 Object Identification Phase: Example  Example: Airline quick checkin terminal  Terminal capabilities:  Operations: Choose seat, print boarding pass  Users: Anonymous (pre auth.), clients (post auth.) © DEEDS Group SWFT WS ’07-08 Object contraction Subjects Objects An: authenticate, Az: authorize, S: send, R: receive

13 13 Object Identification Phase: Attacker Subject  Finally, append an Attacker subject to the matrix  Attacker subject may perform *every* operation  Discuss in null hypothesis discussion which operations may be canceled due to infeasibility from attacker subject row  New unresolved questions, deployment constraints, or external dependencies come up during this discussion  Remaining operations of the attacker are regarded as possible vulnerabilities © DEEDS Group SWFT WS ’07-08

14 14 Object Identification Phase: Survey  Next, a survey is developed using a catalog of questions for each object class  Question catalogs are grouped and refer to:  OS specifics  Hardware/driver related issues  High level software technology  Experiences from the past  Reported security flaws (BugTraq, …)  Should be answered by system architects, software and security engineers  More external dependencies, unresolved questions, deployment constraints and possible vulnerabilities (may/will) arise, input for next phase © DEEDS Group SWFT WS ’07-08

15 15 The Three Phases of Threat Modeling (Recall) © DEEDS Group SWFT WS ’07-08 DFD: Data Flow Diagram Threat effects: STRIDE - Spoofing - Tampering - Repudiation - Information disclosure - Denial of service - Elevation of privilege Ranking of risk: DREAD - Damage potential - Reproducibility - Exploitability - Affected users - Discoverability S/O: Subject/Object

16 16 Threat Modeling: Reaction Phase  Previously generated lists and export knowledge are required to distill potential threats  Threats are  directed against assets  put assets at risk  Reflect an attacker‘s intentions  Next: STRIDE & DREAD ratings, Threat trees …

17 17 Reaction Phase: STRIDE  STRIDE scheme used for classification of expected impact  Acronym for:  Spoofing – allows attackers to act as another user or component ( vs. authentication)  Tampering – illegal modification of data (integrity)  Repudiation – inability of tracing operations back to a specific user (non- repudation)  Information disclosure – gain access to data in transit or in a data store (confidentiality)  DoS – denial of service attack (availability)  Elevation of privilege – illegal raise of access privileges (authorization)

18 18 Reaction Phase: Threat Tree  Threat trees helpful to understand dependencies among a threat‘s partial requirements  Semantics of threat trees similar to that of fault trees in fault tree analysis (FTA)  Root node represents a threat,  Leaves represent entry points to be used for an attack,  Inner nodes represent partial goals during an attack.  By default, nodes on the same level underly OR-relationship, i.e., sufficient to fulfill one condition on level n to proceed on level n-1  Very important node attribute: if condition is mitigated or not

19 19 Threat Tree Example  Below: threat tree on information leakage of a precious document  Right subtree is mitigated (as leaves 2.1 and 2.2 are mitigated)  Left subtree unmitigated, potential entry point: condition 1.2

20 20 Reaction Phase: DREAD  DREAD: used to classify each node in threat trees  Acronym for:  Damage potential – rates the affected assets and the expected impact  Reproducibility – rates the effort to bring the attack about  Exploitability – estimates the threat‘s value and an attacker‘s objectives  Affected users – estimates the fraction of installation which are subject to the attack  Discoverability – a measure for the likelihood of discovering the attack  Rates are measured on a discrete scale, for simplicity in further assessments not too large, e.g., 1: low; 2: medium; 3: high.

21 21 Reaction Phase: Mitigation?  Based on threat trees, DREAD, and STRIDE ratings, mitigations are planned  Multiple selection criteria may be of interest in prioritization, e.g.,  Most easily reproducible vulnerabilities,  Conditions occuring in more than one threat tree,  Strictly damage potential oriented.  After having mitigated one or more conditions, rerun Threat Modeling process on the respective component(s) © DEEDS Group SWFT WS ’07-08

22 22 Microsoft Threat Modeling Tool Download: http://www.microsoft.com/downloads/details.aspx? familyid=62830f95-0e61-4f87-88a6- e7c663444ac1&displaylang=en

23 23

24 24 Attack Surface Measure P. Manadhata and J. Wing. “An Attack Surface Metric" CMU-CS-05-155, July 2005. P. Manadhata, J. Wing, M. Flynn, M. McQueen. "Measuring the Attack Surfaces of Two FTP Daemons", QoP '06: Proceedings of the 2nd ACM workshop on Quality of Protection, 2006.

25 25 Attack Surface Measure  Reasoning: Applications should provide a minimum of accessible services  E.g., API methods, Resources, etc.  Attack Surface is a three dimensional vector  Required input for computation:  Entry points – methods that receive data from the environment  Exit points – methods that send data to the environment  Channels – communication media, e.g., sockets, pipes, etc.  Untrusted data – e.g., DBs or FSs (or single elements like key/value pairs, data rows/cols in a DB, files). © DEEDS Group SWFT WS ’07-08

26 26 Attack Surface Measure  Computation yields a vector with  M(ethods): weighted sum of entry and exit points  C(hannels): weighted channel sum  D(ata): sum of untrusted data items and their weights  How to assign weights?  Damage potential-effort ratio  Attack Surface vector allows comparison, but:  Only systems of similar nature comparable, e.g., two different versions of one system  Cannot compare text processors with database server applications – disadvantage? © DEEDS Group SWFT WS ’07-08

27 27 Attack Surface Measure: Computation (1)  Automatize computation  Imagine systems with several MLoC  But: many concepts are implemented differently among disparate technologies  Static analysis good for evaluation task of entry/exit points  Need call graphs to distinguish between internal methods (not directly callable from the environment) and API methods which constitute an entry/exit point  Channels and untrusted data items evaluated during runtime © DEEDS Group SWFT WS ’07-08

28 28 Attack Surface Measure : Computation (2) © DEEDS Group SWFT WS ’07-08 manually

29 29 Attack Surface Measure – FTP Daemons (1)  Attack Surface for 2 FTP daemons [5]  Wu-FTPD  ProFTPD © DEEDS Group SWFT WS ’07-08 The number of channels opened by both daemons: The number of direct entry points (DEP) and direct exit points (DExP) in both codebases: AR: Access rights, RU: Remote unauthorized Data (in default exe): files such as config, log..:

30 30 Attack Surface Measure – FTP Daemons (2) Damage potential estimation  Define ordering in each resource class  Assign values Table: Numeric values assigned to the values of the attributes Channel’s damage potential in terms of the channel’s protocol A data item’s damage potential in terms of the data item’s type Estimate a method’s damage potential in terms of the method’s privilege. Estimate the effort the attacker needs to spend to use a resource in an attack in terms of the resource’s access rights.

31 31 Attack Surface Measure – FTP Daemons (3) : M : C : D ProFTPD attack surface

32 32 Attack Surface Measure – FTP Daemons (4) ProFTPD Attack Surface: Wu-FTPD Attack Surface: - Wu-FTPD has a higher measure along the method dimension as it has a larger number of methods running with root privilege and accessible with unauthenticated user access rights. – ProFTPD has a higher measure along the data dimension as it has a larger number of files accessible with world access rights.

33 33 Other Comparisons of FTP Daemons  There are more vulnerability reports for Wu-FTPD 2.6.2 than for ProFTPD 1.2.10. From vulnerability databases:

34 34 Literature [1] D. Germanus, A. Johansson, N. Suri “Threat Modeling and Dynamic Profiling”, In Annals of Emerging Research in Information Assurance, Security and Privacy Services, Elsevier Press, 2008. [2] F. Swiderski, and W. Snyder “Threat Modeling”, Microsoft Press, 2004. [3] S. Lipner, and M. Howard, “The Trustworthy Computing Security Development Lifecycle”, http://msdn.microsoft.com/library/default.asp?url=/library/en- us/dnsecure/html/sdl.asp Microsoft, 2005. [4] P. Manadhata and J. Wing. “An Attack Surface Metric" CMU-CS-05-155, July 2005. [5] P. Manadhata, J. Wing, M. Flynn, M. McQueen. "Measuring the Attack Surfaces of Two FTP Daemons", QoP '06: Proceedings of the 2nd ACM workshop on Quality of protection, 2006. [6] B. Schneier "Attack Trees: Modeling security threats", Dr. Dobb's Journal, Dec. 1999. [7] Boström et al., “Extending XP practices to support security requirements engineering”, SESS '06: Proceedings of the 2006 international workshop on Software engineering for secure systems, 2006.


Download ppt "1 OS II: Dependability & Trust Threat Modeling & Security Metrics Dependable Embedded Systems & SW Group www.deeds.informatik.tu-darmstadt.de Prof. Neeraj."

Similar presentations


Ads by Google