Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Software Fault Tolerance (SWFT) Threat Modeling Dependable Embedded Systems & SW Group www.deeds.informatik.tu-darmstadt.de Prof. Neeraj Suri Daniel.

Similar presentations


Presentation on theme: "1 Software Fault Tolerance (SWFT) Threat Modeling Dependable Embedded Systems & SW Group www.deeds.informatik.tu-darmstadt.de Prof. Neeraj Suri Daniel."— Presentation transcript:

1 1 Software Fault Tolerance (SWFT) Threat Modeling Dependable Embedded Systems & SW Group Prof. Neeraj Suri Daniel Germanus Abdelmajid Khelil Dept. of Computer Science TU Darmstadt, Germany

2 2 Terminology  Threat: The adversary‘s goals  Threat Profile: The collection of all threats of a system  Threat Model: A document that provides background information on a system, its threat profile, and analysis of the current system against that threat profile. Threat modeling results in a living threat model.  Vulnerability: A security flaw in the system.  Risk: A characterization of the danger of a vulnerability or condition.  Security Weakness: An insufficient mitigation of a threat (usually resulting in a vulnerability).  Asset: An abstrat/concrete resource that a system must protect from misuse by an adversary.  Trust Level: A charcterization of an external entity, often based on how it is authenticated and what privileges it has.

3 3 Motivation  Threat Model is a master plan for securing software systems  Reckoning applications and technologies w.r.t. their attackability  Acquire attacker’s way of thinking  Minimize impact in case of successful attack  Prioritize development of fixes for discovered weaknesses

4 4 Outline of Today’s Lecture  Discuss components of a (good) Threat Model  Integration of Threat Modeling into software development processes  Attack surface measure © DEEDS Group SWFT WS ‘07

5 5 Threat Model Components D. Germanus, A. Johansson, N. Suri “Threat Modeling and Dynamic Profiling”, Book chapter in Annals of Emerging Research in Information Assurance, Security and Privacy Services, Elsevier Press, 2008.

6 6 Components of a Threat Model  Threat Modeling involves different roles:  Analysts  System architects  Software engineers  Security engineers  Software Testers  and is performed in three phases:  Inception  Object identification  Reaction © DEEDS Group SWFT WS ’07-08

7 7 The Three Phases of Threat Modeling © DEEDS Group SWFT WS ’07-08 DFD: Data Flow Diagram Threat effects: STRIDE - Spoofing - Tampering - Repudiation - Information disclosure - Denial of service - Elevation of privilege Ranking of risk: DREAD - Damage potential - Reproducibility - Exploitability - Affected users - Discoverability S/O: Subject/Object

8 8 Threat Modeling: Inception Phase  Sketch data flow diagrams (DFD)  Gain understanding of the system‘s composition & interactions  Find system entry points  How to interact – UI, resources (local, remote), 3rd party SW  Determine assets  Locality can be derived from DFDs  Output:  DFDs, entry points, assets © DEEDS Group SWFT WS ’07-08

9 9 Threat Modeling: Object Identification Phase  Evaluate which user actions are allowed and the objects involved  Different methods exist, will focus on Subject/Object (S/O) Matrix  Thorough, systematic  Output:  External dependencies  Unresolved questions  Deployment constraints  Possible vulnerabilities © DEEDS Group SWFT WS ’07-08

10 10 Object Identification Phase: Subjects & Objects  Identify system‘s subjects and objects  Subjects:  Active entities, carrying out operations on objects  Processes, users  Use DFDs as a basis  Objects:  Subjects are also objects: Processes, users..  Data stores  Data flows © DEEDS Group SWFT WS ’07-08

11 11 Object Identification Phase: S/O Matrix  Subject/Object Matrix generation:  Subjects represented as rows, objects as columns  Assign each subject-object relation an operation  Operations:  Users, Processes: Authenticate, Authorize, No Access  Data stores: Load, Read, Execute, Absolute control  When matrix is set-up, first columns and secondly rows are contracted yielding a compacted matrix  If necessary, e.g., to differ between disparate roles, expand respective rows/cols again © DEEDS Group SWFT WS ’07-08

12 12 Object Identification Phase: Example  Example: Airline quick checkin terminal  Terminal‘s capabilities:  Operations: Choose seat, Print Boarding Pass  Users: Anonymous (pre auth.), Clients (post auth.) © DEEDS Group SWFT WS ’07-08 Object contraction subjects objects

13 13 Object Identification Phase: Attacker Subject  Finally, append an Attacker subject to the matrix  Attacker subject may perform *every* operation  Discuss in null hypothesis discussion which operations may be canceled due to infeasibility from attacker subject row  New unresolved questions, deployment constraints, or external dependencies come up during this discussion  Remaining operations of the attacker are regarded as possible vulnerabilities © DEEDS Group SWFT WS ’07-08

14 14 Object Identification Phase: Survey  Next, a survey is developed using a catalog of questions for each object class  Question catalogs are grouped and refer to:  OS specifics  Hardware/driver related issues  High level software technology  Experiences from the past  Reported security flaws (BugTraq, …)  Should be answered by system architects, software and security engineers  More external dependencies, unresolved questions, deployment constraints and possible vulnerabilities (may/will) arise, input for next phase © DEEDS Group SWFT WS ’07-08

15 15 Threat Modeling: Reaction Phase  Previously generated lists and export knowledge are required to distill potential threats  Threats are  directed against assets,  put assets at risk,  Reflect an attacker‘s intentions.  Next: STRIDE & DREAD ratings, Threat trees …

16 16 Reaction Phase: STRIDE  STRIDE scheme used for classification of expected impact  Acronym for:  Spoofing – allows attackers to act as another user or component  Tampering – illegal modification of data  Repudiation – inability of tracing operations back to a specific user  Information disclosure – gain access to data in transit or in a data store  DoS – denial of service attack  Elevation of privilege – illegal raise of access privileges

17 17 Reaction Phase: Threat Tree  Threat trees helpful to understand dependencies among a threat‘s partial requirements  Semantics of threat trees similar to that of fault trees in fault tree analysis (FTA)  Root node represents a threat,  Leaves represent entry points to be used for an attack,  Inner nodes represent partial goals during an attack.  By default, nodes on the same level underlie OR-relationship, i.e., sufficient to fulfill one condition on level n to proceed on level n-1  Very important node attribute: if condition is mitigated or not

18 18 Threat Tree Example  Below: threat tree on information leakage of a precious document  Right subtree is mitigated (as leaves 2.1 and 2.2 are mitigated)  Left subtree unmitigated, potential entry point: condition 1.2

19 19 Reaction Phase: DREAD  DREAD: used to classify each node in threat trees  Acronym for:  Damage potential – rates the affected assets and the expected impact  Reproducibility – rates the effort to bring the attack about  Exploitability – estimates the threat‘s value and an attacker‘s objectives  Affected users – estimates the fraction of installation which are subject to the attack  Discoverability – a measure for the likelihood of discovering the attack  Rates are measured on a discrete scale, for simplicity in further assessments not too large, e.g., 1: low; 2: medium; 3: high.

20 20 Reaction Phase: Mitigation?  Based on threat trees, DREAD, and STRIDE ratings, mitigations are planned  Multiple selection criteria may be of interest in prioritization, e.g.,  Most easily reproducible vulnerabilities,  Conditions occuring in more than one threat tree,  Strictly damage potential oriented.  After having mitigated one or more conditions, rerun Threat Modeling process on the respective component(s) © DEEDS Group SWFT WS ’07-08

21 21 Threat Modeling: Process Integration Boström et al., “Extending XP practices to support security requirements engineering”, SESS, 2006.

22 22 Threat Modeling: Process Integration  TM may be simply put as an extra stage of an existing development process – no big conceptual win  Other processes exist which include TM or comparable concepts:  Microsoft Security Development Lifecycle (SDL) No real development lifecycle, focusses on security and reliability, 12 iterative stages  Secure Extreme Programming Agile method, derived from eXtreme Programming (XP), 7 stages. © DEEDS Group SWFT WS ’07-08

23 23 Secure Extreme Programming - Overview  Stages: 1. Identification of security sensitive assets 2. Formulation of abuser stories 3. Abuser story risk assessment 4. Abuser story and user story negotiation 5. Definition of security-related user stories 6. Definition of security-related coding-standards 7. Abuser story countermeasure cross-checking © DEEDS Group SWFT WS ’07-08

24 24 Secure Extreme Programming – Stage 1  Identification of security sensitive assets  High-level assets which need protection are identified  Corresponds to TM‘s Inception phase  Paper actually does not specify how to achieve this goal © DEEDS Group SWFT WS ’07-08

25 25 Secure Extreme Programming – Stage 2  Formulation of abuser stories  Analogously to the concept of „user stories“, a security engineer phrases an attacker‘s potential intentions  Abuser stories should be asset centric to provide the customer a uniform view on his business processes´ criticalities  Example: „All communication between user terminal and backend systems need to be encrypted to anticipate man in the middle attacks and guarantee user data integrity“ © DEEDS Group SWFT WS ’07-08

26 26 Secure Extreme Programming – Stage 3  Abuser story risk management  Corresponds to TM STRIDE and DREAD rating  Beside security-related measures, the estimated complexity and cost required to anticipate the abuser story are taken into account © DEEDS Group SWFT WS ’07-08

27 27 Secure Extreme Programming – Stage 4  Abuser story and user story negotiation  Planning of the next development iteration  Short iterations of 5-10 days preferred  User stories (functionality) and abuser stories (threat mitigation) are considered © DEEDS Group SWFT WS ’07-08

28 28 Secure Extreme Programming – Stage 5  Definition of security-related user stories  Transcription of abuser stories into security-related user stories  Abuser stories reflect requirements of a secure system  Security-related user stories offer software engineers precise information how to achieve, i.e., implement, a secure system © DEEDS Group SWFT WS ’07-08

29 29 Secure Extreme Programming – Stage 6  Definition of security-related coding-standards  Can be implicitly compared to TM‘s question catalog survey  Static catalogs can be interpreted as coding conventions © DEEDS Group SWFT WS ’07-08

30 30 Secure Extreme Programming – Stage 7  Abuser story countermeasure cross-checking  This stage keeps track of threats being mitigated  Each abuser story needs mappings to either one or more security-related user story of any iteration, or has to be documented in deployment constraints or unresolved questions  Otherwise, a threat has not been mitigated and represents a possible vulnerability © DEEDS Group SWFT WS ’07-08

31 31 Attack Surface Measure P. Manadhata and J. Wing. “An Attack Surface Metric" CMU-CS , July P. Manadhata, J. Wing, M. Flynn, M. McQueen. "Measuring the Attack Surfaces of Two FTP Daemons", QoP '06: Proceedings of the 2nd ACM workshop on Quality of protection, 2006.

32 32 Attack Surface Measure  Idea: Applications should provide a minimum of accessible services  Services are, e.g., API methods, Resources, etc.  Attack Surface is a three dimensional vector  Required input for computation:  Entry points – methods that receive data from the environment  Exit points – methods that send data to the environment  Channels – communication media, e.g., sockets, pipes, etc.  Untrusted data – e.g., DBs or FSs, single elements like key/value pairs, data rows/cols in a DB, files. © DEEDS Group SWFT WS ’07-08

33 33 Attack Surface Measure  Computation yields a vector with  M: weighted sum of entry and exit points  C: weighted channel sum  D: sum of untrusted data items and their weights  How to assign weights?  Attack Surface vector allows comparison, but:  Only systems of similar nature comparable, e.g., two different versions of one system  Cannot compare text processors with database server applications – disadvantage? © DEEDS Group SWFT WS ’07-08

34 34 Attack Surface Measure: Computation (1)  Automatize computation, imagine systems with several MLoC  But: many concepts are implemented differently among disparate technologies  Static analysis good for evaluation task of entry/exit points  Need call graphs to distinguish between internal methods (not directly callable from the environment) and API methods which constitute an entry/exit point  Channels and untrusted data items evaluated during runtime © DEEDS Group SWFT WS ’07-08

35 35 Attack Surface Measure : Computation (2) © DEEDS Group SWFT WS ’07-08

36 36 Attack Surface Measure – Example  Attack Surface for two FTP daemons [5]  Wu-FTPD  ProFTPD © DEEDS Group SWFT WS ’07-08 The number of channels opened by both daemons: The number of direct entry points and direct exit points in both codebases:

37 37 Attack Surface Measure – Example Damage potential estimation  Define ordering in each resource class  Assign values Numeric values assigned to the values of the attributes:

38 38 Attack Surface Measure – Example : M : C : D

39 39 Attack Surface Measure – Example ProFTPD Attack Surface: Wu-FTPD Attack Surface:

40 40 Microsoft Threat Modeling Tool Download: familyid=62830f95-0e61-4f87-88a6- e7c663444ac1&displaylang=en

41 41

42 42 Literature [1] D. Germanus, A. Johansson, N. Suri “Threat Modeling and Dynamic Profiling”, In Annals of Emerging Research in Information Assurance, Security and Privacy Services, Elsevier Press, [2] F. Swiderski, and W. Snyder “Threat Modeling”, Microsoft Press, [3] S. Lipner, and M. Howard, “The Trustworthy Computing Security Development Lifecycle”, us/dnsecure/html/sdl.asp Microsoft, [4] P. Manadhata and J. Wing. “An Attack Surface Metric" CMU-CS , July [5] P. Manadhata, J. Wing, M. Flynn, M. McQueen. "Measuring the Attack Surfaces of Two FTP Daemons", QoP '06: Proceedings of the 2nd ACM workshop on Quality of protection, [6] B. Schneier "Attack Trees: Modeling security threats", Dr. Dobb's Journal, Dec [7] Boström et al., “Extending XP practices to support security requirements engineering”, SESS '06: Proceedings of the 2006 international workshop on Software engineering for secure systems, 2006.


Download ppt "1 Software Fault Tolerance (SWFT) Threat Modeling Dependable Embedded Systems & SW Group www.deeds.informatik.tu-darmstadt.de Prof. Neeraj Suri Daniel."

Similar presentations


Ads by Google