Proposed Privacy Taxonomy for IOT Scott Shorter, Electrosoft, 2015-10-23 These slides are based on work contributed to the IDESG Use Case AHG in January.

Slides:



Advertisements
Similar presentations
Information Security Domains Computer Operations Security By: Shafi Alassmi Instructor: Francis G. Date: Sep 22, 2010.
Advertisements

Assurance Services Independent professional services that “improve the quality of information, or its context, for decision makers” Assurance service encompass.
Module 1 Evaluation Overview © Crown Copyright (2000)
Common Criteria Evaluation and Validation Scheme Syed Naqvi XtreemOS Training Day.
Cryptography and Network Security 2 nd Edition by William Stallings Note: Lecture slides by Lawrie Brown and Henric Johnson, Modified by Andrew Yang.
PKE PP Mike Henry Jean Petty Entrust CygnaCom Santosh Chokhani.
Effective Design of Trusted Information Systems Luděk Novák,
The Common Criteria for Information Technology Security Evaluation
IT Security Evaluation By Sandeep Joshi
1 norshahnizakamalbashah CEM v3.1: Chapter 10 Security Target Evaluation.
The Common Criteria Cs5493(7493). CC: Background The need for independently evaluated IT security products and systems led to the TCSEC Rainbow series.
Identity Management Based on P3P Authors: Oliver Berthold and Marit Kohntopp P3P = Platform for Privacy Preferences Project.
Chapter 20 Additional Assurance Services: Other Information
1 Cryptography and Network Security Third Edition by William Stallings Lecturer: Dr. Saleem Al_Zoubi.
Accounting Information Systems Chapter Outlines
Applied Cryptography for Network Security
Cryptography and Network Security Chapter 1. Chapter 1 – Introduction The art of war teaches us to rely not on the likelihood of the enemy's not coming,
Auditing A Risk-Based Approach To Conducting A Quality Audit
Cryptography and Network Security Third Edition by William Stallings Lecture slides by Lawrie Brown.
Cryptography and Network Security Chapter 1 Fourth Edition by William Stallings Lecture slides by Lawrie Brown.
Use Case Development Scott Shorter, Electrosoft Services January/February 2013.
Privacy By Design Sample Use Case Privacy Controls Insurance Application- Vehicle Data.
Conducting the IT Audit
National Smartcard Project Work Package 8 – Security Issues Report.
1 Cryptography and Network Security Fourth Edition by William Stallings Lecture slides by Lawrie Brown Changed by: Somesh Jha [Lecture 1]
Gurpreet Dhillon Virginia Commonwealth University
Functional Model Workstream 1: Functional Element Development.
Dr. Lo’ai Tawalbeh 2007 INCS 741: Cryptography Chapter 1:Introduction Dr. Lo’ai Tawalbeh New York Institute of Technology (NYIT) Jordan’s Campus
NSTIC ID Ecosystem A Conceptual Model v03 Andrew Hughes October October IDESG Version 1.
Identifying the Baseline IDESG Security Committee Discussion 10/23/
Software Configuration Management
Cryptography and Network Security
Eng. Wafaa Kanakri Second Semester 1435 CRYPTOGRAPHY & NETWORK SECURITY Chapter 1:Introduction Eng. Wafaa Kanakri UMM AL-QURA UNIVERSITY
SMART GRID: Privacy Awareness and Training – Information for Consumers A Starting Point April 2012 SGIP-CSWG Privacy Group 1 DRAFT v8.
1 Phases in Software Development Lecture Software Development Lifecycle Let us review the main steps –Problem Definition –Feasibility Study –Analysis.
Chapter Three IT Risks and Controls.
Demonstration of the Software Prototypes PRIME PROJECT 17 December 2004.
TFTM Interim Trust Mark/Listing Approach Paper Analysis of Current Industry Trustmark Programs and GTRI PILOT Approach Discussion Deck TFTM Committee.
MPEG-21 : Overview MUMT 611 Doug Van Nort. Introduction Rather than audiovisual content, purpose is set of standards to deliver multimedia in secure environment.
A DESCRIPTION OF CONCEPTS AND PLANS MAY 14, 2014 A. HUGHES FOR TFTM The Identity Ecosystem DISCUSSION DRAFT 1.
HIT Standards Committee Privacy and Security Workgroup: Initial Reactions Dixie Baker, SAIC Steven Findlay, Consumers Union June 23, 2009.
Session ID: Session Classification: Dr. Michael Willett OASIS and WillettWorks DSP-R35A General Interest OASIS Privacy Management Reference Model (PMRM)
ITU-T X.1254 | ISO/IEC An Overview of the Entity Authentication Assurance Framework.
1 Common Criteria Ravi Sandhu Edited by Duminda Wijesekera.
Privacy Impact Assessments Iain Bourne, Group Manager, Policy Delivery Information Commissioner’s Office, UK Workshop on data protection and the internet:
ISO 9001:2008 to ISO 9001:2015 Summary of Changes
The Value of Common Criteria Evaluations Stuart Katzke, Ph.D. Senior Research Scientist National Institute of Standards & Technology 100 Bureau Drive;
Lecture 7: Requirements Engineering
The Framework for Privacy Policies in the UK: Is telling people what information is gathered about them part of the framework? Does it need to be? Emma.
CMSC : Common Criteria for Computer/IT Systems
Standards Analysis Summary vMR –Pros Designed for computability Compact Wire Format Aligned with HeD Efforts –Cons Limited Vendor Adoption thus far Represents.
Use Case Development Cathy Tilton, Daon Scott Shorter, Electrosoft Services 7 February 2013.
Weekly Discussion Guide Functional Model Planning October 31, 2013 Adam Madlin Security Committee.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
1 CPSC156: The Internet Co-Evolution of Technology and Society Lecture 15: March 8, 2007 Identity, Anonymity, and Accountability in Information Systems.
The FIDO Approach to Privacy Hannes Tschofenig, ARM Limited 1.
HIT Standards Committee Overview and Progress Report March 17, 2010.
THE PROCESS APPROACH. Basic Concepts of Quality Outline Introduction to the Process Approach Types of Process Identification of Processes Process Analysis.
IS 630 : Accounting Information Systems Auditing Computer-based Information Systems Lecture 10.
The common structure and ISO 9001:2015 additions
Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University
Copyright © 2006 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin 20-1 Chapter Twenty Assurance, Related Services and Internal.
ISO 9001:2015 Subject: Quality Management System Clause 8 - Operation
 System Requirement Specification and System Planning.
The Common Criteria for Information Technology Security Evaluation
Prepared by Rand E Winters, Jr. ASR Senior Auditor October 2014
Chapter 20 Additional Assurance Services: Other Information
HIGHLIGHTING THE KEY CHANGES
Chapter 20 Additional Assurance Services: Other Information
Cryptography and Network Security
Presentation transcript:

Proposed Privacy Taxonomy for IOT Scott Shorter, Electrosoft, These slides are based on work contributed to the IDESG Use Case AHG in January At this time they are written for the human-centric identity ecosystem envisioned by NSTIC and IDESG goals, but it’s a basis for understanding privacy in IOT land as well

Common Criteria has A Requirements Syntax for Privacy Enhancing Functions Such as Pseudonymity & Anonymity Background  Common Criteria is an ISO standard security evaluation methodology.  It provides a formal language for expressing security requirements in various domains.  A process for gathering security requirements into sets that describe the intended behavior, “security targets”.  An evaluation methodology for confirming the correct implementation of the claimed requirements  Includes a class of requirements for representing different forms of privacy, including anonymity, pseudonymity and unlinkability Common Criteria on Anonymity/Pseudonymity

Pros and Cons of Common Criteria Pros  Provides a formal way to express a wide range of security and privacy requirements.  An international ISO standard with years of use around the world in security evaluations. Cons  Not directly applicable to the IDESG world of service providers – CC evaluations are applicable to tangible products rather than services – hardware/firmware/software and documentation.  Evaluations are very expensive and time consuming. Summary  I’m trying to learn from how CC models Privacy requirements, NOT to recommend CC evaluation methods be adopted by IDESG Common Criteria on Anonymity/Pseudonymity

Types of Common Criteria Requirements Common Criteria is a source of requirements of two categories:  Security Functional Requirements (SFRs) are things the target of evaluation should do.  Classes of functional requirements include Security Audit, Communications Protection, Cryptography, User Data Protection, Identification and Authentication, Security Management, Privacy, Self-protection, Resource Management and Access Control  Security Assurance Requirements (SARs) are methods for confirming that the target of evaluation does what it claims.  Classes of assurance requirements include Evaluation of Security Target, Development/Design Analysis, Guidance Documents, Lifecycle Support, Testing and Vulnerability Analysis Common Criteria on Anonymity/Pseudonymity

Some Common Criteria Terminology ONLY to understand the upcoming slides, not for IDESG taxonomy discussion. TOE means “Target of Evaluation” TERMDEFINITION identityrepresentation uniquely identifying entities (e.g. a user, a process or a disk) within the context of the TOE. An example of such a representation is a string. For a human user, the representation can be the full or abbreviated name or a (still unique) pseudonym. subjectactive entity in the TOE that performs operations on objects userSee external entity (which is defined as “human or IT entity possibly interacting with the TOE from outside of the TOE boundary”) Common Criteria on Anonymity/Pseudonymity

Common Criteria Privacy Class A Class is a high level category of requirements, broken down into Families. The Privacy class (FPR) contains:  FPR_ANOAnonymity  FPR_PSEPseudonymity  FPR_UNLUnlinkability  FPR_UNOUnobservability Common Criteria on Anonymity/Pseudonymity

Common Criteria Privacy Components Those Families are further decomposed into Components. FPR_ANO.1Anonymity FPR_ANO.2Anonymity without soliciting information FPR_PSE.1Pseudonymity FPR_PSE.2Reversible Pseudonymity FPR_PSE.3Alias Pseudonymity FPR_UNL.1Unlinkability FPR_UNO.1-4Unobservability Common Criteria on Anonymity/Pseudonymity

FPR_ANO Anonymity So there are two ways to provide anonymity.  FPR_ANO.1 protects anonymity because the identity is not disclosed by the system.  FPR_ANO.2 protects anonymity because the identity is never known. Note that this is NOT unlinkability. Anonymous users can still have their actions linked without violating this requirement Common Criteria on Anonymity/Pseudonymity

FPR_PSE Pseudonymity In each of these cases, the verified identity is known to the system, but not made generally available.  In the case of FPR_PSE.2, a function is provided to obtain the verified identity of a pseudonym.  This case may be useful in domains that require a balance of privacy and “real-world” accountability Common Criteria on Anonymity/Pseudonymity

FPR_UNL Unlinkability At the level of Identity Ecosystems, this concept relates to Do Not Track and similar initiatives. Online behavior is very distinctive, each individual has affiliations, interests and patterns that can be used to identify them. Unlinkability in Identity Ecosystems is a feature that regulates the privacy impact of behavioral identification Common Criteria on Anonymity/Pseudonymity

FPR_UNO Unobservability Four variations, not listed here. Beyond the scope of IDESG.  Affects the resources and services delivered by Relying Parties generally, not the Identity Ecosystem particularly Common Criteria on Anonymity/Pseudonymity

Conclusions Privacy relates to Identifiers and Transactions  Anonymity and Pseudonymity are about Identifiers  Unlinkability and Unobservability are about Transactions Unlinkability is different from Anonymity or Pseudonymity There are variations in types of Privacy and ways of implementing it. Different grades of privacy are useful for different purposes.  Anonymity can support free expression, but it can also enable abusive behavior, disclosure of secrets or data piracy  Reversible Pseudonymity provides accountability Common Criteria on Anonymity/Pseudonymity

References Common Criteria text is from version 3.1 release 4  Part 2 contains Security Functional Requirements, including the Privacy Class  Part 3 contains Security Assurance Requirements Common Criteria on Anonymity/Pseudonymity