WP3: D3.1 status, pending comments and next steps

Slides:



Advertisements
Similar presentations
EAP Channel Bindings Charles Clancy Katrin Hoeper IETF 76 Hiroshima, Japan November 08-13, 2009.
Advertisements

Chapter 4 Quality Assurance in Context
Working on a Mini-Project Anders P. Ravn/Arne Skou Computer Science Aalborg University February 2011.
Virtual Workbenches Richard Anthony Dept. Computer Science University of Greenwich Distributed Systems Operating Systems Networking.
1 ITC242 – Introduction to Data Communications Week 12 Topic 18 Chapter 19 Network Management.
Technical Writing II Acknowledgement: –This lecture notes are based on many on-line documents. –I would like to thank these authors who make the documents.
© Copyright Eliyahu Brutman Programming Techniques Course.
TERM PROJECT The Project usually consists of the following: Title
Slide #1 Writing Winning Proposals. Slide #2 Agenda  Overview  Writing Tips  Comments, Suggestions, Questions  Upcoming Seminars.
Greenbench: A Benchmark for Observing Power Grid Vulnerability Under Data-Centric Threats Mingkui Wei, Wenye Wang Department of Electrical and Computer.
CH07: Writing the Programs Does not teach you how to program, but point out some software engineering practices that you should should keep in mind as.
© Siemens AG, CT SE 1, Dr. A. Ulrich C O R P O R A T E T E C H N O L O G Y Research at Siemens CT SE Software & Engineering Development Techniques.
Copyright © The Open Group 2011 Your Name Your title 44 Montgomery Street Suite 960 San Francisco, CA USA Tel
Mantychore Oct 2010 WP 7 Andrew Mackarel. Agenda 1. Scope of the WP 2. Mm distribution 3. The WP plan 4. Objectives 5. Deliverables 6. Deadlines 7. Partners.
11 Writing a Conference Research Paper Miguel A. Labrador Department of Computer Science & Engineering
A Taxonomy of Evaluation Approaches in Software Engineering A. Chatzigeorgiou, T. Chaikalis, G. Paschalidou, N. Vesyropoulos, C. K. Georgiadis, E. Stiakakis.
Academic Essays & Report Writing
1 National Research Council - Pisa - Italy Marco Conti Italian National Research Council (CNR) IIT Institute Executive board meeting 2nd MobileMAN Workshop.
Odyssey A Reuse Environment based on Domain Models Prepared By: Mahmud Gabareen Eliad Cohen.
DEVELOPING A DYNAMIC THESIS. The thesis statement Is a single assertive sentence in the essay. Contains the writer’s position on the topic. The main IDEA.
How to start research V. Jayalakshmi. Why do we research? – To solve a problem – To satisfy an itch – To gain more market share/ Develop and improve –
How to write a good RESEARCH proposal
WP4 deliverable Critical Infrastructure Protection: Attack Prevention Solutions and Attacks.
How to Read Research Papers? Xiao Qin Department of Computer Science and Software Engineering Auburn University
CDB Chris Bonatti (IECA, Inc.) Tel: (+1) Proposed PKI4IPSEC Certificate Management Requirements Document IETF #59 – PKI4IPSEC Working.
16/11/ Semantic Web Services Language Requirements Presenter: Emilia Cimpian
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
The FDES revision process: progress so far, state of the art, the way forward United Nations Statistics Division.
CSIS 4850: CS Senior Project – Spring 2009 CSIS 4850: Senior Project Spring 2009 Object-Oriented Design.
SDLS Protocol Green Book initiation Ignacio Aguilar Sanchez (ESA) CCSDS Spring Meeting 2010 | Portsmouth, VA.
DEPENDABILITY ANALYSIS (towards Networked Information Systems) Ester Ciancamerla, Michele Minichino ENEA {ciancamerlae, In.
Objective ICT : Internet of Services, Software & Virtualisation FLOSSEvo some preliminary ideas.
Fall CS-EE 480 Lillevik 480f06-l6 University of Portland School of Engineering Senior Design Lecture 6 Other development processes Technical documents.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
2050AP Project WP5: “Conclusions” UPM Madrid 11 de Octubre 2013.
UC Marco Vieira University of Coimbra
SOFTWARE TESTING TRAINING TOOLS SUPPORT FOR SOFTWARE TESTING Chapter 6 immaculateres 1.
Project Management PTM721S
Chapter 1: Introduction to Systems Analysis and Design
Technical Report Writing
ECE361 Engineering Practice
Research Methods Dr. X.
<Student’s name>
Management and Manageability in OGSA
Systems Architecture WG: Charter and Work Plan
WP2 - INERTIA Distributed Multi-Agent Based Framework
ISO/IEC Software Testing
Writing the research protocol
SDLS Protocol Green Book initiation
WP1 INERTIA Framework Requirements, Specifications and Architecture
Chapter 18 Formal Reports
How to Read Research Papers?
Introductory Reviewer Development
All You Ever Wanted to Know About Dynamic Taint Analysis & Forward Symbolic Execution (but might have been afraid to ask) Edward J. Schwartz, Thanassis.
CHAPTER 4 PROPOSAL.
CHAPTER 4 PROPOSAL.
Chapter 0 : Introduction to Object Oriented Design
Chapter 1: Introduction to Systems Analysis and Design
Market-based Dynamic Task Allocation in Mobile Surveillance Systems
CTI STIX SC Monthly Meeting
Writing an Engineering Report (Formal Reports)
Automated Analysis and Code Generation for Domain-Specific Models
Requirements Document
ESS.VIP ADMIN EssNet on Quality in Multi-source Statistics, progress report 19TH WORKING GROUP ON QUALITY IN STATISTICS, 6 December 2016 Fabrice Gras,
IFLA LRM adopted! Top congress news!
CS565: Intelligent Systems and Interfaces
Chapter 1: Introduction to Systems Analysis and Design
STEPS Site Report.
Luca Simoncini PDCC, Pisa and University of Pisa, Pisa, Italy
CS 514 Class presentation template [Replace it with the title of your project] Xiaowei Yang.
Presentation transcript:

WP3: D3.1 status, pending comments and next steps Leonardo Montecchi lmontecchi@unifi.it http://rcl.dsi.unifi.it Resilient Computing Lab

Outline Status overview ToC Major comments Next steps

D3.1: status overview Title: Three main chapters: Modeling and Evaluation: State-of-the-art Three main chapters: Model-based approaches Experimental measurement approaches Works combining different evaluation approaches Partners involved: ALL but UniRM All the expected contributions have been provided. NOTE: Some of them appears to be out of the scope of the deliverable (see the major comments) Still to be completed (by UniFI): Executive summary + Conclusions

ToC (version v06) – 1/3 Executive Summary 6 1. Introduction (UniFI) 7 2. Model-based approaches (UniFI) 8 2.1 Formalisms for modelling dependability 8 2.1.1 Combinatorial models 8 2.1.2 State-based models 9 2.2 Model construction and solution approaches 10 2.2.1 Compositional approaches 10 2.2.2 Decomposition and aggregation approaches 12 2.3 Dependability modeling and solution tools 14 2.4 Deriving dependability models from engineering models 15

ToC (version v06) – 2/3 3. Experimental measurement approaches (UniMORE) 18 3.1 SCADA-based LCCIs (UniPARTHENOPE) 18 3.1.1 Metrics (UniPARTHENOPE+UniMORE) 18 3.1.1.1 Security Metrics 19 3.1.1.2 Quality of a Metric 19 3.1.1.3 Security Metric Domains 21 3.1.1.4 Dependability Metrics 21 3.1.2 Dependability and Security Benchmarking of SCADA based LCCIs (UniPARTHENOPE) 25 3.1.3 Data filtering and analysis (UniMORE) 28 3.1.4 Anomaly detection (UniMORE) 34 3.2 Systems for LCCI security (UniMORE) 44 3.2.1 Protocol vulnerability (UniPARTHENOPE) 45 3.2.2 Intrusion Detection Systems for LCCIs (UniMORE) 55 3.3 Field Failure Data Analysis (FFDA) (UniNA) 59 3.4 On line Monitoring (UniNA) 62 3.5 Fault Injection (UniNA) 65

ToC (version v06) – 3/3 4. Works combining different evaluation approaches (UniFI) 70 4.1 Relationships between modeling and experimentation 70 4.2 Works combining modeling and simulation 71 4.3 An holistic evaluation framework 72 5. Conclusions (UniFI) 75

Major comments to be discussed - 1 Positioning of Section 3.1.1 (“Metrics”) -UniPARTHENOPE+UniMORE. Comments: Section 3.1.1 defines the possible objectives of the analyses, i.e., the dependability and security metrics to be evaluated. It is something shared between all the evaluation approaches, not only related to experimental measurement approaches. The last page of subsection 3.1.1.4 (“Dependability Metrics”) describes one specific work ([Romano 1999]): why only this work and not a survey? And why this discussion has been inserted in the dependability metrics section? Proposed action: Move section 3.1.1 at the beginning of the deliverable as a stand-alone chapter, or as a section within chapter 1. Find the right place where [Romano 1999] can be discussed, possibly adding other works as required for a state-of-the-art.

Major comments to be discussed – 2 Content of Section 3.1.2 (“Dependability and Security Benchmarking of SCADA based LCCIs”) – UniPARTHENOPE. Comment: The section provides an overview of DBench, but it is not clear what are the specificities of DBench with respect to SCADA based LCCIs. The section’s title suggests a s-o-t-a on dependability and security benchmarking, not just the description of DBench. It is not clear how the subsections “Static code review”, “dynamic vulnerability analysis” and “vulnerability scanners” are related to the main benchmark section they belong to. The content of subsections “static code review” and “vulnerability scanners” seems to be out of the scope of the deliverable. Proposed actions: Complete the section adding works dealing with dependability and security benchmarking analysis for LCCIs. Clarify the links between the subsections and the main section. Move to “D2.1 Diagnosis and reconfiguration: state-of-the-art”? or revise the two sections that seem to be outside the scope of D3.1.

Major comments to be discussed – 3 Content of Section 3.2.1 (“Protocol vulnerability”) – UniPARTHENOPE. Comment: The whole section appears to be outside the scope of the deliverable. D3.1 should be a survey on modeling and evaluation aspects the section first presents a set of protocols for SCADA systems and then lists the related vulnerabilities and weaknesses. Proposed action: Move the section to “D2.1 Diagnosis and reconfiguration: state-of-the-art”? or rewrite it, e.g. describing the modeling and evaluation techniques used to identify the protocol vulnerabilities and weaknesses.

Major comments to be discussed – 4 Content of Section 3.2.2 (“Intrusion detection systems for LCCIs”) – UniMORE. Comment: Some works surveyed within this section actually deal with evaluation techniques used for intrusion detection, while others are more focused on the systems that use such techniques. This second set of works appears to be outside the scope of the deliverable. Proposed actions: Remove from the section the works focusing on the system definition (possibly moving it to “D1.2 Architectures, algorithms and middleware”?). In general, review the section to better fit the scope of the deliverable. Change the section’s title e.g. “Evaluation methods for Intrusion detection in LCCIs”.

Major comments to be discussed – 5 Works combining different modeling formalisms Comment: A section that explicitly surveys the works combining different modeling formalisms is currently missing, but it is an important topic. Proposed action: Insert a new section within Chapter 4 e.g. titled “Approaches combining different modeling formalisms”, which should also include the contribution by UniNA currently inserted at the end of section 4.3 (holistic evaluation approaches). The title of Chapter 4 should change as well, like “Works combining different modeling formalisms and evaluation approaches”

Next steps (to be agreed) Feb 28: ALL the partners send the agreed modifications; Mar 2: UniFI distributes a new version of D3.1 for last check and (minor) modifications; Finalizing exec summary & conclusions; Addressing minor (typo and editorial) comments; Emphasizing in the doc the (minor) pending issues still to be addressed by the partners; March 9: Partners send final (minor) modifications March 11: UniFI distributes the final version of D3.1

Thank you for your attention!