Evaluation of Abstraction Techniques. Uses for the complexity metrics in our framework Comparing the complexity of the reference model with the abstracted.

Slides:



Advertisements
Similar presentations
Automating Software Module Testing for FAA Certification Usha Santhanam The Boeing Company.
Advertisements

DETAILED DESIGN, IMPLEMENTATIONA AND TESTING Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
1 Ivan Marsic Rutgers University LECTURE 15: Software Complexity Metrics.
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Software and Systems Engineering Seminar Winter 2011 Domain-specific languages in model-driven software engineering 1 Speaker: Valentin ROBERT.
Nov R. McFadyen1 Metrics Fan-in/fan-out Lines of code Cyclomatic complexity* Comment percentage Length of identifiers Depth of conditional.
Detailed Design Kenneth M. Anderson Lecture 21
Software engineering for real-time systems
Software Metrics II Speaker: Jerry Gao Ph.D. San Jose State University URL: Sept., 2001.
1 “White box” or “glass box” tests “White Box” (or “Glass Box”) Tests.
March R. McFadyen1 Software Metrics Software metrics help evaluate development and testing efforts needed, understandability, maintainability.
IMSE Week 18 White Box or Structural Testing Reading:Sommerville (4th edition) ch 22 orPressman (4th edition) ch 16.
Chapter 8 . Sequence Control
1 Complexity metrics  measure certain aspects of the software (lines of code, # of if-statements, depth of nesting, …)  use these numbers as a criterion.
Testing an individual module
Department of Computer Science 1 CSS 496 Business Process Re-engineering for BS(CS)
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
Cyclomatic Complexity Dan Fleck Fall 2009 Dan Fleck Fall 2009.
Software Reliability Model Deterministic Models Probabilistic Models Halstead’s software metric McCabe’s cyclomatic complexity metrix Error seeding Failure.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
Design Space Exploration
Week 3 – Complexity Metrics and Models
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 20 Slide 1 Defect testing l Testing programs to establish the presence of system defects.
1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn.
Software Measurement & Metrics
Product Metrics An overview. What are metrics? “ A quantitative measure of the degree to which a system, component, or process possesses a given attribute.”
Software Metrics Software Engineering.
Agenda Introduction Overview of White-box testing Basis path testing
1 ECE 453 – CS 447 – SE 465 Software Testing & Quality Assurance Lecture 23 Instructor Paulo Alencar.
Software Metrics (Part II). Product Metrics  Product metrics are generally concerned with the structure of the source code (example LOC).  Product metrics.
Software Quality Metrics
Software Engineering 2 Software Testing Claire Lohr pp 413 Presented By: Feras Batarseh.
Chapter 12: Design Phase n 12.1 Design and Abstraction n 12.2 Action-Oriented Design n 12.3 Data Flow Analysis n Data Flow Analysis Example n
SOFTWARE DESIGN. INTRODUCTION There are 3 distinct types of activities in design 1.External design 2.Architectural design 3.Detailed design Architectural.
Software Metrics.
A System to Generate Test Data and Symbolically Execute Programs Lori A. Clarke Presented by: Xia Cheng.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
1 Control Flow Analysis Topic today Representation and Analysis Paper (Sections 1, 2) For next class: Read Representation and Analysis Paper (Section 3)
Theory and Practice of Software Testing
Metrics "A science is as mature as its measurement tools."
Cyclomatic complexity (or conditional complexity) is a software metric (measurement). Its gives the number of indepented paths through strongly connected.
CAD for VLSI Ramakrishna Lecture#2.
CS223: Software Engineering Lecture 21: Unit Testing Metric.
A Discourse on Complexity of Process Models J. CardosoUniversidade da Madeira J. MendlingVienna University of Economics G. NeumannVienna University of.
Static Software Metrics Tool
Modelling and Solving Configuration Problems on Business
Software Testing.
Software Metrics 1.
Software Testing.
White-Box Testing Pfleeger, S. Software Engineering Theory and Practice 2nd Edition. Prentice Hall, Ghezzi, C. et al., Fundamentals of Software Engineering.
Design Characteristics and Metrics
Software metric By Deepika Chaudhary.
Software Engineering Design
CPSC 873 John D. McGregor GQM.
Object-Oriented Metrics
Structural testing, Path Testing
White-Box Testing.
Information flow-Test coverage measure
“White box” or “glass box” tests
White-Box Testing.
Halstead software science measures and other metrics for source code
Algorithm Discovery and Design
Software Design Lecture : 9.
Presented by Trey Brumley and Ryan Carter
Chapter 19 Technical Metrics for Software
Software Metrics SAD ::: Fall 2015 Sabbir Muhammad Saleh.
Introduction to Programming
Coupling Interaction: It occurs due to methods of a class invoking methods of other classes. Component Coupling: refers to interaction between two classes.
Chapter 8: Design: Characteristics and Metrics
Unit III – Chapter 3 Path Testing.
Presentation transcript:

Evaluation of Abstraction Techniques

Uses for the complexity metrics in our framework Comparing the complexity of the reference model with the abstracted models Comparing the complexity of the abstracted models at the same level of abstraction

proView framework A flexible approach for abstracting and personalizing large business process models, Jens Kolb and Manfred Reichert, ACM 2013 SIGAPP Applied Computing Review, vol.13, no.1, pp A tool for – Creating personalized process views. – Changing and propagating the changes. Differences to our approach – Each abstraction results in one process view. – An abstraction can not be performed automatically based on an abstraction criteria. – Although support for task’s attributes are provided, data flow consistency is not supported – Behaviour consistency is not consider while creating a view. – A demo of the tool: project.html project.html

Proview framework Evaluation criteria: – # Activities = no of executable statements in a software – # Gateways = no of choices – McCabe’s Cyclomatic Complexity Is a complexity (comprehensibility and maintainability) metric for business processes. It counts the number of all possible control flow. McCabe= e-n+2 e: is number of all control flows n: is the number of nodes

Slider Approach Polyvyanyy et. al An abstraction technique for creating views. A business process model can be abstracted based on one of numeric task’s attribute. BPMA function – Input: a detailed model and abstraction criteria Abstraction criteria is a task attribute and must be numeric – Output: is an abstracted model

Slider Approach Evaluation Strategy 1: Basic sequential abstraction Strategy 2: Sequential and then block abstraction *Strategy 3: Sequential, dead end, and then loop abstraction Comparison coefficient – Shows the relation between the number of nodes in abstracted and detailed models.

Complexity metrics for BPMs by Volker Gruhn and Ralf Laue, BIS2006 CFC (control flow complexity of processes): (Cardoso 2005 “how to measure the control-flow complexity of web-processes and workflows”) – is a generalization of McCabe’s cyclomatic number – In BPM AND-split adds 1 XOR-split with “n” outgoing transitions adds “n” OR-split with “n” outgoing transitions adds “2 n -1”

Complexity metrics for BPMs by Volker Gruhn and Ralf Laue, BIS2006 Nesting depth of a control structure – Maximum nesting depth (MND) The nesting depth of an action is the number of decisions in the control flow that are necessary to perform this action. CFC=8 MND c =1 CFC=8 MND c =3

Other Factors Well-nested, well-structured (Van der Aalst 1998 The application of Petri nets to workflow management) – Splits and joins must be pairwise. Knot count (number of handles) (Woodward and Hedley 1979, A measure of control-flow complexity in program text) – A control graph of a program has a knot whenever the paths associated with transfer of control intersect.

Other factors Cognitive weight (Shao and Wang 2003, a new measure of software complexity based on cognitive weight, Journal of IIS) – Refers to the effort for understanding a piece of software Sequence W i =1 Call of a user-defined function W i =2 Branching with if-then or if-then-else W i =2 Branching with arbitrary number of selectable case W i =3 Iteration W i =3 *Recursive function call W i =3 Execution of control flows in parallel W i = 4 Interupt W i =4

Complexity Metrics by Cardaso, Mendling, Neumann, Reijers, BPM 2006 LOC (Line of Code) metric: – In BPM: – M1:NOA = No of activities – M2: NOAC = No of activities and control flows – M3: NOAJS= No of activities, joins, and splits

Halstead Complexity metric In software: (Halstead 1987, Elements of software science) [out of print!] – n1= number of unique operators (if, while, etc) – n2= number of unique operands (variables or constants) – N1= total number of operator occurrences – N2= total number of operand occurrences In BPM: (Cardaso, Mendling, Neumann, Reijers, BPM 2006) – n1= number of unique activities, splits, joins, and control flow elements. – n2= number of unique data variable that are manipulated by the process and its activities

Halstead-based Process Complexity (HPC) HPC estimates – Process length: N= n1*log2(n1) + n2*log2(n2) – Volume: V= (N1+N2)*log2(n1+n2) – Difficulty: D= (n1/2)* (N2/n2) Advantages: – No in-depth analysis of process structure – Rate of errors and maintenance efforts can be predicted – Simple to calculate – Can be used for most process modelling language

Fan-in/ Fan-out (Henry and Kafura 1981, Software structure metrics based on information flow) Measures the impact of the information flow in a program structure – Fan-in is a count of all other modules that call a given module (how many times a sub-process/an activity is called) – Fan-out is a count of all other modules that are called from the model under investigation (how many sub- processes/activities are called from a model) Complexity of a procedure (PC) is: PC= Length * ((Fan-in)*(Fan-out))^2

Interface Complexity Cardoso et al Activities are invoked when their inputs (fan-in) are available and the activities are scheduled for execution. When an activity completes its output data is transferred to the activities connected to it. IC= Length* (number of inputs * number of outputs) ^2 Advantages: – Considers data-driven processes – Can be calculated at design time

Complexity of a graph Coefficient of Network Complexity (CNC) – Estimates the complexity of a graph: CNC= number of arcs/ (number of activities, joins, and splits) Complexity Index (CI) Is the minimal number of node reductions that reduces a graph to a single node. (is the number of structured activities) Restrictiveness estimator (RT) estimates the number of feasible sequences in a graph, which requires the reachability matrix r ij. RT = 2 ∑ r ij - 6(N-1)/ (N-2)(N-3)