Presentation is loading. Please wait.

Presentation is loading. Please wait.

FiSMA 2012 1 Program/ STX seminar 1.11.2012  12:30 - 12:45 Opening, Erkki Savioja, FISMA  12:45 – 13:00 STX project, Ossi Taipale, Lappeenranta University.

Similar presentations


Presentation on theme: "FiSMA 2012 1 Program/ STX seminar 1.11.2012  12:30 - 12:45 Opening, Erkki Savioja, FISMA  12:45 – 13:00 STX project, Ossi Taipale, Lappeenranta University."— Presentation transcript:

1 FiSMA 2012 1 Program/ STX seminar 1.11.2012  12:30 - 12:45 Opening, Erkki Savioja, FISMA  12:45 – 13:00 STX project, Ossi Taipale, Lappeenranta University of Technology  13:00 – 13:45 PTP – prosessimalli, Heikki Uusitalo, Knowit Oy  13:45 - 14:15 Coffee Break  14:15 – 15:00 Testing in the Cloud, Leah Riungu-Kalliosaari, Lappeenranta University of Technology  15:00 – 15:30 Standards 29119 and 33063, Ossi Taipale, Lappeenranta University of Technology  15:30 – 16:00 Top-10 recommended measures, Erkki Savioja, FiSMA  16:00 – 16:30 Safety: a new process quality characteristic? Timo Varkoi, FISMA  16:30 Closing, Erkki Savioja, FISMA

2 10 metrics for improving the level of management Pekka Forselius, Senior Advisor, FiSMA ry Risto Nevalainen, Senior Advisor, FiSMA ry Erkki Savioja, Managing Director, FiSMA ry

3 FiSMA 2012 3 Contents  About FISMA  Chosen approach to selecting measures  Classification of metrics  Suggested 10 measures  Examples of recommended sets of measures for different organisations  Discussion

4 FiSMA 2012 4 FiSMA – ”For better Management”  FiSMA supports its member organisations to improve their fact based management through measurement  History  Started in 1992, renamed as FiSMA in 1997  Current activities  Standardisation in ISO/IEC JTC1 SC7  Domain is ”SSS” (Software, Systems, Services)  Research projects and method development  Knowledge and experience sharing  Workgroups, forums, seminars etc for members  Training services

5 FiSMA 2012 5 FiSMA and external stakeholders

6 FiSMA 2012 6 Organisation 2012 Scope Manager Forum FiSMA SPIN Testing Models and Standards SW Process Standards and Models IT Service Management Annual meeting Board, 6 members Managing Director Erkki Savioja Senior Advisor Pekka Forselius Senior Advisor Risto Nevalainen Senior Advisor Timo Varkoi

7 FiSMA 2012 7 Chosen approach to selecting of measures  The aim is to select generic measures, which fit for  Different types of systems  Different types of organisations  Generality and significance  Is introduced in some widely approved standard  Each measure is based on some reference model  Focus on process improvement, change management and business needs  All measures are validated in use at least in some FiSMA organisation  A measure should align with a current state of organisations  Beneficial right now (immediately)  About level 3 maturity is assumed i.e. measurement needs have been identified quite well.

8 FiSMA 2012 8 Classification of metrics  Metric = measure and it’s value. A value could be a goal or achieved.  Example: system’s current functional size is 600 function points  Metrics are classified in ISO/IEC 15939 as follows:  Base measure = measure defined in terms of an attribute and the method for quantifying it  Derived measure = measure that is defined as a function of two or more values of base measures  Indicator = measure that provides an estimate or evaluation of specified attributes derived from a model with respect to defined information needs

9 FiSMA 2012 9 Condition based classification  Typical areas of software metrics are:  Software product (itself)  The process used to produce software  The management of developing software  Leading and managing software business  Other classifiers:  The software product standard: internal, external, in use  The software process standard: capability, maturity  Software supplier / customer organisation / end-user  Project model: cost, time, quality, recourses, workload, benefit/profit  Critical systems: stability, integrity, method conformance  Life cycle model: specification, technical planning, development, verification, validation, production ( in all e.g. V&V findings, coverage, traceability)  Maturity wise: initial, managed, defined sets

10 FiSMA 2012 10 Classifications in this presentation

11 FiSMA 2012 11 Motivation items for measurement Creation of value Immediate benefits Strategic benefits Competitiveness, efficiency Changeability, adaptability Cost savings Creation of growth Image, position Capability to deliver CompetenceManagement

12 FiSMA 2012 12 Topic A: Quality measures of software product, 2 measures Recommended metrics:  A.1 Improvement of efficiency of end-user’s work  Type: Derived measure  Main content: A rate of user tasks, which are supported by the software compared to all other user tasks. A recommended method is a case study.  What the measure explains: How well and comprehensively the software fulfils user needs.  A.2 End-user satisfaction  Type: Base measure  Main content : User experience. Could be divided to sub-measures. A recommended method is Net Promoter.  What the measure explains : How successfully the software serves the end-user e.g. usability and accessibility.

13 FiSMA 2012 13 Topic B: Software process, 3 measures Recommended metrics:  B.1 Maturity of the software process  Type: Indicator, indirect measure  Main content: An operational level derived from a summary of selected processes. Well-known and widely applied methods are CMMI and SPICE.  What the measure explains: Process wise capability of the supplier organisation to deliver products or services  B.2 Agility of the software process  Type: Indicator, indirect measure  Main content: A level of agility adaption with the whole software organisation. A recommended method is a survey or an employee inquiry.  What the measure explains: Ability to react to external changes or requests.  B.3 Improvability of the software process  Type: Indicator, indirect measure  Main content: A rate of planned and decided improvement efforts which get completed accordingly. A recommended method is audit.  What the measure explains: Capability to execute when there is need for change and development activities

14 FiSMA 2012 14 Topic C: Management of a software project, 2 measures Recommended metrics:  C.1 Functional size of the software  Type: Derived measure  Main content: A size of the software to be developed, acquired, maintained or which is the subject to other activity. A recommended method is FiSMA 1.1 or any other ISO/IEC-standard FSM method (e.g. function points)  What the measure explains: Functional size enables comparisons of quality, efficiency and price data of systems of different sizes. Also a value of the software’s functionality for the end-user  C.2 Workload of the software project  Type: Base measure  Main content: The complete workload of a defined development team in assigned activities during the life cycle of the system. A recommended unit of workload is an hour  What the measure explains: Important source data for timetables, pricing and comparison of productivity

15 FiSMA 2012 15 Topic D: Management of software business, 3 measures Recommended metrics:  D.1 Delivery speed of the software  Type: Indicator, indirect measure  Main content: Functional size of the software divided by development time, FP/months  What the measure explains: Delivery speed achieved related to comparable projects; indicates competitiveness of both acquiring and supplying organisations  D.2 Cost efficiency of the software purchase  Type: Indicator, indirect measure  Main content: Total cost of the purchased software divided by a functional size, €/FP  What the measure explains: The cost Efficiency of the purchase compared to similar ones; indicates competitiveness of both acquiring and supplying organisations  D.3 Efficiency of the development portfolio  Type: Derived measure, partly indicator, indirect measure  Main content: Benefits of development portfolio compared to the investment. A recommended method ROI or benefit/cost ratio  What the measure explains: A competence to allocate and address IT efforts in accordance with business goals and value creation

16 FiSMA 2012 16 Example 1: Middle size software company  The organisation is a mid-size software company. It has reach the level of ISO9001 and aims now to develop further. Some measures are in use for management purposes and as a part of a project specific customer reporting. Customer feedbacks are collected from completed projects on a monthly basis.  The main driver for measurement is improvement of productivity  Proposed measures:  A.2 End-user satisfaction  B.2 Agility of the software process  C.1 Functional size of the software  D.1 Delivery speed of the software  D.3 Efficiency of the development portfolio

17 FiSMA 2012 17 Example 2: Large acquiring organisation  The organisation acquires lots of project type software development services from different organisations. A frame agreement has been made with key suppliers. Software skills of own personnel are on quite solid level, especially concerning the most central systems  Proposed measures:  A.1 Improvement of efficiency of end-users’ work  C.1 Functional size of the software  D.1 Delivery speed of the software  D.2 Cost efficiency of the software purchase  D.3 Efficiency of the development portfolio

18 FiSMA 2012 18 Example 3: Relatively large software product company  The company is international. Also R&D takes place in many countries. The company has stable products with wide market share especially among large corporate customers. The company has achieved several certificates according to market demands.  Proposed measures:  A.2 End-user satisfaction  B.1 Maturity of the software process  B.2 Agility of the software process  B.3 Improvability of the software process  C.1 Functional size of the software  D.1 Delivery speed of the software  D.3 Efficiency of the development portfolio

19 FiSMA 2012 19 Example 4: IT project house  The organisation is small and earns mainly by selling development projects. During offering phase there is often severe competition. Profit margin is low.  Proposed measures:  B.2 Agility of the software process  B.3 Improvability of the software process  C.1 Functional size of the software  C.2 Workload of the software project  D.1 Delivery speed of the software  D.2 Cost efficiency of the software project


Download ppt "FiSMA 2012 1 Program/ STX seminar 1.11.2012  12:30 - 12:45 Opening, Erkki Savioja, FISMA  12:45 – 13:00 STX project, Ossi Taipale, Lappeenranta University."

Similar presentations


Ads by Google