Presentation is loading. Please wait.

Presentation is loading. Please wait.

ISACA’s COBIT® Assessment Programme

Similar presentations


Presentation on theme: "ISACA’s COBIT® Assessment Programme"— Presentation transcript:

1 ISACA’s COBIT® Assessment Programme
Presented by:

2 Session Objectives An understanding of the new COBIT assessment programme An understanding of the relationship to ISO/IEC and why ISACA selected this standard A walk through with one of the key COBIT 4.1 processes DS1 Define and manage service levels Copyright ISACA All rights reserved Slide 2

3 What is A Process Assessment?
ISO/IEC identifies process assessment as an activity that can be performed either as part of a process improvement initiative or as part of a capability determination approach The purpose of process improvement is to continually improve the enterprise’s effectiveness and efficiency The purpose of process capability determination is to identify the strengths, weaknesses and risk of selected processes with respect to a particular specified requirement through the processes used and their alignment with the business need It provides an understandable, logical, repeatable, reliable and robust methodology for assessing the capability of IT processes Defined in ISO Copyright ISACA All rights reserved Slide 3

4 What is the new COBIT Assessment Programme?
The COBIT Assessment Programme includes: COBIT Process Assessment Model (PAM): Using COBIT 4.1 COBIT Assessor Guide: Using COBIT 4.1 COBIT Self Assessment Guide: Using COBIT 4.1 The COBIT PAM brings together two proven heavyweights in the IT arena, ISO and ISACA The COBIT PAM adapts the existing COBIT 4.1 content into an ISO compliant process assessment model REVEAL – ISACA’s new COBIT Assessment Process brings COBIT together with ISO15504 – a reference model for assessing process capability (consisting of capability levels which in turn consist of the process attributes and further consist of generic practices). REVEAL – ISACA publications to support the COBIT Assessment Programme include the Process Assessment Model (or PAM); a guide for Certified Assessors (and we will talk more about the concept of “certified assessors” a little later); and a “self assessment” guide for enterprises that would like a less formal assessment using the same basic approach. REVEAL – The PAM – the key reference source for an assessment basically re-states much of the COBIT 4.1 content into an ISO15504 compliant process assessment model for use in assessing IT process capability. Note that PAM is a Process Improvement Process, an analysis of gaps, it is not a risk-based methodology. Copyright ISACA All rights reserved Slide 4

5 What’s different? But don’t we already have maturity models for COBIT 4.1 processes? The new COBIT assessment programme is: A robust assessment process based on ISO 15504 An alignment of COBIT’s maturity model scale with the international standard A new capability-based assessment model which includes: Specific process requirements derived from COBIT 4.1 Ability to achieve process attributes based on ISO 15504 Evidence requirements Assessor qualifications and experiential requirements Results in a more robust, objective and repeatable assessment Assessment results will likely vary from existing COBIT maturity models! Lead into the next slide with differences and say: ‘There is no direct relationship between the existing COBIT 4.1 CMM and the new approach based on ISO 15504’. Copyright ISACA All rights reserved Slide 5

6 Differences to COBIT Maturity Model
The COBIT 4.1 PAM uses a measurement framework that is similar in terminology to the existing maturity models in COBIT 4.1 While the words are similar the scales are NOT the same: The COBIT PAM uses the capability scale from ISO/IEC 15504, whereas the existing COBIT maturity models uses a scale derived from SEI\CMM A PAM level 3 is NOT the same as a CMM level 3 Assessments done under the PAM are likely to result in ‘lower’ scores PAM assessments are based on more fully defined and defensible attributes COBIT 4.1 Process Maturity Level ISO/IEC Process Capability Level Attribute 5 Optimised 5 Optimizing PA 5.1 Process innovation PA 5.2 Process optimization 4 Managed and measurable 4 Predictable PA 4.1 Process measurement PA 4.2 Process control 3 Defined 3 Established PA 3.1 Process definition PA 3.2 Process deployment 2 Repeatable but intuitive 2 Managed PA 2.1Performance management PA 2.2 Work product management 1 Initial/ad hoc 1 Performed PA 1.1 Process performance 0 Non-existent 0 Incomplete Self explanatory Copyright ISACA All rights reserved Slide 6

7 Process Assessment Model
Assessment Overview Process Assessment Model Assessment Process This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 7

8 Process Reference Model
The high-level measurable objectives of performing the process and the likely outcomes of effective implementation of the process REVEAL - Process purpose – The high-level measurable objectives of performing the process and the likely outcomes of effective implementation of the process Copyright ISACA All rights reserved Slide 8

9 Process Reference Model
An observable result of a process - an artefact, a significant change of state or the meeting of specified constraints The activities that, when consistently performed, contribute to achieving the process purpose The artefacts associated with the execution of a process – defined in terms or process ‘inputs’ and process ‘outputs’ The Process Reference Model consists of 3 primary elements: REVEAL – Process outcomes - An observable result of a process (Note: An outcome is an artefact, a significant change of state or the meeting of specified constraints.) REVEAL - Base practices – The activities that, when consistently performed, contribute to achieving the process purpose REVEAL - Work product - An artefact associated with the execution of a process – defined in terms or process ‘inputs’ and process ‘outputs’ Copyright ISACA All rights reserved Slide 9

10 PRM Based on COBIT 4.1 Process ID DS1 Process Name Define and Manage Service Levels Purpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs. Outcomes (Os) Number Description DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers. DS1-O2 Internal and external SLAs are formalised in line with customer requirements and delivery capabilities. DS1-O3 Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs. DS1-O4 Processes are in place to monitor (and periodically review) SLAs and achievements. Base Practices (BPs) Supports DS1-BP1 Create a framework for defining IT services. DS1-BP2 Build an IT service catalogue. DS1-O1, O2 DS1-BP3 Define SLAs for critical IT services. DS1-BP4 Define OLAs for meeting SLAs. DS1-BP5 Monitor and report end-to-end service level performance. DS1-BP6 Review SLAs and underpinning contracts. DS1-BP7 Review and update the IT service catalogue. DS1-BP8 Create a service improvement plan. Work Products (WPs) Inputs PO1-WP1 Strategic IT plan DS1-O1, O2, O3, O4 PO1-WP4 IT service portfolio PO2-WP5 Assigned data classifications PO5-WP3 Updated IT service portfolio AI2-WP4 Initial planned SLAs AI3-WP7 Initial planned OLAs DS4-WP5 Disaster service requirements, including roles and responsibilities ME1-WP1 Performance input to IT planning Outputs Input To DS1-WP1 Contract review report DS2 DS1-O1, O4 DS1-WP2 Process performance reports ME1 DS1-WP3 New/updated service requirements PO1 DS1-O2, O3 DS1-WP4 SLAs AI1, DS2, DS3, DS4, DS6, DS8, DS13 DS1-WP5 OLAs DS4 to DS8, DS11, DS13 DS1-WP6 Lets look at where the PRM information comes from: REVEAL – Process purpose comes from the business requirements the process satisfies from the COBIT Waterfall—left side of picture. REVEAL – Process outcomes are derived from the ‘objectives’ identified in COBIT 4.1 – right side of picture. The Process Reference Model has been defined for each of the 34 processes identified in COBIT 4.1. Copyright ISACA All rights reserved Slide 10

11 PRM Based on COBIT 4.1 Process ID DS1 Process Name Define and Manage Service Levels Purpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs. Outcomes (Os) Number Description DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers. DS1-O2 Internal and external SLAs are formalised in line with customer requirements and delivery capabilities. DS1-O3 Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs. DS1-O4 Processes are in place to monitor (and periodically review) SLAs and achievements. Base Practices (BPs) Supports DS1-BP1 Create a framework for defining IT services. DS1-BP2 Build an IT service catalogue. DS1-O1, O2 DS1-BP3 Define SLAs for critical IT services. DS1-BP4 Define OLAs for meeting SLAs. DS1-BP5 Monitor and report end-to-end service level performance. DS1-BP6 Review SLAs and underpinning contracts. DS1-BP7 Review and update the IT service catalogue. DS1-BP8 Create a service improvement plan. Work Products (WPs) Inputs PO1-WP1 Strategic IT plan DS1-O1, O2, O3, O4 PO1-WP4 IT service portfolio PO2-WP5 Assigned data classifications PO5-WP3 Updated IT service portfolio AI2-WP4 Initial planned SLAs AI3-WP7 Initial planned OLAs DS4-WP5 Disaster service requirements, including roles and responsibilities ME1-WP1 Performance input to IT planning Outputs Input To DS1-WP1 Contract review report DS2 DS1-O1, O4 DS1-WP2 Process performance reports ME1 DS1-WP3 New/updated service requirements PO1 DS1-O2, O3 DS1-WP4 SLAs AI1, DS2, DS3, DS4, DS6, DS8, DS13 DS1-WP5 OLAs DS4 to DS8, DS11, DS13 DS1-WP6 Lets look at where the PRM information comes from: REVEAL – Base Practices come from the key activities identified in the RACI chart in COBIT 4.1. REVEAL – Work products are obtained from the Inputs/Outputs tables in COBIT 4.1. The Process Reference Model has been defined for each of the 34 processes identified in COBIT 4.1. Copyright ISACA All rights reserved Slide 11

12 Assessment Overview This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 12

13 Process Capability Levels
Level 5 Optimizing process PA 5.1 Process innovation attribute PA 5.2 Process optimization attribute Optimizing The process is continuously improved to meet relevant current and projected business goals Level 4 Predictable process PA 4.1 Process measurement attribute PA 4.2 Process control attribute Predictable The process is enacted consistently within defined limits Level 3 Established process PA 3.1 Process definition attribute PA 3.2 Process deployment attribute Established A defined process is used based on a standard process Level 2 Managed process PA 2.1 Performance management attribute PA 2.2 Work product management attribute Managed The process is managed and work products are established, controlled and maintained Level 1 Performed process PA 1.1 Process performance attribute Performed The process is implemented and achieves its process purpose Level 0 Incomplete process Incomplete The process is not implemented or fails to achieve its purpose Copyright ISACA All rights reserved Slide 13

14 Measurement Framework
COBIT assessment process measures the extent to which a given process achieves specific attributes relative to that process— ‘process attributes’ COBIT assessment process defines 9 process attributes (based on ISO/IEC ) PA 1.1 Process performance PA 2.1 Performance management PA 2.2 Work product management PA 3.1 Process definition PA 3.2 Process deployment PA 4.1 Process measurement PA 4.2 Process control PA 5.1 Process innovation PA 5.2 Continuous optimisation **Note that the Process Reference Model (PRM) in the COBIT PAM refers ONLY to Level 1 – PA1.1. All other levels and attributes PA2.1 to PA5.2 deal with generic outcomes. The COBIT Assessment Programme approach measures the extent to which a given process achieves specific attributes of a process. Those attributes are: REVEAL Process results or performance Management of work products of the process Management of the process performance Definition of the process Deployment of the process Measurement and control of the process Innovation and optimisation of the process Lets take a look at a couple of these in a little more detail so you can get a sense for what they mean. Copyright ISACA All rights reserved Slide 14

15 Process Attributes (example)
PA 1.1 Process performance The process performance attribute is a measure of the extent to which the process purpose is achieved. As a result of full achievement of this attribute, the process achieves its defined outcomes. The first process attribute – relates to the results or performance of the SPECIFIC PRM process: It’s a measure of the extent to which the process achieves its purpose – what it is designed to do. This attribute is fully achieved when the process achieves its defined outcomes. On this slide and the next one – walk through an example of process attributes PA1 and PA2. Copyright ISACA All rights reserved Slide 15

16 Process Attributes (example)
PA 2.1 Performance management A measure of the extent to which the performance of the process is managed. As a result of full achievement of this attribute: Objectives for the performance of the process are identified. Performance of the process is planned and monitored. Performance of the process is adjusted to meet plans. Responsibilities and authorities for performing the process are defined, assigned and communicated. Resources and information necessary for performing the process are identified, made available, allocated and used. Interfaces between the involved parties are managed to ensure effective communication and clear assignment of responsibility. PA 2.2 Work product management A measure of the extent to which the work products produced by the process are appropriately managed. As a result of full achievement of this attribute: Requirements for the work products of the process are defined. Requirements for documentation and control of the work products are defined. Work products are appropriately identified, documented and controlled. Work products are reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements. The next attributes relate to management of the process and associated work products: REVEAL - PA 2.1 is a measure of the extent to which the performance of the process is ‘managed’ and is fully achieved when: Process objectives have been defined. The process performance is planned and monitored. Process performance is adjusted to meet plans. Responsibilities and authorities are defined, assigned and communicated. Resource and information requirements are identified, allocated and used. There is effective communication between parties and clear assignment of responsibilities. REVEAL – PA2.2 is a measure of the extent to which the work products produced by the process are managed and is fully achieved when: Requirements for the work products have been defined. Requirements for documentation and control of the work products have been defined. The work products are identified, documented and controlled consistent with the definitions. Work products are reviewed and adjusted as necessary to meet the requirements. We will walk through an example of these shortly. Copyright ISACA All rights reserved Slide 16

17 Process Attribute Rating Scale
COBIT assessment process measures the extent to which a given process achieves the ‘process attributes’ N Not achieved—0 to 15% achievement There is little or no evidence of achievement of the defined attribute in the assessed process P Partially achieved—> 15% to 50% achievement There is some evidence of an approach to, and some achievement of, the defined attribute in the assessed process. Some aspects of achievement of the attribute may be unpredictable L Largely achieved—> 50% to 85% achievement There is evidence of a systematic approach to, and significant achievement of, the defined attribute in the assessed process. Some weakness related to this attribute may exist in the assessed process F Fully achieved—> 85% to 100% achievement There is evidence of a complete and systematic approach to, and full achievement of, the defined attribute in the assessed process. No significant weaknesses related to this attribute exist in the assessed process But first let us have a look at the ‘rating scale’ . which measures the extent to which a given process achieves each of the process attributes. The PAM (consistent with ISO15504) defines 4 levels of ‘achievement’: Not Achieved – where there is little or no evidence of achievement of the attribute in the process Partially Achieved – some of the elements have been achieved Largely Achieved – evidence of systematic approach to and achievement of the attribute elements – although some weaknesses/improvement opportunities may still exist Fully Achieved – evidence of full achievement of the attribute elements and no significant weaknesses exist Copyright ISACA All rights reserved Slide 17

18 Process Attribute Ratings and Capability Levels
1 L / F 2 L / F F 3 L / 4 L / F L / F 5 PA 5.1 Innovation PA 5.2 Optimization Level 5 - Optimizing PA 4.2 Control PA 4.1 Measurement Level 4 - Predictable PA 3.2 Deployment PA 3.1 Definition Level 3 - Established PA 2.2 Work product management PA 2.1 Performance management Level 2 - Managed The Process Attributes are organized into logical ‘levels’ representing the various process capability levels – REVEAL and refer briefly to each of the process attributes and the capability levels. Achievement of a given ‘Process Capability level’ requires the attributes for that level to have been ‘Fully’ or ‘Largely’ achieved – and the attributes for all lower levels to be ‘Fully’ achieved. REVEAL - For example – achieving level 1 capability requires Attribute PA 1.1 to be fully or largely achieved. REVEAL – achieving level 2 requires both PA2.1 and PA2.2 to be fully or largely achieved and PA1.1 to be fully achieved. REVEAL – achieving level 3 requires both PA 3.1 and PA3.2 to be fully or largely achieved and PA1.1, 2.1 and 2.2 to be fully achieved. REVEAL/REVEAL – and so on for capability levels 4 and 5. PA 1.1 Process performance Level 1 - Performed Level 0 - Incomplete L/F = Largely or Fully F= Fully This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 18

19 COBIT Assessment Process Overview
This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 19

20 Process Attributes and Capability Levels
Process Attribute Indicators (PAI) 9 Process Attributes Optimizing Predictable ISO Established Managed Performed Incomplete The COBIT Process Assessment Model (PAM) uses the PRM and the Measurement Framework to define an assessment model for each of the COBIT 4.1 processes. The Assessment Model defines ‘indicators’ that support achievement of the 9 process attributes. REVEAL – The 9 process attributes by Maturity level REVEAL – The PAM defines two types of ‘indicators’ REVEAL - Process Performance Indicators – these are the Base Practices and Work Products and are specific to each of the 34 COBIT Processes REVEAL - Process Capability Indicators – these are practices, resources and work products that are ‘generic’ and generally apply to all of the 34 COBIT processes COBIT This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Slide 20 20

21 Process Attributes and Capability Levels
Optimizing Predictable Established Managed Performed Incomplete REVEAL – The 9 process attributes by Capability level REVEAL – The PAM defines two types of ‘indicators’ REVEAL - Process Performance Indicators – these are the Base Practices and Work Products and are specific to each of the 34 COBIT Processes REVEAL - Process Capability Indicators – these are practices, resources and work products that are ‘generic’ and generally apply to all of the 34 COBIT processes This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Slide 21

22 Process Attribute Rating
Assessment indicators in the PAM are used to support the assessors’ judgement in rating process attributes: Provide the basis for repeatability across assessments A rating is assigned based on objective, validated evidence for each process attribute Traceability needs to be maintained between an attribute rating and the objective evidence used in determining that rating As implied by their name, indicators do not represent requirements of a process. They represent a common starting point for assessment, which increases the consistency of assessor judgment and enhances the repeatability of the results. The indicators provide a framework for assessment that helps to ensure that:  Assessors have the ability to interpret the organisational unit's instantiation of a process consistently against the Process Assessment Model(s).  The information is captured for subsequent analysis.  The information needed for the Organizational Unit to plan and perform process improvement is captured.  Assessment results are representative, reliable and repeatable. The assignment of a rating for a given Process Attribute needs to be supported by objective, validated evidence. The traceability of the rating and the supporting evidence needs to be maintained. Copyright ISACA All rights reserved Slide 22

23 Example from COBIT 4.1: DS1 Define and manage service levels
We will now look at one of the COBIT 4.1 processes. Copyright ISACA All rights reserved Slide 23

24 Process Reference Model Example DS1
Process ID DS1 Process Name Define and Manage Service Levels Purpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs. Outcomes (Os) Number Description DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers. DS1-O2 Internal and external SLAs are formalised in line with customer requirements and delivery capabilities. DS1-O3 Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs. DS1-O4 Processes are in place to monitor (and periodically review) SLAs and achievements. Base Practices (BPs) Supports DS1-BP1 Create a framework for defining IT services. DS1-BP2 Build an IT service catalogue. DS1-O1, O2 DS1-BP3 Define SLAs for critical IT services. DS1-BP4 Define OLAs for meeting SLAs. DS1-BP5 Monitor and report end-to-end service level performance. DS1-BP6 Review SLAs and underpinning contracts. DS1-BP7 Review and update the IT service catalogue. DS1-BP8 Create a service improvement plan. Work Products (WPs) Inputs PO1-WP1 Strategic IT plan DS1-O1, O2, O3, O4 PO1-WP4 IT service portfolio PO2-WP5 Assigned data classifications PO5-WP3 Updated IT service portfolio AI2-WP4 Initial planned SLAs AI3-WP7 Initial planned OLAs DS4-WP5 Disaster service requirements, including roles and responsibilities ME1-WP1 Performance input to IT planning Outputs Input To DS1-WP1 Contract review report DS2 DS1-O1, O4 DS1-WP2 Process performance reports ME1 DS1-WP3 New/updated service requirements PO1 DS1-O2, O3 DS1-WP4 SLAs AI1, DS2, DS3, DS4, DS6, DS8, DS13 DS1-WP5 OLAs DS4 to DS8, DS11, DS13 DS1-WP6 For the purposes of this walkthrough we are going to use DS1 – Define and Manage Service Levels. This slide shows the ‘Process Reference Model’ for DS1 and identifies its purpose, outcomes, base practices and principal inputs and outputs. For those of you familiar with COBIT 4.1 this should look pretty familiar. Copyright ISACA All rights reserved Slide 24

25 Figure 6—PA1.1 Process Performance
Assessing Process Capability Does the process achieve its defined outcomes (PA1.1)? As evidenced by: Production of an object A significant change of state; Meeting of specified constraints, e.g., requirements, goals N Not achieved to 15 % achievement P Partially achieved > 15 % to 50 % achievement L Largely achieved > 50 % to 85 % achievement F Fully achieved > 85 % to 100 % achievement. Figure 6—PA1.1 Process Performance Result of Full Achievement of the Attribute Base Practices (BPs) Work Products (WPs) The process achieves its defined outcomes. BP Achieve the process outcomes. There is evidence that the intent of base practice is being performed. Work products are produced that provide evidence of process outcomes, as outlined in section 3. The Assessor then needs to assess whether there is sufficient evidence that PA1.1 is achieved and whether that achievement is None, Partial, Largely or Fully satisfied. Note that this is the level where the detailed and specific process requirements from the Process Reference Model are used. The assessor then reaches a conclusion as to the extent to which the attribute has been achieved. Copyright ISACA All rights reserved Slide 25

26 Assessing Process Capability
PA 2.1 Performance management Have objectives for the performance of the process been identified? Is performance of the process planned and monitored? Is performance of the process adjusted to meet plans? Are responsibilities and authorities for performing the process defined, assigned and communicated? Are resources and information necessary for performing the process identified, made available, allocated and used? Are interfaces between the involved parties managed to ensure effective communication and clear assignment of responsibility? N Not achieved to 15 % achievement P Partially achieved > 15 % to 50 % achievement L Largely achieved > 50 % to 85 % achievement F Fully achieved > 85 % to 100 % achievement The same kind of approach would be followed to determine if the other Process Attributes were achieved. In this case, the assessor would be trying to determine the extent to which the elements of PA2.1 are achieved. Note: From level 2 onwards you are no longer using the PRM; you are looking primarily at the attribute goals or objectives, called generic outcomes and generic practices and generic work products in the PAM section 4. Copyright ISACA All rights reserved Slide 26

27 Assessing Process Capability
PA 2.1 Performance management Have objectives for the performance of the process been identified? Is performance of the process planned and monitored? Is performance of the process adjusted to meet plans? Are responsibilities and authorities for performing the process defined, assigned and communicated? Are resources and information necessary for performing the process identified, made available, allocated and used? Are interfaces between the involved parties managed to ensure effective communication and clear assignment of responsibility? N Not achieved to 15 % achievement P Partially achieved > 15 % to 50 % achievement L Largely achieved > 50 % to 85 % achievement F Fully achieved > 85 % to 100 % achievement. The assessor then reaches a conclusion as to the extent to which the attribute has been achieved. Copyright ISACA All rights reserved Slide 27

28 Assessing Process Capability
PA 2.2 Work product management Have requirements for the work products of the process been defined? Have requirements for documentation and control of the work products been defined? Are work products appropriately identified, documented and controlled? Are work products reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements? N Not achieved to 15 % achievement P Partially achieved > 15 % to 50 % achievement L Largely achieved > 50 % to 85 % achievement F Fully achieved > 85 % to 100 % achievement A similar process is applied to each of the other Process Attributes. Copyright ISACA All rights reserved Slide 28

29 Assessing Attribute Achievement
Not Partially Largely Fully PA 1.1 Process performance PA 2.1 Performance management PA 2.2 Work product management PA 3.2 Deployment PA 3.1 Definition The extent to which the attributes have been achieved can be summarised. PA 4.1 Measurement PA 4.2 Control PA 5.1 Innovation PA 5.2 Optimisation Copyright ISACA All rights reserved Slide 29

30 Assessing Process Capability Levels
Level 0 Incomplete Level 1 Performed Level 2 Managed Level 3 Established Level 4 Predictable Level 5 Optimising PA 5.2 Optimisation L/F PA 5.1 Innovation PA 4.2 Control L/F F PA 4.1 Measurement PA 3.2 Deployment L/F F F PA 3.1 Definition And used as a basis for reaching a conclusion on the Capability level of the process assessed. PA 2.2 Work product management L/F F F F PA 2.1 Performance management L/F F F F F PA1.1 Process performance L/F = Largely or Fully F= Fully Copyright ISACA All rights reserved Slide 30

31 Overview That was a quick walkthrough of the Process Assessment Model as it is applied to one of the COBIT processes. Obviously the complete assessment would need to repeat those activities for each of the remaining COBIT processes that were identified as being in-scope/relevant for the assessment. [REVEAL] I now want to turn our attention to the Assessment Process itself. There is only enough time today to walk through the assessment process at a very high level. Detailed discussion of the process for a compliant assessment is provided in an Assessor Guide. In addition, simplified guidance has been developed in a Self-assessment Guide to completing assessments for those wanting to perform a simple, judgement based self assessment as a precursor to a more formal compliant assessment. This is a transition slide to indicate we have completed discussion of PAM and will now be moving on to discussion of principle activities in the COBIT Assessment Process. This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 31

32 Assessment Process Activities
Initiation Planning the assessment Briefing Data collection Data validation Process attributes rating 7 Reporting the results The activities associated with performing a compliant COBIT Assessment consist of these items identified on the slide. We will quickly review the key elements of each of these activities. Copyright ISACA All rights reserved Slide 32

33 1. Initiation Identify the sponsor and define the purpose of the assessment: Why it is being carried out? Define the scope of the assessment: Which processes are being assessed? What constraints, if any, apply to the assessment? Identify any additional information that needs to be gathered Select the assessment participants, the assessment team and define the roles of team members Define assessment inputs and outputs: Have them approved by the sponsor The objective of the initiation phase is to ensure that there is a common understanding with the sponsor on the purpose and scope of the assessment, and to identify the individuals with the appropriate competencies to ensure a successful assessment. For example – is the purpose of the assessment to benchmark current performance and identify improvement opportunities – or is the objective to demonstrate contractual/regulatory compliance? Recall, it is highly unlikely an enterprise would assess all 34 COBIT processes, so a scoping tool kit has been provided, see next slides for outline and scoping example. Copyright ISACA All rights reserved Slide 33 33

34 Process Assessment Model Walkthrough
Lets walk through the Process Assessment Model Scoping Exercise. Scoping Guidance Copyright ISACA All rights reserved Slide 34

35 Scoping ..1 There is a six Step Selection Process:
The aim of the scoping as part of Assessment Initiation is to focus on the assessment on the business needs of the enterprise. This reduces the overall effort involved the assessment One of the benefits of using COBIT 4.1 as the process reference model is that it has extensive validated mappings from business objectives, and IT Objectives and IT processes. [COBIT 4.1 Appendix 1]. These are available in the tool kit There is a six Step Selection Process: Step 1 Identify relevant business drivers for the IT processes assessment. Step 2 Prioritise the enterprise’s IT processes that may be included within the scope of the assessment Step 3 Perform a preliminary selection of target processes for inclusion in the assessment, based on the above prioritisation Step 4 Confirm the preliminary selection of target processes with the project sponsor and key stakeholders of the process assessment Step 5 Finalise the processes to be included in the assessment Step 6 Document the scoping methodology in the assessment records Recall, it is highly unlikely an enterprise would assess all 34 COBIT processes. Copyright ISACA All rights reserved Slide 35

36 Scoping ..2 Available Mappings Linking Business Goals to IT Goals
Linking IT Goals to IT processes Mapping IT processes to IT governance focus areas and COSO US Sarbanes-Oxley Act Cloud Computing Self Diagnostic Self explanatory Copyright ISACA All rights reserved Slide 36

37 Scoping ..3 Available Mappings Linking Business Goals to IT Goals
Linking IT Goals to IT processes PO5 manage the IT Investment DS6 Identify and Allocate Costs Business Goals - Financial; Customer; Internal; and Learning –Growth perspectives Copyright ISACA All rights reserved Slide 37

38 2. Planning the Assessment
An assessment plan describing all activities performed in conducting the assessment is: Developed Documented together with An assessment schedule Identify the project scope Secure the necessary resources to perform the assessment Determine the method of collating, reviewing, validating and documenting the information required for the assessment Co-ordinate assessment activities with the organisational unit being assessed The Assessment Planning phase includes such things as: Determine the assessment activities. -may be tailored as necessary. Determine the necessary resources and schedule for the assessment. Define how the assessment data will be collected, recorded, stored, analysed and presented with reference to the assessment tool. Define the planned outputs of the assessment. Assessment outputs desired by the sponsor in addition to those required as part of the assessment record are identified and described. Verify conformance to requirements. Detail how the assessment will meet all the requirements in the standard. Manage risks. Potential risk factors and mitigation strategies are documented, prioritised and tracked through assessment planning. All identified risks will be monitored throughout the assessment. Potential risks may include changes to the assessment team, organisational changes, changes to the assessment purpose/scope, lack of resources for assessment, confidentiality, priority of the data, base practices and criticality of indicators and availability of key work products such as documents. Co-ordinate assessment logistics with the Local Assessment Co-ordinator. -compatibility and the availability of technical equipment, identified workspace and scheduling requirements will be met. Review and obtain acceptance of the plan. The sponsor identifies who will approve the assessment plan. The plan, including the assessment schedule and logistics for site visits is reviewed and approved. Confirm the sponsor’s commitment to proceed with the assessment. Copyright ISACA All rights reserved Slide 38 38

39 3. Briefing The assessment team leader ensures that the assessment team understands the assessment: Input Process Output Brief the organisational unit on the performance of the assessment: PAM, assessment scope, scheduling, constraints, roles and responsibilities, resource requirements, etc. The Assessor Guide uses Annex A Part 3 A.4 Briefing A.4.1 Overview from ISO Brief the assessment team. Ensure that the team understands the approach defined in the documented process, the assessment inputs and outputs, and is proficient in using the assessment tool. Brief the organisational unit. Explain the assessment purpose, scope, constraints, and model. Stress the confidentiality policy and the benefit of assessment outputs. Present the assessment schedule. Ensure that the staff members understand what is being undertaken and their role in the process. Answer any questions or concerns that they may have. Potential participants and anyone who will see the presentation of the final results should be present at the briefing session. Copyright ISACA All rights reserved Slide 39 39

40 4. Data Collection The assessor obtains (and documents) an understanding of the process(es) including process purpose, inputs, outputs and work products, sufficient to enable and support the assessment Data required for evaluating the processes within the scope of the assessment are collected in a systematic manner The strategy and techniques for the selection, collection, analysis of data and justification of the ratings are explicitly identified and demonstrable Each process identified in the assessment scope is assessed on the basis of objective evidence: The objective evidence gathered for each attribute of each process assessed must be sufficient to meet the assessment purpose and scope Objective evidence that supports the assessors’ judgement of process attribute ratings is recorded and maintained in the assessment record This record provides evidence to substantiate the ratings and to verify compliance with the requirements See Assessor Guide. Collect evidence of process performance for each process within the scope. Evidence includes observation of work products and their characteristics, testimony from the process performers, and observation of the infrastructure established for the performance of the process. Collect evidence of process capability for each process within the scope. Evidence of process capability may be more abstract than evidence of process performance. In some cases, the evidence of process performance may be used as evidence of process capability. Record and maintain the references to the evidence that supports the assessors’ judgement of process attribute ratings. Verify the completeness of the data. Ensure that for each process assessed, sufficient evidence exists to meet the assessment purpose and scope. Copyright ISACA All rights reserved Slide 40 40

41 5. Data Validation Actions are taken to ensure that the data are accurate and sufficiently cover the assessment scope, including: Seeking information from firsthand, independent sources Using past assessment results Holding feedback sessions to validate the information collected Some data validation may occur as the data is being collected Assemble and consolidate the data. For each process, relate the evidence to defined process indicators. Validate the data. Ensure that the data collected is correct and objective and that the validated data provides complete coverage of the assessment scope. Copyright ISACA All rights reserved Slide 41 41

42 6. Process Attribute Rating
For each process assessed, a rating is assigned for each process attribute up to and including the highest capability level defined in the assessment scope The rating is based on data validated in the previous activity Traceability must be maintained between the objective evidence collected and the process attribute ratings assigned For each process attribute rated, the relationship between the indicators and the objective evidence is recorded Establish and document the decision-making process used to reach agreement on the ratings (e.g., consensus of the assessment team or majority vote). For each process assessed, assign a rating to each process attribute. Use the defined set of assessment indicators in the Process Assessment Model to support the assessors’ judgement. Record the set of process attribute ratings as the process profile and calculate the capability level rating for each process using the Capability Level Ratings criteria. Copyright ISACA All rights reserved Slide 42 42

43 7. Reporting the Results The results of the assessment are analysed and presented in a report The report also covers any key issues raised during the assessment such as: Observed areas of strength and weakness Findings of high risk, i.e., magnitude of gap between assessed capability and desired/required capability See Assessor Guide. Tasks: Assemble and consolidate the data. For each process, relate the evidence to defined process indicators. Validate the data. Ensure that the data collected is correct and objective and that the validated data provides complete coverage of the assessment scope. Process attribute rating: For each process assessed, a rating is assigned for each process attribute up to and including the highest capability level defined in the assessment scope. The rating is based on data validated in the previous activity. Traceability shall be maintained between the objective evidence collected and the process attribute ratings assigned. For each process attribute rated, the relationship between the indicators and the objective evidence shall be recorded. Establish and document the decision-making process used to reach agreement on the ratings (e.g., consensus of the assessment team or majority vote). For each process assessed, assign a rating to each process attribute. Use the defined set of assessment indicators in the Process Assessment Model to support the assessors’ judgement. Record the set of process attribute ratings as the process profile and calculate the capability level rating for each process using the Capability Level Ratings criteria. Reporting the Results Overview During this phase, the results of the assessment are analysed and presented in a report. The report also covers any key issues raised during the assessment such as observed areas of strength and weakness and findings of high risk. Prepare the assessment report. Summarise the findings of the assessment, highlighting the process profiles, key results, observed strengths and weaknesses, identified risk factors, and potential improvement actions (if within the scope of the assessment). Present the assessment results to the participants. Focus the presentation on defining the capability of the processes assessed. Present the assessment results to the sponsor. The assessment results will also be shared with any parties (e.g., organisational unit management, practitioners) specified by the sponsor. Finalise the assessment report and distribute to the relevant parties. Verify and document that the assessment was performed according to requirements. Assemble the Assessment Record. Provide the Assessment Record to the sponsor for retention and storage. Prepare and approve assessor records. For each assessor, records to prove the participation in the assessment are produced. The sponsor or the sponsor’s delegated authority approves the records. Provide feedback from the assessment as a means to improve the assessment process. Copyright ISACA All rights reserved Slide 43 43

44 Target Process Capabilities (example)
Level 1 Level 2 Level 3 PA 1.1 PA 2.1 PA 2.2 PA 3.1 PA 3.2 Process A Target Capability L Assessed Process B Target Capability Assessed L F Process C Target Capability Assessed L F Depending on the ‘purpose’ of the assessment – it may be appropriate to compare ‘assessed’ capabilities with a ‘target’ or ‘desired’ capability This slide can be used to review nature of ‘required’ or desired process capabilities by process, and as a way of comparing actual assessed capability against the ‘target’ capability. Copyright ISACA All rights reserved Slide 44

45 Consequence of Capability Gaps
Figure A.3—Consequence of Gaps at Various Capability Levels This slide summarises the nature of any gaps that may exist within a given process capability level. Perhaps the easiest way to think about this would be: What is the consequence of NOT being able to achieve the capability level denoted in the first column. This figure is reproduced from ISO with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 45

46 Capability Gaps and Risk
Figure A.4—Risk Associated With Each Capability Level This presents similar information and may be best interpreted as: What is the relative risk if the gap in assessed capability at each maturity level is Substantial, Significant or Slight, e.g., if the gap between your assessed capability and the requirements to achieve Level 2 capability is Substantial – the result would represent a ‘High’ risk to the enterprise. My interpretation would be that: A ‘slight’ gap at Level 1 would be a High Risk – still not achieving Level 1 capabilit.y A ‘significant’ gap at level 2 would be a High risk. A ‘slight’ gap at Level 3 would be a Medium risk. This figure is reproduced from ISO with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 46

47 Assessor Certification
COBIT process assessment roles: Lead assessor—a ‘competent’ assessor responsible for overseeing the assessment activities Assessor—an individual, developing assessor competencies, who performs the assessment activities Assessor competencies: Knowledge, skills and experience: With the process reference model; process assessment model, methods and tools; and rating processes With the processes/domains being assessed Personal attributes that contribute to effective performance A training and certification scheme is being developed for COBIT 4.1 and COBIT 5 Copyright ISACA All rights reserved Slide 47

48 Questions? And so Goodbye . . .
COBIT Assessment Programme: Contact Information: Questions? Copyright ISACA All rights reserved Slide 48


Download ppt "ISACA’s COBIT® Assessment Programme"

Similar presentations


Ads by Google