Presentation is loading. Please wait.

Presentation is loading. Please wait.

ISACA’s COBIT ® Assessment Programme Presented by:

Similar presentations


Presentation on theme: "ISACA’s COBIT ® Assessment Programme Presented by:"— Presentation transcript:

1 ISACA’s COBIT ® Assessment Programme Presented by:

2  An understanding of the new COBIT assessment programme  An understanding of the relationship to ISO/IEC and why ISACA selected this standard  A walk through with one of the key COBIT 4.1 processes DS1 Define and manage service levels Session Objectives Copyright ISACA All rights reserved Slide 2

3  ISO/IEC identifies process assessment as an activity that can be performed either as part of a process improvement initiative or as part of a capability determination approach  The purpose of process improvement is to continually improve the enterprise’s effectiveness and efficiency  The purpose of process capability determination is to identify the strengths, weaknesses and risk of selected processes with respect to a particular specified requirement through the processes used and their alignment with the business need  It provides an understandable, logical, repeatable, reliable and robust methodology for assessing the capability of IT processes What is A Process Assessment? Copyright ISACA All rights reserved Slide 3

4  The COBIT Assessment Programme includes: COBIT Process Assessment Model (PAM): Using COBIT 4.1 COBIT Assessor Guide: Using COBIT 4.1 COBIT Self Assessment Guide: Using COBIT 4.1  The COBIT PAM brings together two proven heavyweights in the IT arena, ISO and ISACA  The COBIT PAM adapts the existing COBIT 4.1 content into an ISO compliant process assessment model What is the new COBIT Assessment Programme? Copyright ISACA All rights reserved Slide 4

5  But don’t we already have maturity models for COBIT 4.1 processes?  The new COBIT assessment programme is: A robust assessment process based on ISO An alignment of COBIT’s maturity model scale with the international standard A new capability-based assessment model which includes: Specific process requirements derived from COBIT 4.1 Ability to achieve process attributes based on ISO Evidence requirements Assessor qualifications and experiential requirements  Results in a more robust, objective and repeatable assessment  Assessment results will likely vary from existing COBIT maturity models! What’s different? Copyright ISACA All rights reserved Slide 5

6  The COBIT 4.1 PAM uses a measurement framework that is similar in terminology to the existing maturity models in COBIT 4.1  While the words are similar the scales are NOT the same: The COBIT PAM uses the capability scale from ISO/IEC 15504, whereas the existing COBIT maturity models uses a scale derived from SEI\CMM A PAM level 3 is NOT the same as a CMM level 3 Assessments done under the PAM are likely to result in ‘lower’ scores PAM assessments are based on more fully defined and defensible attributes Differences to COBIT Maturity Model COBIT 4.1 Process Maturity Level ISO/IEC Process Capability LevelAttribute 5 Optimised5 Optimizing PA 5.1 Process innovation PA 5.2 Process optimization 4 Managed and measurable 4 Predictable PA 4.1 Process measurement PA 4.2 Process control 3 Defined3 Established PA 3.1 Process definition PA 3.2 Process deployment 2 Repeatable but intuitive 2 Managed PA 2.1Performance management PA 2.2 Work product management 1 Initial/ad hoc1 PerformedPA 1.1 Process performance 0 Non-existent0 Incomplete Copyright ISACA All rights reserved Slide 6

7 Assessment Overview This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Process Assessment Model Assessment Process Copyright ISACA All rights reserved Slide 7

8 Process Reference Model The high-level measurable objectives of performing the process and the likely outcomes of effective implementation of the process Copyright ISACA All rights reserved Slide 8

9 Process Reference Model The activities that, when consistently performed, contribute to achieving the process purpose The artefacts associated with the execution of a process – defined in terms or process ‘inputs’ and process ‘outputs’ An observable result of a process - an artefact, a significant change of state or the meeting of specified constraints Copyright ISACA All rights reserved Slide 9

10 PRM Based on COBIT 4.1 Process IDDS1 Process NameDefine and Manage Service Levels Purpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs. Outcomes (Os) NumberDescription DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers. DS1-O2Internal and external SLAs are formalised in line with customer requirements and delivery capabilities. DS1-O3Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs. DS1-O4Processes are in place to monitor (and periodically review) SLAs and achievements. Base Practices (BPs) NumberDescriptionSupports DS1-BP1Create a framework for defining IT services.DS1-O1 DS1-BP2Build an IT service catalogue.DS1-O1, O2 DS1-BP3Define SLAs for critical IT services.DS1-O2 DS1-BP4Define OLAs for meeting SLAs.DS1-O3 DS1-BP5Monitor and report end-to-end service level performance.DS1-O4 DS1-BP6Review SLAs and underpinning contracts.DS1-O4 DS1-BP7Review and update the IT service catalogue.DS1-O1 DS1-BP8Create a service improvement plan.DS1-O1 Work Products (WPs) Inputs NumberDescriptionSupports PO1-WP1Strategic IT planDS1-O1, O2, O3, O4 PO1-WP4IT service portfolioDS1-O1, O2, O3, O4 PO2-WP5Assigned data classificationsDS1-O1 PO5-WP3Updated IT service portfolioDS1-O4 AI2-WP4Initial planned SLAsDS1-O3 AI3-WP7Initial planned OLAsDS1-O3 DS4-WP5Disaster service requirements, including roles and responsibilitiesDS1-O1 ME1-WP1Performance input to IT planningDS1-O1, O2 Outputs NumberDescriptionInput ToSupports DS1-WP1Contract review reportDS2DS1-O1, O4 DS1-WP2Process performance reportsME1DS1-O4 DS1-WP3New/updated service requirementsPO1DS1-O2, O3 DS1-WP4SLAsAI1, DS2, DS3, DS4, DS6, DS8, DS13DS1-O2 DS1-WP5OLAsDS4 to DS8, DS11, DS13DS1-O3 DS1-WP6Updated IT service portfolioPO1DS1-O1, O4 Copyright ISACA All rights reserved Slide 10

11 PRM Based on COBIT 4.1 Process IDDS1 Process NameDefine and Manage Service Levels Purpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs. Outcomes (Os) NumberDescription DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers. DS1-O2Internal and external SLAs are formalised in line with customer requirements and delivery capabilities. DS1-O3Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs. DS1-O4Processes are in place to monitor (and periodically review) SLAs and achievements. Base Practices (BPs) NumberDescriptionSupports DS1-BP1Create a framework for defining IT services.DS1-O1 DS1-BP2Build an IT service catalogue.DS1-O1, O2 DS1-BP3Define SLAs for critical IT services.DS1-O2 DS1-BP4Define OLAs for meeting SLAs.DS1-O3 DS1-BP5Monitor and report end-to-end service level performance.DS1-O4 DS1-BP6Review SLAs and underpinning contracts.DS1-O4 DS1-BP7Review and update the IT service catalogue.DS1-O1 DS1-BP8Create a service improvement plan.DS1-O1 Work Products (WPs) Inputs NumberDescriptionSupports PO1-WP1Strategic IT planDS1-O1, O2, O3, O4 PO1-WP4IT service portfolioDS1-O1, O2, O3, O4 PO2-WP5Assigned data classificationsDS1-O1 PO5-WP3Updated IT service portfolioDS1-O4 AI2-WP4Initial planned SLAsDS1-O3 AI3-WP7Initial planned OLAsDS1-O3 DS4-WP5Disaster service requirements, including roles and responsibilitiesDS1-O1 ME1-WP1Performance input to IT planningDS1-O1, O2 Outputs NumberDescriptionInput ToSupports DS1-WP1Contract review reportDS2DS1-O1, O4 DS1-WP2Process performance reportsME1DS1-O4 DS1-WP3New/updated service requirementsPO1DS1-O2, O3 DS1-WP4SLAsAI1, DS2, DS3, DS4, DS6, DS8, DS13DS1-O2 DS1-WP5OLAsDS4 to DS8, DS11, DS13DS1-O3 DS1-WP6Updated IT service portfolioPO1DS1-O1, O4 Copyright ISACA All rights reserved Slide 11

12 Assessment Overview This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 12

13 Process Capability Levels Level 0Incomplete process Incomplete The process is not implemented or fails to achieve its purpose Level 1Performed process PA 1.1Process performance attribute Level 1Performed process PA 1.1Process performance attribute Performed The process is implemented and achieves its process purpose Level 2Managed process PA 2.1Performance management attribute PA 2.2Work product management attribute Level 2Managed process PA 2.1Performance management attribute PA 2.2Work product management attribute Managed The process is managed and work products are established, controlled and maintained Level 4Predictable process PA 4.1Process measurement attribute PA 4.2Process control attribute Level 4Predictable process PA 4.1Process measurement attribute PA 4.2Process control attribute Predictable The process is enacted consistently within defined limits Level 5Optimizing process PA 5.1Process innovation attribute PA 5.2Process optimization attribute Level 5Optimizing process PA 5.1Process innovation attribute PA 5.2Process optimization attribute Optimizing The process is continuously improved to meet relevant current and projected business goals Level 3Established process PA 3.1Process definition attribute PA 3.2Process deployment attribute Level 3Established process PA 3.1Process definition attribute PA 3.2Process deployment attribute Established A defined process is used based on a standard process Copyright ISACA All rights reserved Slide 13

14  COBIT assessment process measures the extent to which a given process achieves specific attributes relative to that process— ‘process attributes’  COBIT assessment process defines 9 process attributes (based on ISO/IEC ) PA 1.1 Process performance PA 2.1 Performance management PA 2.2 Work product management PA 3.1 Process definition PA 3.2 Process deployment PA 4.1 Process measurement PA 4.2 Process control PA 5.1 Process innovation PA 5.2 Continuous optimisation Measurement Framework Copyright ISACA All rights reserved Slide 14

15  PA 1.1 Process performance The process performance attribute is a measure of the extent to which the process purpose is achieved. As a result of full achievement of this attribute, the process achieves its defined outcomes. Process Attributes (example) Copyright ISACA All rights reserved Slide 15

16  PA 2.1 Performance management A measure of the extent to which the performance of the process is managed. As a result of full achievement of this attribute: a.Objectives for the performance of the process are identified. b.Performance of the process is planned and monitored. c.Performance of the process is adjusted to meet plans. d.Responsibilities and authorities for performing the process are defined, assigned and communicated. e.Resources and information necessary for performing the process are identified, made available, allocated and used. f.Interfaces between the involved parties are managed to ensure effective communication and clear assignment of responsibility.  PA 2.2 Work product management A measure of the extent to which the work products produced by the process are appropriately managed. As a result of full achievement of this attribute: a.Requirements for the work products of the process are defined. b.Requirements for documentation and control of the work products are defined. c.Work products are appropriately identified, documented and controlled. d.Work products are reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements. Process Attributes (example) Copyright ISACA All rights reserved Slide 16

17  COBIT assessment process measures the extent to which a given process achieves the ‘process attributes’ Process Attribute Rating Scale NNot achieved—0 to 15% achievement There is little or no evidence of achievement of the defined attribute in the assessed process PPartially achieved—> 15% to 50% achievement There is some evidence of an approach to, and some achievement of, the defined attribute in the assessed process. Some aspects of achievement of the attribute may be unpredictable LLargely achieved—> 50% to 85% achievement There is evidence of a systematic approach to, and significant achievement of, the defined attribute in the assessed process. Some weakness related to this attribute may exist in the assessed process FFully achieved—> 85% to 100% achievement There is evidence of a complete and systematic approach to, and full achievement of, the defined attribute in the assessed process. No significant weaknesses related to this attribute exist in the assessed process Copyright ISACA All rights reserved Slide 17

18 PA 2.2 Work product management PA 2.1 Performance management Level 2 - Managed PA 1.1 Process performance Level 1 - Performed Level 0 - Incomplete PA 3.2 Deployment PA 3.1 Definition Level 3 - Established PA 4.2 Control PA 4.1 Measurement Level 4 - Predictable PA 5.1 Innovation PA 5.2 Optimization Level 5 - Optimizing 1 L/FL/F 2 L/FL/F F F 3 L/FL/F F 4L/FL/F F F F L/FL/F 5F F F F L/F = Largely or Fully F= Fully Process Attribute Ratings and Capability Levels This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 18

19 COBIT Assessment Process Overview This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 19

20 Process Attributes and Capability Levels Incomplete Performed Managed Established Predictable Optimizing Slide 20 9 Process Attributes Process Attribute Indicators (PAI) COBIT ISO This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO.

21 Process Attributes and Capability Levels Incomplete Performed Managed Established Predictable Optimizing Slide 21 This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO.

22 Process Attribute Rating  Assessment indicators in the PAM are used to support the assessors’ judgement in rating process attributes: Provide the basis for repeatability across assessments  A rating is assigned based on objective, validated evidence for each process attribute  Traceability needs to be maintained between an attribute rating and the objective evidence used in determining that rating Copyright ISACA All rights reserved Slide 22

23 Example from COBIT 4.1: DS1 Define and manage service levels Copyright ISACA All rights reserved Slide 23

24 Process Reference Model Example DS1 Process IDDS1 Process NameDefine and Manage Service Levels Purpose Satisfy the business requirement of ensuring the alignment of key IT services with the business needs. Outcomes (Os) NumberDescription DS1-O1 A service management framework is in place to define the organisational structure for service level management, covering the base definitions of services, roles, tasks and responsibilities of internal and external service providers and customers. DS1-O2Internal and external SLAs are formalised in line with customer requirements and delivery capabilities. DS1-O3Operating level agreements (OLAs) are developed to specify the technical processes required to support SLAs. DS1-O4Processes are in place to monitor (and periodically review) SLAs and achievements. Base Practices (BPs) NumberDescriptionSupports DS1-BP1Create a framework for defining IT services.DS1-O1 DS1-BP2Build an IT service catalogue.DS1-O1, O2 DS1-BP3Define SLAs for critical IT services.DS1-O2 DS1-BP4Define OLAs for meeting SLAs.DS1-O3 DS1-BP5Monitor and report end-to-end service level performance.DS1-O4 DS1-BP6Review SLAs and underpinning contracts.DS1-O4 DS1-BP7Review and update the IT service catalogue.DS1-O1 DS1-BP8Create a service improvement plan.DS1-O1 Work Products (WPs) Inputs NumberDescriptionSupports PO1-WP1Strategic IT planDS1-O1, O2, O3, O4 PO1-WP4IT service portfolioDS1-O1, O2, O3, O4 PO2-WP5Assigned data classificationsDS1-O1 PO5-WP3Updated IT service portfolioDS1-O4 AI2-WP4Initial planned SLAsDS1-O3 AI3-WP7Initial planned OLAsDS1-O3 DS4-WP5Disaster service requirements, including roles and responsibilitiesDS1-O1 ME1-WP1Performance input to IT planningDS1-O1, O2 Outputs NumberDescriptionInput ToSupports DS1-WP1Contract review reportDS2DS1-O1, O4 DS1-WP2Process performance reportsME1DS1-O4 DS1-WP3New/updated service requirementsPO1DS1-O2, O3 DS1-WP4SLAsAI1, DS2, DS3, DS4, DS6, DS8, DS13DS1-O2 DS1-WP5OLAsDS4 to DS8, DS11, DS13DS1-O3 DS1-WP6Updated IT service portfolioPO1DS1-O1, O4 Copyright ISACA All rights reserved Slide 24

25  Does the process achieve its defined outcomes (PA1.1)? –As evidenced by: Production of an object A significant change of state; Meeting of specified constraints, e.g., requirements, goals  N Not achieved 0 to 15 % achievement  P Partially achieved > 15 % to 50 % achievement  L Largely achieved > 50 % to 85 % achievement  F Fully achieved > 85 % to 100 % achievement. Assessing Process Capability Figure 6—PA1.1 Process Performance Result of Full Achievement of the Attribute Base Practices (BPs)Work Products (WPs) The process achieves its defined outcomes. BP Achieve the process outcomes. There is evidence that the intent of base practice is being performed. Work products are produced that provide evidence of process outcomes, as outlined in section 3. Copyright ISACA All rights reserved Slide 25

26 Assessing Process Capability  PA 2.1 Performance management a.Have objectives for the performance of the process been identified? b.Is performance of the process planned and monitored? c.Is performance of the process adjusted to meet plans? d.Are responsibilities and authorities for performing the process defined, assigned and communicated? e.Are resources and information necessary for performing the process identified, made available, allocated and used? f.Are interfaces between the involved parties managed to ensure effective communication and clear assignment of responsibility?  N Not achieved 0 to 15 % achievement  P Partially achieved > 15 % to 50 % achievement  L Largely achieved > 50 % to 85 % achievement  F Fully achieved > 85 % to 100 % achievement Copyright ISACA All rights reserved Slide 26

27 Assessing Process Capability  PA 2.1 Performance management a.Have objectives for the performance of the process been identified? b.Is performance of the process planned and monitored? c.Is performance of the process adjusted to meet plans? d.Are responsibilities and authorities for performing the process defined, assigned and communicated? e.Are resources and information necessary for performing the process identified, made available, allocated and used? f.Are interfaces between the involved parties managed to ensure effective communication and clear assignment of responsibility?  N Not achieved 0 to 15 % achievement  P Partially achieved > 15 % to 50 % achievement  L Largely achieved > 50 % to 85 % achievement  F Fully achieved > 85 % to 100 % achievement. Copyright ISACA All rights reserved Slide 27

28 Assessing Process Capability  PA 2.2 Work product management a.Have requirements for the work products of the process been defined? b.Have requirements for documentation and control of the work products been defined? c.Are work products appropriately identified, documented and controlled? d.Are work products reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements?  N Not achieved 0 to 15 % achievement  P Partially achieved > 15 % to 50 % achievement  L Largely achieved > 50 % to 85 % achievement  F Fully achieved > 85 % to 100 % achievement Copyright ISACA All rights reserved Slide 28

29 Fully PA 1.1 Process performance PA 2.2 Work product management PA 2.1 Performance management PA 3.2 Deployment PA 3.1 Definition PA 4.2 Control PA 4.1 Measurement PA 5.1 Innovation PA 5.2 Optimisation Assessing Attribute Achievement Attribute Achievement NotPartiallyLargely Copyright ISACA All rights reserved Slide 29

30 PA1.1 Process performance PA 2.2 Work product management PA 2.1 Performance management PA 3.2 Deployment PA 3.1 Definition PA 4.2 Control PA 4.1 Measurement PA 5.1 Innovation PA 5.2 Optimisation Level 1 Performed F F F F F F F F F F L/F = Largely or Fully F= Fully Assessing Process Capability Levels L/F Level 2 Managed Level 3 Established Level 4 Predictable Level 5 Optimising L/F Level 0 Incomplete Copyright ISACA All rights reserved Slide 30

31 Overview This figure is reproduced from ISO :2003 with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 31

32 1Initiation 2Planning the assessment 3Briefing 4Data collection 5Data validation 6Process attributes rating 7 Reporting the results Assessment Process Activities Copyright ISACA All rights reserved Slide 32

33  Identify the sponsor and define the purpose of the assessment: Why it is being carried out?  Define the scope of the assessment: Which processes are being assessed? What constraints, if any, apply to the assessment?  Identify any additional information that needs to be gathered  Select the assessment participants, the assessment team and define the roles of team members  Define assessment inputs and outputs: Have them approved by the sponsor 1. Initiation Copyright ISACA All rights reserved Slide 33

34 SCOPING GUIDANCE Process Assessment Model Walkthrough Copyright ISACA All rights reserved Slide 34

35  The aim of the scoping as part of Assessment Initiation is to focus on the assessment on the business needs of the enterprise. This reduces the overall effort involved the assessment  One of the benefits of using COBIT 4.1 as the process reference model is that it has extensive validated mappings from business objectives, and IT Objectives and IT processes. [COBIT 4.1 Appendix 1]. These are available in the tool kit  There is a six Step Selection Process: Step 1 Identify relevant business drivers for the IT processes assessment. Step 2 Prioritise the enterprise’s IT processes that may be included within the scope of the assessment Step 3 Perform a preliminary selection of target processes for inclusion in the assessment, based on the above prioritisation Step 4 Confirm the preliminary selection of target processes with the project sponsor and key stakeholders of the process assessment Step 5 Finalise the processes to be included in the assessment Step 6 Document the scoping methodology in the assessment records Scoping..1 Copyright ISACA All rights reserved Slide 35

36  Available Mappings Linking Business Goals to IT Goals Linking IT Goals to IT processes Mapping IT processes to IT governance focus areas and COSO US Sarbanes-Oxley Act Cloud Computing Self Diagnostic Scoping..2 Copyright ISACA All rights reserved Slide 36

37  Available Mappings Linking Business Goals to IT Goals Linking IT Goals to IT processes Scoping..3 Copyright ISACA All rights reserved Slide 37

38  An assessment plan describing all activities performed in conducting the assessment is: Developed Documented together with An assessment schedule  Identify the project scope  Secure the necessary resources to perform the assessment  Determine the method of collating, reviewing, validating and documenting the information required for the assessment  Co-ordinate assessment activities with the organisational unit being assessed 2. Planning the Assessment Copyright ISACA All rights reserved Slide 38

39  The assessment team leader ensures that the assessment team understands the assessment: Input Process Output  Brief the organisational unit on the performance of the assessment: PAM, assessment scope, scheduling, constraints, roles and responsibilities, resource requirements, etc. 3. Briefing Copyright ISACA All rights reserved Slide 39

40  The assessor obtains (and documents) an understanding of the process(es) including process purpose, inputs, outputs and work products, sufficient to enable and support the assessment  Data required for evaluating the processes within the scope of the assessment are collected in a systematic manner  The strategy and techniques for the selection, collection, analysis of data and justification of the ratings are explicitly identified and demonstrable  Each process identified in the assessment scope is assessed on the basis of objective evidence: ­ The objective evidence gathered for each attribute of each process assessed must be sufficient to meet the assessment purpose and scope ­ Objective evidence that supports the assessors’ judgement of process attribute ratings is recorded and maintained in the assessment record This record provides evidence to substantiate the ratings and to verify compliance with the requirements 4. Data Collection Copyright ISACA All rights reserved Slide 40

41  Actions are taken to ensure that the data are accurate and sufficiently cover the assessment scope, including: Seeking information from firsthand, independent sources Using past assessment results Holding feedback sessions to validate the information collected  Some data validation may occur as the data is being collected 5. Data Validation Copyright ISACA All rights reserved Slide 41

42  For each process assessed, a rating is assigned for each process attribute up to and including the highest capability level defined in the assessment scope  The rating is based on data validated in the previous activity  Traceability must be maintained between the objective evidence collected and the process attribute ratings assigned  For each process attribute rated, the relationship between the indicators and the objective evidence is recorded 6. Process Attribute Rating Copyright ISACA All rights reserved Slide 42

43  The results of the assessment are analysed and presented in a report  The report also covers any key issues raised during the assessment such as: Observed areas of strength and weakness Findings of high risk, i.e., magnitude of gap between assessed capability and desired/required capability 7. Reporting the Results Copyright ISACA All rights reserved Slide 43

44 Level 1Level 2 PA 1.1 PA 2.1PA 2.2PA 3.1PA 3.2 Level 3 Process ATarget Capability Assessed Process CTarget Capability Assessed L LLF LLFFF Process BTarget Capability Assessed Target Process Capabilities (example) Copyright ISACA All rights reserved Slide 44

45 Figure A.3—Consequence of Gaps at Various Capability Levels Consequence of Capability Gaps This figure is reproduced from ISO with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 45

46 Figure A.4—Risk Associated With Each Capability Level Capability Gaps and Risk This figure is reproduced from ISO with the permission of ISO at Copyright remains with ISO. Copyright ISACA All rights reserved Slide 46

47  COBIT process assessment roles: Lead assessor—a ‘competent’ assessor responsible for overseeing the assessment activities Assessor—an individual, developing assessor competencies, who performs the assessment activities  Assessor competencies: Knowledge, skills and experience: With the process reference model; process assessment model, methods and tools; and rating processes With the processes/domains being assessed Personal attributes that contribute to effective performance  A training and certification scheme is being developed for COBIT 4.1 and COBIT 5 Assessor Certification Copyright ISACA All rights reserved Slide 47

48 COBIT Assessment Programme: Contact Information: And so Goodbye... Copyright ISACA All rights reserved Slide 48


Download ppt "ISACA’s COBIT ® Assessment Programme Presented by:"

Similar presentations


Ads by Google