Presentation is loading. Please wait.

Presentation is loading. Please wait.

Auditor’s Guide to IT Auditing by Richard Cascarino.

Similar presentations


Presentation on theme: "Auditor’s Guide to IT Auditing by Richard Cascarino."— Presentation transcript:

1 Auditor’s Guide to IT Auditing by Richard Cascarino

2 Part I: IT Audit Process Technology and Audit IS Audit Function Knowledge IS Risk and Fundamental Auditing Concepts Standards and Guidelines for IT Auditing Internal Control Concepts Knowledge Risk Management of the IT Function Audit Planning Process Audit Management Audit Evidence Process Audit Reporting and Follow-up

3 Technology and Audit n Key Concepts è "Reasonable Assurance" è "Acceptable Levels" n Control Environments n Primary Element of Internal Control n Establishes conditions under which Internal Controls will operate è Organization Structure è Control Framework è Organizational policies and procedures è External Influences

4 Control Environment n Organizational Structure è Defines individual managers' responsibilities è Sets limits of authority è Ensures appropriate segregation of duties

5 Control Framework (1) n May be complex or simple è Large organizations tend to have highly structured control frameworks è Small organizations frequently use personal contact between employees n Elements è Segregation of duties è Competence and integrity of people è Appropriate levels of authority è Accountability è Adequate resources è Supervision and review

6 Control Framework (2) n Policies and Procedures è Describe – Scope of the function – Activities – Interrelationships with other departments – External Influences – Laws and Regulations – Customers – Suppliers – Union agreements – Competitive Environments

7 Manual and Automated Systems n Systems Software è Computer programs and routines controlling computer hardware, processing and non-user functions è Includes – Operating Systems – Telecommunications software – Data management n Applications Software è Computer programs written to support business functions è Includes – General Ledger – Payroll – Stock systems – Order Processing

8 Manual and Automated Systems n End-User Systems è Generated outside the IT organization to meet specific user needs è Includes – Micro-based systems – User-developed systems

9 Control Procedures n General IT Controls è Computer Operations – Physical Security – Logical Security – Program change control è Systems Development – Application Controls – Business systems oriented – Accuracy – Completeness – Authorization n Compensating Controls – Weak controls may be compensated for by other controls

10 Data and Transactions Objectives è Input – All transactions are initially and completely recorded – All transactions are completely and accurately entered into the system – All transactions are entered once only è Controls may include – Pre-numbered documents – Control total reconciliation – Data validation – Activity logging – Document scanning – Access authorization – Document cancellation è Input methods – On-line input – Batch input – System Interfaces – EDI

11 Data and Transaction Objectives n Processing è Approved transactions are accepted by the system and processed è All rejected transactions are reported, corrected, and re-input è All accepted transactions are processed once only è All transactions are accurately processed è All transactions are completely processed

12 Controls May Include è Control totals è Programmed balancing è Segregation of duties è Restricted access è File labels è Exception reports è Error logs è Reasonableness tests è Concurrent update control n Processing Types è Batch processing è Interactive update è On-line batch

13 Data and Transaction Objectives n Output è Hard copy è File output è On-line enquiry files n Primary Objectives è Assurance that the results of Input and Processing are output è Output is available only to authorized personnel n Typical Controls è Complete audit trail è Output distribution logs

14 Program Control Objectives n Integrity of programs and processing n Change Control è Prevention of unwanted changes è Ensuring adequate design and development control è Ensuring adequate testing è Controlled program transfer è Ongoing maintainability of systems

15 Typical Controls è Use of a formal Systems Development Life Cycle è User involvement è Adequate documentation è Formalized testing plan è Planned conversion è Use of post-implementation reviews è Establishment of a QA function è Involvement of Internal Auditors

16 Batch vs On-line n Early days - Batch only è All inputs collected centrally è Input together in "batches" è Normally punched cards è May be entered via terminal with update taking place in batch mode è Primary control objectives – Accuracy – Completeness

17 Nowadays è On-line, real-time input with a small batch component è Input via a terminal è Instantaneous update è Overnight report production è Terminals may be local or remote è Terminals may be dial-up or dedicated è Terminals may be differing types è Primary control objectives – Availability – Security – Confidentiality – Accuracy

18 Communications n Microwave n Satellite n Cables è Dedicated è Dial-up n Line operations è Digital to analogue è Simplex - one way only è Half-duplex - one way at a time è Duplex - two way communications

19 Other Concepts n Synchronous Communications è High speed transmission and reception of long groups of characters n Asynchronous Communications è Slow, irregular transmissions, one character at a time with start and stop bits n Encryption è Scrambling of data into unreadable forms such that it can be unscrambled n Protocol è A set of rules for message transmission in the network

20 Networks n Private Public Switched (PSNs) n Value Added (VANs) n Local Area (LANs) n Wide Area (WANs) n The Internet

21 Network Configurations n Point-to-Point è Separate, direct links n Multidrop è Multiple terminals sharing a single line n Ring Networks è No central computer, each machine is a "node" n Star Networks è Single central computer coordinating all communications

22 On-line Systems Capabilities n On-line enquiry – Allows a remote user to retrieve data directly è Primary concern - Confidentiality n On-line data entry – Remote entry of data – Allows concurrent processing of data è Primary concerns – Transaction authenticity – Accuracy – Completeness n On-line update – As per on-line data entry but with immediate effect è Primary concerns – Concurrency control – Availability

23 Basic On-line Concerns n Availability n Security è Unauthorized access è Accidental or intentional changes n Security threatened areas è Operating system è Management features è Inter-computer communication è Dial-up access è Gateways è Poor performance

24 Availability (1) n Availability of è Hardware components è Software è Data è Networking capability è Human resources

25 Availability (2) n Ensured by è Adequate physical environment è Adequate backups è Multiple redundancies è Peer-to-peer networking è Adequate Disaster Recovery Planning è Training

26 Security (1) n A factor of è Hardware è Software è Human element n Hardware è Theft è Sabotage è Penetration n Operating System Software è Theft è Corruption è Bypassing

27 Security (2) n Applications Software è Theft è Corruption è Bypassing è Substitution n Data è Theft è Corruption è Substitution è Manipulation

28 Sources of Security Threats n Insiders - Users n Insiders - Specialists n Outsiders - Legitimate n Outsiders - Hackers

29 IS Audit Function Knowledge

30 IS Internal Audit Definition  Internal Audit is:  An independent, objective assurance and consulting activity designed to:  Add value and  Improve an organization’s operations  It helps an organization accomplish its objectives by:  Bringing a systematic, disciplined approach to  Evaluate and improve the effectiveness of:  Risk management  Control and  Governance processes.

31 Main Objectives  Express an opinion  Interpret factual evidence  Make constructive and cost-effective suggestions  Presented in a report  Additional objectives  Discovery of errors  Discovery of fraud

32 Scope of Internal Audit  Review reliability and integrity of financial and operating information  Review means used to identify, measure, classify, and report such information  Review systems to determine compliance with policies, plans, procedures, laws, and regulations  Review the means of safeguarding assets  Appraise economy and efficiency of resource utilization  Review operations or programmes to ascertain whether results are consistent with published objectives and are being carried out

33 Target Areas (1)  Examine and appraise management aspects of the  organization  Independence required  Within the normal organizational structure  Examines management's:  Goals  Policies  Decisions  Standards  Procedures  Controls  Perform special assignments as requested  Report to management

34 Target Areas (2)  Examine and appraise the administrative and financial  aspects  Strengthening systems and controls  Adequacy  Application  Review reliability of records  Assist directly in uncovering fraud and errors  Assist indirectly in preventing fraud and errors  Ensure compliance with policies  Ensure compliance with statute  Ensure adequate reporting takes place

35 Why have Internal Audit?  Provides management with an independent opinion on the state of Internal Control  Assures management that information presented to them is:  Consequent  Uniform  Standardized  Chances of detecting fraud and errors is increased  Enables management's evaluation of Internal Audit itself  Assists the auditee get better

36 Internal Audit is  Separate from normal operations  A staff (personnel) function  No line authority  Recommend not instruct  Objective due to distance from operations  Reporting to a high enough level to maintain independence

37 What and When to Audit?  Depends on Risk  Financial loss  Public embarrassment  Industrial Action  Fraud  Risk measurement  Cost of and event times likelihood of occurrence  May be mitigated by good internal control  May be exacerbated by poor internal control  Risk may be  Accepted  Reduced  Transferred  NOT ignored

38 Control Responsibility  Management's Job  Planning  Establishing objectives and goals  Choosing preferred methods of utilizing resources  Organizing  Gathering the required resources  Arranging them in such a way that the objectives may be attained  Directing  Authorizing, instructing, and monitoring performance  Periodically comparing actual to planned performance  Leading  Control

39 Audit Responsibility  Evaluation of controls  Testing compliance with controls  Not  Designing controls  Implementing controls

40 Internal Audit Place, Role, and Function  Place of the Internal Audit Function  Independence  Role and function of Internal Audit  Role of Audit Committees  Internal Audit Definitions

41 Place of the Internal Audit Function  Organizational Status  Influenced by  Level of responsibility of work undertaken Importance of work undertaken Value attached by management  Level of Internal Audit Reporting Accessibility to top management Sufficient to promote independence Ensure full scope auditing Ensure adequate attention given to audit reporting Ensure appropriate action taken on findings

42 Place of the Internal Audit Function  Organizational Plan  Grouping together by management of resources to achieve a logical flow of action  To achieve audit independence  Audit grouped separately  Outside of the chain of command  Reporting independently

43 Audit Reporting  May be to  Top Executive Management  May lead to distrust by others  May lead to a lot of non-audit activities  Chief Executive Officer  Less threatening but good independence  Problems of access  Financial Director  Traditional reporting structure - can work well  May be a problem with other departments  Audit Committee

44 Audit Committee  Committee of persons with specialized knowledge  Link executive management / external audit / internal audit  Should:  Consist of a majority of non-executive directors  Meet regularly - minimum four times a year  Not be chaired by the chief executive  Approves audit plans and receives audit reports  Recommended by all Corporate Governance studies  Not present in all companies

45 Dual Reporting  Common solution  Functional reporting to the Audit Committee  Administrative reporting to the Chief Executive  Possible problems  Being pulled in two directions  Possibly open to manipulation  Audit undertaking line functions

46 IS Risk and Fundamental Auditing Concepts  Objectives of the Risk-based Approach  Nature of Systems Risk  Key Risk Characteristics  Assessing the Risk  Scoring System Risk Characteristics  Ranking Systems by Risk

47 Objectives of the Risk-based Approach  To allow the allocation of scarce IT audit resources to systems containing the greatest corporate risk  To consistently measure, on an agreed basis, the relative risk of diverse systems  To objectively rank systems in order of their vulnerabilities  To identify the elements making up systems risk for a given system

48 Nature of Systems Risk  Risk vary in emphasis from one system to another – Batch processing of minor value - low risk – On-line, real-time processing of FOREX - high risk  Why? – Relative value of the assets controlled – Impact on the business of the system failing – Technical complexity involved – Sensitivity of the transactions involved

49 Key Risk Characteristics  Primary Risk Characteristics include: – Monetary values handled by the system – Value of information used by the system – Confidentiality of information used by the system – Extent of influence on the system of statutes and regulatory bodies – The technical complexity of the processing of the system – The stability of the system – The impact on the business of system failure or interruption  Plus any others appropriate to your specific company or industry

50 Assessing the Risk  Evaluation must be kept simple  Risk assessment may be done with or without "weightings"  Overall" – Obtain a brief understanding of each system  Scope  Coverage  Volumes  Values – Score each characteristic – Add up the scores and rank the systems

51 Scoring Systems Risk Characteristics (1)  Monetary Values – The higher the monetary value handled, the greater the potential loss – Score:  10if system controls over 49% of the total assets etc.  9if system controls 30-49% of the total assets etc.  8if system controls 20-29%  7if system controls 10-19%  6 if system controls 5-9%  5if system controls less than 5% of the total assets, revenues etc.

52 Scoring Systems Risk Characteristics (2)  Value of Information – Information may be vulnerable to loss or disclosure. Here we look only at the effect of loss on the enterprise – More subjective, score:  10extremely valuable, loss of which endangers the entire enterprise  7valuable, loss is likely to have a major impact  3valuable, loss is likely to have some impact  1small value, loss causing negligible impact

53 Scoring Systems Risk Characteristics (3)  Confidentiality of Information – Disclosure of data to another party may detrimentally impact the organisation – Data handled and stored by the system rated as:  8 highly damaging, immediate effect of share price, profitability or image  5 seriously damaging, external impact, legal impact and financial or operational damage  3 embarrassing, largely internal impact, small financial or operational damage  1 mildly embarrassing, negligible impact

54 Scoring Systems Risk Characteristics (4)  Statute and Regulatory implications – Many systems support business functions which are regulated by laws or regulatory bodies. Score:  7 the system controls the principle regulated business function  5 the system controls a major business activity which is subjected to regulatory control  3 the system controls a subsidiary activity subject to regulatory control  2 the systems controls a subsidiary activity and produces information which is sent to regulatory bodies  0 no regulations apply

55 Scoring Systems Risk Characteristics (5)  Technical Complexity – The higher the complexity, the greater the chance of getting it wrong – Complexities include  Use of new, untested hardware, distributed systems, complex database systems, extensive networking, little available vendor support, extensive tailoring of the operating environment. Score:  6 very highly complex  5 highly complex  3 average complexity  0 simple systems

56 Scoring Systems Risk Characteristics (6)  Systems Stability – The higher the frequency of change, the higher the probability of error – Aspects of stability include:  frequency of program changes, frequency of system failure, frequency of system enhancements, number of systems with which it interacts, skill of the maintenance programmers. Score:  5Very highly unstable  4Highly unstable  2 Some instability  0Highly stable

57 Scoring Systems Risk Characteristics (7)  Impact of System Failure – Short-term impact of total system failure and major disruption to processing. Score:  10 endangerment of the organisation's existence  8 very serious disruption caused  6 significant disruption caused  4 some disruption would result  2 minor disruption  0 no disruption

58 Overall Risk Assessment Key Risk CharacteristicScore 1 Asset values controlled 2 Value of Information 3 Confidentiality of Information 4 Regulatory body influence 5 Technical complexity 6 Stability of system 7 Impact of system disruption Total Application Score

59 Other Areas for Computer Risk Assessment  Physical Security  Personnel Security  Data Security  Systems Software Security  Telecommunications Security  Computer Operations Security

60 Standards and Guidelines for IT Auditing IIA Standards and Ethics ISACA Standards and Ethics COSO Internal Control Standards NIST Standards BSI Standards

61 Standards for the Professional Practice of Internal Auditing  Mandatory  Attribute  Performance  Implementation  Advisory  Aids

62 IIA Standards 1000 Purpose, Authority, and Responsibility (Charter) 1100 - Independence and Objectivity 1200 - Proficiency and Due Professional Care 1300 - Quality Assurance and Improvement Program 2000 - Managing the Internal Audit Activity 2100 - Nature and Scope of Work 2200 - Engagement Planning 2300 - Performing the Engagement 2400 - Communicating Results 2500 - Monitoring Progress 2600 - Acceptance of Risk

63 IIA Code of Ethics: Principles Integrity –The integrity of internal auditors establishes trust and thus provides the basis for reliance on their judgment. Objectivity –Internal auditors exhibit the highest level of professional objectivity in gathering, evaluating, and communicating information about the activity or process being examined. Internal auditors make a balanced assessment of all the relevant circumstances and are not unduly influenced by their own interests or by others in forming judgments Confidentiality –Internal auditors respect the value and ownership of information they receive and do not disclose information without appropriate authority unless there is a legal or professional obligation to do so. Competency –Internal auditors apply the knowledge, skills, and experience needed in the performance of internal auditing services.

64 ISACA Standards Standards define mandatory requirements for IS auditing and reporting Guidelines provide guidance in applying IS Auditing Standards Procedures provide examples of procedures an IS auditor might follow in an audit

65 COBIT Control Objectives for Information and related Technology (COBIT®) –Control objectives High-level and detailed generic statements of minimum good control –Control practices Practical rationales and how-to-implement guidance for the control objectives –Audit guidelines Guidance for each control area on how to obtain an understanding, evaluate each control, assess compliance, and substantiate the risk of controls not being met –Management guidelines Guidance on how to assess and improve IT process performance, using maturity models, metrics, and critical success factors

66 ISACA Code of Ethics Support the implementation of, and encourage compliance with, appropriate standards, procedures, and controls for information systems. Perform their duties with due diligence and professional care, in accordance with professional standards and best practices. Serve in the interest of stakeholders in a lawful and honest manner, while maintaining high standards of conduct and character, and not engage in acts discreditable to the profession. Maintain the privacy and confidentiality of information obtained in the course of their duties unless disclosure is required by legal authority. Such information shall not be used for personal benefit or released to inappropriate parties. Maintain competency in their respective fields and agree to undertake only those activities that they can reasonably expect to complete with professional competence. Inform appropriate parties of the results of work performed; revealing all significant facts known to them. Support the professional education of stakeholders in enhancing their understanding of information systems security and control.

67 COSO Internal Control Standards Three Objectives 1. Economy and efficiency of operations, including achievement of performance goals and safeguarding of assets against loss; 2. Reliable financial and operational data and reports; and 3. Compliance with laws and regulations Five Components 1. Sound Control Environment 2. Sound Risk Assessment Process 3. Sound Operational Control Activities 4. Sound Information and Communications Systems 5. Effective Monitoring

68 NIST Standards Covers: –Elements of computer security –Roles and responsibilities –Common threats With standards for: –Management Controls –Operational Controls –Technical Controls

69 Internal Control Concepts Must take into account: –Internal Control Objectives –Cost Benefit issues Types of Internal Control include: –Preventative –Detective –Corrective –Directive –Compensating

70 Elements of Internal Control Segregation of duties Competence and integrity of people Appropriate levels of authority Accountability Adequacy of resources Appropriate supervision and review

71 Control Procedures General IT controls Computer operations Physical security Logical security Program change control Systems development

72 Application Controls Control types include: –Preventative controls –Discretionary controls –Voluntary controls –Manual controls –General controls

73 General Control Objectives Data and transaction objectives –Input –Processing –Output Program control objectives –Development of systems –Testing of systems –Running of systems –Ensuring quality of systems

74 COSO Control Components Control environment Risk assessment Control activities Information and communication Monitoring

75 Risk Management of the IT Function All entities encounter risk Risk analysis software –Cost of Risk Analysis (CORA) –World Modeler –Primavera –@Risk –Etc.

76 Elements of Risk Analysis Estimating the significance of a given risk Assessing the likelihood of frequency of occurrence Process analysis including identification of key dependencies and control nodes Normally evaluated before the mitigating effects of controls are considered (inherent risk}

77 Defining the IT Risk Universe Computer risk –Probability that an undesirable event turns into a loss Computer exposure –Results from a threat from an undesirable event with the potential to become a risk Vulnerability –A flaw or weakness in the system which could result in a threat or risk

78 Sources of Threats Users Management IT staff IT auditors Outsiders Other systems

79 RISK-based Audit Approach Risk profile including –Physical security –Personnel security –Data security –Applications software security –Systems software security –Telecommunications security –Operations security

80 Audit Planning Process n Typical audit scope issues: – Audit frequency è Fixed frequency è Random frequencies è Conditional approach based on analytical review or risk analysis – Audit intensity è Not always more time in the riskier areas – Audit timing è Involves a variety of objectives and constraints

81 Engagement Planning Cornerstone of successful auditing Involves –Identifying those tasks to be performed in the course of an audit –Allocating the tasks to individual auditors –Deciding when a task should take place –Quantifying how long it should take to execute Helps establish the objectives and scope of the audit Helps anticipate problems and achieve flexibility in identifying the control objectives and risks Plan should be always be looked on as provisional and subject to amendment, depending on what is found

82 Annual Audit Plan Normally based on the overall risk assessment of the organization Also the available audit resources Can be simplified into: –Mandatory audit activities Must be carried out within the time span of the audit plan –Discretionary audit activities Subject to availability of resources Allocated on a risk basis

83 Planning the Assignment Tasks scheduled as part of the audit process to include: –Participants –Timescales –Requirements for auditee participation –Areas to be covered –Areas to be excluded Unplanned work will always occur –Allowances based on past performance not wishful thinking Budget for non-audit work –Leave –Training –Sickness –Administration

84 Audit Management

85 Establishing an IT Audit Function What? Why? How? Where? How much?

86 The Audit Mission (1) To review, appraise, and report on:  Soundness, adequacy, and application of controls  Compliance with established policies, plans, and procedures  Accounting for and safeguarding corporate assets  Application of proper authority levels

87 The Audit Mission (2) To review, appraise, and report on:  The reliability of accounting and other data  The quality of performance of assigned duties  The extent of coordinated effort between departments  Safeguarding of corporate interests in general

88 The IS Audit Mission (1) To review, appraise, and report on:  Soundness, adequacy, and application of IS operational standards  Soundness, adequacy, and application of systems development standards  The extent of compliance with corporate standards  Security of the corporate IS investment

89 The IS Audit Mission (2) To review, appraise, and report on:  The adequacy of contingency arrangements  The completeness and accuracy of computer- processed information  Whether optimum use is being made of all computing resources  Soundness of application systems developed

90 Scope of Work Undertaken Installation reviews Systems reviews Audit of systems under development Audit of the development process Audit of the contingency planning arrangements

91 Other IS Audit Tasks To provide in-department expertise To train non-IS auditors To computerize the internal audit function To review logical security To liaise with external IS audit functions

92 Installation Review Points (1) Hardware / software acquisition policies Organization, staffing, and reporting structure of the IS department Development of new application systems Maintenance of existing systems Change control and problem management

93 Installation Review Points (2) Control over the implementation of systems software Control over the operations function Physical security over the corporate IS assets The IS role in the corporate survival plan Compliance with legal requirements

94 System Reviews Four phases 1. Gaining an understanding 2. Evaluating controls 3. Testing key controls 4. Reporting

95 Audit of Systems Under Development To ensure:  Appropriate controls are designed into systems  Designed controls are, in fact, implemented  Implemented controls work  Control retrofit is rarely done Audit of the Development Process To ensure:  Delivery of a quality application system –On time –Within budget –Under controlled conditions

96 Provision of In-department Expertise To answer IS-related questions To assist in building a professional reputation To assist credibility with IS To give assurance on IS related concerns To be part of the team

97 Training non-IS Auditors In IS-related control concerns In IS controls In the audit use of computers In the provision of IS consultancy to users In their role in the audit of computer systems

98 Computerization of the Internal Audit Function  Productivity tools  Audit planners  Word processors  Spreadsheets  Graphics systems  Computer Assisted Audit Techniques  Retrieval packages  Data downloading  Data manipulation  Verification processing

99 Specialist Tasks Review of logical security Review of IS strategic planning Review of IS efficiency / effectiveness Review of communications systems Review of IS technical support functions

100 Computer Systems Exposures  Erroneous management decisions  Unacceptable accounting policies  Inaccurate record keeping  Business interruption  Built-in fraud  Violation of legal statutes  Excessive operating cost  Inflexibility  Overrun budgets  Unfulfilled objectives

101 Primary Control Over an Audit Documentation and Review  An iterative process –Review after each step –Go / no-go decisions Do not over-review Check objectives constantly Document continuously

102 Organization of the Function Within the corporation Reporting independently Adequate authority for access IS audit a part of internal audit Reporting within the structure

103 Structure of IS Audit  A factor of size  Specialists vs generalists  Complexity of systems  Uniqueness of systems  Use of packaged systems  Computer audit manager  Application auditors  Trainee auditors  Audit application development staff  Technical support

104 Skill Levels Required Manager  Specialized skills in auditing  Specialized skills in computer auditing  Managerial skills  Knowledge of the corporation

105 IS Managerial Tasks Planning the strategic direction of the section Priority setting Liaison internally and externally Review and approval of all I.S. audit work Controlling and monitoring workflow Staffing of the department  Defining roles  Sourcing staff  Training  Motivating  Career planning

106 IS Audit as a Support Function Development of CAATs Assistance to non-IS auditors Internal training of non-IS auditors Development of control procedures for internal computer usage Research in advanced IS and IS audit techniques

107 Organizational Structures  Centralized –Independence from local management –Close ties with corporate management and flexible availability –But - seen as an outsider  Decentralized –Each division with its own IS auditors –Close ties at local level and enhanced perception of benefits –Better understanding of business functions –But - possible loss of objectivity and standard audit approach  Hybrid –Generalist groups in the field / Technical support at head office with rotation of staff –Best of both worlds? –But - may fragment audit effort and result in a loss of cohesion

108 Audit Control Group Head office based “Auditing the auditors" A monitoring and quality control function Researching future IS audit tools and techniques Integrating internal / external / IS audit

109 Computer Requirements Depends on the type of IS auditing to be undertaken  Access to mainframe required  Running of CAATs  Use of computers as a productivity tool Security requirements for audit computers  Hardware  Terminals  Personal computers / laptops / networks  Modems / data concentrators  Communication lines Printer requirements

110 Software Selection of:  Standard word processor & spreadsheet  Planning software  Communications software  Interrogation software  Training –Computer training –Audit training –Management training  Local / overseas –Level of training required

111 General Rules  Use the lowest level staff possible  Do not over-audit  Do not use CAATs for the sake of using them  Do not buy equipment without justification  Allow for training and familiarisation in software costs  Benefits –Business benefits –Quantifyable –Measured –Demonstrable –Requires clear audit objectives

112 Role of the Specialist Conduct technical reviews Technical support Development of CAATs Develop an IS training program for the audit group Coordinate with external IS audit

113 Specialist as a Consultant Systems control reviews IS audit assistance Implementation reviews Corporate survival reviews IS management technique reviews

114 Problems Encountered with Specialists Career paths Remuneration Availability Prima donnas Motivation

115 Relations with IS  Isolated existence  A "closed shop" tradition  Poor communications  Iron curtain to rest of organisation  Different reporting lines  Must win respect and confidence  An antagonistic IS deadly –Ineffectual audits –Superficial only –Withered IS audit function

116 IS Concerns - Auditors Don't:  1Find fraud  2Enhance the IS image  3Reduce IS costs  4Work independently  5Improve IS value for money  6Come fully trained  7Help with project control  8Stick to their own business  9Express their feelings  10Equate real risks with history or practicality

117 What Is Evidence?  Something intended to prove or support a belief  Each piece may be flawed  Personal bias  Potential error of measurement  Less competent than desirable  In total the "body of evidence"  Should provide a factual basis for audit opinions

118 Standards of Audit Evidence  IIA Standards state that auditors  "should collect, analyze, interpret and document information to support audit results"  Information should be  Related to the audit objectives  Pertinent to the scope of work  Systematically gathered

119 Rules of Evidence  Primarily designed for legal evidence  May have to be complied with in legal cases  For example, evidence whose value as proof is offset by a prejudicial effect may be excluded  The auditor is not normally so restricted  Any evidence  Professional judgment  Until the auditor is satisfied

120 Types of Audit Evidence  Obtained by observing conditions, interviewing people, examining records  Physical  Testimonial  Documentary  Analytical

121 Physical Evidence  Obtained by observation  People  Property  Events  May be in the form of photographs, maps, etc.  Observations should be supported by documented examples  If not possible, by corroborating observation

122 Testimonial Evidence  May take the form of  Letters  Statements in response to enquiries  Interviews  Not conclusive in themselves  Should be supported by documentation

123 Documentary Evidence  Most common form of audit evidence  Includes  Letters  Memoranda  Business documents  Sources affect reliability  Internal control procedures affect reliability

124 Analytical Evidence  Derived from  Computations  Comparisons to standards  Past operations  Similar operations  Regulations  Reasoning

125 Statistics Statistics is concerned with scientific methods for collecting, organizing, summarizing, presenting, and analyzing data  "You might prove anything by figures" (Carlyle)  "There are three kinds of lies: lies, damn lies, and statistics" (Disraeli)  "Don't be a novelist - be a statistician, much more scope for the imagination" (Mel Calmen)  "He uses statistics as a drunken man uses a lamp post - for support rather than illumination" Andrew Lang)

126 Statistical and Nonstatistical Concepts Why Sample? – Speed – Cost – To ensure data is "Substantially or Materially Correct" – Not mentioned in IIA Standards but "Information should be Sufficient, Competent, Relevant, and Useful to provide a sound basis for audit findings and recommendations" "Sufficient information is Factual, Adequate, and convincing so that a prudent, informed person would reach the same conclusion as the auditor"

127 Statistical vs Nonstatistical Sampling Similarities  Both require Auditor Judgement  Audit Procedures performed will not differ  Both permitted in Audit practice Differences  Statistical Plans – Control and Measure Sampling Risk  Require Technical Training and Expertise  Normally Require Computer Facilities

128 Statistical Sampling: Advantages  Provides the opportunity to select the minimum sample size required to satisfy the objectives  Provides a quantitative measure of the sampling risk  Permits the auditor to explicitly specify a level of Reliability Confidence) and a desired degree of Precision (Materiality)  Provides a measure of sufficiency of the evidence gathered  Provides for more objective results for management  Provides a more defensible expression of test results  Is simple to apply with computer software

129 Statistical Sampling: Disadvantages Requires random sample selection which may be more costly and time consuming May lead to problems in establishing a correlation between the sample and the population if not appropriately organised May require specific staff training May require the acquisition of specialized software

130 Nonstat Sampling: Advantages and Disadvantages Advantages  Allows the auditor to utilize his subjective judgment to influence the sample toward items of greatest value and highest risk  May be equally effective and efficient as statistical sampling but may cost less Disadvantages  Statistical inferences may not be objectively valid  Cannot quantitatively determine sampling risk  Risks over or under auditing depending on the experience and judgment of the auditor

131 Terminologies (1) Uncertainty – Audit Risk - a combination of Inherent Risk, Control Risk, and Detection Risk  Inherent / Control Risks - assessed by Auditor's Judgment Detection Risk  Sampling Risk - risk of sample being non- representative  Nonsampling Risk - all other aspects of Audit Risk (e.g., Audit Procedures were not appropriate

132 Terminologies (2) Sampling Risk – Risk of incorrect acceptance (less chance of successful Audit)  Risk of incorrect rejection (greater Audit effort)  Risk of assessing Control Risk too high (less chance of a successful Audit)  Risk of assessing Control Risk too low (greater Audit effort)

133 Terminologies (3) Confidence Level (Reliability) – Percentage of times one would expect the sample to adequately represent the full population – The higher the percentage, the more representative the sample – The higher the confidence percentage required, the larger the sample Precision  How close the sample estimate is to the true population value

134 Terminologies (4)  Population (total collection of items about which an opinion will be expressed)  Sampling Unit (the individual items making up the population)  Frame (sample frame is a listing of the Sampling Units making up the Population)  Sample (collection of sampling units drawn from the frame that will be subject to Audit Procedures)  Measures of Location / Central Tendency (Mean, Median, Mode) – Mean (value of the population divided by the number of items) – Median (middle value in a population) – Mode (most frequently occurring value)

135 Sample Design Decide on the objectives of the survey Assess the resources available Define the sample population and the sample unit Select a sample frame Decide on a survey method if data not readily available Choose a sampling method

136 Bias in Sampling May arise if:  The sampling frame does not cover the population adequately or accurately  The sample is non-random (e.g., "convenient")  Subjective judgment enters into the selection criteria  These can lead to systematic, non- compensating errors in a sample

137 Primary Types of Sampling Attribute Sampling  Two-way (dichotomous) scale  Primarily yes / no type answers  Looks at likelihood of errors in populations Variables Sampling  Samples a population based on some specific variable (e.g., value) – Qualitative information  Used to obtain estimates of values, etc.

138 Primary Types of Sampling Probability Proportionate to Size (PPS)  A new approach to Variables Sampling  Uses Attribute Sampling methods to estimate Rand Amounts  Uses "Dollar" as sampling unit to select items for audit  Also called Dollar Unit Sampling (DUS), Cumulative Monetary Amount (CMA), and Combined Attribute Variables (CAV) sampling

139 Sampling Methods Sampling Approaches (Sampling Plans) Attributes Sampling Discovery Sampling Stop-or-Go Sampling (Sequential Sampling) Variables Sampling PPS Sampling Judgmental

140 Non-Statistical Selection Methods (1) Haphazard Selection  Auditor's best guess of a representative sample  Often used where no extrapolation will be done Block Selection  Used on blocks of transactions (e.g., all transactions within a time scale)  Use with caution since inferences beyond the block may be invalid

141 Non-Statistical Selection Methods (2) Judgment Method  Basic Issues – Value of items – Relative risk – Representativeness

142 Sampling Methods (Techniques of Selection) Random Number Sampling Interval Sampling (Systematic Selection) Stratified Sampling Block Sampling (Cluster Sampling) Probability Proportionate to Size (PPS) Mechanised

143 Components of a Sampling Plan COMPONENT SAMPLING PLAN  Population definition All invoices  Sampling unit specification One invoice  Nature of error to be identified Invoices in error  Method of selection Random  Sampling Risk 10%  Sample Size As per calculation  Evaluation phase No erroneous Invoices found  Interpretation of Sample Result Invoices are correct at a 90% confidence level

144 Sample Sizing and Selection Two paths to selection  Directed – Used when serious error or manipulation is suspected – Not scientific sampling – Used purely to detect a suspected condition – May not be relied on to draw conclusions about the population Random Sampling – Seeks to represent the population – Taking a snapshot in miniature – The larger the sample the closer it depicts the population, the more it can be relied upon

145 Sample Sizing Factors affecting sample size  Population size  Population variability  Expected error rate  Desired precision  Confidence level  Tolerable error

146 Population Size Sample size increases as population increases Increase is not proportional  Populations of over 5000 require very little increase in sample size  Population 50 Sample 33  Population 500 Sample 78  Population 1000 Sample 85  Population 55000-100000 Sample 93

147 Population Variability (Variables Sampling) Substantial effect on Sample size Variability is Standard Deviation of a population Standard Deviation is computed by  Taking the difference if each item from the mean  Squaring the difference  Adding the squares and averaging  Taking the square root of the average

148 Effect of Standard Deviation on Sample Size As the Standard Deviation increases the sample size increases Rule of thumb  Changes in a population's variability affects the Sample size by the square of the relative change Where there is a large deviation, Stratification may be required Generally the more widespread the values, the larger the sample

149 Expected Error Rate (Attribute Sampling) Initial Auditor assessment of expected population error rate (Deviation Rate or Rate of Occurrence) The higher the expected error rate, the larger the sample  If expected error rate of 1% gave a sample size of 93  Then an expected error rate of 3% would give a sample size of 361  All other factors being equal If the sample shows a higher than expected error rate?

150 Desired Precision Also called Desired Allowance for Sampling Risk For example, Inventory is estimated at $1,000,000 plus/minus $200,000 The tighter the desired precision, the larger the sample size required Sample size changes by the square of the relative change in precision  eg +/- $50,000 is a change in desired precision by a factor of 4  Sample size would increase by 16

151 Confidence Level (Reliability) Percentage of time that the sample adequately represents the population (i.e., that the estimation of value can be x% relied upon)  95% confidence level states that for a given sample size, if the sample was taken 100 times, 95 times the sample selected would adequately represent the population  The higher the confidence required, the larger the sample  In Variables Sampling, primary concern is Risk of incorrect acceptance  In Attribute Sampling, primary concern is Risk of assessing the control risk too high

152 Confidence Interval (Precision) Tolerable misstatement of a value For example, $100,000 +/- $10,000 gives Confidence Interval of $90,000 to $110,000 Primary concern is the risk of Incorrect Rejection Relates to efficiency of the audit

153 Confidence Interval Found by multiplying a Reliability Factor by the Standard Deviation of the Sample Then adding and subtracting from the Sample Estimate Assuming a Normal Distribution For example, a 95% confidence level results in a 1.96 reliability factor Confidence Interval therefore equals Estimated Value +/- (Reliability Factor x (Standard Deviation / Square Root of Sample Size))

154 Tolerable Error The maximum rate of deviations the Auditor will accept The closer the expected error rate is to the Tolerable Error, the larger the sample The larger the Tolerable Error, the smaller the sample

155 Calculating the Sample Size (Attribute Sampling)  Where – C is the Confidence Coefficient  p is the max error rate  q is 100%-p  P is the desired precision  n is the sample size – n = C2pq P2

156 For Example  Where the population is 1000, desired precision is +/- 2%, desired confidence level is 95% and the estimated error rate is not to exceed 5% then  C = 1.96 (Confidence Coefficient at 95%)  p = 0.05  q = 0.95  P = 0.02  n = 1.962 x 0.05 x 0.95 0.022  n = 45.6

157 Calculating the Sample Size (Variables Sampling) Where n1 is the preliminary sample size C is the confidence coefficient S is the standard deviation of the population P is the desired precision then n1 = C2 S2 P2

158 Attribute vs Variables Sampling Attribute Sampling (How Many?)  Allows the auditor to estimate the occurrence rate of deviance from internal control policies and / or whether the estimated rates are acceptable Variables Sampling (How Much?)  Used to estimate the Value of a Population  Normally expressed as a value plus or minus an amount ( the range of precision at the desired level of confidence)

159 Standard Deviation Where – s = Standard deviation of the sample – S = Sum of – x = Value of each sample item – n = Sample size then  s =   x2)-  (x)2/n n-1 For Example  In a population where Mean is 20, three samples were drawn, values 11, 20, 29  s = 81  s = 9

160 In a Normal Distribution Mean of the Distribution +/- one standard deviation includes 68% of the area under the curve Mean +/- two standard deviations includes 95.5% of the area Mean +/- three standard deviations includes 99.7% of the area

161 Standard Deviation Assists Any given item selected at random would fall 68% of the time within one standard deviation from the mean For example  With a mean of $100  Standard Deviation of $10  68% of sampling units would be within $90 to $110 at a 68% confidence level  To increase our confidence level to 95.5% we must increase our tolerance to two standard deviations  $80 to $120 at 95.5% confidence level

162 Normal Distributions Standard Deviations Area Under the Curve Confidence Coefficient Confidence Level  1.0 68%  1.64 90%  1.96 95%  2.0 95.5%  2.7 99%

163 Sample Size Largely dependent on Confidence level Precision desired Variability of the population Audit Objectives

164 Probability Proportional to Size Sampling (PPS) Also known as Dollar Unit Sampling (DUS) Cumulative Monetary Amount (CMA) Combined Attribute Variables (CAV) Commonly used to assess whether values are overstated Uses a different formula to determine Sample Size

165 Sample Size Where  n = Sample size  BV = Book value of the account (eg Accounts Receivable)  RF = Risk factor (multiplier see below)  TE = Tolerable error (auditor's judgement)  n = BV x RF TE

166 Risk Factors Determined by effectiveness of Auditee Internal Control structure Relevant Audit Procedures Links to Confidence Levels

167 Reliability Factors  Reliability Required Reliability Factors  99% 4.605  95% 2.996  90% 2.300  For Example – Value of Inventory = $500,000 – Auditor Specified Material Error = $10,000 – Auditor Determined Little effective control (i.e., RF=2.6) – n = BV x RF = $ 500,000 x 2.6 = 1300 = 130 TE $ 10,000 10 – Sampling Interval is therefore $500,000 / 130 = $3846

168 Application Stock Unit Value Amount Cum Amount 30 $ 16 480 480 90 $ 100 9000 9480 92 $ 111 10212 19692 70 $ 40 2800 22492 20 $ 15 300 22792 Sampling Interval = $ 3846  If no errors found Auditor concludes – Finished Goods Inventory has a maximum overstatement of R 10,000 with 95% Reliability  If errors did occur – The average error amount must be projected to the whole population (Tainting Percentage)

169 PPS Advantages  PPS tends to select high value items  PPS unaffected by population item variability  PPS may result in a smaller sample size  PPS easy to implement  PPS does not require the normal approximations required by variables sampling  PPS Permits a statistically valid sample selection which includes more high value items

170 PPS Disadvantages  PPS requires that the population be cumulatively totaled  As errors increase the sample size may be larger than with other sampling methods  PPS primarily designed to detect overstatements  Zero or negative items are presumed not to occur  PPS not intuitively as appealing to auditors

171 Other Sampling Types  Difference Estimation – Determine the difference between audit and book values – Calculate the mean difference – Multiply the mean difference by the numbers in the population – Allow for sampling risk – Useable where small errors predominate and there is no skew  Mean-per-unit (MPU) Sampling – Average the audit value of the sample – Multiply it by the population size (Not very accurate)  Ratio Estimation – Multiply the book value of the population by the ratio of audit value to book value of the sample – Useable where small errors predominate and there is no skew

172 Regression Analysis  Used to show relationships between two or more variable quantities  Also known as Least Squares  Measures the extent to which a change in one variable (the independent variable) causes changes in another or others (dependent variables)  May be shown graphically on scatter charts  More accurately calculated using the "Least Squares Method"  "The value which best fits a set of quantities is one which minimises the sum of the squared differences between itself and these quantities" - Sawyer

173 CAAT Types and their Usage – Application audit tools are not always CAATs – "Any tangible aid that assists an auditor"  Tools to obtain information  Tools to evaluate controls  Tools to verify controls  Automated tools

174 Automated Tools (CAATS)  Test Data Generators  Flowcharting Packages  Specialized Audit Software  Generalized Audit Software  Utility Programs

175 Specialized Audit Software  Can accomplish any audit task but – High development and maintenance cost – Require specific IS skills – Must be "verified" if not written by the auditor – High degree of obsolescence

176 Generalized Audit Software  "Prefabricated" audit tests  Each use is a one-off  Auditor has direct control  Lower development cost  Fast to implement

177 Applications of Generalized Audit Software  Detective examination of files  Verification of processing controls  File interrogations  Management inquiries

178 Types of Audit Software  Program generators  Macrolanguages  Audit-specific tools  Data downloaders  Micro-based software

179 Audit Software Functions  File access  Arithmetic operations  Logic operations  Record handling  Update  Output  Statistical  File comparison  Graphics

180 Determining the Appropriate CAAT  Depends on the Audit Objective and selected technique  Application Audit Techniques  Purposes – 1 To verify processing operation – 2 To verify the results of processing

181 Source code review – Requires programming skill – Slow – Expensive – Boring – Proves little – May be useful for specialized review

182 Confirmation of Results  For example, Debtors certification – Slow – Uncertain – Only shows up errors in your favor – Very labor intensive

183 Test Data – Selected to test both correct data and errors – Require little technical background  butLacks objectivity – Influenced by what is expected – Assumes program tested is "LIVE" program

184 Integrated Test Facility – Establishes a "dummy" entity – Process data together with live data – Excluded from live results – Under the auditor's control but – May result in system catastrophe

185 ITF Advantages – Little technical training required – Low processing cost – Tests system as it routinely operates – Understood by all involved – Tests manual function as well as computer

186 ITF Disadvantages – ITF transactions must be removed before they interfere with live totals – High cost if live systems require modification to implement – Test data affects live files - danger of destruction – Difficult to identify all exception conditions – Quantity of test data will be limited

187 Snapshot Technique – A form of transaction trail – Identifiable inputs "tagged" – Trail produced for all processing logic – Useful in high-volume systems – Used extensively by IS staff in testing systems

188 SAMPLING – "Liars, Damned Liars and Statistics" – A tool for audit quality control – May be the only tool possible in a high-volume system – Not well understood by auditors – At computer speeds 100% sampling may be practicable  May not be desirable

189 Parallel Simulation  Uses same input data  Uses same files  Uses different programs  From a different source  To produce the same results?

190 Common Problems – Getting the wrong files – Getting the wrong layout – Documentation is out of date – Prejudging results  Never believe what the first printout tells you

191 Audit Reporting – Results of the audit usually reported  Orally  Interim reports  Closing conference  In writing  As minimum at end of audit – Reports should be  Objective  Clear  Complete  Concise  Constructive  Timely

192 Reports Should – Include audit  Purpose  Scope  Results  Auditor's opinion  Recommendations for potential improvements  Acknowledgement of satisfactory performance  Auditee's reply to the auditor's opinions and recommendations – Be reviewed and approved by the head of internal auditing

193 Written Reports (1) – Audit report is a reflection of the competence and professional image of the whole internal audit department – Not only the technical soundness but also  Clarity  Tone  Style – Message must be unambiguous – Questions must be anticipated and answered – Desired mood must be created by words

194 Written Reports (2)  Audit report is a reflection of the competence and professional image of the whole internal audit department  Not only the technical soundness but also –Clarity –Tone –Style  Message must be unambiguous  Questions must be anticipated and answered  Desired mood must be created by words

195 Clear Writing Techniques (1)  Objectives of writing - to inform and influence  Gather the necessary information –Before starting –Avoids reorganizations –Avoids rewriting  Use a conversational style –Tends to be more retainable –Requires anticipation of feedback –Builds mental images –Avoid personal references when deficiencies are reported –Criticize practices, not people

196 Clear Writing Techniques (2)  Keep sentences short and simple –Try to average 15 to 20 words –One idea per sentence –Long sentences tend to be  Foggy  Awkward  Dull  Boring

197 Clear Writing Techniques (3)  Use active voice verbs –Active voice usually  Shorter  More lively  More conversational –"The manager asked for... ‘instead of’.... were asked for by the manager" –Passive tends to be  Dull  Formal  Unclear  Less emphatic  Vague

198 Clear Writing Techniques (4)  Use clear, familiar words –Be specific and precise –Never sacrifice clarity for brevity –Avoid jargon where possible –Explain it where it is essential –The burden of communication is on the writer, not the reader  Use appropriate headings –Break up the monotony of long sections –Assist  Location of specific information  Speeding up the reading process  Facilitate scanning

199 Preparing to Write  Starts at the beginning of the audit  A mental picture of the report  Free writing  Loosens up the mental muscles  Coordination of several writers efforts  Read the report aloud

200 Basic Audit Report (1)  Cover –Almost always desirable –Sets a professional tone –Should include  Report title  Name and location of auditee  Date of audit coverage

201 Basic Audit Report (2)  Formalities section –Normally constitutes an introduction –Typically one to three pages and includes  Date of report  Addressee (get it right)  Background  Audit scope and objectives  Brief opinion and nature of findings  Reply expectations  Signature  Names of participating auditors  Distribution  Contents

202 Basic Audit Report (3)  Executive Summary –Important issues and findings –Provides a preliminary perspective to the whole report –Focuses on risks to the organisation and the specific effect of –control weaknesses –May be all that is read –"Condense and eliminate" approach  Abbreviated explanations of major audit findings  Ordered by importance and cross-referenced –"Briefings" approach  Informing, advising, interpreting

203 Basic Audit Report (4)  Detailed Findings –Usually the body of the report –Condition, criteria, cause, and effect –Should include enough information for the reader to understand the findings  Exhibits and Attachments –Normally placed within the report –Place in an appendix if lengthy –Clearly label all graphics, charts, financial tabulations –If in an appendix, cross-reference to the report

204 Internal Audit Opinion  Frequently required by management  Provides an overall perspective to the rest of the report  Forces auditors to commit themselves  But –May cause a management overreaction –Important parts of the report may be ignored –Audit results are normally mixed in nature

205 Auditee Responses  At the discretion of the auditor  Provide a balanced report  Must be reviewed with, and agreed by the auditee  Lend credibility to the report  Less "sniping" from the sidelines

206 Polishing the Report  Rigorous review prior to issue  Normally in a peer group  One person with no knowledge of the specific audit area  May be done by checklist  Computerized grammar and style checkers tend to be American English  Ultimate signoff by the head of Internal Audit  Do not build in delays to report issuance

207 Distributing the Report  To the first level able to take appropriate action  Distribution list determined early in the audit process  Auditee chains of command can cause political ramifications  Delivery method should take into account –Confidentiality –Remoteness  If contents highly confidential, steps can be taken to trace individual copies –Copy numbering –Misspellings –Rewordings

208 Interim Reporting  Reports prepared and issued while the audit is in progress –To report progress on an extended audit –Where a finding warrants immediate attention  May be written or verbal –Advantages of interim reports  Timely feedback to the auditee  Higher probability of immediate action  A more favorable final report if appropriate action taken  A follow-up opportunity during the audit itself

209 Management Reaction n May be to: – Accept the advice or recommendations è Auditor must follow-up to ensure promised action is taken – Reject the advice or recommendations è Auditor must ascertain that top management has assumed the risk of non-action è No further follow-up is required n Management's decision - not audit's n Can be a major cause of auditor frustration n The audit committee can help

210 Follow-up n Investigate, evaluate, and report the effect of the audit n May be done by executive management in conjunction with auditees n May be done by another auditor n May be done by the original team n MUST be done

211 The Audit Report  The most important and demanding part of the job  Value of audit comes not from gathering information but from assessing and presenting it  Management acceptance  Management awareness  Prompt response to auditor recommendations


Download ppt "Auditor’s Guide to IT Auditing by Richard Cascarino."

Similar presentations


Ads by Google