Presentation is loading. Please wait.

Presentation is loading. Please wait.

NSAA Information Technology Conference Hartford, Connecticut September 24, 2015 Presented by: Mike Billo and Anne Skorija PA Department of the Auditor.

Similar presentations


Presentation on theme: "NSAA Information Technology Conference Hartford, Connecticut September 24, 2015 Presented by: Mike Billo and Anne Skorija PA Department of the Auditor."— Presentation transcript:

1 NSAA Information Technology Conference Hartford, Connecticut September 24, 2015 Presented by: Mike Billo and Anne Skorija PA Department of the Auditor General 1

2 Introduction Objectives To communicate the audit standards pertaining to data reliability as applicable by audit engagement, with emphasis on performance audits. To provide a framework/approach for evaluating and concluding on data reliability. 2

3 Introduction Key Concepts Assessment of reliability of computer-processed data must be documented whether the data is presented in an electronic file or in paper reports Computer-processed “data” and computer- processed “information” are used interchangeably 3

4 Comply with standards Legal Implications Applicable to ALL auditors Performance audits Financial audits Attestation engagements Non-yellow book audits and reviews 4 Why is this important?

5 Certain controls may impact a determination about the reliability of computer-processed information/data. Change in way of thinking as standards have evolved. Internal reviews have determined a need for consideration and training. 5 Why is this important?

6 Computer-Processed Data What it is and how to assess it 6

7 What is Computer-Processed Data? Computer-processed data can be presented as audit evidence in electronic data files and also in paper reports. If a computer was used to produce a paper report, the contents are computer-processed data. 7

8 Examples of Computer-Processed Data Data extracts from databases, data warehouses, or data repositories; Data maintained in Microsoft Excel or Access or similar commercial products; Data extracts from enterprise software applications supported by information technology departments or contractors; Data collected from forms and surveys on web portals; Data summarized in a paper report or copied from a table in a document. 8

9 What is computer-processed information / data? Any data that has some form of electronic process applied to it. 9

10 Factors Affecting Data Reliability 10 Audit Objectives Reasonableness Interviews with Auditees Mathematical Accuracy Comparison to Published Information Comparison to Source Documents Internal Controls (including IT controls) CAAT Results

11 Applicability Financial Audits Attestation Engagements Performance Audits Non-yellow book audits /reviews 11

12 Paper Report Assess Reliability (Data Reliability Assessment Form) Assess Reliability (Data Reliability Assessment Form) Electronic File Assess Reliability Initially (Data Reliability Assessment Form) Computer Assisted Audit Techniques (CAAT checklist) Conclude on Data Reliability (Data Reliability Assessment Form) Computer – Processed Data in Performance Audits 12

13 Data Assessment Basics Handout – Data Reliability Assessment Form Background Data Source Audit and Audit Period How will the data be used? Steps taken to assess data reliability For data files and data on reports For data files only Data limitations, if any 13

14 Data Assessment Basics Data assessment conclusion Sufficiently reliable Not sufficiently reliable Impact on audit procedures Impact on audit report Completed by Approved by 14

15 What do the standards say? 15

16 What standards are we required to follow? Performance Audits – Yellow Book (GAGAS) Financial Audits – Yellow Book and Generally Accepted Auditing Standards (GAAS) from AICPA Attestation Engagements – Yellow Book and Statements on Standards for Attestation Engagement (SSAE) from AICPA Other Audits / Reviews – State Audit Office Policy and Procedures 16 What do the standards say?

17 Performance Audits Generally Accepted Government Auditing Standards (GAGAS) Financial Audits AICPA Statements on Auditing Standards for Financial Audits(AU-C) GAGAS Attestation Engagements AICPA’s Statements on Standards for Attestation Engagements (SSAE) GAGAS Non-Yellow Book Audits and Reviews State Audit Office Policies and Procedures 17

18 Specific Standard Citations Performance Audits Specific References to Follow Financial Audits AU-C500 – Audit Evidence, AU-C520.A25 – Analytical Procedures GAAP and Single Audits Attestation Engagements AT 101 – Sufficient and Appropriate Evidence Non-Yellow Book Audits and Reviews State Audit Office Policies and Procedures 18

19 What Does the Yellow Book Say about Data Reliability? 19

20 High Level Overview Auditors should assess the sufficiency and appropriateness of computer-processed information regardless of whether this information is provided to auditors or auditors independently extract it. (GAGAS paragraph 6.66) 20

21 Fieldwork Standards for Performance Audits – Chapter 6 Obtaining sufficient, appropriate evidence 6.56 - 6.68 21

22 Fieldwork Standards for Performance Audits – Chapter 6 Assessment of Evidence 6.69-6.72 Internal Controls 6.15 – 6.22 Information Systems Controls 6.23 -6.27 22

23 Reporting Standards for Performance Audits – Chapter 7 Report Contents: Objectives, Scope, and Methodology Limitations 7.11- 7.13 23

24 Sufficient, Appropriate Evidence: Integral to Audits (GAGAS 6.57) Sufficient: Quantity Enough evidence to persuade a knowledgeable person that the findings are reasonable Appropriate: Quality Relevant Valid Reliable 24

25 Verify Information (GAGAS 6.65) 25 Determine what auditee did to obtain assurance over reliable information Test auditee procedures to obtain assurance Perform direct tests of the data

26 Computer-Processed Data Used as Evidence (GAGAS 6.66) Auditors should assess sufficiency and appropriateness Provided to the auditors Auditors independently extract it 26

27 The nature, timing and extent of procedures to assess sufficiency and appropriateness is affected by the auditee’s internal controls. (GAGAS 6.66) Better Controls Less data testing Worse Controls More data testing 27

28 Level of procedures also affected by significance of information and level of detail presented in findings and conclusions (GAGAS 6.66) Less significance/less detail Less data testing Greater significance/greater detail More data testing 28

29 Computer-Processed Data (GAGAS 6.66) Assessment Sufficiency Appropriateness Includes Completeness Accuracy For intended purpose 29

30 Sufficiency and Appropriateness (GAGAS 6.71) Evidence is sufficient and appropriate When it provides a reasonable basis for findings and conclusions within the context of the audit objectives Evidence is not sufficient and appropriate When using evidence carries unacceptably high risk Evidence has significant limitations Evidence does not support findings and conclusions 30

31 Evidence has limitations when Reliability has not been assessed Reliability cannot be assessed Errors identified in testing 31 GAGAS 6.72

32 Handling Limitations (GAGAS 6.72) Corroborative evidence Redefine audit objectives or limit scope to eliminate need to use the evidence Change presentation of findings and describe the limitations in the report to avoid misleading the users Reporting limitations or uncertainties as findings (including internal control deficiencies) 32

33 Objectives, Scope and Methodology 33

34 Include Limitations in Report (GAGAS 7.11) Describe scope of work and limitations Including issues that would be relevant to likely users so that users can reasonably interpret the findings, conclusions and recommendations of the report Describe any significant constraints Information limitations Scope impairments Denials Excessive delays of access to certain records or individuals 34

35 Explain Significant Limitations (GAGAS 7.12) Auditors should … explain any significant limitations or uncertainties based on the auditors’ overall assessment of the sufficiency and appropriateness of the evidence in the aggregate. 35

36 Audit Methodology (GAGAS 7.13) Auditors may include a description of procedures performed to assess sufficiency and appropriateness of information used as audit evidence. 36

37 What does the Gray Book say about Performance Audits? 37

38 What does the Gray Book Say about Performance Audits? Assessing the Reliability of Computer-Processed Data (Gray Book) – Supplemental Guidance Supersedes GAO-03-273G GAO-09-680G: Published July 1, 2009 http://www.gao.gov/products/GAO-09-680G 38

39 EFFECTIVE COMMUNICATION MAXIMIZING PROFESSIONAL JUDGEMENT PRIOR WORK To determine whether you can use the data for your intended purposes 39 EFFICIENT WORKLOAD

40 What does reliability mean? Reliability means that data are: Reasonably complete Reasonably accurate Meet your intended purposes Not subject to inappropriate alteration. 40

41 When is a Data Reliability Assessment Required? If data is intended to materially support findings, conclusions, or recommendations (agrees with and supports Yellow Book paragraph 6.66). 41

42 When is a Data Reliability Assessment NOT Required? If use of the data does NOT materially affect findings, conclusions, or recommendations. In most circumstances, when information is presented as background. No assessment required; however, ensure the data is obtained from the best available source. 42

43 Factors Affecting the Extent of an Assessment Expected importance of the data to the final report (i.e., primary support for findings) More important; more testing. Anticipated level of risk in using the data Single or Multiple Engagements? Strength or weakness of any corroborating evidence Less evidence; more testing. 43

44 What if the data has errors? Data errors are considered acceptable if… You have assessed the associated risk and conclude that the errors are not substantial enough to cause a reasonable person, aware of the errors, to doubt a finding, conclusion, or recommendation based on the data. 44

45 Use of Unreliable Data For example, even though the auditors may have some limitations or uncertainties about the sufficiency or appropriateness of some of the evidence, they may nonetheless determine that in total there is sufficient, appropriate evidence to support the findings and conclusions. (GAGAS 6.70) It is important to note … that information presented in our findings … based on the best information (data) available at the time of our audit procedures. Placement of qualifier important in the audit report, i.e., footnote 45

46 46

47 Data Reliability Assessment Form 47

48 Data Reliability Assessment Form 48

49 Data Reliability Assessment Form Steps taken to assess the completeness and accuracy of the data 49

50 Data Reliability Assessment Form 50 Steps taken to assess the completeness and accuracy of the data (cont.)

51 Data Reliability Assessment Form 51 Document data limitations Indicate your assessment of the sufficiency and appropriateness of the data for purposes of the engagement

52 Data Reliability Assessment Form 52 Document the impact of the assessment

53 53

54 Understanding IT Controls Documenting Baseline IT Controls 54

55 55 Understanding the IT Environment

56 56 Understanding the IT Environment

57 57

58 CAAT Checklists Relating the CAAT Checklist to the Yellow Book Independent control totals Excellent test for completeness, appropriateness, quantity, validity Did the auditee provide the entire population? Auditee’s record layout Excellent test for completeness, appropriateness, quality, relevance Did the auditee provide data including all fields needed/requested? 58

59 CAAT Checklists CAAT Checklist Performance Audits Used to document CAAT objectives ONLY. (Data reliability is concluded on the Data Reliability Assessment Form.) CAAT and Data Reliability Assessment Checklist Financial Audits, Attestation Engagements, and Other Reviews and Examinations Used to document CAAT objectives AND data reliability conclusions. 59

60 CAAT Checklist Performance Audits 60

61 CAAT Checklist Performance Audits (Continued) 61

62 62 CAAT Checklist Performance Audits (Continued)

63 63 CAAT Checklist Performance Audits (Continued)

64 64 CAAT Checklist Performance Audits (Continued)

65 65 CAAT Checklist Performance Audits (Continued)

66 66 CAAT Checklist Performance Audits (Continued)

67 67 CAAT Checklist Performance Audits (Continued)

68 68 CAAT Checklist Performance Audits (Continued)

69 69 CAAT Checklist Performance Audits (Continued)

70 70 CAAT Checklist Performance Audits (Continued)

71 71 CAAT Checklist Performance Audits (Continued)

72 72 CAAT Checklist Performance Audits (Continued)

73 73 CAAT Checklist Performance Audits (Continued)

74 74 CAAT Checklist Performance Audits (Continued)

75 75 CAAT Checklist Performance Audits (Continued)

76 76 CAAT Checklist Performance Audits (Continued)

77 77 CAAT Checklist Performance Audits (Continued)

78 78 CAAT Checklist Performance Audits (Continued)

79 79 CAAT Checklist Performance Audits (Continued)

80 80 CAAT Checklist Performance Audits (Continued)

81 Differences Between Performance and Financial Audits/Attestation Engagements Checklists CAAT Checklist Performance Audits: Final, audit-level, conclusion on CAAT objectives CAAT and Data Reliability Assessment Checklist Financial: Final, audit-level, conclusion on data reliability 81

82 Conclusions 82

83 Reminder Computer-processed data may be in either electronic data files or on paper reports. We are required to assess the reliability of computer-processed data in either case. 83

84 Key Takeaways Computer-Processed Data Definition Electronic data files or data on paper reports Data Reliability Assessment Form CAATs Checklists Performance Audits Financial Audits, Attestation Engagements, and Other Reviews and Examinations Understanding IT Environment Form 84

85 Final Comments and Questions 85

86 Thank you for your attention! 86


Download ppt "NSAA Information Technology Conference Hartford, Connecticut September 24, 2015 Presented by: Mike Billo and Anne Skorija PA Department of the Auditor."

Similar presentations


Ads by Google