Presentation is loading. Please wait.

Presentation is loading. Please wait.

ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee.

Similar presentations


Presentation on theme: "ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee."— Presentation transcript:

1 ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee

2 2 Reputation promise/mission The Auditor-General of South Africa has a constitutional mandate and, as the Supreme Audit Institution (SAI) of South Africa, it exists to strengthen our country’s democracy by enabling oversight, accountability and governance in the public sector through auditing, thereby building public confidence.

3 3 Annual audit of reported actual performance against predetermined objectives, indicators and targets as contained in the annual performance report. Integral part of the annual regularity audit process, confirming the Compliance with laws and regulations Usefulness of performance reporting Reliability of performance reporting Integral part of the annual regularity audit process, confirming the Compliance with laws and regulations Usefulness of performance reporting Reliability of performance reporting Audit of predetermined objectives defined as:

4 4 Public Finance Management Act (PFMA), 1999 (Act No.1 of 1999) Treasury Regulations issued in terms of the PFMA, 2002 Public Service Regulations (PSR), Part III B: (Only applicable to departments) Guidelines, instruction notes, practice notes, issued by National Treasury Framework for managing programme performance information (issued by the National Treasury in May 2007) Framework for strategic and annual performance plans (issued by National Treasury in August 2010) Framework for managing programme performance information (issued by the National Treasury in May 2007) Framework for strategic and annual performance plans (issued by National Treasury in August 2010) Legislative requirements for planning, budgeting & reporting of performance info

5 5 Main criteria Sub-criteriaSub-criteria Existence Timeliness Presentation Measurability Relevance Consistency Validity Accuracy Completeness Compliance with regulatory requirements UsefulnessUsefulness ReliabilityReliability Audit criteria

6 6 Understand and test the design and implementation of the performance management systems, processes, and relevant controls Understand and test the design and implementation of the performance management systems, processes, and relevant controls 11 Test the measurability, relevance, presentation & consistency of planned and reported performance information 22 Conclude on the usefulness of the report on predetermined objectives 33 Conclude on the reliability of the reported performance for selected programmes or objectives Conclude on the reliability of the reported performance for selected programmes or objectives 55 Audit approach Test the reported performance information to relevant source documentation to verify the validity, accuracy & completeness of reported performance information Test the reported performance information to relevant source documentation to verify the validity, accuracy & completeness of reported performance information 44

7 7 Audit reporting – Management report An audit conclusion will be prepared and included in the management reports for all departments, constitutional institutions, trading entities, public entities, Parliament and provincial legislatures

8 8 Usefulness of information: Audit findings focus on the consistency, relevance, measurability & presentation of reported performance information. Reliability of information: Audit findings focus on the validity, accuracy & completeness of reported performance information REPORT ON OTHER LEGAL AND REGULATORY REQUIREMENTS Predetermined objectives COMPLIANCE WITH LAWS AND REGULATIONS Report non-compliance matters in relation to the performance management and reporting processes Audit reporting – Auditor’s report

9 9 Usefulness Performance indicators Well defined Verifiable Targets SMART criteria

10 10 TECHNICAL INDICATOR DESCRIPTIONS continued…

11 11 Reliability Reported performance Valid Accurate Complete Explanations for major variances

12 12 Additional matters Achievement of planned targets Greater than 20% of planned targets not achieved Material adjustments to the annual performance report

13 13 Independent Communication Authority of South Africa (ICASA)

14 14 OVERVIEW A review of the draft 2013/14 Annual Performance Plan and the related draft Strategic Plan was performed. (Audits only performed on Departments) Our focus was to assess the usefulness of the information contained in the plans in terms of the: –measurability and relevance of indicators (well-defined, verifiable, relevant) –measurability of targets (specific, measurable, time-bound, relevant) Findings and discussions Conclusion in management report

15 15 Findings – Measurability of Indicators Definition: The indicator needs to have a clear, unambiguous definition so that data will be collected consistently and be easy to understand and use (supported by Appendix E). Error rate: 30% of indicators were not well-defined. WELL- DEFINED

16 16 Findings – Measurability of Indicators (continued) VERIFIABLE Definition It must be possible to validate the processes and systems that produce the indicator (supported by Appendix E). Error rate: 30% of indicators were not verifiable.

17 17 Findings – Measurability of Targets Definition: The nature and the required level of performance can be clearly identified. Error rate: 12% of targets were not specific. SPECIFIC

18 18 Findings – Measurability of targets (continued) MEASURABLE Definition: The required performance can be measured. Error rate: 12% of targets were not measurable

19 19 Findings – Measurability of targets (continued) TIME BOUND Definition: The time period or deadline for delivery is specified. Error rate: 0% of targets were not time bound

20 20 Findings – Relevance RELEVANCE Definition: Indicators : The indicator must relate logically and directly to an aspect of the institution’s mandate and the realization of strategic goals and objectives. Targets: The required performance is linked to the achievement of a goal. Error rate: 0% of indicators and related targets were not relevant

21 21 TECHNICAL INDICATOR DESCRIPTIONS The Framework for Strategic Plans and Annual Performance Plans as issued by NT and enforced by Instruction Note 33 requires all departments, constitutional institutions and schedule 3A & 3C public entities to compile technical indicator descriptions (see Annexure E extract in slides) for all performance indicators included in their plans effective from the 2012/2013 reporting period. These technical indicator descriptions must be published on the website of the department/constitutional institution /public entity. ICASA has not compiled any technical indicator descriptions for the 2012/13 year nor the 2013/14 year (at the time of our review).

22 22 CONCLUSION The errors identified relating to the 2013/14 plan: –Targets not specific: below the threshold for qualification; –Targets not measurable: below the threshold for qualification; –Indicators not well-defined: above the threshold for qualification; –Indicators not verifiable: above the threshold for qualification. Thresholds on errors identified: –0% to 19%  No opinion –20% to 50%  Qualified –Above 50%  Adverse or disclaimer

23 23 RECOMMENDATION Develop and implement standard operating procedures. Clearly defined roles and responsibilities linked to individual performance contracts. A forum should be established for the portfolio to share insights and have a consistent approach. Continue to involve the AGSA in the planning process allowing sufficient time for the AGSA to review the draft plans.

24 24 Universal Service Access Agency of South Africa (USSASA)

25 25 Findings – Measurability of Indicators WELL- DEFINED Definition: The indicator needs to have a clear, unambiguous definition so that data will be collected consistently and be easy to understand and use (supported by Appendix E). Error rate: 32% of indicators were not well-defined.

26 26 Findings – Measurability of Indicators (continued) VERIFIABLE Definition It must be possible to validate the processes and systems that produce the indicator (supported by Appendix E). Error rate: 0% of targets were not verifiable

27 27 Findings – Measurability of Targets SPECIFIC Definition: The nature and the required level of performance can be clearly identified. Error rate: 24% of targets were not specific.

28 28 Findings – Measurability of targets (continued) MEASURABLE Definition: The required performance can be measured. Error rate: 24% of targets were not measurable

29 29 Findings – Measurability of targets (continued) TIME BOUND Definition: The time period or deadline for delivery is specified. Error rate: 0% of targets were not time bound

30 30 Findings – Relevance RELEVANCE Definition: Indicators : The indicator must relate logically and directly to an aspect of the institution’s mandate and the realization of strategic goals and objectives. Targets: The required performance is linked to the achievement of a goal. Error rate: 0% of indicators and related targets were not relevant

31 31 CONCLUSION The errors identified are still under discussion with Management at USAASA. Adjustment on the quality of the data included in the planning documents is likely to be made by the department.

32 32 RECOMMENDATION Planning documents should be independently reviewed within the entity and the reviews should achieve adherence to the Frameworks issued by National treasury Oversight committees (Audit Committee) in the department to continuously monitor compliance and quality of planning documents The department/entity should develop the technical indicator descriptions to ensure consistent understanding of the indicators and process to reach the objectives

33 33 THANK YOU


Download ppt "ICASA and USSASA Predetermined Objectives – 2013/14 March 2013 Portfolio committee."

Similar presentations


Ads by Google