PROGRAM 2.1—PERFORMANCE AUDIT SERVICES
Table 4 provides a summary of the performance of Program 2.1 in 2010–11. The delivery of Performance Audit Services is described in more detail in the following sections.
Table 4 Program 2.1: Performance against 2010–11 Portfolio Budget Statements program deliverables and performance indicators
|Outcome 2: Improvement in public administration
To improve the efficiency and effectiveness of the administration of Australian Government programs and entities by undertaking a program of independent performance audits and assurance reviews for the information of Parliament and the executive.
|Deliverables||2010–11 target||2010–11 result|
|Number of performance audits to be produced||56||54|
|Number of better practice guides produced||4||3|
|Review of DMO Major Projects Report||1||1|
|Other audit and related reports||1||1|
|Key performance indicators||2010–11 result|
|Delivery of a work program which is balanced in coverage and nature and which recognises the audit priorities of the Parliament|
|- The value of the ANAO contribution is recognised by the Parliament||Not assessed in 2010–11. Survey conducted every two years.a|
|- The value added by ANAO products and services is recognised by public sector entities||86% of public sector entities acknowledged the value added by Performance Audit Services Group products and services.b|
|- The JCPAA’s general satisfaction with the overall quality, timeliness and coverage of ANAO products and services||Not assessed in 2010–11. Survey conducted every two years.a|
|$28.170 million ($26.107 million in 2009–10).|
JCPAA = Joint Committee of Public Accounts and Audit
a The parliamentary survey from January 2009 found that 94% of parliamentarians surveyed expressed satisfaction with ANAO products and services and the JCPAA reported a high level of satisfaction with the overall quality, timeliness and coverage of ANAO products and services. The next parliamentary survey will be conducted in 2011–12 under a two-year cycle).
Performance audits cover a wide range of topics and commonly examine related governance arrangements, information systems, performance measures, monitoring systems and legal compliance. Audits are conducted in accordance with ANAO auditing standards. All performance audit reports are tabled in Parliament.
The Auditor-General Act 1997 authorises the Auditor-General to conduct, at any time, a performance audit of an entity, a Commonwealth authority or company, other than a government business enterprise (GBE) or any of its subsidiaries. The Auditor-General may conduct a performance audit of a fully owned GBE or its subsidiaries if the responsible minister, the Minister for Finance or the Joint Committee of Public Accounts and Audit (JCPAA) requests the audit.
A performance audit is an independent, objective and systematic examination of the operations of a body to form an opinion on whether:
- management of the operations is economical, efficient and effective
- internal procedures for promoting and monitoring economy, efficiency and effectiveness are adequate
- improvements might be made to management practices (including procedures for promoting and monitoring performance).
In seeking to improve public administration, performance audits also identify better practices, which may then be incorporated into better practice guides produced by the ANAO for dissemination throughout the Australian Government sector. Better practice guides are discussed later in this section.
Table 5 summarises the program’s performance in terms of the number, timeliness and cost of performance audit reports.
Table 5 Program 2.1: Performance, quantitative measures, 2008–09, 2009–10 and 2010–11
|Number of performance audit reports||Time taken to complete reports (months)||Average cost per report ($m)|
Figure 7 shows the numbers of performance audit reports completed over the past five financial years.
Figure 7 Number of performance audit reports, 2006–07 to 2010–11
A major outcome from our performance audit work is improvement in the management and administration of Australian Government programs. Performance audit reports also provide assurance to the Parliament about the way an area of public administration is being conducted.
Our performance audit work program is developed annually in consultation with the JCPAA and public sector entities. This ensures that our audit products and outputs meet the needs of the Parliament and public sector entities.
The program is based on an assessment of the changing Australian Public Service environment and the ANAO’s understanding of the expectations of government and the operations of agencies. Audit activity planning takes into account risks, financial materiality, program significance, audit impact, program visibility, the extent of recent audit and evaluation coverage, and broad themes derived from the audit planning process.
Key environmental factors influencing the 2010–11 program included:
- the ongoing implementation of economic stimulus measures
- the Council of Australian Governments reform agenda
- whole-of-government policy and program design and delivery
- areas identified by past performance audits and reviews, including governance, risk management, and effective use of human and financial resources.
Particular areas identified for audit focus in 2010–11 included:
- program implementation
- procurement and capability acquisition processes
- grants administration
- monitoring program progress and outcomes
- Indigenous programs.
The performance audits conducted in 2010–11 are summarised by portfolio in Appendix 4. A published audit work program is one of the ANAO’s key products, and is discussed in more detail later in this report.
The development of new policies and programs, particularly in response to the recent stimulus measures, requires agencies to respond rapidly to changed priorities and to quickly implement new initiatives. Significant reports in this area included Audit Report No.9 2010–11, Green Loans Program, and Audit Report No.12 2010–11, Home Insulation Program.
The audit of the Green Loans program provided a timely reminder of the challenges in program implementation and the importance of executive management engagement. In 2006, the ANAO and the Department of the Prime Minister and Cabinet jointly produced a better practice guide, Implementation of Programme and Policy Initiatives, that stated in the foreword:
Too often the challenges involved in turning a policy idea into effective outcomes, and the skills and effort required to do so, are not fully appreciated. Too often the results fall short of expectations. Yet we know that defects in implementation rob the community of the full benefits of a new policy and waste community resources.
The audit reinforced the view that the guide is a useful framework to assist agencies in managing program implementation, and reflects the collective experience and wisdom of senior managers and executives in the Australian Public Service.
The audit of the Home Insulation Program identified a number of challenges to implementation of programs that arise during the policy development stage, including maintaining compatibility between program design and policy objectives, developing appropriate assumptions and identifying early the critical program components, while highlighting the particular challenges of planning demand-driven programs. Agencies need to establish effective governance arrangements that provide proactive and effective oversight and response to emerging problems, and to implement effective and timely risk treatments. The audit also showed that effective program implementation requires the identification of the resources and systems needed to support implementation, and the timely operation of appropriate compliance and audit programs.
These and other audits of program implementation have highlighted the fact that delivery of a government’s policy agenda relies not only on the provision of sound policy advice, where the Australian Public Service has traditionally performed well, but on the effective implementation of new programs, where performance has been mixed.
More broadly, new policy implementation also requires effective organisational arrangements to support the management of a suite of new policy initiatives, which was identified in Audit Report No.29 2010–11, Management of the Implementation of New Policy Initiatives. This audit assessed the effectiveness of the approach taken by the Australian Federal Police (AFP) in managing the implementation of new policy initiatives. While the broad strategy developed in 2008 to improve project management in the AFP was sound, its implementation has not been effective. As a consequence, the measures taken to improve organisational project management capability have had little effect and, at the time of the audit, the AFP still lacked the processes, controls and structures necessary to provide the commissioner and the Government with assurance that new policy initiatives were being delivered in accordance with the Government’s time, quality and cost expectations. The audit made recommendations aimed at enhancing governance structures, improving the early identification of implementation issues and building a stronger organisational capability to plan, support and monitor the implementation of new policy initiatives.
Procurement and capability acquisition processes
Performance audits that review agencies’ procurement activities provide assurance that agencies are making appropriate decisions about the use of government resources to develop capability and deliver outcomes. A number of performance audits in 2010–11 identified potential for agencies to further improve their procurement processes.
Audit Report No.11, Direct Source Procurement, assessed how well agencies had implemented the Commonwealth Procurement Guidelines and relevant financial management and accountability legislation when undertaking direct source procurement. The guidelines promote ‘value for money’ as the core principle in all procurements. The other key principles—encouraging competition; efficient, effective and ethical use of resources; and accountability and transparency in decision making—underpin the achievement of value for money.Agencies are required to have regard to all such considerations in their procurement activities.The audit found that, overall, the four audited agencies were reasonably familiar with the Government’s procurement framework and the guidelines. However, in practice, they did not consistently follow key elements of the guidelines when choosing and conducting direct source procurements. For the majority of direct source procurements examined, from the circumstances of the procurement and/or procurement documentation, it was not evident that one or more obligations, requirements or specified sound practices had been met, including for higher value procurements.
Audit Report No.40 2010–11, Management of the Explosive Ordnance Services Contract, focused on the Department of Defence’s contractual arrangements for storing and distributing its $3.1 billion inventory of explosive ordnance, representing some 60 per cent of Defence’s reported total inventory at 30 June 2010. The audit found that Defence had established mechanisms to support the effective management of the Explosive Ordnance Services Contract, but should seek to incorporate a firm contract expiry date to allow the services provided under the contract to be market tested. There is currently no limit on the number of performance-based contract extensions available to the incumbent contractor: provided the contractor continues to meet contractually defined performance standards, the contract may be extended indefinitely. Longstanding advice from the Department of Finance and Deregulation is that such ‘evergreen’ provisions are likely to limit competition, and do not provide the necessary assurance that the value-for-money requirements of the policy framework in the Commonwealth Procurement Guidelines will be met.
The ANAO also identifies areas of good practice in its performance audits. Audit Report No.26, Management of the Tender Process for a Replacement BasicsCard, found that the Department of Human Services (DHS) demonstrated sound procurement and management practice and acted in a manner consistent with Finance’s operational guidance to agencies contained in the Guidance on the Mandatory Procurement Procedures. In planning and managing the procurement, including approaching the market, evaluating tender submissions and conducting contract negotiations, DHS also complied with the requirements of the Commonwealth Procurement Guidelines.
The procurement outcomes for Defence capability are contingent on effective acceptance arrangements. Audit Report No.57 2010–11, Acceptance into Service of Navy Capability, found that Defence is still some way from achieving its decade-old objective of seamless, well-developed processes and systems for the effective and efficient delivery of Navy capability. The overall picture is of a capability development system that has not consistently identified and responded, in a timely and comprehensive way, to conditions that adversely affected Navy capability acquisition and support. Opportunities to identify and mitigate cost, schedule and technical risks have been missed, resulting in chronic delays in Navy mission systems achieving final operational capability. At the highest level, acquisition plans have not clearly set out the Government’s agreed scope, cost and schedule for each project at the time of each project’s approval. Consequently, compliance with government requirements, which is a fundamental responsibility of Defence, could not be confirmed by Defence.
The pathway to better capability outcomes is reliant on clear up-front agreements on capability requirements definition, verification and validation procedures, and configuration management. In all cases, the Capability Development Group, Defence Materiel Organisation and Navy would benefit from working more closely together during important phases of the development of major systems. At key stages of each project, all parties would benefit from a definite agreed view on the risks that must be managed in order to achieve a successful outcome. Experience in the United Kingdom and the United States underscores the importance of the acquisition organisation and the navy working together to ensure that hand-offs do not become ‘voyages of discovery’ in the final stages of the project.
Without the application of greater discipline by Defence in the implementation of its own policies and procedures, the necessary improvements in acquisition outcomes will not be achieved. In some essential systems engineering, technical regulatory elements and capability integration management areas, there are insufficient numbers of qualified staff; this needs to be addressed as a priority. The ANAO made eight recommendations designed to improve Defence’s management of the acquisition and transition into service of Navy capability, including reducing delays in achieving operational release.
Audits continue to identify issues in regard to the administration of grants and application of the enhanced grants administration framework. For example, the ANAO regularly identifies that key requirements for transparency in grants management, such as the publication and consistent application of selection criteria, are not being met. Agencies should provide clear advice on the merits of each proposed grant, including a recommendation to the decision maker concerning whether or not funding should be approved under the relevant program guidelines, having regard to the statutory obligations governing the approval of spending proposals.
Two audits in particular found incomplete documentation or inconsistent application of criteria for selecting grant recipients. Audit Report No.3 2010–11, The Establishment, Implementation and Administration of the Strategic Projects Component of the Regional and Local Community Infrastructure Program, found that no version of the program guidelines outlined the assessment criteria that would be used to select successful applicants, and other opportunities to publish the selection criteria were not taken. The audit highlighted problems in the assessment process, with the eligibility and compliance checking process abandoned part way through its implementation, resulting in all applications received being considered. The reasons for the selection of some of the short-listed applications and the non-selection of others were not apparent from the program document or subsequent advice provided to the ANAO.
Audit Report No.24 2010–11, The Design and Administration of the Better Regions Program, found that, overall, the program was effectively designed and has been well administered by the Department of Infrastructure, Transport, Regional Development and Local Government (and later the Department of Regional Australia, Regional Development and Local Government). Because the program was established solely to fund various regional election commitments, neither department had a role in the selection of projects. There remains, however, an obligation to assess the efficient and effective use of public money for a proposed grant, and the published program guidelines and departmental administration of the program recognised that there was a requirement for both an assessment of whether each project would make efficient and effective use of public money and an assessment of whether any risk mitigation measures should be imposed. While results of risk analyses and mitigation proposals were clearly and effectively communicated to the relevant parliamentary secretary, assessment briefings provided to the parliamentary secretary did not similarly outline the basis upon which the department had assessed each project as representing an efficient and effective use of public money.
Monitoring program progress and outcomes
Several performance audits have identified scope for improvement in agencies’ arrangements for monitoring and reporting on program progress, including annual reporting to the Parliament under the outcome and programs framework.
Audit Report No.25 2010–11, Administration of the Trade Training Centres in Schools Program, found that the planning approach and administrative framework for the program established by the Department of Education, Employment and Workplace Relations (DEEWR) were generally sound. Nevertheless, the audit identified issues relating to DEEWR’s implementation of the program’s framework, and scope to further strengthen the program’s Portfolio Budget Statements (PBS) performance information framework.
The audit found that the program’s PBS performance information framework had improved over time through the adoption of a broader set of performance measures, such as the trend for students participating in vocational and technical education. Nevertheless, it was difficult for stakeholders to assess the extent to which the program had been successful in developing the trade training centre infrastructure which underpins the delivery of program outcomes, as there were no specific targets for the construction of trade training centres. In this context, there was an opportunity to heighten transparency for the Parliament and public by establishing expectations for overall program infrastructure development and subsequently reporting on progress.
Audit Report No.30 2010–11, Digital Education Revolution Program—National Secondary Schools Computer Fund, found that, overall, the administration of the Digital Education Revolution (DER) program had been effective in supporting progress through a partnership approach towards the National Secondary Schools Computer Fund’s objective of increasing the computer-to-student ratio for students in years 9 to 12.
The audit identified limitations in the program’s progress measures, including annual reporting to the Parliament. Administering agencies need to strike an appropriate balance between accountability and devolved responsibility, and remain accountable to responsible ministers and the Parliament for the use of Australian Government funding. Opportunities for enhancement identified by the audit, and reflected in its recommendations, build on the DER program’s partnership approach by seeking well-timed assurance over aspects of the program’s delivery by education authorities, and strengthening of reporting on performance to stakeholders.
During 2010–11, the ANAO increased its focus on the administration of Australian Government Indigenous programs, tabling six reports. The performance of government programs in addressing disadvantage experienced by Aboriginal and Torres Strait Islander people is an important and topical aspect of public administration. Program delivery arrangements are relatively decentralised and characterised by ‘whole-of-government’ approaches to policy coordination, program administration and service delivery. Further, there are a large number of Indigenous-specific programs administered by Australian Government bodies, most of which are of a relatively small annual value. In these circumstances, effective coordination of services and programs is important. Audit Report No.18 2010–11, Government Business Managers in Aboriginal Communities under the Northern Territory Emergency Response, highlighted the importance of maintaining a clear understanding between departments of the government coordination arrangements that had been put in place in communities under the Northern Territory Emergency Response and of maintaining a stable presence in those communities.
Grant funding provided to not-for-profit, local government and community-based organisations is a key model of service delivery; the amount of funding delivered this way indicates a strong reliance on these organisations to contribute to the broader outcomes sought by government. With respect to the often limited capacity of these funded organisations, purchasing agencies have traditionally adopted a cautious approach to funding in order to minimise risks, characterised by short-term grant funding and an annual funding cycle. In some circumstances this may be an appropriate approach but audits have highlighted that despite the recognition of the long-term nature of the investment needed to address Indigenous disadvantage, many programs continue to operate on the basis of small, short-term investments leading to multiple transactions, creating in the process a high level of administrative burden for both grantees and agencies. These issues were identified in, for example, Audit Report No.8 2010–11, Multifunctional Aboriginal Children’s Services (MACS) and Crèches and Audit Report No.32 2010–11, Northern Territory Night Patrols.
Audits undertaken by formal request of Parliament, ministers or parliamentarians
Three audits were tabled during 2010–11 as a result of formal requests from Parliament:
- Audit Report No.9 2010–11, Green Loans Program. This work was undertaken in response to a request from Senator Christine Milne, dated 3 February 2010, and other concerns in relation to the administration of the program.
- Audit Report No.12 2010–11, Home Insulation Program. Following the commencement of the Home Insulation Program in February 2009, the Auditor-General received representations from various stakeholders raising concerns about the program’s delivery. On 3 March 2010, the then Minister Assisting the Minister for Climate Change and Energy Efficiency, the Hon. Greg Combet AM, MP, requested that the Auditor-General conduct an audit of the Home Insulation Program. This followed a number of requests from the Shadow Minister for Climate Action, Environment and Heritage, the Hon. Greg Hunt MP, to conduct an audit of the program in response to safety concerns and allegations of rorting and noncompliance by installers.
- Audit Report No.13 2010–11, Implementation and Administration of the Civil Aviation Safety Authority’s Safety Management System Approach for Aircraft Operators. This resulted from a recommendation in the Senate Standing Committee on Rural and Regional Affairs and Transport’s report Administration of the Civil Aviation Safety Authority and Related Matters, which requested that the ANAO review the Civil Aviation Safety Authority’s implementation and administration of the regulation of aircraft operators’ safety management systems.
Better practice guides contribute to improved public administration by providing a mechanism for recognising better practices in organisations and promulgating them to all Australian Government entities. This can involve examining practices in the public or private sectors, in Australia or overseas.
Depending on the subject, better practice guides can be developed from information collected during an audit or prepared to meet an identified need for guidance in a particular area of public administration. Our emphasis is on identifying, assessing and articulating good practice from our knowledge and understanding of the public sector, particularly by providing guidance in areas where improvements are warranted.
During 2010–11, we took part in a number of Australian public sector forums, seminars and conferences to increase awareness of better practice guides and other audit reports. We continued to encourage entities to use better practice guides to review their own practices, and our better practice guides continued to be well received by public sector entities, other audit offices and professional organisations.
The better practice guides published in 2010–11 are described in Table 6.
Table 6 Better practice guides published in 2010–11
|Strategic and Operational Management of Assets by Public Sector Entities
13 September 2010
|In both the public and the private sectors, asset management is an essential component of good governance, and should be aligned to, and integrated with, an entity’s strategic, corporate and financial planning. Entities should have a disciplined approach to match their investment in assets to program requirements, and to plan for asset replacement in a strategic way that accords with the Government’s capital budgeting framework where applicable.
The aim of this guide was to update the previous publication Better Practice Guide on Asset Management, June 1996, and to provide a practical asset management framework that can be adopted by Australian Government entities to assist in the effective management, maintenance, and use of assets to achieve their goals and agreed program delivery outcomes.
|Fraud Control in Government Entities
28 March 2011
|Fraud is an ever-present threat to the Australian community, and its prevention and detection pose significant challenges, to which government programs are not immune. This better practice guide explains measures entities can take in establishing an effective fraud control environment. It updates the publication Fraud Control in Australian Government Agencies (2004), and reflects the changing fraud risk landscape.
The release of this better practice guide coincides with the issue of an updated version of the Commonwealth Fraud Control Guidelines. These new guidelines establish the fraud control policy framework within which entities determine their own specific practices, plans and procedures to manage the prevention and detection of fraudulent activities.
The better practice guide is an important tool for senior management and those who have direct responsibilities for fraud control. Elements of the guide will be useful to a wider audience, including employees, contractors, service providers and others with an interest in sound public administration.
|Human Resource Information Management Systems
4 April 2011
|The better practice guide provides an overview of significant risks and controls that are relevant to key human resource (HR) functions.
The guide discusses risks and controls associated with the design, implementation and maintenance of the Human Resource Information System. It will assist HR system managers and practitioners to implement better practices to improve the effectiveness and efficiency of HR and payroll processes; strengthen system controls; and appropriately manage and segregate user access to key system functions. It will also increase awareness of system controls within the Peoplesoft and SAP HR systems that are used by a large number of Australian Government entities.
Figure 8 shows the numbers of better practice guides published over the past five financial years.
Figure 8 Publication of better practice guides, 2006–07 to 2010–11
Other audit and related services
In 2010–11, we tabled our annual review of Defence Materiel Organisation major projects. We also published our annual Audit Work Program in July 2010.
Defence Materiel Organisation major projects report
The ANAO’s third annual review of the status of selected Defence equipment acquisition projects built on the work undertaken by the Defence Materiel Organisation (DMO) and the ANAO to improve the transparency and public accountability of major Defence acquisitions. The 2009–10 Major Projects Report builds on the longitudinal analysis of project performance that began in the previous major projects reports. To meet stakeholder requirements, the report provides additional information in the project data summary sheets, including enhancements proposed by the JCPAA.
The review covered the cost, schedule and capability progress achieved by 22 DMO projects, which had an approved budget totalling $40.8 billion as at 30 June 2010. The ANAO reviewed the project data provided in the DMO project data summary sheets and provided an independent review report to Parliament in 2010–11. The non-inclusion of base date figures for expenditure and contract price for a number of projects represented a departure from the project data summary sheet guidelines, and was the basis for the Auditor-General’s qualified conclusion. The DMO advised that it had not included this information because, in its view, the provision of figures was a time-consuming and costly exercise, offering limited value for project management outcomes. Project information on risks, issues and other future events, such as capability performance, was considered by the ANAO to be outside the scope of the review.
The ANAO’s analysis again indicated that keeping major acquisition projects on schedule remains the major challenge for the DMO and industry contractors, and affects the availability of capability for the end user, the Australian Defence Force. Across the 22 projects reviewed, on average there was a schedule slippage of slightly under three years against the original target dates for achieving final operational capability.
The review was conducted in accordance with ASAE 3000, Assurance Engagements other than Audits or Reviews of Historical Financial Information. By its nature, a review does not provide the level of assurance of an audit.
Audit Work Program July 2010
We publish a comprehensive guide to our performance audit work program in July each year. It provides a portfolio-level view of our audit strategies and audits in progress as at 1 July 2010, and a rolling program of potential audit topics. While not all audits listed will be commenced, the publication assists Parliament and entities by providing a clear indication of our areas of interest. The Audit Work Program July 2010 was developed during the second half of 200910 and is widely distributed, including to all parliamentarians, the JCPAA and agency heads.
The following sections describe in more detail the ANAO’s performance in providing performance audit services during 2009–10. Our primary performance measures relate to the number of reports and guides produced (see figures 7 and 8), and the level of satisfaction the Parliament, the JCPAA and audited entities have with our reports and services. We also participate in regular quality assurance (QA) and peer review processes to monitor our performance and identify areas for improvement.
Feedback from Parliament
The contribution of performance audit services to the work of Parliament is measured, in part, by a review of comments made in parliamentary committee reports and at committee hearings. Parliamentary committee reviews of audit reports give entities an impetus to implement audit recommendations and contribute to the overall improvement of public administration. In 2010–11, parliamentary committees continued to support audit conclusions and recommendations.
We also formally survey our parliamentary stakeholders every two years, to assess how well we are meeting their expectations. The next survey will be conducted in the second half of 2011.
The most recent survey, conducted in 2009, found that a very high proportion of members of parliament and parliamentary committee secretaries had positive perceptions of the ANAO and valued its work highly. We received positive feedback on our reports and publications, our engagement with parliamentarians and the focus of our program.
We also received various suggestions on ways to improve our interactions with Parliament. We are acting on those comments, which centred on four key areas applicable to performance auditing:
- Focus of our audits—The ANAO endeavours to maintain a balance in its performance reporting. We seek a balance between reporting on issues that address the key risks and challenges facing the Australian Government public sector and individual entitles and being responsive to a changing environment and stakeholder requests that cover matters of public interest.
- Accessibility of our reports—In 2010–11 we launched our new website, which features a new look and feel, and provides more options for searching and browsing our reports.
- Format of our reports—The new website also allows us to progressively publish our reports in both HTML and PDF formats, providing greater flexibility for users to select sections of reports appropriate to their needs. We have also provided additional training for staff to provide them with skills in clear and effective writing.
- Engagement with members of parliament—In 2010–11 we engaged extensively with various committees and parliamentarians. This included briefing individual parliamentarians and committees on specific performance audit topics.
Review by the Joint Committee of Public Accounts and Audit
The JCPAA is required by the Public Accounts and Audit Committee Act 1951 to examine all reports of the Auditor-General, and report the results of its deliberations to both houses of Parliament. Its primary purpose in reviewing audit reports is to assess whether audited agencies have responded appropriately to the Auditor-General’s findings.
The JCPAA’s Report 418, Review of Auditor-General’s Reports tabled between September 2009 and May 2010, was tabled on 22 December 2010, and commented on nine performance audit reports:
- Audit Report No.7 2009–10, Administration of Grants by the National Health and Medical Research Council—The JCPAA urged the National Health and Medical Research Council to implement the ANAO recommendations aimed at strengthening accountability and transparency throughout the peer review process. Whilst commending the council’s ongoing improvement in post-award grant management, the committee urged the council to implement the ANAO’s recommendation on the implementation of a risk-based arrangement to ensure a better management of Commonwealth monies.
- Audit Report No.8 2009–10, The Australian Taxation Office’s Implementation of the Change Program: a Strategic Overview—The JCPAA acknowledged the difficulties inherent in implementing such a diverse and complex project. The committee further recommended that the Australian Taxation Office monitor and evaluate customers’ satisfaction with the new system.
- Audit Report No.10 2009–10, Processing of Incoming International Air Passengers—The JCPAA was concerned about the effectiveness of the primary line system to ensure the referral of incoming air passengers and crew who pose a risk to the Australian community. The committee urged the Customs and Border Protection Service to implement the ANAO’s recommendation No. 3, relating to IT systems issues and problems, as soon as possible, to mitigate the risk posed to Australia’s border protection.
- Audit Report No.15 2009–10, AusAID’s Management of the Expanding Australian Aid Program—The JCPAA stressed the importance of regular public reporting on performance to build public and parliamentary confidence in AusAID. The committee made further recommendations to AusAID, in particular supporting the ANAO’s recommendation regarding the classification of administered and departmental expenses.
- Audit Report No.20 2009–10, The National Broadband Network Request for Proposal Process—The JCPAA shared the ANAO’s concerns regarding risk management for the project and further encouraged all agencies and departments to identify risks early in the tender process and, where possible, quantify them.
- Audit Report No.26 2009–10, Administration of Climate Change Programs—The JCPAA found that there will be an ongoing need for climate change programs to combat the potential effects of climate change on the Australian people and economy. It noted that the department had implemented the ANAO’s recommendation regarding establishing a grants policy unit, but was concerned that the programs implemented by successive governments have experienced a range of risk management and reporting problems and that relevant departments have not been able to successfully address these issues.
- Audit Report No.27 2009–10, Coordination and Reporting of Australia’s Climate Change Measures—The JCPAA was satisfied that Australia’s Greenhouse Gas Emissions Inventory meets internal requirements and acknowledged the steps being taken by the Department of Climate Change and Energy Efficiency to improve the implementation of the United Nations Framework on Climate Change recommendations with regard to inventory. It urged the department to fully implement the ANAO’s recommendations to address inconsistencies in the reporting of abatement measures across agencies.
- Audit Report No.31 2009–10, Management of the AusLink Roads to Recovery Program—The JCPAA was concerned about the inconsistencies the audit identified in the quality of data used to measure the efficiency and effectiveness of the program. The committee noted that the Department of Infrastructure, Transport, Regional Development and Local Government had agreed to implement all of the ANAO’s recommendations.
- Audit Report No.33 2009–10, Building the Education Revolution—Primary Schools for the 21st Century—The JCPAA was concerned about the data integrity issues identified by the ANAO; it urged DEEWR to ensure that they are addressed and that the relevant data is collected to enable the program to be monitored and evaluated.
In addition, the JCPAA tabled Report 422, Review of the 2009–10 Defence Materiel Organisation Major Projects Report, in April 2011. The report incorporated ongoing issues that were raised as part of the review of the pilot major projects report in 2007–08, but also provided discussion on the Auditor-General’s major findings in relation to the 2008–09 and 2009–10 major projects reports. The committee requested the DMO to address the base-date dollar issue associated with the qualified audit opinions given in the ANAO’s reports, with the matter expected to be resolved for the 2011–12 major projects report.
The JCPAA also held hearings into a further five ANAO performance audits during the year. The reports relating to these reviews had not been released by 30 June 2011.
Responses from audited entities
Entities are not required to implement the recommendations made in audit reports, and may consider each recommendation on its merits. Improvements in administration and accountability and better service delivery are more likely if the recommendations in performance audit reports are accepted by the audited entity at the time of the audit, and we make genuine efforts to achieve that result. However, disagreement sometimes occurs. Entity comments are included in full in the final audit report.
In 2010–11, we made 143 recommendations in our audit reports to improve entity performance and accountability. Our recommendations are sometimes presented in parts for clarity, so it is possible for entities to agree with parts of one recommendation and disagree with other parts. Of the 143 recommendations, 136 (95 per cent) were fully agreed in all parts. Six recommendations (4 per cent) were agreed with some qualification, and one recommendation was not agreed.
After each performance audit report is tabled, the ANAO seeks feedback on the audit process by means of a survey and an interview with the responsible manager. The survey is completed by a firm of consultants that is engaged by the ANAO, but is independent of the performance audit teams.
The response rate from auditees surveyed for the 2010–11 reporting period was 87 per cent, an increase compared to 70 per cent in 2009–10. Auditees’ acknowledgement of the value added by ANAO products and services was consistent with 2009–10, at 86 per cent. The percentage of auditees who considered that the auditors had demonstrated they had the professional knowledge and audit skills required to conduct the audit remained stable during 2010–11 at 91 per cent (from 90 per cent for 2009–10).
The results of the survey are an important guide to the effectiveness of current practice and are also important in the development of new audit practices and approaches. The survey is an important business tool for improving the quality and effectiveness of performance audit services.
The 2010–11 QA program comprised a review of five performance audits. The reviews identified that audits were substantially compliant with the auditing standards and ANAO policies. Areas identified that warrant improvement include the presentation of audit objectives and audit criteria in audit reports, and the completion documentation in the audit working papers relating to independence and quality control requirements.
The results of the QA reviews were reported to the ANAO Executive and disseminated to all performance audit staff. The results are used to update the ANAO’s policies and guidance material, where appropriate, and are taken into account when preparing the performance audit learning and development program.
A peer review arrangement for performance audits involving the ANAO and the New Zealand Audit Office has been in place since 2000. Two performance audits from each office are reviewed every two years. This arrangement aims to strengthen performance audit practices in both offices through an exchange of constructive feedback and better practice. During 2010–11, the ANAO reviewed two new Zealand Audit Office audits and provided a report to the New Zealand Auditor-General.