Annual Performance Reporting, No 11 2003-04 The audit reviewed the 2001-02 annual reports of the departments of : Communications, Technology and the Arts; Education, Science and Training; Employment and Workplace Relations; Immigration and Multicultural and Indigenous Affairs and the Australian Customs Service. The objectives of this audit were to determine whether agencies had: established a sound annual reporting performance information framework; developed arrangements to ensure performance information is accurate and coherent; and appropriately analysed performance information in their annual reports.



Australian Public Service (APS) agencies are required to prepare an annual report that is tabled in Parliament. In accordance with section 63(2) of the Public Service Act 1999, annual reports must comply with requirements that have been approved by the Joint Committee of Public Accounts and Audit (JCPAA). The Department of Prime Minister and Cabinet (PM&C) prepares these requirements.

The PM&C Requirements for Annual Reports1 define their purpose as follows:

The primary purpose of annual reports of departments is accountability in particular to the parliament.

Annual reports should therefore inform Parliamentarians and other stakeholders about the performance of the agency and act as a key reference document.

The objectives of this audit were to determine whether agencies had:

  • established a sound annual reporting performance information framework;
  • developed arrangements to ensure performance information is accurate and coherent; and
  • appropriately analysed performance information in their annual reports.

The audit focused on whether overall characteristics were demonstrated in annual reports to make them appropriate instruments of accountability.2

The audit involved:

  • an assessment of selected sections from the annual reports of five agencies against criteria for sound reporting;
  • review in those agencies of the organisational arrangements for the coordination of the annual report, and data quality and assurance arrangements for performance information in the annual report; and
  • detailed testing of an indicative sample of performance data.

Key findings

Annual reporting performance information framework (Chapter 2)

Generally, outcomes, agency outputs and administered item outputs3 were well specified in the sections of the annual reports reviewed. Outcomes tended to be clearly stated and the expected impact and the target groups were defined. Also, specification of outputs/administered items generally was clear and identified the products and services to be delivered.

The achievement of outcomes in each of the agencies examined relied on the efforts of a range of stakeholders, and this was acknowledged in the annual reports. However, there was little or no performance information that related to the individual contributions of each agency, and other stakeholders, to the achievement of the shared outcomes.

In a number of instances, agencies did not have suitable performance measures relating to the quality of outputs/administered items or effectiveness/impact indicators for outcomes.

However, performance indicators used in the sections of annual reports reviewed were generally appropriately defined. That is, generally, the indicators measured what they purported to measure and had supporting methodologies and assumptions that were clearly identified, including data sources for those indicators.

The performance information frameworks of many of the agency reports examined were not structured to allow an assessment of the efficiency of agency operations and the cost effectiveness of outputs delivered.

Also, targets or other bases for comparison for performance indicators were not being widely used.

Data assurance arrangements (Chapter 3)

In each agency reviewed, there was a central coordination area and senior-level approval processes for the annual report that aimed to ensure all levels of information in the annual report were accurate, coherent and consistent. As well as having coordination responsibilities, most of these central areas had implemented specific strategies to review and improve the process for production of the annual report.

In addition, to assist with ensuring the accuracy and relevance of performance information, the areas within agencies responsible for the delivery of outputs were also responsible for signing off the performance information in the annual report.

Most agencies had not developed standards and procedures in relation to data quality and coherence. This meant that there were no established minimum expectations of, or bases for improvement in, data quality and coherence.

However, despite the frequent absence of data quality standards, testing of selected performance indicators by the ANAO found that the agencies had established performance information management arrangements to produce accurate information in internal and external reports. Establishing and monitoring data quality standards would assist agencies to ensure that this
situation is maintained. The ANAO suggests, therefore, that agencies consider implementing such arrangements.

Each agency reviewed used costing models to attribute costs to outputs and outcomes. The methodology and assumptions in the costing models were appropriate to produce accurate and reliable calculations of output and outcome costs. Also, testing by the ANAO found that the quality assurance control frameworks for costing information in the agencies were adequate.

However, costing system documentation and procedures covering such matters as user instructions, administration of the system and backup and recovery had not been compiled for the costing models being used by the agencies. Improvement in agencies' documentation of costing approaches would remove an unnecessary risk to the generation of important financial information by reducing reliance on the knowledge of key individuals.

The budgeting approaches in the agencies exhibited appropriate features in line with criteria for sound budgeting.

In a number of agencies, the production of internal and external performance information reports did not use the same systems and processes. During the course of the audit, the ANAO suggested that each of these agencies review their internal and external reporting frameworks with a view to achieving correlation between the two. This would be likely to enhance the accuracy, coherence and consistency of reported performance information, and could lead to operational efficiencies.

Presentation of results (Chapter 4)

While annual reports should present results and analyse performance information, the ANAO found that the sections reviewed during the audit generally only provided descriptive information about activities. Agencies also frequently provided performance information in tables and charts but did not specifically analyse this information in their discussions of performance. As well, in one agency, the complex presentation of information in the annual report would, in the ANAO's view, make assessment of its performance difficult for the reader.

As targets or other bases for comparison were not being widely used, it was generally not evident whether reported performance was above or below expectations. Where targets or other bases for comparison were included, agencies did not always analyse the relevant performance information in relation to them. Where it was obvious that performance had not met expectations, agencies generally only reported on positives and did not discuss areas where performance had not met expectations.

Trend information on non-financial performance was provided in the sections of the annual reports reviewed so that comparisons of some aspects of non-financial performance over time could be made. However, this was mostly high-level industry or sectoral information that, although it was relevant, did not relate directly to the agency's specific performance. That is, the reader would not be able to see trends in performance against agency-specific standards and targets. In addition, agencies did not actually analyse this trend information in the annual report to demonstrate their performance, therefore leaving it to the reader to draw their own conclusions.

In relation to financial performance, no information on trends was provided in the annual reports reviewed. As well, in a number of instances, there were problems with the discussion and/or presentation of results to allow comparisons of actual and budgeted expenditure. As a result, it was not possible to determine whether the agency's financial performance was in accordance with expectations, or the implications for the agency's outcomes and outputs.

As discussed above, agencies were experiencing difficulty in measuring and reporting quality and effectiveness/impact indicators. Evaluations can be useful in leading to development of measures in these areas. Although most agencies undertook a range of evaluations, the results of these were frequently not discussed in the annual report. Therefore, evaluations were not being used to support performance reporting in the annual report by providing information on quality and effectiveness that was otherwise not available.

The ANAO found that, overall, agencies had linked results reported in the annual report to commitments made in the Portfolio Budget Statements (PBS) and explained changes made in their outcomes and outputs frameworks in their annual reports. They also substantially complied with the PM&C Requirements for Annual Reports, although it should be noted in this regard that the audit assessment process provided a satisfactory rating if there was evidence that a particular requirement had been covered in the annual report. The ANAO assessment did not attempt to judge the appropriateness or otherwise of that coverage. However, certain of the PM&C requirements (for example, demonstrating trends in performance and providing an analysis of performance) are covered by some of the other audit criteria used to assess the performance information frameworks and analysis of results in annual reports.

Overall audit conclusion

The ANAO concluded, on the basis of the sections of the five 2001–02 annual reports reviewed, that outcomes, agency outputs and administered item outputs were well specified in most instances. However, in order to provide accountability and transparency to Parliamentarians and other stakeholders, agencies' annual reporting frameworks need to be improved, particularly in relation to:

  • the specification of agencies' influence on, and contribution to, shared outcomes;
  • performance measures relating to quality and effectiveness/impact;
  • the efficiency of agency operations and the cost effectiveness of outputs delivered; and
  • targets or other bases for comparison.

Performance information generally had not been presented and analysed in annual reports in a way that would allow Parliamentarians and other stakeholders to interpret and fully understand results. Particular issues concerned the need for annual reports to:

  • provide an analysis of performance, rather than list activities;
  • assess performance against targets or other bases for comparison;
  • provide and review trends in non-financial and financial performance; and
  • use the results of evaluations where appropriate to provide performance information on quality and effectiveness.

In these circumstances, annual reports did not fully meet their primary purpose of accountability, particularly to Parliament.

Agencies had developed arrangements to provide performance information in those areas of the annual reports examined that was accurate, coherent and consistent. However, establishing and monitoring agency data quality standards, improvement in documentation of costing approaches, and a review by particular agencies of the correlation between their internal and external reporting frameworks, would assist agencies to ensure that performance information in future annual reports continues to be accurate, coherent and consistent.

Agency Responses

The ANAO made two recommendations to improve accountability for, and transparency of, results in agencies' annual reports. All agencies agreed to the recommendations, except the Department of Employment and Workplace Relations (DEWR). DEWR agreed with qualification to part of Recommendation No.1 made in chapter 2 of this report.

Better practice guide

To assist agencies to develop their annual reporting performance information frameworks and analysis, the ANAO is jointly preparing, with the Department of Finance and Administration a Better Practice Guide on this subject. The Better Practice Guide is scheduled for publication in early 2004.


1 Department of Prime Minister and Cabinet, Requirements for Annual Reports for Departments, Executive Agencies and FMA Act Bodies, June 2002, Part 2, Section 5.

2 The audit criteria, along with a summary of whether they were broadly met by agencies, are listed in tables towards the beginning of chapters 2, 3 and 4.

3 The Department of Finance and Administration provides the following definitions of outcomes, outputs and administered items:
Outcomes: The impact sought or expected by government in a given policy arena. The focus is on change and consequences: what effect can the government have on the community, economy and/or national interest.
Outputs: The actual deliverables—goods and services—agencies produce to generate the desired outcomes specified by government. Administered items: Those resources administered by the agency on behalf of the government (such as transfer payments to the States, grants and benefits) to contribute to a specified outcome.
Administered items are identified separately from departmental items (that is, departmental outputs) because they involve different accountability requirements. (The Outcomes and Outputs Framework Guidance Document, November 2000, < word version-pp. 10, 19 and 16).