This edition of audit insights draws together key learnings from Australian National Audit Office (ANAO) reports tabled in the Parliament of Australia from October to December 2017. It provides insights into policy and program design – evidence and evaluation, record keeping, and the Australian Government Procurement Contract Reporting information report.

Video

Introduction

This edition of audit insights draws on the key learnings of the nine ANAO performance audit reports tabled in the Parliament during the second quarter of 2017–18.

These audits identified key learnings across a number of areas including:

  • Governance and risk management;
  • Business processes and decision making tools;
  • Policy design and implementation;
  • Managing cost savings and benefits;
  • Measuring performance and impact;
  • Regulatory reform;
  • Record keeping;
  • Reporting;
  • Quality assurance and continuous improvement; and
  • Procurement.

This insights edition focusses on two specific areas which have been reported multiple times in this quarter, as well as regularly appearing in previous audit reports:

  • Policy and program design; and
  • Record keeping.

Policy and program design – evidence and evaluation

To maximise the opportunity for success, policy and program design and implementation should be informed by a strong evidence base and sound analysis. In addition, identifying monitoring and evaluation arrangements early in the design phase, including baseline data and access to reliable sources of data, is necessary to ensure that these arrangements are not influenced by progress with implementation and intended impact can be  measured and evaluated.

Three audits in 2017-18 second quarter found examples of both good practices and areas for improvement, in entities use of an evidence base and evaluation processes in policy and program development and review:

The audit of the design and implementation of the Community Development Program found a number of good practices, including:

  • policy design was supported by analysis and consultation across government (although the review of the predecessor program was based on incomplete analysis of the data); and
  • transparent performance monitoring and reporting arrangements were established for program providers that included performance indicators that were measurable and linked to policy objectives.

However, it also found that the evaluation strategy was: developed seven months after the program commenced, reducing the scope to identify and collect baseline data to more accurately measure the impact of the program; was not peer reviewed by a reference group; and the timeline for implementation and proposed review of the evaluation findings were not aligned to inform consideration of further funding.

The Monitoring the Impact of Australian Government School Funding audit found that the arrangements established by the Department of Education and Training to monitor the impact of Australian Government school funding did not provide a sufficient level of assurance that funding had been used in accordance with the legislative framework, in particular the requirement for funding to be distributed on the basis of need. Further, the department has not used available data to effectively monitor the impact of school funding and to provide greater transparency and accountability.

With respect to the use of a strong evidence base and sound analysis to implement legislative requirements and support policy development, key findings included:

  • no robust monitoring of implementation plans that were developed, published and maintained by authorities;
  • bilateral discussions and comprehensive reviews were not conducted; and annual progress reviews were not prepared as required by bilateral agreements;
  • information provided by authorities to account for funding was relied upon with insufficient validation; and
  • data was not fully utilised, including school funding data, to inform the development of current and future education policy.

The Low Emissions Technologies for Fossil Fuels audit examined the research and development of technologies under the suite of Low Emission Technologies for Fossil Fuels programs. Key findings of the audit included:

  • in one program, program guidelines were not developed at the commencement of the program but were later put in place;
  • program key performance measures provided limited insight into the extent to which the programs were achieving the strategic objective;
  • project level reporting was based on progress and funding expended without providing visibility and oversight of program achievements against its strategic objectives; and
  • an evaluation strategy was not developed at the commencement of the programs.

As a consequence, there was limited insight into the extent to which the programs are achieving the strategic objective of accelerating the deployment of technologies to reduce greenhouse gas emissions, or inform decisions on the future of the programs.

The following key learnings with respect to policy and program design – evidence and evaluation were identified in these reports:

Record keeping

Records contain information that is a valuable resource and important business asset. To support future business activities and decisions, sufficient records should be created and retained that demonstrate the basis on which key decisions are taken to ensure that lessons can be learnt from activities. Keeping sufficient evidence of the decision-making processes and business activities is also fundamental to accountability and transparency. Information assets should be effectively managed through a systematic approach that protects and preserves records.

Three audits in the 2017–18 second quarter found opportunities for record keeping improvements:

The Monitoring the Impact of Australian Government School Funding audit found that there was an absence of evidence to demonstrate authorities’ compliance with the legislated policy requirements.

  • There was an absence of documentation to support why the department responded differently to authorities self-reported instances of non-compliance with policy requirements.
  • Due to differences in record-keeping arrangements between states and territories, no formatted approach was requested to seek evidence of compliance from government authorities. This resulted in a disparity in the evidence submitted in the same school year.

As a result of these weaknesses, including the reliance on self-reporting with limited verification work undertaken, the department has limited assurance in relation to the compliance of authorities with ongoing policy requirements.

The audit of the Australian Taxation Office’s Use of Settlements found that there was comprehensive policies and procedures to provide guidance to settlement officers, but there was scope for improved conformance with requirements to retain adequate settlement case records.

To ensure transparency of settlement decision-making and adequate quality assurance mechanisms for settlement data, the ATO’s policy is that documentation must be recorded in the case management system. Areas with the lowest levels of conformance with the requirements included:

  • attaching appropriate evidence of approvals and the rationale for settlements;
  • accurately recording settlement amounts; and
  • adequately explaining differences between amounts in settlement submissions and settlement deeds.

Systematic recording and monitoring of cases where settlement was considered but did not proceed could also support the ATO in refining its settlement case selection processes over time.

The audit of the Delivery of the Moorebank Intermodal Terminal found weakness in evidence and documentation, including:

  • the use of non-government email services for official Australian Government business, including instances of transmission of confidential project documentation;
  • evidence did not support that all parties engaged in the market sounding process received the same information; and
  • that there were no minutes or formal documentation for a significant number of negotiation meetings between the parties.

The following key learnings with respect to record keeping were identified in these reports:

Information report

This quarter the ANAO also published Australian Government Procurement Contract Reporting, an information report using data contained in Austender. Based on the value of contracts reported in Austender in 2016–17, over $47 billion in taxpayer funds was committed through government contracts. This makes Austender an important data source to tell a story about government procurement practices and to inform a debate about whether these procurement practices are serving their primary purpose – to achieve best value for money on behalf of taxpayers.

The information report was published under section 25 of the Auditor General’s Act 1997 that enables the Auditor‑General to table reports in either House of the Parliament on any matter, at any time.  It was a piece of data analysis, undertaken at a point in time, on publicly available data held in Austender – the Government’s procurement information system and primary transparency tool for contracts being tendered or entered into.

The information report was neither an audit nor an assurance review and for this reason it does not make findings, draw conclusions, make recommendations or provide any assurance in respect of the reliability of the data.  Some of the interesting insights and information included in the report are:

  • 83% of non-Defence contracts were amended with changes to contract value within 12 months of publishing the contract
  • Of contracts first published between November 2015 and November 2016, over 30% more than doubled in contract value in their first 12 months. There were over 1,800 which increased in value by between 100 and 500%, and 200 with an increase in value of over 500%.
    • The above observations may indicate that either contract scope is expanding after entering the contract, or a higher price is being paid for the original scope than agreed at the outset.
  • The number of short-term (1 month or less) contracts commencing during the month of June was on average 2.5 times higher than the number commencing in July thru to May (2012-13 to 2016-17)
    • While all examined agencies saw an increase, there was a dramatic difference between agencies, ranging from 57 to over 700% increases in short-term contracts commencing in June.
  • There is a high number of contracts with a value between $79 000 and $80 000 (the threshold for approaching the market through an open tender process) compared to those between $80,000 and $81,000
    • There was also 2,457 instances where two contracts were entered into, by the same government entity, with the same supplier organisation, with similar start dates, with both contracts being for less than $80,000 (but combined greater than $80,000)
  • In 2016-17 just under 30% of procurement (by value) was put out to open market tender (non-panel).  Over 50% was through a limited tender, which can mean a sole source contract or an approach to a selected number of possible suppliers.
  • Of the 285 procurement panels analysed that have five or more suppliers and met a threshold of activity (number of contracts), more than half had greater than 80% of the contract values awarded go to the top 20% of suppliers on that panel.

The report will inform the ANAO’s future risk analysis of government procurement practices and assist in determining the focus of future and current procurement related performance audits.