The objective of the audit was to assess the effectiveness of the delivery of the first and second funding rounds of the Filling the Research Gap program by the Department of Agriculture.

Summary

Introduction

1. The Filling the Research Gap (FtRG) under the Carbon Farming Futures Program was established in 2011 as a six year research funding initiative. The FtRG is expected to deliver:

  • a suite of collaborative research projects that address the research priorities identified for each funding round;
  • practical management options to reduce on-farm greenhouse gas emissions and sequester carbon;
  • research reports from each project, as well as an overarching report on each funding round;
  • peer-reviewed journal papers; and
  • a range of communication material to disseminate key findings and outcomes.

2. Some $201 million was budgeted for FtRG over the period 2011–12 to 2016–17. Of this amount, $150.4 million was available for grants, $34.1 million for associated initiatives1, and $16.6 million allocated to the then Department of Agriculture, Fisheries and Forestry to administer the program. In September 2013, this department became the Department of Agriculture. Throughout this report, the department responsible for administering the FtRG program is referred to as Agriculture.

3. Agriculture established arrangements to support the program’s operation as a multi-round competitive grants program. This included an assessment process whereby the eligibility of applications was assessed by departmental staff, and an expert advisory panel assessed the merit of eligible applications against the selection criteria published in the program guidelines.

4. The first funding round attracted 234 applications and was completed in May 2012. Grant funding offers were made to 58 applicants with a total value of $47.3 million (GST exclusive). The grant offers were accepted by 57 applicants (involving $47.0 million) and all funding deeds were entered into by mid-October 2012. Most projects were contracted to be completed in May or June 2015, with three projects to be completed in 2013 and one in 2014.

5. The second funding round attracted 237 applications and was completed in April 2013. Grant funding offers were made to 31 applicants with a total value of $27.1 million (GST exclusive). The grant offers were accepted by each successful applicant and all funding deeds were entered into by mid‑September 2013. The research work for most projects is to be completed by May or June 2016, with one project to be completed in May 2015.

6. Further funding rounds had originally been planned but, in July 2013, the then Government announced the return of $143 million of unallocated funding from the Carbon Farming Futures program to the budget.

Audit objective, scope and criteria

7. The objective of the audit was to assess the effectiveness of the delivery of the first and second funding rounds of the FtRG program by Agriculture.

8. The audit criteria reflected relevant policy and legislative requirements for the expenditure of public money and the grant administration framework, including the Commonwealth Grant Guidelines. They also drew upon ANAO’s Administration of Grants Better Practice Guide.

Overall conclusion

9. As part of the then Government’s Clean Energy Future plan, the FtRG grants program was to invest up to $201 million in research to identify and develop new ways for land managers to reduce emissions, store carbon in the soil and enhance sustainable agricultural practices. Through the two funding rounds that have been undertaken, a total of $74.1 million was contracted to 88 applicants located across all states and the Australian Capital Territory. The Commonwealth Scientific and Industrial Research Organisation (CSIRO) and universities accounted for the majority of funding awarded under the program. The research work for most round one projects is to be completed in May or June 2015, and most round two projects are to be completed in May or June 2016.

10. The delivery of the first and second funding rounds by Agriculture was effective in a number of important respects. Of note was that:

  • clear roles and administrative responsibilities were established, including an expert advisory panel to assess and make recommendations on the most meritorious applications against the published selection criteria;
  • the program guidelines were comprehensive and the design and conduct of the funding rounds contributed to accessible and competitive application processes;
  • Agriculture’s briefing of the then Minister on the assessment outcomes of each round addressed those matters relevant to grants decision‑making and was timely; and
  • the grant agreements signed with the successful applicants adequately protected the Commonwealth’s interests.

11. There were, however, also some important shortcomings in Agriculture’s implementation of the FtRG program. From a governance perspective, the arrangements adopted to identify and manage conflicts of interest were not tailored to the circumstances of the program.2 Shortcomings were also evident in the application assessment approach for each funding round including, departures from the assessment approach outlined in the published program guidelines. In addition, unsuccessful applicants were not provided with feedback that clearly identified the reasons they had not been awarded funding.3

12. A recurring theme in ANAO’s audits of grants administration over a number of years has been the importance of grant programs being implemented in a manner that accords with published program guidelines so that applicants are treated equitably, and those applications that are funded are the most likely to further the program’s objectives.4 In this context, two of the four ANAO recommendations relate to Agriculture improving aspects of its assessment of grant applications with a particular focus on the department clearly following the approach outlined in the published program guidelines. The other recommendations relate, in turn, to tailoring the conflict of interest management arrangements to the circumstances of the particular granting activity, and providing clearer feedback to unsuccessful applicants.

Key findings by chapter

Program design and governance (Chapter 2)

13. Agriculture established a generally sound framework for the design and governance of the program. In this respect, a range of governance documentation was developed to assist in the delivery of each program funding round. However, while each funding round’s implementation plan outlined the intention to have a monitoring and evaluation component as part of the program, it was not until late 2012 and early 2013 that the FtRG program area began focusing in detail on developing a monitoring and evaluation plan (and it had not been finalised more than 18 months after the program’s establishment).5

14. Rather than implement the approach outlined in the program’s implementation plans, the department decided to develop one monitoring and evaluation framework to cover all three of its Carbon Farming Futures programs. In this context, a number of Key Performance Indicators (KPIs) have been developed addressing both the administration of the program and program outcomes. Performance against these KPIs was to be reported to the Parliament through annual reports of the Land Sector Carbon and Biodiversity Board, but this has yet to occur.6 This reporting is separate to that required through the Portfolio Budget Statements (PBS) and Annual Report. In this context, the KPI for the program published in the PBS is inadequate as it does not relate to whether the program is achieving its objectives but, rather, the number of projects that have been awarded funding.

15. On a more positive note, clear roles and responsibilities were established for the program, including in respect to an expert advisory panel to play a key role in the merit assessment stage. The panel brought specific knowledge, experience and judgement to bear in assisting Agriculture to formulate funding recommendations to the then Minister. Nevertheless, as might be expected, there were particular conflicts of interest issues to be addressed. For both funding rounds, a significant proportion of eligible applications and successful applications were subject to either a direct or indirect conflict of interest declaration by one or more panel members. This situation reflected the extent of the panel members’ involvement in carbon research in the agricultural sector. In this context, the rates of declared conflicts of interest were similar to other non-agricultural research grant programs.

16. The FtRG program area followed the standard departmental arrangements for managing conflicts of interest. However, there were a number of shortcomings with the departmental record of declared direct and indirect conflicts of interest and how they were addressed. In addition, the conflict of interest arrangements that were adopted did not address the issue of potential conflicts that arise as a result of past collaboration between panel members and applicants, including through publications and co-authorships. There were also a number of shortcomings in relation to the management of those conflict of interest situations that were not addressed in the program governance arrangements.

Access to the program (Chapter 3)

17. The grant application process for both rounds was accessible and was effectively designed to maximise the attraction of high quality applications for assessment. Each round attracted over 230 applications and very few of these were assessed as ineligible.7

18. One important factor that contributed to the low level of assessed ineligibility was that most of the eligibility requirements were clearly grouped and identified in the program guidelines for each round. However, there were some inconsistencies in the department’s eligibility assessment in relation to applicants: providing all the necessary letters of support; and detailing in-kind contributions. In addition, the guidelines did not provide consistent advice to applicants concerning the required completion date for projects.

19. In round two, the program’s use of an online SmartForm was able to limit applicants from entering some information that otherwise would have made the application ineligible. Nevertheless, departmental records do not demonstrate how queries raised in the course of the eligibility checking process were addressed before it was decided that applications were eligible. Further, the guidelines provided that in-kind contributions were ‘required’, but the department did not assess this statement as an eligibility requirement.8

Merit assessment for the first round (Chapter 4)

20. Consistent with sound grants administration practice, the merit assessment approach planned for the first round involved all eligible, compliant applications being assessed in the same manner against the same criteria. Also consistent with sound grants administration, the planned approach involved eligible applications being scored on a scale of zero to 10 against each of the seven equally weighted published criteria.9 However, the program’s assessment arrangements operated in such a way that there was no requirement that applications satisfy each assessment criterion to be considered for funding. Such an approach does not recognise that applications that are assessed as not satisfactorily meeting the published merit assessment criteria are most unlikely to represent value for money in the context of the program objectives.

21. The panel’s assessment report, prepared by the department, formed the basis for advice to the Minister on applications recommended for approval. However, there were a number of inconsistencies and inaccuracies in the report. In addition, the report did not draw attention to a significant departure that had occurred compared with the published program guidelines relating to the assessment of the seventh assessment criterion, which was described as ‘appropriate budget’.

22. Specifically, the program guidelines outlined that the assessment would take into account the extent to which proposals demonstrated value for money. However, this criterion was not scored on a zero to 10 scale as with the other six assessment criteria. The effect of this approach was that the appropriate budget criteria was not equally weighted and similarly assessed with the other criteria10, as required in the guidelines. Further, the overall merit score for eligible applications did not incorporate an assessment of the extent to which they had included an ‘appropriate budget’ for the project.11

Merit assessment for the second round (Chapter 5)

23. Adjustments to the published program criteria were made for the second funding round, reflecting lessons learned from the first round. In particular, separate criteria (and sub-criteria) were published for research projects and research coordination projects (see Table S.1).

Table S.1: Number of assessment criteria and sub-criteria

 Number of assessment criteria and sub-criteria

Source: ANAO analysis of round two program guidelines.

24. Similar to the first round, the round two program guidelines had outlined that the round two merit assessment process would involve eligible applications being assessed against the published criteria and a merit ranking being allocated to each. However, Agriculture adopted an online assessment tool for the program that used four scoring categories (‘method’, ‘capacity’, ‘value’ and ‘risk’), rather than enabling the direct scoring of the program’s seven merit assessment criteria (‘addressing research priorities’, ‘defined activities and outcomes’, ‘achieve FtRG outcomes’, ‘technical feasibility’, ‘risk management’, ‘financial and managerial competency’ and ‘value for money’). In this context, the online assessment tool was not fit for purpose given the particular requirements of the FtRG program.

25. The FtRG program area developed 18 questions, closely aligned or reflective of the program’s assessment criteria. This went some way towards implementing an assessment approach that reflected the criteria and sub‑criteria12 published in the program guidelines. However, for four of the assessment criteria, a quantitative assessment for one or more of the published sub-criteria was not able to be undertaken through the tool. As a result:

  • certain sub-criteria were not addressed by the questions used for scoring applications against the four categories which meant that a number of assessment criteria were not fully assessed;
  • applications were not scored, ranked and reported according to the seven assessment criteria contained in the program guidelines, but on the basis of four scoring categories; and
  • the feedback that was provided to unsuccessful applicants was couched in terms of the four scoring categories rather than the published criteria.

Advice to the Minister, and funding decisions (Chapter 6)

26. While some aspects of the assessment process were not well documented by the department, the then Minister was provided with timely briefings on the outcomes of the two funding rounds that addressed those matters relevant to grants decision-making. The briefings included clear recommendations that the Minister should approve those applications assessed by the expert advisory panel as the most meritorious for grant funding. The only significant shortcoming in the departmental briefings was that they assured the Minister that the assessment process for each round had complied with the program guidelines. However, as indicated above, this was not the case. The Minister agreed to the funding recommendations he received from the department.

Funding distribution, feedback to applicants and implementation of funding deeds (Chapter 7)

27. The first and second rounds of the program resulted in funding being distributed in a way that was consistent with the program’s objective to draw upon industry, science and government sectors for practical research outputs.13 There was also no evidence of any political bias in the approval of funding.

28. The provision of feedback to unsuccessful applicants in terms of merit assessment was generally sound in round one, although applicants whose application had been assessed as ineligible were not informed of this situation (or the reasons). As indicated at paragraph 25, where detailed feedback was requested by unsuccessful applicants in round two, the feedback to applicants was not presented in terms of comparative performance against the assessment criteria contained in the program guidelines. This reflected the approach taken to the merit assessment process through the online assessment tool (see paragraphs 24 and 25).

29. The department adopted sound governance arrangements for the approved grants. Further, grant funding deeds were generally in accord with the terms approved by the then Minister and the expert panel’s advice. However, there were a number of errors and oversights in Agriculture’s reporting of grants.

Summary of agency response

30. The proposed audit report was provided to Agriculture and members of the expert advisory panel. Some adjustments were made to the report in relation to the suggestions made by the panel. The department provided formal comments on the proposed report and these are summarised below, with the full response included at Appendix 1:

The Department of Agriculture (department) welcomes the ANAO’s findings in relation to the design and delivery of the Filling the Research Gap program, including the establishment of comprehensive guidelines enabling an open and competitive application processes for both rounds of the program.

Facilitated by the clear guidance provided to applicants, the department attracted a range of high quality research applications that addressed all of the program’s research priorities. Through these applications, the department has established a comprehensive suite of research projects that will enable it to meet the program’s objectives of identifying practical outcomes that Australia’s farmers can use to reduce agricultural greenhouse gas emissions while maintaining productivity.

The department acknowledges the overall findings of the audit report including the identification of some areas where it can make some further improvements to its program design and delivery processes. The department is implementing changes to its Grants Management Manual to further emphasise the need to tailor each aspect of a program’s design and consider how constructive feedback can be provided to unsuccessful applicants.

The department agrees with each of the recommendations made in the audit report and is taking action to implement these recommendations as part of its current review of its grant administrative processes to ensure that they address the key principles and requirements outlined in the updated Commonwealth Grant Guidelines.

Recommendations

Set out below are the ANAO’s recommendations and the Department of Agriculture’s abbreviated responses. More detailed responses are shown in the body of the report immediately after each recommendation.

Recommendation No.1

Paragraph 2.74

To improve the conflict of interest management arrangements for competitive, merit-based grant programs, ANAO recommends that the Department of Agriculture:

(a) emphasise in its Grants Management Manual the importance of these arrangements being tailored to the circumstances of the particular granting activity; and

(b) when employing expert advisory panels to assist with the implementation of research grant programs, address in the program governance arrangements the potential conflicts of interest that arise from recent collaborations between applicants and panel members through publications and co-authorships.

Agriculture’s response: Agreed

Recommendation No.2

Paragraph 4.40

To improve the assessment of applications to competitive, merit-based grant programs, ANAO recommends that the Department of Agriculture:

(a) establish appropriate minimum scores that an application must achieve against each assessment criterion in order to progress in the assessment process as a possible candidate to be recommended for funding; and

(b) develop guidance for producing assessment reports that provide an accurate outline of the application and selection process that was followed, the results of this work and the basis for the recommendations that are made to the decision-maker.

Agriculture’s response: Agreed

Recommendation No.3

Paragraph 5.36

To improve the assessment of applications to competitive, merit-based grant programs, ANAO recommends that the Department of Agriculture properly test and authorise any information technology based system to support assessments before introduction.

Agriculture’s response: Agreed

Recommendation No.4

Paragraph 7.66

To improve the feedback it provides to unsuccessful grant program applicants, ANAO recommends that the Department of Agriculture clearly outline:

(a) whether the application(s) had been assessed as ineligible and, if so, the reasons for this; and

(b) for applications that proceeded to merit assessment, the relative performance of their application(s) against the published assessment criteria.

Agriculture’s response: Agreed

Footnotes

[1] This included $30.5 million for the Australian Bureau of Statistics to undertake a biennial land management practice survey.

[2] Such tailoring was needed given the specialised nature of the program’s research and the breath of the panel members’ own engagement in this research. Because the arrangements were not tailored, panel members were not required to declare past authorship collaborations with applicants.

[3] In round one, applications assessed as ineligible were not informed of this situation. In round two, feedback to unsuccessful applicants was not presented in terms of comparative performance against the assessment criteria.

[4] Similarly, the grants administration framework was developed based, in part, on recognition that potential applicants and other stakeholders have a right to expect that program funding decisions will be made in a manner, and on a basis, consistent with the published program guidelines.

[5] This approach was adopted notwithstanding that, consistent with the Commonwealth Grant Guidelines (CGGs), Agriculture’s Grants Management Manual highlights that it is important in the planning and design phase of the program to draft a monitoring and evaluation strategy as the elements of the strategy will feed into the whole grant and program administration processes and the development of key documents.

[6] The 2011–12 Annual Report, the only report tabled to date, outlined that: ‘Most Land Sector Package measures will commence in 2012–13. As these programs are still under development their performance indicators have not yet been fully defined. During the course of 2012–13 performance indicators for each of the measures will be developed and agreed and reported on in the 2012–13 Annual Report.’

[7] Of the 234 applications received in round one, 222 (or 95 per cent), were assessed as eligible to proceed to the merit assessment stage. In round two, none of the 237 applications were assessed as ineligible, but one application was rejected because it was late.

[8] Three research project applications did not detail in-kind contributions, but were not considered ineligible applications by the department (none of these applications were awarded funding). The application form for the nine coordination project applications did not provide applicants with the opportunity to include a proposed budget, including in-kind contributions (one of these applications was awarded funding).

[9] While none of the 58 recommended and approved applications scored less than six out of 10 against the first five assessment criteria, eight of those applications (14 per cent) scored between 3.2 and 4.8 out of 10 against the sixth criterion (‘identifies key risks and mechanisms to treat these risks’).

[10] Early feedback from some panel members about challenges in scoring this criterion from the budget information provided by applicants led to assessors giving a yes or no assessment response, rather than scoring applications out of 10 against the criterion.

[11] The only circumstance under which this departure could not have had an effect on the relative ranking of eligible applications would be in the unlikely situation that all 222 eligible applications had submitted an application that was equally meritorious in terms of demonstrating how the grant funding they had budgeted would contribute directly to the program objectives; the extent to which the value for money of their project was enhanced through cash or in-kind contributions; and the level of contribution evident in letters of support from co-contributors and/or consortium members.

[12] For each of the criterion, the published guidelines had stated that to rank highly against the criterion an application would need to demonstrate certain listed characteristics (using the phrase ‘applications that will rank highly against this criterion will…’). In this report, ANAO has described those listed points under each criterion as sub-criteria.

[13] Overall, universities accounted for 45 per cent of the total number of grants awarded in rounds one and two, followed by the CSIRO which accounted for 26 per cent of grants. State government departments/agencies and industry/other applicants each accounted for a similar share of grants—almost 15 per cent in each case.