The objective of the audit was to assess the Department of Social Services’ (Social Services) implementation and evaluation of the Cashless Debit Card trial.

Summary and recommendations

Background

1. Welfare quarantining, in the form of income management, was first introduced in 2007 as part of the Australian Government’s Northern Territory National Emergency Response.1 The aim of income management is to assist income support recipients to manage their fortnightly payments — such as Newstart/Youth Allowance, parenting or carer payments, and the Disability Support Pension — for essentials like food, rent and bills.2

2. On 1 December 2014, the Government agreed to trial a new approach to income management — the Cashless Debit Card (CDC), in Ceduna and the East Kimberley. The Cashless Debit Card Trial (CDCT or the trial) aimed to: test whether social harm caused by alcohol, gambling and drug misuse can be reduced by placing a portion (up to 80 per cent) of a participant’s income support payment onto a card that cannot be used to buy alcohol or gambling products or to withdraw cash; and inform the development of a lower cost welfare quarantining solution to replace current income management arrangements.

3. On 14 March 2017, the Minister for Human Services and the Minister for Social Services announced the extension of the trial in Ceduna and the East Kimberley for a further 12 months. In addition, funding was allocated as part of the 2017–18 Budget to trial the CDC in two new locations with the Government announcing in September 2017 that the CDC would be delivered to the Goldfields region of Western Australia and also to the Hinkler Electorate (Bundaberg and Hervey Bay Region) in Queensland.3 Subsequently, the Social Services Legislation Amendment (Cashless Debit Card) Act 2018 received royal assent on 20 February 2018. The amendments restricted the expansion of the CDC, with the cashless welfare arrangements continuing to 30 June 2019 in the current trial areas of East Kimberley and Ceduna, with one new trial site in the Goldfields.

Rationale for undertaking the audit

4. Recent ANAO audits have highlighted the need for entities to articulate mechanisms to determine whether an innovation is successful and what can be learned to inform decision making regarding scaling up the implementation of that innovation. The CDCT was selected for audit to identify whether the Department of Social Services (Social Services) was well placed to inform any further roll-out of the CDC with a robust evidence base. Further, the audit aimed to provide assurance that Social Services had established a solid foundation to implement the trial including: consultation and communication with the communities involved; governance arrangements; the management of risks; and robust procurement arrangements.

Audit objective and criteria

5. The objective of the audit was to assess the Department of Social Services’ implementation and evaluation of the Cashless Debit Card Trial.

6. To form a conclusion against the audit objective, the ANAO adopted the following high level audit criteria:

  • Appropriate arrangements were established to support the implementation of the Cashless Debit Card Trial.
  • The performance of the Cashless Debit Card Trial was adequately monitored, evaluated and reported on, including to the Minister for Social Services.

Audit methodology

7. The audit methodology included:

  • examining and analysing documentation relating to the implementation, risk management, monitoring and evaluation for the Cashless Debit Card Trial; and
  • interviews with key officials in the departments of Social Services and Prime Minister and Cabinet and with external stakeholders including Indue Limited (Indue), ORIMA Research (ORIMA), Community Leaders, Local Partners and others in the trial sites.

Conclusion

8. The Department of Social Services largely established appropriate arrangements to implement the Cashless Debit Card Trial, however, its approach to monitoring and evaluation was inadequate. As a consequence, it is difficult to conclude whether there had been a reduction in social harm and whether the card was a lower cost welfare quarantining approach.

9. Social Services established appropriate arrangements for consultation, communicating with communities and for governance of the implementation of CDCT. Social Services was responsive to operational issues as they arose during the trial. However, it did not actively monitor risks identified in risk plans and there were deficiencies in elements of the procurement processes.

10. Arrangements to monitor and evaluate the trial were in place although key activities were not undertaken or fully effective, and the level of unrestricted cash available in the community was not effectively monitored. Social Services established relevant and mostly reliable key performance indicators, but they did not cover some operational aspects of the trial such as efficiency, including cost. There was a lack of robustness in data collection and the department’s evaluation did not make use of all available administrative data to measure the impact of the trial including any change in social harm. Aspects of the proposed wider roll-out of the CDC were informed by learnings from the trial, but the trial was not designed to test the scalability of the CDC and there was no plan in place to undertake further evaluation.

Supporting findings

Implementation of the Cashless Debit Card Trial

11. Social Services conducted an extensive consultation process with industry and stakeholders in the trial sites. A communication strategy was developed and implemented which was largely effective, although Social Services identified areas for improvement in future rollouts.

12. There were appropriate governance arrangements in place with clearly defined roles and responsibilities across key departments and stakeholders for reporting and oversight of the CDCT.

13. Social Services demonstrated an integrated approach to risk management across the department linking enterprise, program and site-specific risk plans. While a CDCT program risk register was developed, the identified risks were not actively managed, some risks were not rated in accordance with the Risk Management Framework, there was inadequate reporting of risks and some key risks were not adequately addressed by the controls or treatments identified. In particular, treatments were inadequate to address evaluation data and methodology risks that were ultimately realised. Social Services managed and effectively addressed operational issues as they arose.

14. Aspects of the procurement process to engage the card provider and evaluator were not robust. The department did not document a value for money assessment for the card provider’s IT build tender or assess all evaluators’ tenders completely and consistently.

15. Social Services effectively established or facilitated arrangements to deliver local support to CDCT communities, although there were delays in the deployment of additional support services. As part of the CDCT, Social Services also trialled Community Panels and reviewed their effectiveness to inform broader implementation.

Performance monitoring, evaluation and reporting

16. A strategy to monitor and analyse the CDCT was developed and approved by the Minister. However, Social Services did not complete all the activities identified in the strategy (including the cost-benefit analysis) and did not undertake a post-implementation review of the CDCT despite its own guidance and its advice to the Minister that it would do a review. There was scope for Social Services to more closely monitor vulnerable participants who may participate in social harm and their access to cash.

17. Key performance indicators (KPIs) developed to measure the performance of the trial were relevant, mostly reliable but not complete because they focused on evaluating only the effectiveness of the trial based on its outcomes and did not include the operational and efficiency aspects of the trial. There was no review of the KPIs during the trial and KPIs have not been established for the extension of the CDC.

18. Social Services developed high level guidance to support its approach to evaluation, but the guidance was not fully operationalised. Social Services did not build evaluation into the CDCT design, nor did they collaborate and coordinate data collection to ensure an adequate baseline to measure the impact of the trial, including any change in social harm.

19. Social Services regularly reported on aspects of the performance of the CDCT to the Minister but the evidence base supporting some of its advice was lacking. Social Services advised the Minister, after the conclusion of the 12 month trial, that ORIMA’s costs were greater than originally contracted and ORIMA did not use all relevant data to measure the impact of the trial, despite this being part of the agreed Evaluation Framework.

20. Social Services undertook a review and reported to the Minister on a number of key lessons learned from the 12 month trial of the CDC. Learnings about the effectiveness of the Community Panels were based on the number of applications received and delays in decision making, rather than from the evaluation findings that noted a delay in the establishment of the Community Panels and a lack of communication with participants. The 12 month trial did not test the scalability of the CDC but tested a limited number of policy parameters identified in the development of the CDC. Many of the findings from the trial were specific to the cohort (predominantly indigenous) and remote location, and there was no plan in place to continue to evaluate the CDC to test its roll-out in other settings.

Recommendations

Recommendation no.1

Paragraph 2.20

Social Services should confirm risks are rated according to its Risk Management Framework and ensure mitigation strategies and treatments are appropriate and regularly reviewed.

Department of Social Services’ response: Agreed.

Recommendation no.2

Paragraph 2.31

Social Services should employ appropriate contract management practices to ensure service level agreements and contract requirements are reviewed on a timely basis.

Department of Social Services’ response: Agreed.

Recommendation no.3

Paragraph 2.36

Social Services should ensure a consistent and transparent approach when assessing tenders and fully document its decisions.

Department of Social Services’ response: Agreed.

Recommendation no.4

Paragraph 3.14

Social Services should undertake a cost-benefit analysis and a post-implementation review of the trial to inform the extension and further roll-out of the CDC.

Department of Social Services’ response: Agreed.

Recommendation no.5

Paragraph 3.47

Social Services should fully utilise all available data to measure performance, review its arrangements for monitoring, evaluation and collaboration between its evaluation and line areas, and build evaluation capability within the department to facilitate the effective review of evaluation methodology and the development of performance indicators.

Department of Social Services’ response: Agreed.

Recommendation no.6

Paragraph 3.69

Social Services should continue to monitor and evaluate the extension of the Cashless Debit Card in Ceduna, East Kimberley and any future locations to inform design and implementation.

Department of Social Services’ response: Agreed.

Summary of the Department of Social Services’ response

21. The Department of Social Services was provided with a copy of the proposed audit report for comment. A summary of the department’s response is below and the full response is at Appendix 1.

The Department of Social Services (the department) welcomes the ANAO’s conclusions and agrees with the six recommendations. The department notes there are a number of areas it needs to focus on including in relation to Risk Management, contract management arrangements and utilising available data to measure performance.

22. An extract of the proposed report was provided to the organisations mentioned in the proposed report: Indue Limited; ORIMA Research; Boston Consulting Group; Maddocks; Clayton Utz; Deloitte Access Economics; and Colmar Brunton.

Key learnings for all Australian Government entities

Below is a summary of key learnings and areas for improvement identified in this audit report that may be considered by other Australian Government entities when trialling new initiatives and designing, implementing and evaluating programs.

Performance and impact measurement

1. Background

Introduction

1.1 Welfare quarantining, in the form of income management, was first introduced in 2007 as part of the Australian Government’s Northern Territory National Emergency Response.4 The aim of income management is to assist income support recipients manage their fortnightly payments — such as Newstart/Youth Allowance, parenting or carer payments, and the Disability Support Pension — for essentials like food, rent and bills.5

1.2 Income management is targeted towards specific groups, ‘based upon their higher risks of social isolation and disengagement, poor financial literacy, and participation in risky behaviours’.6 Subsequent to its introduction in the Northern Territory, income management has been implemented in other locations in Australia, including the East Kimberley (2008) and Ceduna (2014) with more than 25 000 people subject to income management as at 29 December 2017.

The Cashless Debit Card Trial

1.3 On 1 December 2014, the Government agreed to trial a new approach to income management — the Cashless Debit Card (CDC).7 The Cashless Debit Card Trial (CDCT or the trial) aimed to:

  • test whether social harm caused by alcohol, gambling and drug misuse can be reduced by placing a portion of a participant’s income support payment onto a card that cannot be used to buy alcohol or gambling products (such as poker machines, but not including lottery tickets), or to withdraw cash; and
  • inform the development of a lower cost welfare quarantining solution to replace current income management arrangements.

1.4 The card’s design was informed by existing income management trials and the Forrest Review: Creating Parity8 report which recommended the introduction of a Healthy Welfare Card to support welfare recipients to: manage their income and expenses; save for large purchases; and invest welfare income in a healthy life.9

1.5 The parameters that were tested during the CDCT included: the delivery of welfare quarantining by restricting 80 per cent of income support payments into a commercially provided debit card account; the use of contracted Local Partners; and establishing Community Panels that were authorised to make some administrative decisions.10 The CDCT was co-designed, with input from community leaders and the wider community during its development.

1.6 The CDCT was conducted in two locations (see Figure 1.1) for 12 months, commencing on:

  • 15 March 2016 in Ceduna and surrounding areas — Koonibba, Oak Valley, Penong, Scotdesco, Smoky Bay, Thevenard and Yalata — in South Australia; and
  • 26 April 2016 in the East Kimberley region — Kununurra and Wyndham — in Western Australia.

Figure 1.1: Cashless Debit Card Trial locations

A map of Australia showing the Cashless Debit Card trial locations of Wyndham, Kununurra and Ceduna.

Source: ORIMA Research documentation.

1.7 The two sites were selected for the CDCT following consultation and support from the identified community leaders. Income management measures had been in place at both sites prior to implementation of the CDCT, with the majority of those on income management in these sites being voluntary participants. Key aspects of the existing income management model and the CDCT model are compared in Appendix 3.

1.8 The trial sites had populations of 4110 and 5139 in Ceduna and the East Kimberley respectively and both were geographically remote.11 Within these populations, there were 783 CDCT participants in Ceduna and 1320 CDCT participants in the East Kimberley as at 31 March 2017. Around 30 per cent of the population in the trial sites identified as Indigenous Australians (compared to fewer than three per cent of the total population) with Indigenous participants making up more than three-quarters of all CDCT participants (77 per cent of participants in Ceduna and 81 per cent of participants in the East Kimberley).

1.9 The CDCT was administered by the Department of Social Services (Social Services) with support from the Department of the Prime Minister and Cabinet (PM&C) and the Department of Human Services (Human Services). In addition, the Social Services contracted:

  • Indue Limited (Indue) to deliver the IT build, Cashless Debit Card, banking services and local support for the CDCT; and
  • ORIMA Research (ORIMA) to undertake an evaluation of the CDCT. ORIMA’s Wave 1 Interim Evaluation Report was completed in February 2017 and the Cashless Debit Card Trial Evaluation: Final Evaluation Report (CDCT final evaluation report) was completed in August 2017, and released to the public in March 2017 and September 2017 respectively.

1.10 A timeline of key events in the development and implementation of the CDCT is at Figure 1.2.

Figure 1.2: Timeline of key events

A diagram of key events for the implementation and delivery of the Cashless Debit Card.  1 August 2014 – Creating Parity (Forrest Review) is released recommending the implementation of a Healthy Welfare Card 1 December 2014 - Government agrees to trial a new approach to income management  19 December 2014 - Approval given to procure consultancy and probity advisors to advise on new Healthy Welfare Card 22 January 2015 - Inaugural meeting of the New Welfare Card Steering Committee 16 March 2015 - Ministerial approval provided for direct tender approach to Indue April - May 2015 Parliamentary Secretary to the Prime Minister travels to potential trial communities for consultations with leaders 4 August 2015 Ceduna and surrounds announced as the first trial site and MoU signed between Ceduna Leaders and the Commonwealth 8 August 2015 Announcement of support services package for Ceduna 18 August 2015 Social Security Legislation Amendment (Debit Card Trial) Bill 2015 introduced to Parliament 31 August 2015 IT Build contract signed with Indue 12 November 2015 Social Security Legislation Amendment (Debit Card Trial) Act 2015 receives Royal Assent 16 November 2015 Announcement of the East Kimberley as the second trial site and announcement of support services package 1 December 2015 Monitoring and Evaluation Strategy submitted to Assistant Minister for approval 22 February 2016 - ORIMA Research commissioned to perform the independent evaluation 15 March 2016 Trial Commences in Ceduna and surrounds 18 March 2016	 Ceduna Community Panel first meeting 4 April 2016 Operational contract signed with Indue 21 April 2016 – 26 May 2016 Initial Conditions evaluation fieldwork undertaken by Orima 26 April 2016 Trial commences in the East Kimberley April – May 2016 – cards rolled out in Ceduna and surrounds June 2016 – Cards rolled out in the East Kimberley

Source: ANAO analysis of Social Services’ documentation.

Cost of the Cashless Debit Card Trial

1.11 The total cost of the 12 month trial including implementation costs was approximately $18.3 million. A breakdown of costs associated with the delivery of the CDCT are outlined in Table 1.1.

Table 1.1: Cost of the Cashless Debit Card Triala

 

2015–16

$m

2016–17

$m

2017–18

$m

Total

$m

Human Services

3.427

1.273

0

4.700

Social Services

0.825

1.785

0

2.610

Indue

5.823

3.063

0

8.886

Other contractors

0.907

1.049

0.149

2.105

Totalb

10.982

7.170

0.149

18.301

         

Note a: Costs are calculated on an accrual basis, that is, costs are calculated according to when the service was received, not when payment was made.

Note b: CDC related costs excluded from this total are the support services provided to both trial sites of $1.0 million for Ceduna and $1.6 million for the East Kimberley region and $240,000 paid to Boston Consulting Group which was contracted to provide advice on the initial development of the CDCT.

Source: ANAO analysis of Social Services’ documentation.

1.12 On 14 March 2017, the Minister for Human Services and the Minister for Social Services announced the extension of the trial in both sites for a further 12 months. In addition, funding was allocated as part of the 2017–18 Budget to trial the CDC in two new locations with the Government announcing in September 2017 that the CDC would be delivered to the Goldfields region of Western Australia and also to the Hinkler Electorate (Bundaberg and Hervey Bay Region) in Queensland.12 Subsequently, the Social Services Legislation Amendment (Cashless Debit Card) Act 2018 received royal assent on 20 February 2018. The amendments restricted the expansion of the CDC, with the cashless welfare arrangements continuing to 30 June 2019 in the current trial areas of East Kimberley and Ceduna, with one new trial site in the Goldfields.

Rationale for undertaking the audit

1.13 Recent ANAO audits13 have highlighted the need for entities to articulate mechanisms to determine whether an innovation is successful and what can be learned to inform decision making regarding scaling up the implementation of that innovation. The CDCT was selected for audit to identify whether the Department of Social Services (Social Services) was well placed to inform any further roll-out of the CDC with a robust evidence base. Further, the audit aimed to provide assurance that Social Services had established a solid foundation to implement the trial including: consultation and communication with the communities involved; governance arrangements; the management of risks; and robust procurement arrangements.

Audit approach

Audit objective, criteria and scope

1.14 The objective of the audit was to assess the Department of Social Services’ implementation and evaluation of the CDCT.

1.15 To form a conclusion against the audit objective, the ANAO adopted the following high level audit criteria:

  • Appropriate arrangements were established to support the implementation of the CDCT.
  • The performance of the CDCT was adequately monitored, evaluated and reported on, including to the Minister for Social Services.

Audit methodology

1.16 The audit methodology included:

  • examining and analysing documentation relating to the implementation, risk management, monitoring and evaluation for the CDCT; and
  • interviews with key officials in the departments of Social Services and Prime Minister and Cabinet and with external stakeholders including Indue, ORIMA, Community Leaders, Local Partners14 and others in the trial sites.

1.17 The audit was conducted in accordance with ANAO Auditing Standards at a cost to the ANAO of approximately $483,000.

1.18 Team members for this audit were Sandra Dandie, Judy Jensen, Michael Jones and Sally Ramsey.

2. Implementation of the Cashless Debit Card Trial

Areas examined

This chapter examined whether Social Services established appropriate arrangements to support the implementation of the Cashless Debit Card Trial (CDCT) including for: consultation and communication; governance; risk management; procurement; and supporting local communities.

Conclusion

Social Services established appropriate arrangements for consultation, communicating with communities and for governance of the implementation of CDCT. Social Services was responsive to operational issues as they arose during the trial. However, it did not actively monitor risks identified in risk plans and there were deficiencies in elements of the procurement processes.

Areas for improvement

The ANAO made three recommendations aimed at improving risk management practices, contract management and procurement processes.

Was there appropriate consultation and an effective communication strategy implemented?

Social Services conducted an extensive consultation process with industry and stakeholders in the trial sites. A communication strategy was developed and implemented which was largely effective, although Social Services identified areas for improvement in future rollouts.

2.1 Social Services began communications planning well in advance of the Cashless Debit Card Trial (CDCT) commencement. Consultations with the banking and retail sectors began in late 2014, and initial implementation plans were developed and approved in January 2015. In December 2015, an Overarching Communication Strategy was prepared by Social Services in consultation with the Department of the Prime Minister and Cabinet (PM&C).

2.2 There was considerable consultation and engagement with stakeholders15, including with indigenous community leaders prior to the commencement of the CDCT. Community leaders in the Ceduna area and East Kimberley supported trialling the Cashless Debit Card (CDC) in their communities and were actively involved in many CDCT decisions, including the decision to set the quarantined portion of a participant’s income support at 80 per cent.

2.3 To support the implementation of the trial, Social Services documented timelines and activities in site specific implementation plans including advertising campaigns, the distribution of communication materials and ‘engaging the community’ activities such as workshops and BBQs in the months leading up to the trial starting.

2.4 The card provider, Indue Limited (Indue), had representatives based in the trial site locations for four weeks at the start of the trial, supplemented by officials from Social Services and PM&C for the trial period. Local Partners16 and Financial Wellbeing and Capability program providers (for example, Centacare in Ceduna and Wunan in the East Kimberley) provided participants with ongoing support, information and assistance, including, for example, to activate and setup their online accounts. Commencement of the CDCT was supported by advertising — print, radio (in language and in English) and social media channels — as well as communications through factsheets, posters, Question & Answer sheets, letters to trial participants, info-graphic posters, and local media engagement.

2.5 Despite these efforts, Social Services acknowledged that communications to participants for future rollouts could be improved. Local Partners advised ANAO that participants required more face-to-face information from knowledgeable staff and support, as many had not attended the public meetings or asked questions and there was some confusion and misinformation among participants.

Were there appropriate governance arrangements in place with clearly defined roles and responsibilities to oversight the Cashless Debit Card Trial?

There were appropriate governance arrangements in place with clearly defined roles and responsibilities across key departments and stakeholders for reporting and oversight of the CDCT.

2.6 Management of the delivery of the CDCT was supported by appropriate governance arrangements that clearly outlined the roles and responsibilities of Australian Government departments — Social Services, PM&C and the Department of Human Services (Human Services). The governance groups, as they evolved, are depicted in Figure 2.1 (on the following page).

2.7 Social Services was responsible for: coordinating governance arrangements for the CDCT and for CDC policy; administration; and delivery of the CDCT. Social Services’ responsibilities involved: implementation oversight; community and stakeholder engagement strategy; CDC Hotline17; trial logistics; risk and issue management; card provider procurement and contract management; and monitoring and evaluation of the trial. Social Services’ state offices were responsible for managing agreements with relevant support services, and supporting engagement and data requests with state governments and other organisations.

2.8 Delivery of the CDCT was supported by PM&C’s National Office and Regional Network Offices in Ceduna and the East Kimberley. In particular, PM&C provided on-the-ground support with community and leader engagement, and managing and providing data on its funded support services.

Figure 2.1: Governance arrangements underpinning delivery of the Cashless Debit Card Trial

A diagram showing the governance arrangements for the delivery of the Cashless Debit Card.

Note a: The New Welfare Card (NWC) Steering Committee provided high-level policy and implementation advice for planning and the co-design process. They were responsible for agreeing the rolling brief prior to it being submitted to Ministers, oversighting the risk and issues registers, and monitoring project progress.

Note b: The New Welfare Card Inter-Departmental Committee (NWC IDC) comprised Deputy Secretaries from Social Services, Human Services and PM&C and provided strategic direction, including overseeing risk management arrangements.

Note c: The New Welfare Card Working Group members were executive level staff with the working group providing a forum for information sharing, and discussing risks and issues. The Working Group would escalate issues to the NWC IDC where necessary.

Note d: The Steering Committee provided guidance and oversight; and had a key role in discussing policy, legal aspects, administration, budget, and risk management.

Note e: The Project Board had responsibility for coordinating, disseminating, tasking, tracking and monitoring implementation progress, and guiding cross-departmental issues. This body merged with the Steering Committee in June 2016 and retained the name, Project Board.

Source: ANAO analysis of Social Services’ documentation, including terms of reference documents.

2.9 Human Services was responsible for providing technological advice and support, including: administration of payments; placing eligible participants onto the trial; exemption processing; applying quarantined percentages; delivery of a customer service centre18; and providing support to participants regarding their payments.

2.10 Social Services built on its existing high level arrangements with PM&C and Human Services regarding interdepartmental relationships and responsibilities to provide decision making and oversight across the CDCT with roles and responsibilities outlined and updated when required. In addition, the following governance arrangements were also established to support the CDCT, including:

  • interdepartmental steering committees and boards;
  • project meetings held between Social Services and Human Services to monitor trial implementation, issue management and service delivery;
  • local governance arrangements (as outlined in Figure 2.1) and the Community Panels; and
  • an Evaluation Steering Committee.

2.11 The operation of these governance arrangements is discussed further in this chapter and in Chapter 3.

Were there fit-for-purpose risk management arrangements in place and were they actively managed?

Social Services demonstrated an integrated approach to risk management across the department linking enterprise, program and site-specific risk plans. While a CDCT program risk register was developed, the identified risks were not actively managed, some risks were not rated in accordance with the Risk Management Framework, there was inadequate reporting of risks and some key risks were not adequately addressed by the controls or treatments identified. In particular, treatments were inadequate to address evaluation data and methodology risks that were ultimately realised. Social Services managed and effectively addressed operational issues as they arose.

2.12 The Commonwealth Risk Management Policy defines risk as ‘the effect of uncertainty on objectives’ and risk management as the ‘coordinated activities to direct and control an organisation with regard to risk’.19 Social Services’ risk management practices were supported by a centralised area that provided guidance and a framework to support enterprise, group and program level risk plans. Risks were identified and documented in a register for the CDCT from early 2015. In June 2015, the risk register was circulated for feedback to PM&C, Human Services, and the Departments of Finance, Communications and the Treasury, and discussed at the New Welfare Card (NWC) Working Group (previously the NWC Steering Committee). Pre-trial issues and decisions were logged and this formed the basis of the issues register maintained by Social Services, covering policy and operational issues, and incident management.

Risk management

2.13 Responsibility for managing the risk register was not visible as a task within the CDCT team structure.20 Risk registers were updated with additional risks during the trial period, although for the risks previously identified the descriptions, status, controls and treatments were not updated and information was not complete. For example, at the end of the trial the risk of ‘legislation doesn’t pass’, with a treatment of ‘review…August 2015’ had not been updated and was still open — this risk should have been closed in November 2015.21

2.14 All interdepartmental Steering Committees included risk management in their terms of reference. The NWC Steering Committee initially received a summary of risks in a ‘rolling brief’ report. The final rolling brief was presented in February 2015 and there was no systematic reporting of risks to the subsequently formed Project Board to support them to oversee risk management during the trial period.

2.15 Risks were reported internally to the Executive Management Group in the Project Status Reports throughout the entire trial period. However, this reporting was based on risk registers that were incomplete. Some of the risks being reported on were not identified in the risk register and Social Services’ Risk Management Framework for rating risks had not been correctly applied to the risks22 resulting in high risks not being flagged. There were six risks rated as medium risk whereas the risk matrix indicates that they should have been rated high. The risks that were rated as medium, instead of high (as per the ‘Risk Register December 2016–February 2017’ used for the ‘Project Status Report December 2016–February 2017’, which reported there were no high rated risks) were:

  • critical incident by a trial participant attributed to the trial;
  • circumventions by trial participants;
  • legislation not passing;
  • Human Services unable to complete the IT build in time;
  • trial policy in relation to credit cards leads to: participants with credit card debt being unable to make re-payments; and participants being restricted from getting a credit card from mainstream banking institutions due to the trial restrictions; and
  • aged pensioners being harassed for cash.

2.16 Controls did not fully address the identified risks, and treatments were also inadequate. For example, the risk that ‘trial objectives and impacts are not measured through effective evaluation process’ was identified as a risk in the risk register and a separate risk register was drafted for the evaluation of the trial in October 2015.

2.17 The November 2015 version of the evaluation risk register included the risks of ‘Data provided by state governments not sufficient to analyse appropriate variables’ and ‘Method and analytic approach not appropriate for purpose’. Risks were assigned to the Evaluation Unit and/or the Welfare Debit Card Taskforce, but were not updated subsequently. Further, there were errors in applying risk ratings to risks identified in the evaluation risk register, the treatments were not complete (predominately ‘ongoing review’), and there were no further updates or apparent active management of the risks identified. The lack of active management and ineffective treatments contributed to the data and methodology issues encountered with the evaluation process (discussed further in Chapter 3).

2.18 Indue was required to prepare a risk register and provide Social Services with an extract of all risks specific to the CDCT by 15 April 2016 according to the contract, but Indue did not do this during the trial period.23 Social Services was not formally monitoring Indue-related risks, reporting these risks to the Executive, and ensuring post-incident treatments were reflected in its register. For example, additional controls identified after a data incident were not reflected in Social Services’ risk register — including acknowledging additional treatments and the likelihood of reoccurrence, and it was not included in the Project Status Report to the Executive. The incident was appropriately handled by Indue and Social Services and documented as per the incident management process and in the issues register.

Issue management

2.19 Issues24 were regularly reported to the Steering Committees. Responsibility for issue management was clearly assigned within the CDCT team. Issues were documented in the issues register and in the Joint Issues Register (jointly managed with Indue), regularly updated, and effectively managed throughout the trial period.

Recommendation no.1

2.20 Social Services should confirm risks are rated according to its Risk Management Framework and ensure mitigation strategies and treatments are appropriate and regularly reviewed.

Department of Social Services’ response: Agreed.

2.21 The department agrees with the recommendation and will improve its application of the Risk Management Framework.

Were procurement processes to engage a card provider and evaluator robust?

Aspects of the procurement process to engage the card provider and evaluator were not robust. The department did not document a value for money assessment for the card provider’s IT build tender or assess all evaluators’ tenders completely and consistently.

Procurement of a card provider

2.22 In early 2015, Social Services discussed the proposed CDC with representatives from the four major banks (Westpac, National Australia Bank, Australia and New Zealand Banking Group Limited, and Commonwealth Bank) and the BasicsCard provider, Indue. Written submissions provided by the banks indicated that a rules-based approach, including merchant code blocking, (as opposed to a technology based approach25) was possible and there was a mixed response regarding interest in participating in the trial.

2.23 In early March 2015, Social Services undertook an internal desktop review of 18 financial institutions to rate potential card providers for the trial. Only four of these were Authorised Deposit-taking Institutions (ADIs). Social Services advised the ANAO that the major banks ‘… were not interested in delivering a small scale trial of the nature of the CDC’. The institutions selected for review were rated against nine selection criteria. The criteria included whether the institution was an ADI and had previous experience with delivering welfare quarantining programs.26 Social Services did not contact the institutions to verify information, capabilities or interest in the trial.

2.24 Through this desktop review, Indue was identified as the preferred provider as it was the only institution to meet all the criteria. Subsequently:

  • Ministerial approval to directly source Indue27 was sought and granted on 16 March 2015; and
  • Social Services and PM&C began consultations with lndue on 19 March 2015.

2.25 To assist it with the procurement process, Social Services contracted: Boston Consulting Group for commercial and technical advice — including providing preliminary estimates of the expected total costs of the IT build and operation, which were consistent with Indue’s initial quote (around five million dollars compared to the final cost of nine million dollars) and analysis to assist with the initial assessment of Indue’s tender; Maddocks as the probity advisor28; and Clayton Utz for legal advice.

2.26 Indue’s tender was evaluated in accordance with departmental procurement processes with the evaluation undertaken by a six member panel comprised of officials from Social Services, PM&C, Human Services and the Department of Finance. It was noted in the Evaluation Report that a value for money assessment (as required by the Commonwealth Procurement Rules) would be conducted prior to finalisation of the contract. Social Services subsequently issued: two contracts, separating the IT build and the Implementation and Operational phases; and a letter of agreement.29

2.27 The approval for the:

  • IT build contract with Indue was not supported by a documented value for money assessment, although it referenced the cost modelling work performed by Boston Consulting Group30; and
  • Implementation and Operational phase contract31 with Indue did encompass a value for money assessment, including analysis of pricing components and comparisons with industry standards.

Contract management

2.28 Social Services did not develop a Contract Management Plan to assist in the management of the Indue service agreement — as per its departmental contract management guidance and recommended by its legal adviser — until after the trial period ended.

2.29 The contracts with Indue did specify roles and responsibilities including: the IT build and the delivery of local support (including its CDCT specific Customer Service Centre32 and Local Partners); arrangements with merchants33; Indue’s involvement in steering committee meetings, progress reporting requirements, and risk and incident management procedures. In addition, the contract required Indue to regularly provide two reports to Social Services, the:

  • Trial Outcomes Monthly Report34 (one for each trial site); and
  • Trial Outcomes Service Level Agreements (SLA) report.35 Under the SLA, Indue was required to report against a number of key performance indicators (KPIs) on a weekly, then monthly basis.

2.30 Social Services did not have formal monitoring arrangements for the performance measures in the SLA at the trial commencement. Discussions were held with Indue in November 2016 to verify invoice calculations and SLA measurements relating to assessing contract performance, seven months after the trial commenced in Ceduna. The delays with the payment of invoices were inconsistent with Social Services’ Financial Instructions.36

Recommendation no.2

2.31 Social Services should employ appropriate contract management practices to ensure service level agreements and contract requirements are reviewed on a timely basis.

Department of Social Services’ response: Agreed.

2.32 The department agrees with the recommendation and will work to improve its contract management practices.

Procurement of an evaluator

2.33 The Monitoring and Evaluation Strategy, developed 1 December 2015, outlined the monitoring and evaluation objectives, high level approach and the projects that would be collated to create a final consolidated review of the program. The strategy outlined the intention to externally commission the ‘Independent Community Change Evaluation’. Subsequently, Social Services conducted a procurement process to identify an evaluator with expertise in the collection and analysis of primary survey data.

2.34 Social Services developed a procurement plan and identified a scoring approach with non-weighted criteria including: capability and capacity; past performance and current work; risk management; methodology; and financial — to assess and select the successful evaluator. The procurement plan was approved by a Social Services Senior Executive on 13 November 2015 and tenders were sought from five members from the Social Policy Research and Evaluation Panel.37 The evaluation of the tenders and contracting of the evaluator were scheduled to be finalised in December 2015, with the trial due to commence in Ceduna in March 2016.

2.35 Table 2.1 shows Social Services’ scoring of the three tenders it received. Social Services’ evaluation of the tenders was incomplete and not consistent with its own documented procurement evaluation process, method and report, with the assessment of cost/price (for each ‘… cost element’) not finalised for one of the tenders as per Social Services’ evaluation score sheet.38 Further, Social Services applied an inconsistent approach to evaluating the tenders, including by linking Criteria 4 and 5 in scoring the successful tender compared to the unsuccessful tenders.39 Social Services Evaluation Score Sheet template for assessing tenders notes ‘In assessing Capability the nominated referees must be contacted to confirm experience, competence and capability’. Social Services Tender Assessment Rating/Scoring Scale template indicates each score is based on the tender and a response from referees. Referees were not contacted for one of the tenders. The price of the successful tender (ORIMA) was around double the amount of the other tenders. Social Services subsequently negotiated to reduce the final tender price with ORIMA.40

Table 2.1: Scoring of tenders submitted to undertake the evaluation

 

Criteria 1

Criteria 2

Criteria 3

Criteria 4

Criteria 5

 

 

Capability and capacity

Past performance and current work

Risk management

Methodology

Financial

Total score

ORIMA Research

8

8

7

8

6 (at $922,592)

37

Colmar Brunton

8

7

6

7

7 (at $435,677)

35

Deloitte Access Economics

6

7

6

4

7 (at $443,658)

30

             

Source: ANAO analysis of Social Services’ documentation.

Recommendation no.3

2.36 Social Services should ensure a consistent and transparent approach when assessing tenders and fully document its decisions.

Department of Social Services’ response: Agreed.

2.37 The department agrees with the recommendation. The department will improve its internal control systems in the form of policy, procedures and guidance material to undertake and document efficient and effective procurement practices that achieve value for money.

Were arrangements in place to provide local support?

Social Services effectively established or facilitated arrangements to deliver local support to CDCT communities, although there were delays in the deployment of additional support services. As part of the CDCT, Social Services also trialled Community Panels and reviewed their effectiveness to inform broader implementation.

2.38 To provide local support in CDCT communities, Social Services put in place arrangements including Local Partners, additional support services, and Community Panels.

Local Partners

2.39 Local Partners41 provided general support, including facilitating initial card set up, account balance checking, bill payments, temporary and replacement cards and assisting participants to address issues as they arose. Local Partners were also tasked with providing information to participants on: the Community Panels; and the application process to reduce the proportion of their restricted funds below 80 per cent.

2.40 Local Partners provided on-the-ground support. There were two Local Partners in Ceduna and one each in Yalata, Oak Valley, Scotdesco and Koonibba. In the East Kimberley there were two Local Partners in Kununurra and one in Wyndham. Australia Post42 also provided some card services. All Local Partners, except one in Ceduna, were contracted for the commencement of each trial site and received training from Indue in the days following the commencement of the trial. Local Partners were supported by Social Services and Indue and provided with reference materials such as the Local Partner Training Manual and the Local Partner Administration Manual. Local Partners worked closely with Indue and the call centres to support trial participants over the trial period.

2.41 Local Partners advised the ANAO that as the trial progressed, fewer resources were needed for general CDC information and acute issue management as more participants became familiar with the CDC and were able to use the self-help kiosks to perform their account balance enquiries and funds transfers without assistance. One common issue encountered by Local Partners was participants not remembering the personal identification number (PIN) for their card and not having a mobile phone to which a new PIN could be securely sent. Local Partners advised the ANAO that they initially allowed their personal mobiles to be used by participants. As this was a security concern, Indue encouraged participants to use the Indue online web-portal (accessed via self-help kiosks) to reset their PIN, if they could remember their password, or use temporary cards which have a preset PIN.

Additional support services

2.42 In September/October 2015, Social Services and PM&C undertook a gap analysis of support services and, in conjunction with the local leaders, identified areas where additional support services would be needed to assist CDCT participants. For the:

  • Ceduna region — additional funding of $1 million was announced in October 2015 to provide: drug and alcohol counsellors; a new 24/7 mobile outreach team; and support to access rehabilitation services; and
  • East Kimberley — additional funding of $1.3 million was announced in November 2015 and the final funding amount was $1.6 million to provide: drug and alcohol counsellors; improved access to drug and alcohol rehabilitation for adolescents; and support for families facing challenges, for example, low school attendance.

2.43 Not all services were available for the commencement of the trial and there was limited uptake and usage of the services during the trial period. Social Services monitored and reported usage of support services to the Steering Committee. East Kimberley community leaders reported, including to the ANAO, that if there had been ongoing consultation regarding the delivery and focus of support services, then outcomes may have improved.

Community Panels

2.44 As part of the CDCT, Social Services established Community Panels — a forum led by a community leader where members of the community assessed applications from trial participants to vary the 80:20 quarantine ratio. Participants could submit an application to the Community Panel for a reduction in the quarantined portion of their income support payment from 80 per cent down to a legislated floor of 50 per cent.43 The application was assessed against pre-determined criteria and any supporting statement.44 The development and operation of a Community Panel, including when it would be established and whether members were paid, were community-led decisions.

2.45 Social Services, as part of the Project Closure Report process, reviewed the effectiveness of the Community Panels. They determined that the Community Panels were not as effective as envisaged, resulting in lengthy delays in making decisions and that they would not be introduced into new localities. Further discussion of the Community Panels is at paragraphs 3.61 to 3.63.

3. Performance monitoring, evaluation and reporting

Areas examined

This chapter examined whether the performance of the Cashless Debit Card Trial was adequately monitored, evaluated and reported, including: arrangements in place to monitor implementation; the establishment of key performance indicators; the approach and data underpinning the evaluation; the quality and accuracy of information reported to the Minister; and the extent to which the extension and roll-out of the Cashless Debit Card (CDC) was informed by evaluation of the trial.

Conclusion

Arrangements to monitor and evaluate the trial were in place although key activities were not undertaken or fully effective, and the level of unrestricted cash available in the community was not effectively monitored. Social Services established relevant and mostly reliable key performance indicators, but they did not cover some operational aspects of the trial such as efficiency, including cost. There was a lack of robustness in data collection and the department’s evaluation did not make use of all available administrative data to measure the impact of the trial including any change in social harm. Aspects of the proposed wider roll-out of the CDC were informed by learnings from the trial, but the trial was not designed to test the scalability of the CDC and there was no plan in place to undertake further evaluation.

Areas for improvement

The ANAO made three recommendations aimed at Social Services undertaking a post-implementation review and a cost-benefit analysis; improving arrangements and capability in monitoring, performance measurement and evaluation; and to continue to evaluate the CDC in current and future locations.

Were appropriate arrangements in place to monitor and analyse the implementation of the Cashless Debit Card Trial?

A strategy to monitor and analyse the CDCT was developed and approved by the Minister. However, Social Services did not complete all the activities identified in the strategy (including the cost-benefit analysis) and did not undertake a post-implementation review of the CDCT despite its own guidance and its advice to the Minister that it would do a review. There was scope for Social Services to more closely monitor vulnerable participants who may participate in social harm and their access to cash.

3.1 Social Services consulted with relevant stakeholders and embedded monitoring arrangements by developing the combined CDCT Monitoring and Evaluation Strategy (the Strategy), dated 1 December 2015. The Strategy was discussed and approved by the Assistant Minister for Social Services on 17 February 2016 and the Minister summarised aspects of the strategy, in letters regarding data sharing, to the South Australian and West Australian Premiers in February and March 2016 respectively (discussed further in paragraph 3.29).

3.2 The Strategy identified the aims of monitoring and evaluation including:

  • to ensure that any unintended consequences of the programme are observed in real time and resolved in a timely manner. The trial also will give early and ongoing indicators of the overall impact of the trial on indicators of social harm.
  • to assess the implementation of the trial and examine the impacts of cashless welfare delivery, and will do so by answering key evaluation questions and to produce an analysis which explores the social impacts which may be attributed to the trial.45

3.3 Specific projects to achieve the aims outlined in the Strategy included:

  • a Monitoring Project — to be undertaken by the Welfare Debit Card Taskforce (later to become a line area in Social Services) with information from the project expected to be provided to the ‘… independent evaluator’; and
  • an Evaluation Project — to be externally commissioned to ORIMA by Social Services. This is discussed from paragraph 3.19 onwards.

The Monitoring Project

3.4 The Monitoring Project identified various sub-projects and key activities to be completed. Table 3.1 lists the analysis and data proposed in the Monitoring Project and whether it was completed as planned.

Table 3.1: Comparison of proposed data and actual data used in monitoring the Cashless Debit Card Trial

Data source proposed in the Monitoring Project

Data used in the monitoring of the CDCT?

Data Collection Project — including the use of the Research and Evaluation Database.a

Yes, although the ANAO found that the Research and Evaluation Database was not used.

Merchant Sales Analysis — to report on changes to alcohol and gambling expenditure in the trial sites based on analysis of merchant sales reports.

No, Social Services indicated to the ANAO that the data was not available.

Specialised Product Viability Analysis — comprised of: a comprehensive analysis of the electronic functionality of the card; an analysis of the impacts on the local marketplace; relevant comparisons to other products; and a comprehensive cost-benefit analysis.

Partly — the ANAO found that the cost-benefit analysis was not undertaken during the trial period. The cost-benefit analysis was due to be completed in late 2017 (more than six months after the trial concluded). Social Services advised it is an ongoing project.

Activities to monitor the operation of the CDCTb including from the analysis of the following data sources:

  • Indue Management Information Systemc;
  • Human Services;
  • feedback from local sources including Community Panels;
  • service provider information; and
  • Social Services’ mailbox and the CDC hotline.

Yes, however the ANAO found that service provider information was not complete or it was inconsistently reported. Additionally, Social Services’ Data Exchange (DEX)d service provider data was not used as several providers only began reporting through DEX in the six months to June 2016 and there was no benchmark data available for the six month period to December 2015. Reporting was also limited for services that were not fully operational or where there was low take up.

   

Note a: The monitoring strategy proposed the use of the Research and Evaluation Database, an amalgamation of Australian Government administrative data including the Department of Human Services’ Income Security Integrated System (ISIS) data and Integrated Employment System (IES) data, and information on demographics, income support payments received, family characteristics, housing information, and other income reported.

Note b: Monitoring card activation rates, community panel applications, external transfers and any issues reported by stakeholders, participants and the public through feedback and tip offs.

Note c: As noted in Chapter 2, Social Services contracted Indue to monitor participant account and merchant transaction information; non-compliance and fraud and regular reporting from Indue.

Note d: Data Exchange (referred to as DEX) is the IT system where performance information on service provision is shared between Social Services and service providers. It is also used by other departments. Department of Social Services, Data Exchange: About, [Internet] DSS, 2018, available from https://dex.dss.gov.au/about/ [accessed March 2018].

Source: ANAO analysis of Social Services’ documentation.

3.5 Social Services acknowledged that some aspects of the Monitoring Project had not been undertaken but advised that monitoring had been thorough.

3.6 Operational monitoring analysis was not shared between the two sections within Social Services that were responsible for monitoring and evaluating the CDCT or shared with its contracted evaluator, ORIMA, despite ORIMA having responsibility for reporting on some output KPIs. To effectively monitor and evaluate activities that the department was responsible for implementing, and which forms the basis for future decisions regarding the commitment of government expenditure, there would be merit in Social Services’ adopting a more coordinated and collaborative approach to sharing information within the department.

Monitoring quarantined income support payments

3.7 As noted in Chapter 1, the trial was to test any change in social harm (habitual abuse and community harm related to alcohol, gambling and drugs) from a reduction in the total amount of cash available in trial sites by quarantining 80 per cent of an income support recipient’s payment into their restricted Indue bank account.46 In addition to 20 per cent of a participant’s income support payment being unrestricted, local leaders in both trial sites agreed that an additional amount (up to $200) could be externally transferred by a participant out of their Indue account to their personal unrestricted account every 28 days. There was no requirement for these external transfers for other expenses to be approved by Social Services.

3.8 Social Services analysed the Indue data relating to trial participants who had made use of the external transfer facility and noted that the limit ‘…is not being misused by the majority of participants’. Social Services reporting indicated participants were receiving around an additional three per cent of funds as unrestricted cash as a result of the facility.

3.9 ANAO analysis showed that over 60 per cent of CDC participants utilised this facility over the trial period and over 40 per cent of participants transferred amounts between $190 and $200 over a period of 28 days at least once over the trial period. Figure 3.1 shows the number of individual participants making transfers by value ranges in both sites, during the trial period.

Figure 3.1: Number of participants making transfers by value in both sites, during the trial period

 

Source: ANAO analysis of Indue trial participant account data.

3.10 The ANAO analysed the change in the 80:20 ratio for a trial participant if an additional amount of $200 was withdrawn from their unrestricted bank account every 28 days. For a single person on Disability Support who was 22 years or older (with or without children), an additional $200 every 28 days increased their unrestricted income support amount to 32 per cent, with the level of unrestricted cash increasing to 61 per cent of the total payment for a single person, under 18, on Youth Allowance with no children, living with their parents.47

3.11 This facility had the potential to significantly change the restricted portion of income support for participants in trial sites (dependent on the type of income support payment the participant received); and increased cash availability in communities. Further, this situation compromised the ability of the local leaders to improve social outcomes as they had less leverage to reward favourable behaviour (by decreasing participants’ quarantined income support). Social Services did not use available data on indicators of vulnerability and social harm, such as the Research and Evaluation Database, to monitor the proportion of additional cash withdrawn, using the external transfer facility, by vulnerable CDCT participants.48

Post-implementation/Program performance/Program implementation reviews

3.12 Social Services’ guidance advised that ‘line areas contribute and are responsible for ensuring programmes are evaluation ready through the conduct of post-implementation or programme performance reviews and monitoring’.49 Social Services indicated to the Minister in September 2016 (around five months into the CDCT) that it would provide advice on implementation issues as it ’…progresses a Program Implementation Review’.

3.13 An implementation review did not proceed. Social Services advised the ANAO that ‘an additional implementation review, on top of the independent evaluation, would seem duplicative and not value for money’. Social Services further advised that aspects of the agreed review were included in the CDC evaluation. However, this was counter to Social Services’ guidance which states that an implementation review is separate to an evaluation50 and contrary to the CDCT final evaluation report which noted that the CDCT ‘… is primarily an outcomes evaluation and not an implementation review and, as such, its coverage of the implementation process is limited’.51

Recommendation no.4

3.14 Social Services should undertake a cost-benefit analysis and a post-implementation review of the trial to inform the extension and further roll-out of the CDC.

Department of Social Services’ response: Agreed.

3.15 The department agrees with the recommendation and has commenced a cost-benefit analysis and post-implementation review of the initial CDC trial. The department has also initiated a new independent evaluation of the CDC to commence in mid-late 2018.

Were there suitable key performance indicators developed to measure the performance of the Cashless Debit Card Trial?

Key performance indicators (KPIs) developed to measure the performance of the trial were relevant, mostly reliable but not complete because they focused on evaluating only the effectiveness of the trial based on its outcomes and did not include the operational and efficiency aspects of the trial. There was no review of the KPIs during the trial and KPIs have not been established for the extension of the CDC.

3.16 Social Services developed a draft program logic and identified expected outcomes that would result from the implementation of the CDCT52 in its Strategy. Social Services commissioned ORIMA Research (ORIMA) to further refine the program logic and develop key performance indicators (KPIs) to measure the effectiveness of the CDCT. The commissioning of ORIMA on 22 February 2016 (22 days before the trial commenced) allowed little time to develop, test, agree and establish baseline data to measure against the KPIs. The KPIs were developed by ORIMA, in consultation with stakeholders, over the first few months of the trial and provided to Social Services on 26 July 2016, over four months after the trial had commenced. Measures were developed for trial outputs, short-term outcomes and medium term outcomes in consultation with various stakeholders. The ANAO assessed53 the KPIs for their:

  • Relevance — KPIs developed for the CDCT were largely relevant. A number of KPIs were not focused enough to specifically measure effectiveness, for instance, KPIs that related to support services. The indicator developed to assess the operational performance of the Community Panels did not take into account feedback from trial applicants.
  • Reliability — KPIs developed for the CDCT were mostly reliable. There were issues with measurement and bias; and
  • Completeness — KPIs developed for the CDCT were not complete. The output KPIs did not provide a balanced examination of the CDCT. In addition, KPIs relating to the administration and operational aspects of the trial54 and the associated development of metrics were mostly absent and without an implementation review — this left a gap.55

3.17 The KPIs that were developed did not address efficiency aspects of the CDCT, a key aim of the trial, but Social Services did estimate and report on the run (variable) costs of the CDC trial compared to the run costs for delivering income management.56 On 1 March 2017, Social Services provided advice to the Minister on the estimated higher run costs ($3713) for delivering the CDC trial per participant (in two remote locations) compared to the costs ($3280) of delivering income management (in a number of urban, regional and remote locations).57 The Minister was also advised of the estimated costs for the future expansion of the CDC, on the basis of two options — delivery in two remote and two regional locations ($1503) or three remote and one regional location ($1942).58

3.18 There was also a lack of ongoing reporting against KPIs, and as at December 2017, there were no KPIs in place for the extension of the CDCT in the trial sites.59 On this basis, arrangements were not established to effectively measure the wider aim of ‘encouraging socially responsible behaviour’.60

Was the evaluation of the Cashless Debit Card Trial supported by a sound approach and data?

Social Services developed high level guidance to support its approach to evaluation, but the guidance was not fully operationalised. Social Services did not build evaluation into the CDCT design, nor did they collaborate and coordinate data collection to ensure an adequate baseline to measure the impact of the trial, including any change in social harm.

Evaluation project

3.19 The Evaluation Unit in Social Services developed a range of high level guidance in mid-201561, which included advice on types of evaluations and appropriate methodologies to guide its business areas.62 The guidance indicated the role of the Evaluation Unit was to approve ‘…the scope, design, and timeframe for evaluations’, collaborating:

with line areas and Program Office to facilitate “evaluation readiness”–building evaluation into policy design and ensuring collection of appropriate data and performance information to support evaluation.

3.20 Although it had responsibility, the Evaluation Unit did not build evaluation into the policy design for the CDCT and did not determine the evaluation approach and how the evaluation would ‘…be conducted (e.g. in-house, consultancy, peer review) taking into account level of complexity, scale and funding available from program area’.

3.21 Instead, the Welfare Debit Card Taskforce developed Social Services’ Evaluation Project Brief and its proposed methodologies and measurements paper, and consulted with the Evaluation Unit and government stakeholders including the Department of the Prime Minister and Cabinet (PM&C), in July 2015. The Welfare Debit Card Taskforce also led the development of the Monitoring and Evaluation Strategy and consulted with the Evaluation Unit on 10 September 2015 prior to forwarding a copy of the Strategy to the Evaluation Unit on 22 September 2015.

3.22 In response to the Minister’s commitment to undertake a ‘…thorough evaluation of the impact of the Cashless Debit Card Trial’, the Monitoring and Evaluation Strategy outlined an approach to evaluate the trial using qualitative survey data, administrative income support and state data (South Australia and Western Australia). The Strategy noted that the evaluation would be externally commissioned and would ‘…primarily involve qualitative data collection and assess whether community members, especially individuals on the trial, observes changes in key indicators of community and social harm’ with the Strategy noting that information from Social Services and state governments would inform the final report on measuring the impact of the trial. Alternative evaluation approaches to establish a baseline and measure any changes in social harm were not considered.

Evaluation data and analysis

3.23 Social Services first met with the evaluation consultant, ORIMA, on 25 February 2016, when details of the contract including the evaluation methodology and data were discussed. Social Services noted at the meeting that: ‘Negotiations with WA and SA regarding the sharing of data are not as advanced as anticipated’.

3.24 The ANAO identified data available for the evaluation included: ORIMA survey data; state data on crime, gambling, hospital admissions and public housing, disruptive behaviour and complaints; local area data on service use; Commonwealth income support data; Indue card transaction data; and comparison site data.

3.25 ORIMA indicated to the ANAO that it was unable to collect baseline survey data to evaluate the trial given the evaluation was commissioned three and nine weeks prior to the commencement of the trial in Ceduna and the East Kimberley respectively.

ORIMA survey data

3.26 Primary survey data was collected by ORIMA from trial participants, families of trial participants and non-CDCT community members for the purpose of undertaking quantitative analysis. Interviews and focus groups were also arranged with community leaders, stakeholders and merchants to collect qualitative information. There were two waves of data collected for quantitative analysis and three waves of data collected for qualitative analysis.

3.27 Data collection at Wave 1 was achieved by interviewing every third or fourth person in a range of locations — an intercept survey63, until the target samples size were achieved.64 ORIMA’s method involved re-interviewing Wave 1 survey respondents in the Wave 2 survey, to enable a longitudinal approach to analysis. In practice, interview targets were not met and attrition, such as previous respondents that were not able to be interviewed, was higher than anticipated with only 28 per cent of trial participants responding to the survey in both waves of data collection.65 To address this, ORIMA undertook additional interviews with new respondents. This reduced the evaluator’s ability to compare outcomes for the same participants between waves.

3.28 ORIMA noted that they used the quantitative surveys as their primary data source to measure the impact of the trial, including measuring any change in social harm.66

State and community level administrative data

3.29 The Minister67 wrote to the South Australian and West Australian Premiers, in February and March 2016 respectively, approximately one month prior to the trial commencing in each site, seeking the Premiers’ support including in the provision of relevant data to monitor and evaluate the performance of the trial. The identification and request for specific data items from the states were discussed or sent to the relevant South Australian and West Australian departments one and three months into the 12 month trial, respectively.

3.30 Administrative data received by Social Services from the states for the evaluation included monthly and quarterly statistics on crime and intoxication apprehensions, hospital admissions (identifying those that were alcohol related), school attendance, housing debt, poker machine revenue, child abuse notifications, disruptive housing tenancies, outpatient counselling for substance abusers and homelessness. Community-level information included data on attendance at rehabilitation/sobering-up units and night patrol.

3.31 The ANAO found that limitations in the state data and the department’s approach to using it in the evaluation reduced how effectively the impact of the actual trial on social harm could be isolated and evaluated. For example, state statistics available to measure change in the amount of crime committed or gambling covered a greater geographical area than Ceduna with 60 per cent of the population captured in this data not in the trial area. Further, for some data items: statistics were not collected (e.g. hospital admissions in Kununurra and Wyndham); there was an insufficient time series to be able to measure any changes in the trial areas over time; and there was also a lack of consistency between the data available for the comparison sites (for instance, there is no hospital in Derby) compared to the trial sites. Further discussion on comparison sites is at paragraph 3.39.

Commonwealth and Indue administrative data

3.32 Various Social Services documentation, including the: Strategy, Evaluation Framework and advice drafted for a Social Services’ meeting with the Minister in April 201668, noted that Commonwealth administrative data, including the Research and Evaluation Database, would be analysed to measure the impact of the CDCT.

3.33 Social Services undertook internal analysis of longitudinal Commonwealth administrative data69 in October 2016, designed to ‘…supplement the Evaluation of the CDC Trial — Initial Conditions Report’. The analysis provided some aggregate descriptive statistics on the CDCT participants from 30 June 2015 to 30 June 2016, two and three months into the trial in the East Kimberley and Ceduna respectively.

3.34 Social Services asked ORIMA for assurance that the data and approach ORIMA was using would be sufficiently robust to assess the trial outcomes so that its senior executive could brief the Minister on its 12 month progress report on 15 June 2017. ORIMA, in its response, indicated that it would use participant survey data, qualitative interview/focus group data and state administrative data — highlighting its limitations. Social Services accepted ORIMA’s approach, despite ORIMA identifying that it was not intending to use Commonwealth administrative data in its approved approach to ‘triangulate’ the data sources to estimate the impact of the CDCT.

3.35 ORIMA reported to the ANAO that it linked Commonwealth income support administrative data70 and Indue data to produce CDC account balance and transaction dashboards ‘as part of program implementation feedback’ for Social Services. The linked data was also used to assess output KPIs to confirm that CDCT participants: had timely access to the Income Support Payments deposited into their CDC accounts; were able to successfully activate their CDCs; and could make purchases with their CDCs.

3.36 ORIMA noted in its CDCT final evaluation report that Human Services administrative income support data was used for a:

Comparison of the demographic characteristics of the Wave 1 and Wave 2 CDCT participant response samples against population benchmarks (age, gender and Aboriginal and/or Torres Strait Islander origin71

3.37 ORIMA outlined, in the final evaluation report, the administrative data72 it examined to assess performance on the effectiveness of the CDCT.73 This did not include the use of Indue Card and Commonwealth administrative data. ORIMA indicated to the ANAO that Social Services did not mention the potential use of the Research and Evaluation Database and Social Services did not provide ORIMA with this database.74.

3.38 Social Services’ ability to robustly measure the impact of the trial and any change in social harm was reduced because Commonwealth administrative data including the Research and Evaluation Database75 was not used as originally intended and there were limitations in obtaining adequate state data.

Comparison site data

3.39 Social Services initially indicated its intention to use comparison sites in its Strategy, noting that:

The evaluation and monitoring strategy will attempt to isolate and eliminate variables that are beyond the scope of influence of the debit card. Comparison location[s] may be observed directly if they present a reasonable and accurate comparison; however, the main comparison source will be wider statistical trends in the broader geographical locations.

3.40 The Evaluation Framework76 developed by ORIMA and agreed by the CDC Evaluation Steering Committee (discussed at paragraph 3.43) did not include an approach to observe individuals in a comparison location where the card was not implemented, relying on comparing statistical trends in the comparison sites proposed by the relevant states.

3.41 ORIMA reviewed the comparison sites (Coober Pedy and Port August for Ceduna and Derby for the East Kimberley) proposed by the South Australian and Western Australian Governments to determine their suitability on the basis of four Australian Bureau of Statistics’ Socio-Economic Indexes for Areas scores. The review did not consider whether there was consistent data available in the comparison and trial sites that would allow the same outcome variables to be measured.

3.42 Comparison sites were selected for the trial to compare aggregate outcomes of trial sites to locations where the card was not implemented. However, the comparison locations selected for the trial (Port Augusta and Derby) were known areas where trial participants regularly visited (some merchants were blocked in the sites due to possible circumvention). Due to these arrangements, the data and measurement of outcomes in the comparison sites (crime, gambling, and alcohol related hospital admissions) could have included trial participant behaviour which would introduce bias into the evaluation results.

Cashless Debit Card Trial Evaluation Steering Committee

3.43 The CDCT Evaluation Steering Committee (committee) was established to oversee the evaluation. The Committee membership included subject matter experts, state government, Social Services and PM&C regional and national representatives.

3.44 The committee’s first and only meeting was convened on 31 March 2016, where key issues including the methodology to be able to determine the impact of the trial (attribution) and the selection of comparison sites, was discussed. Further communication was through email with Social Services seeking input to key evaluation deliverables including the Evaluation Framework and subsequent evaluation reports. The committee’s feedback was provided to ORIMA. The most consistent areas of concern were: methodology; robustness of data and its general limitations; the tone; and the need for greater clarity in aspects of the reports.

Cost of the evaluation

3.45 The evaluation commenced on 22 February 2016, nearly three months later than originally scheduled. The cost of the evaluation was $1.6 million77, over double the initial amount agreed with ORIMA. During the evaluation, three variations to the original contract price were agreed.

3.46 ORIMA’s justification for the increase in cost were due to an increase in the number of interviews of trial participants ORIMA conducted, as noted at paragraph 3.27, and:

  • The complexity of the survey fieldwork has exceeded expectations;
  • The administrative data has proven to be far less coherent than anticipated;
  • The level and diversity of stakeholder involvement has exceeded what we could anticipate prior to commencement; and
  • There has been an increasingly high profile of the trial during the critical fieldwork periods –including visits and statements from the Minister, Prime Minister, other politicians and social commentators.

Recommendation no.5

3.47 Social Services should fully utilise all available data to measure performance, review its arrangements for monitoring, evaluation and collaboration between its evaluation and line areas, and build evaluation capability within the department to facilitate the effective review of evaluation methodology and the development of performance indicators.

Department of Social Services’ response: Agreed.

3.48 The department agrees with the recommendation and is in the process of implementing this recommendation. From August 2017, departmental monitoring and evaluation arrangements were reviewed which led to substantial improvements in how the department conducts evaluations including the appointment of an internal Chief Evaluator. The establishment of the department’s new Evaluation Policy clearly delineates responsibilities for developmental, trial, performance monitoring, post-implementation review, process evaluation and impact evaluation and the extension of the centralised evaluation model to include performance monitoring and post-implementation review.

Was advice to the Minister supported by an appropriate evidence base?

Social Services regularly reported on aspects of the performance of the CDCT to the Minister but the evidence base supporting some of its advice was lacking. Social Services advised the Minister, after the conclusion of the 12 month trial, that ORIMA’s costs were greater than originally contracted and ORIMA did not use all relevant data to measure the impact of the trial, despite this being part of the agreed Evaluation Framework.

Monitoring advice

3.49 Social Services reported on the performance of the CDCT to the Minister for Human Services through regular fortnightly meetings, ad hoc briefings on issues, and monthly data reports.78 Monthly data reports to the Minister79 commenced in September 201680 and provided updates on available81 state statistics related to crime, gambling and the uptake of services such as community patrols, St John Ambulance and public housing in areas that included the trial locations. A version of this report, the Cashless Debit Card Trial Progress Report, was published in October 2016, with the addition of anecdotal information from community leaders in support of the trial.82

3.50 Social Services did not always account for seasonality83 in its analysis and reporting. For example, Social Services, in reporting to the Minister, noted the decline in the average number of alcohol related hospital admissions in Ceduna from March to June 2016 compared with hospital admissions from November 2015 to February 2016, but acknowledged that seasonality was not taken into account. Seasonality was not taken into account in the department’s reporting on other social indicators to the Minister, despite the data being available. For example, Social Services reported to the Minister on the decline in the number of pick-ups by the Kununurra Miriwoong Community Patrol from April to May 201684, but the department did not analyse all available data (April to August), did not account for seasonality and did not measure pick-ups related to alcohol to more accurately assess change.85 86 ANAO analysis showed that there was a consistent decline in alcohol related pick-ups over time, not just over the trial period. There was a decline from April 2014 to August 2014 (61.7 per cent); with further declines shown from April 2015 to August 2015 (35.4 per cent) and April 2016 to August 2016 (36.1 per cent).

3.51 The Minister was advised that there was a decrease in the total number of St John Ambulance call-outs in September 2016 compared to September 2015. Accounting for seasonality in the data, ANAO found, in analysing the data over a longer period, there was a 17 per cent increase in call-outs from April to October 2016 when compared to the previous year.

3.52 Anecdotal information reported to the Minister suggested an increase in school attendance, but ANAO analysis of state data available to Social Services showed that attendance was relatively stable for non-indigenous students but it had declined by 1.7 per cent for indigenous students, after the implementation of the trial compared to the same period (between May to August) in 2015.

3.53 Social Services indicated to the ANAO that the monitoring statistics provided inconclusive results due to the short-time frame of the trial. The ANAO found that other factors also impacted on the performance results including:

  • some data was not collected; and
  • the data was inconsistent and not fit for purpose (e.g. some covered wider geographical areas). (see paragraph 3.31)

Evaluation advice

3.54 The Evaluation of the CDC Trial — Initial Conditions Report and the Cashless Debit Card Trial Evaluation: Wave 1 Interim Evaluation Report were provided to the Minister to note the findings and to approve the reports for public release. Advice to the Minister highlighted that the CDC trial ‘may have contributed to a reduction in all three targeted behaviours’ as well as other positive impacts such as participants being able to save more money and take better care of their children.

3.55 Social Services noted in its submission to the Minister, on the interim evaluation findings, the mixed overall response to whether the CDC trial had made respondent’s lives better. This varied significantly based on whether the respondent was a trial participant or family member of a participant (a greater proportion indicated that the CDC had made their lives worse rather than better) or non-participants (who had the ‘…reverse perception’, on the impact of the card on the community).

3.56 Social Services, in advising the Minister on the CDC final evaluation findings on the impact of the trial on 29 August 2017, indicated that:

The CDCT has had a positive impact in lowering alcohol consumption across the two trial sites. Across both sites, of the 231 participants who reported drinking alcohol before the trial commenced, 41 per cent reported drinking alcohol less often…

…Analysis of the final report indicates that ORIMA Research utilised data provided from the Department of Social Services data sources. However, of note, ORIMA Research did not draw on Indue data and only some of the administrative data provided by the Department of Human Services and State governments. Additionally, raw data is not provided, only a list of data sources. This means conclusions in the report will not be able to be independently verified and this is likely to draw criticism from academics.

3.57 Social Services did not advise the Minister, over the course of the CDCT, that ORIMA’s approach to evaluation had changed from what was previously agreed and that Commonwealth administrative income support data was not used to measure the impact of the trial.87

3.58 Social Services advised the Minister on the 23 May 2017, after the 12 month trial, that the variability of the state administrative data available for the evaluation had been an issue:

For example, crime statistics for Ceduna and surrounds is provided at the Local Government Area (LGA) and does not align with data provided by Western Australia. The data and the reporting periods of the data for the comparison sites of Coober Pedy, Port Augusta and Derby also varies across both sites, including for health services, sobering up units, crisis accommodation centres, night patrols, and mental health and drug services.

and that ‘…unanticipated complexities’ encountered in the evaluation fieldwork for the CDCT had affected the evaluation timeframe for delivery and ORIMA requested an additional $630,762 to cover unanticipated expenses to complete the evaluation.

Was the extension and roll-out of the Cashless Debit Card informed by the trial evaluation findings?

Social Services undertook a review and reported to the Minister on a number of key lessons learned from the 12 month trial of the CDC. Learnings about the effectiveness of the Community Panels were based on the number of applications received and delays in decision making, rather than from the evaluation findings that noted a delay in the establishment of the Community Panels and a lack of communication with participants. The 12 month trial did not test the scalability of the CDC but tested a limited number of policy parameters identified in the development of the CDC. Many of the findings from the trial were specific to the cohort (predominantly indigenous) and remote location, and there was no plan in place to continue to evaluate the CDC to test its roll-out in other settings.

Lessons learned from the trial

3.59 Social Services undertook an analysis of lessons learned from the CDCT and reported this to the Minister in October 2017. A number of positive features related to the implementation were identified by Social Services, including: ‘…the establishment of Local Partners, close engagement with key community members to address emerging issues quickly as they arise, and staff resourcing on location to ensure participants are well supported as they transition onto the program’.

3.60 The Minister was also provided with advice on some of the proposed changes to an approach for any future expansion of the CDC including: ‘…clearer and consistent communication with key stakeholders and participants in the lead up to the rollout of the card in a region is needed, and consistency of high quality staff to enhance service delivery for participants’ and providing increased support to participants that have low technological and financial skills.

3.61 Areas identified to achieve efficiencies were to have an ‘…automated merchant management solution’ and ‘…not making community panels a mandatory feature, or focal point’, in the future roll-out.

3.62 Social Services’ reasoning behind removing Community Panels as a mandatory feature, was that:

The community panels established for the project were not as effective as envisaged and resulted in lengthy delays in making decisions.

3.63 Social Services did not refer to the evaluation of the trial, which noted other factors that impacted on the effectiveness of Community Panels, including the ‘…delay in establishing and commencing the Community Panels from the start of the trial’ and that ‘…the panel process was not adequately known and communicated’ to the trial participants and communities. The evaluation report indicated that community leaders and stakeholders indicated they believed the Community Panel was ‘…a good and necessary safeguard process in the trial to ensure that personal/family circumstances and needs were taken into consideration’.88

The potential for future learnings

3.64 Social Services originally advised the Minister, on 16 March 2015, that the CDCT would not be able to inform government on:

… the full commercial delivery of a restricted debit card by major banks as set out in the Forrest Review, including their role and potential costs, and is unlikely to provide evidence around economies of scale for a broader roll-out.

3.65 Implementation of the trial in the remote sites of Ceduna and the East Kimberley, where income support participants predominantly identified as indigenous, restricted the ability for the trial to inform government on any future implementation of the card in non-remote sites due to the different type of service arrangements and support required.

3.66 The extension of the CDC, announced on 14 March 2017, was informed by some of the reported trial site outcomes from the Cashless Debit Card Trial Evaluation: Wave 1 Interim Evaluation Report, although outcomes were only informed by data collected over a six to seven month period.

3.67 In responding to the Parliamentary Joint Committee on Human Rights regarding the Social Services Legislation Amendment (Cashless Debit Card) Bill 2017 to allow for the extension of the ‘… CDC program’ and the expansion to two new sites, the Minister noted that:

The expansion of the Cashless Debit Card is necessary to allow the Government an opportunity to build on the research findings of the interim and final reports, to help test the card and the technology that supports it in more diverse communities and settings.

3.68 With the agreed extension of the CDC in the trial sites for a further year from 1 July 2017 and the proposed expansion of the CDC, an updated monitoring strategy was developed in December 2017. Social Services indicated at Senate Estimates on 2 November 2017 and confirmed this in a subsequent meeting with the ANAO on 25 January 2018 that it was not recommending a further evaluation of any future roll-out.89

Recommendation no.6

3.69 Social Services should continue to monitor and evaluate the extension of the Cashless Debit Card in Ceduna, East Kimberley and any future locations to inform design and implementation.

Department of Social Services’ response: Agreed.

3.70 The department agrees with the recommendation. The department will continue to monitor CDC implementation using available data to improve future design and implementation. As the report notes, an updated CDC Data Monitoring Strategy was developed in December 2017 and will be applied to the CDC expansion.

3.71 As noted in the report, on 15 May 2018, the Government has announced a new independent evaluation of the CDC, which will assess the program’s effectiveness in all three CDC sites. This evaluation will build on the results of the ORIMA Research evaluation.

Appendices

Appendix 1 Department of Social Services’ response

Department of Social Services response

Appendix 2 Parameters tested through the Cashless Debit Card Trial

Parameter

Details

Quarantine 80 per cent of income support payments to restrict cash in trial site communities

A default rate of 80 per cent of a recipient’s income support payment will be quarantined in a restricted account, linked to the CDC. The quarantined amount was available to purchase essential goods and services only.

Mainstream delivery

Delivery of the CDC banking services through a commercial card provider and utilising existing payment and merchant category code infrastructure to apply restrictions.

Local Partners

Local organisations were contracted by the card provider Indue to provide banking related support services to participants.

Mandatory participation

Mandatory participation for income support recipients of working age living in trial areas.a

Recipients of the Age pension and Veterans’ pension and people outside the trial sites could volunteer to participate.

Community Panels

Community Panels — consisted of local leaders and community members — could reduce the quarantine rate applied (to a legislated minimum of 50 per centb of an income support payment) if requested by the recipient.

Other expense transfers

Participants could transfer up to $200 every 28 days from their restricted account to a non-restricted bank account to allow for incidental cash purchases. No approval was required for these transfers.

   

Note a: That is, the trial was not explicitly targeted to community members vulnerable to participating in social harm.

Note b: The Wyndham Community Panel decided to limit the reduction to 70 per cent.

Source: ANAO analysis of Social Services’ documentation.

Appendix 3 Key aspects of the approach to welfare quarantining under the Income Management model and the Cashless Debit Card model

Aspect

Income Management model

Cashless Debit Card model

Objective

Directing Income Support payment expenditure away from restricted goods towards necessities and helping participants to financially manage their needs.

Restricting access to discretionary cash with the aim of reducing habitual abuse and community harm related to alcohol, gambling and drugs.

Participant triggers

Variable depending on location and Income management measures in place. Participants are targeted based on disengagement, vulnerability and referrals from child protection and other state and territory authorities. Those on Centrelink and DVA payments can volunteer.

All working age welfare recipients in a CDCT site, excluding those on age and veterans payments. Other persons can volunteer.

Default rate of quarantining

Varies between 50 and 90 per cent.

80 per cent. May be varied down to 50 per cent at the discretion of the Community Panels.

Restricted goods

Alcohol, gambling products, tobacco, pornographic material and cash.

Alcohol, gambling products and cash.

Policy owner

Social Services.

Social Services.

Account provider

Primarily delivered by Human Services.

Primarily delivered by the commercial provider, Indue Limited, with support from Social Services and Human Services.

Participant support

Individual support is provided by Human Services social workers when a participant entered the trial.

Support is provided by Human Services to process payments to non-BasicsCard vendors.

Locally based support services for assistance with services such as financial counselling, and alcohol and drug rehabilitation.

Banking services are provided by a commercial entity and its Local Partners.

Social Services provides support for housing and emergency payments.

Merchant management

Merchants must apply to Human Services to accept the BasicsCard.

There is ongoing manual management and compliance checks.

Merchants automatically accept the CDC.

Manual management and compliance checks agreements are only required with mixed merchants (those selling both excluded and non- excluded goods and services).

     

Source: ANAO analysis of Social Services’ documentation.

Appendix 4 Assessment of evaluation KPIs against the criteria of relevant, reliable and complete

KPIs are assessed below. Where a KPI has been assessed as either mostly, partially or not relevant/reliable, an explanation is provided. ‘Yes’ indicates that a KPI is relevant/reliable, mostly indicates that the KPI has been assessed as more than 50 per cent relevant/reliable, partially indicates that the KPI is less than 50 per cent relevant/reliable and ‘no’ indicates where a KPIs is deemed to be not relevant/reliable. KPIs as a whole were assessed as being partially complete, with the ANAO’s assessment deeming them as giving a less than 50 per cent complete picture of the trials performance.

Assessment of evaluation KPIs against the criteria of relevant, reliable and complete

Performance indicator specification (Target)

Relevant/Reliable

Comment

Output Performance Indicators

1

Number of community leaders who endorse the program

Number of community leaders who:

Feel program design is appropriate for their community characteristics; believe program will be/ is a good thing for their community; speak positively about program; and believe Trial Parameters were developed using a co-design approach

(No target rate specified - to be defined with a qualitative indication of number: all, most, many, some, few)

Yes/Yes

2

% of participants who understand card conditions

% of participants who are aware:

How much of their welfare income is quarantined in terms of cash withdrawals; what they can and cannot purchase on the card; which merchant types they can and cannot use the card at; they can use the card wherever Visa is accepted, including online (except where a Merchant is blocked); they can use the card to make online payment transfers for housing and other expenses, and to pay bills; and what to do if the card is lost or stolen

(No Target)

Yes/Mostly

Reliable comment: No baseline understanding of card conditions set.

3

% of compulsory Trial participants sent a debit card

% of compulsory Trial participants sent a debit card

(100% Within two months of program launch)

Yes/Yes

4

% of distributed cards that are activated

Of all cards distributed to participants, % of these that are activated

(95%a Within one month of receiving card)

Yes/Yes

5

80% of income support payments are quarantined

Income support payments are quarantined and 20% are received in cash (excluding approved adjustments)

(100% of recipients within two months of program launch)

Yes/No

Reliable comment: This measures instituted trial arrangements, not performance and the $200 other expenses transfer facility was not measured at an individual participant level, meaning that the effective quarantine rate could be lower than the target that was set.

6

# of support services available in the community

# and type of additional support services in operation as planned

(100% within three month of program launch)

No/No

Relevant and Reliable Comment: this was not measured in either the interim or final evaluation report due a lack of suitable Social Services data.

7

% participants with reasonable access to merchants and products

Excluding the purchase of alcohol and gambling % of participants who agree that they can still shop where and how they usually shop

(90%)

% reporting concerns over access to allowable products

(10% maximum)

Yes/Yes

8

# community leaders who believe appropriate adjustments are made to income restrictions on a case-by-case

Number of community leaders who believe community panels are assessing applications in a timely, consistent and fair manner; and number of community leaders who believe community panels are making just and reasonable decisions about changing percentage of welfare payments quarantined

(Most)

Mostly/Partially

Relevance comment: the views the applicants to the panels are not being taken into account effecting the ability to assess the community panel’s performance from a key stakeholder group.

Reliability comment: Community leaders are being asked about a process which they have control over raising a risk of bias, additionally these panels were not in place at the time that Wave 1 data was collected in the East Kimberley.

Short Term Outcome Performance Indicators

9

Frequency of use /volume consumed of drugs & alcohol

Number of times alcohol consumed by participants per week; % of participants who say they have used non-prescription drugs in the last week; number of times per week spend more than $50 a day on drugs not prescribed by a doctor; number of times per week have six or more drinks of alcohol at one time (binge drinking); and % of participants, family members and general community members reporting a decrease in drinking of alcohol in the community since commencement of Trial;

(N/A due to absence of baseline data)

Number of on-the-ground stakeholders reporting a decrease in drinking of alcohol in the community since commencement of Trial.

(Many)

Yes/ Mostly

Reliable Comment: No baseline set and no specific targets set.

10

Frequency/volume of gambling and associated problems

Number of times Trial participants engage in gambling activities per week; number of days a week spend three or more hours gambling; number of days a week spend more than $50 gambling; % of participants indicating that they gamble more than they can afford to lose or borrow money or sell things to gamble; and % of participants, family members and general community members reporting a decrease in gambling in the community since commencement of Trial;

(N/A due to absence of baseline data)

Number of on-the-ground stakeholders reporting a decrease in gambling and associated problems in the community since commencement of Trial

(Many)

EGM (‘poker machine’) revenue in Ceduna and Surrounds

(Lower than before trial)

Yes/Mostly

Reliable Comment: No baseline set and no specific targets set.

11

% aware of drug & alcohol support services

% participants who are aware of drug and alcohol support services available in their community

(No sound evidentiary basis for setting a target)

Mostly/Mostly

Relevant Comment: This KPI does not measure the success of support services that were part of the trial.

Reliable Comment: No baseline set and no specific targets set.

12

% aware of financial & family support services

% participants who are aware of financial and family support services (including domestic violence support services) available in their community

(No sound evidentiary basis for setting a target)

Mostly/Mostly

Relevant Comment: This KPI does not measure the success of support services that were part of the trial.

Reliable Comment: No baseline set and no specific targets set.

13

Usage of drug & alcohol support services

% of participants who have ever used drug and alcohol support services; number of times services used per participant; and intention to / likelihood of using service in future

(Higher at Wave 2 than at Wave 1 (statistically significant))

Number of people in community using services

(Higher than before Trial)

Mostly/ Yes

Relevant Comment: This KPI does not measure the success of support services that were part of the trial.

14

Usage of financial & family support services

% of participants who have ever used financial or family support services (including domestic violence support services); number of times services used per participant; and intention to / likelihood of using service in future

(Higher at Wave 2 than at Wave 1 (statistically significant))

Number of people in community using services

(Higher than before Trial)

Mostly/ Yes

Relevant Comment: This KPI does not measure the success of support services that were part of the trial.

Medium Term Outcome Performance Indicators

15

Frequency of use/volume consumed of drugs and alcohol

See: short-term indicators of frequency of use /volume consumed of drugs & alcohol

(Frequency/ volume not higher at Wave 2 than at Wave 1)

Yes/Mostly

Reliable Comment: No baseline set and no specific targets set.

16

Frequency/volume of gambling and associated problems

See short-term indicators of frequency/volume of gambling and associated problems

(Frequency/ volume not higher at Wave 2 than at Wave 1)

Yes/Mostly

Reliable Comment: No baseline set and no specific targets set.

17

Incidence of violent & other types of crime and violent behaviour

Police reports of assault and burglary offences; drink driving/drug driving; domestic violence incidence reports; drunk and disorderly conduct; outstanding driving and vehicle fines; % of participants, family members and the general community who report being the victim of crime in the past month; and % of participants, family members and the general community who report a decrease in violence in the community since commencement of Trial

Number of on-the-ground stakeholders reporting a decrease in violence in the community since commencement of Trial

(Lower than before Trial)

Yes/Mostly

Reliable Comment: No baseline set and no specific targets set.

18

Drug/alcohol-related injuries and hospital admissions

Drug / alcohol-related hospital admissions / emergency presentations / sobering up service admissions

(Lower than before Trial)

% of participants / family members who say they have been injured after drinking alcohol / taking drugs in the last month

(Not higher at Wave 2 than at Wave 1)

Yes/Yes

19

% reporting feeling safe in the community

% of participants, family members and other community members who report feeling safe in their community

(Higher at Wave 2 than at Wave 1 (statistically significant))

Yes/Yes

20

% reporting feeling safe at home

% of participants, family members and other community members who report feeling safe at home

(Higher at Wave 2 than at Wave 1 (statistically significant))

Yes/Yes

Completeness analysis of KPIs

Overall Completeness: Partial

Issues:

Key areas of the CDCT relating to the administrative and operational aspects of the trial such as the Social Services call centre, wellbeing exemptions, community visits, levels of cash available in the community and staff training were not measured with KPIs. This reduced the ability of Social Services to drive improvement in the operational aspects of the CDC.

There is a lack of ongoing KPIs post ORIMA’s evaluation to measure on the CDC’s wider goal of ‘encouraging socially responsible behaviour’ per the trial legislation and the other goals of the trial over the long term.

There were no efficiency focused KPIs to drive the measurement of the CDC’s efficiency, particularly against existing welfare quarantining measures. This means that a key intent of the CDC as noted in the Forrest Review to drive down the cost and resources involved in welfare quarantining was not measured by a KPI.

 

     

Note a: A five per cent margin was allowed for people moving in and out of income support payments.

Source: CDCT Evaluation Framework and ANAO analysis.

Footnotes

1 The Northern Territory National Emergency Response Act 2007 was passed and given royal assent in August 2007. The Act outlined measures to address child abuse and other issues in Indigenous communities in the Northern Territory.

2 Department of Human Services, How Income Management Works [Internet], DHS, February 2018, available from https://www.humanservices.gov.au/individuals/enablers/how-income-management-works [accessed March 2018].

3 The Minister for Human Services announced the roll-out of the CDC to the Western Australian Goldfields region on 1 September 2017 and to the Hinkler Electorate on 21 September 2017.

4 The Northern Territory National Emergency Response Act 2007 was passed and given royal assent in August 2007. The Act outlined measures to address child abuse and other issues in Indigenous communities in the Northern Territory.

5 Department of Human Services, How Income Management Works [Internet], DHS, February 2018, available from https://www.humanservices.gov.au/individuals/enablers/how-income-management-works [accessed March 2018].

6 Department of Social Services, Guide to Social Security Law: 11.1.1.20 Outline of Income Management Measures [Internet], DSS, September 2015, available from http://guides.dss.gov.au/guide-social-security-law/11/1/1/20 [accessed February 2018].

7 There are examples of a number of government social policy trials and pilots aimed at testing policies, including Income Management trials, initiatives to improve school attendance in remote communities and encouraging income support recipients into employment. Income Management trials were implemented at sites including: the Kimberley (Western Australia); Playford (South Australia); Greater Shepparton (Victoria); Bankstown (New South Wales); Rockhampton (Queensland); and Logan (Queensland).

8 A Forrest, The Forrest Review Creating Parity, Commonwealth of Australia, Canberra, 2014, available from https://www.pmc.gov.au/sites/default/files/publications/Forrest-Review.pdf [accessed March 2018].

9 A Forrest, The Forrest Review Creating Parity, Commonwealth of Australia, Canberra, 2014, p. 13, available from https://www.pmc.gov.au/sites/default/files/publications/Forrest-Review.pdf [accessed March 2018].

10 The CDCT parameters are presented in further detail in Appendix 2.

11 Australian Bureau of Statistics, Census of Population and Housing, cat. No. 2008.0, ABS, Canberra 2016.

12 The Minister for Human Services announced the roll-out of the CDC to the Western Australian Goldfields region on 1 September 2017 and to the Hinkler Electorate on 21 September 2017.

13 See Auditor-General Report No.10 2017–18 Design and Monitoring of the National Innovation and Science Agenda and Auditor-General Report No.32 2017–18 Funding Models for Threatened Species Management.

14 Local Partners are organisations located in the trial sites, contracted by the card provider Indue, to assist participants with activating and using the card and assisting with other card related issues.

15 Social Services reported that approximately 300 consultations were held in the Ceduna Region and 110 consultations in the East Kimberley region.

16 Indue engaged Local Partners to deliver local support. See paragraph 2.39 for further discussion.

17 A telephone hotline created by Social Services for queries related to the CDC.

18 Centrelink’s (Human Services) customer service centre (13 23 07) for queries relating to payments, including Centrepay or rent deduction.

19 Department of Finance, Commonwealth Risk Management Policy, Finance, 2014, paragraph 2.

20 While the CDCT team structure during the trial did not identify who was responsible for risk management, one team member’s personal development plan did identify risk management as a task. Post-trial, in August 2017, responsibility for risk management was clearly assigned within the CDC Operations team structure.

21 The Social Security (Debit Card Trial) Amendment Act 2015 received Royal Assent on 12 November 2015.

22 The Department of Social Services’ Risk Management Framework incorporated an Excel-based risk register template with macros which ensured risks were correctly rated according to the Risk Matrix, however, this template was not used for the CDC project.

23 Indue provided Social Services with the first extract of its risk register on 1 February 2018.

24 An issue can be defined as a current incident, impeding factor or problem. An issue can arise due to a risk materialising.

25 Banks indicated that to standardise Stock Keeping Units (SKUs) across all businesses to standardise barcoding across Australia (which would be necessary for point of sale restriction of excluded goods and services) would be costly and not possible to implement in the short-term.

26 There were seven criteria regarding ‘an ability to’: 1. Deliver the card solution within six months; 2. Deliver the card solution to up to 3,500 participants; 3. Deliver the card solution in two disadvantaged communities, including one in a remote location; 4. Quarantine 70 per cent of participants funds with the option to increase this percentage; 5. Use technology to block the purchase of excluded items (alcohol and gambling goods and services) in around 50 stores, or if technology is not available, to block stores whose primary sales are of excluded goods; 6. Receive payment through the RBA as an Authorised Deposit taking Institution (ADI); and 7. Able to store received payments on the card. There were two ‘highly preferable criteria’: Previous experience in delivering welfare quarantining programmes; and previously expressed interest in income quarantining issues such as responses to the Healthy Welfare Card proposal in the Creating Parity report.

27 Approval was given on the basis that the procurement met the conditions of limited tender under Division 2, clause 10.3(g) of the Commonwealth Procurement Rules which permitted a limited tender: that is: ‘when a relevant entity procures a prototype or a first good or service that is intended for limited trial or that is developed at the relevant entity’s request in the course of, and for, a particular contract for research, experiment, study, or original development’.

28 Maddocks provided a probity sign-off letter on 1 September 2015, noting that ‘to the extent we have been involved or observed the Procurement to the date of this letter, the Procurement has been conducted in accordance with the Commonwealth Procurement Rules, relevant Commonwealth legislation, policies and probity principles’.

29 Social Services and Indue signed a Letter of Agreement on 23 February 2016 so that Indue could commence preliminary activities to meet the proposed implementation timetable as there were delays in signing the operational contract due to commercial and legal issues.

30 CPRs section 10.5 requires a value for money assessment to be performed for each contract awarded through limited tender. Department of Finance, Commonwealth Procurement Rules, Commonwealth of Australia, Canberra, July 2014, p. 28, available from https://www.legislation.gov.au/Details/F2014L00912 [accessed March 2018]. The CPRs were amended in March 2017 and January 2018.

31 Indue requested that the breakdown of the contract price be kept confidential and Social Services approved this based on Clayton Utz advice. The confidentiality request does not meet the four criteria set out in the Confidentiality Test. See Department of Finance (Finance), Buying for the Australian Government, Confidentiality throughout the Procurement Cycle: Practice, Awarding a Contract, [Internet], available from http://www.finance.gov.au/procurement/procurement-policy-and-guidance/buying/contract-issues/confidentiality-procurement-cycle/practice.html [accessed April 2018].

32 Indue’s Customer Service Centre (1800 710 265) was established shortly before the trials commenced and began taking calls on the first day of the Ceduna trial, 15 March 2016. Indue prepared scripts for anticipated common questions and reported call statistics to Social Services as part of its regular reporting. There was initially Indue’s 1300 number listed on the back of the Indue card. This number incurred the cost of a local call from a landline and additional charges from a mobile. During the trial period the 1300 number was replaced with the Indue Customer Service Centre 1800 number on the back of the cards to reduce costs for participants.

33 Local merchants who sold a mixture of restricted (alcohol, gambling products or cash equivalent products) and non-restricted goods had to enter into a mixed merchant agreement with Indue.

34 This report provides Social Services with aggregate information on transactions, card activations, card declines and other metrics from cards in the relevant region; information on merchants; including non-compliance, complaints and declined transactions; and information on calls received by Indue’s call centre for the month.

35 This report provides information on the key metrics Indue is required to meet under its service level agreement with Social Services.

36 The Financial Instructions state that: ‘Relevant departmental officials must ensure, on receipt of a correctly rendered invoice for goods or services provided that payment is processed within the 30 day standard trading terms of the Department’.

37 The Social Policy Research and Evaluation Panel (2007–2016) was a panel of experts, established by the former Department of Families, Housing, Community Services and Indigenous Affairs (FAHCSIA), accessible to deliver social policy research, evaluation, investment in data and professional development services to Commonwealth entities.

38 Social Services’ committee report to its senior executive also indicated that: ‘In summary, a qualitative assessment method was used to assess each tender against the Conditions of Participation and Evaluation Criteria as stated in the RFQ [request for quote]. Each tender was evaluated and a value for money determination derived for the tender and tenders were ranked relative to the value for money each offers. A copy of the assessment rating/scoring method is in the Evaluation Plan’ but indicated further in the report that a value for money assessment for Deloitte Access Economics was not undertaken.

39 Social Services noted in its assessment of ORIMA’s tender, ‘Costs could be more transparent and the total quoted price is high but broadly this appears reflective of the ideal methodological approach.’

40 Final cost of the evaluation is discussed in paragraph 3.45 and 3.46.

41 Social Services’ contract with Indue included a requirement for Indue to engage Local Partners. Services Agreement Debit Card Trial Services – Implementation and Operational Phase, Department of Social Services and Indue Limited, Contract number 90007395.

42 Australia Post provided temporary and replacement card services.

43Social Security (Administration) Act 1991, under section 124PE the Minister for Human Services authorised the Ceduna, Kununurra and Wyndham Community Panels to consider and, on a case by case basis, make a decision to vary the restricted portion of funds being placed on the Cashless Debit Card. The Ceduna and Kununurra Community Panels decided that the restricted portion of funds could be reduced to 50 per cent. The Wyndham Implementation Group and Wyndham Community Panel agreed that the restricted portion of funds could only be reduced to 70 per cent for participants in Wyndham.

44 Criteria for consideration included, for example: criminal record; children’s school attendance record; housing evictions; and outstanding debts.

45 Quote from Social Services’ Monitoring and Evaluation Strategy.

46 Not all CDCT participants had 20 per cent of their income support payment available as cash. As at 31 March 2017, 128 participants had received agreement from the Community Panel to have the quarantined proportion of their income support payment reduced.

47 This analysis does not take into account any additional income received by the participant.

48 The Research and Evaluation Database contains a multitude of relevant variables (including information on cognitive or neurological impairment, drug/alcohol dependent, homeless, Illness/injury requiring frequent treatment, psychiatric problem or mental Illness, significant lack of literacy and language skills, recent trauma/domestic violence, released prisoner (in gaol 14 days+), significant caring responsibilities, job in transition, served eight week non-payment period in last 12 months) that were presented to the Minister in a table showing vulnerabilities in Income Support population in the trial sites (see paragraph 3.32).

49 Social Services’ guidance defined a post-implementation review as: ‘… a review that asks and answers the question of whether an initiative was implemented in the manner envisaged, on time and within budget - and whether the relevant systems, governance and program information and reporting are in place.’ The guidance notes that the responsibility for undertaking a post-implementation review is the line/policy/program area, and indicates that the post implementation review will often inform evaluations.

50 Social Services’ guidance, the Evaluation and Review Type Selection Tool, included advice on: identifying two types of performance reviews including an evaluation and a post-implementation review (PIR); the high level process for conducting the reviews; and what area in Social Services was responsible.

51 ORIMA Research, Cashless Debit Card Trial Evaluation Final Evaluation Report, ORIMA, August 2017, p. 99, available from https://www.dss.gov.au/sites/default/files/documents/08_2017/cashless_debit_card_trial_evaluation_-_final_evaluation_report.pdf [accessed February 2018].

52 Immediate outcomes expected within one to three months included: 1. For individuals showing behaviour contributing to wider social harm, a reduction in: the purchase of alcohol and gambling products; indicators of socially harmful community behaviour, including crime and violence. 2. For all individuals subject to the trial, the debit card is used responsibly with minimal or no: incidents of circumventing the system; perceived stigma or discrimination. 3. Participants with alcohol, drug and gambling addictions to access available supports. 4. Community leadership bodies support the implementation of the Cashless Debit Card. Immediate outcomes expected within four to twelve months included: 1. Communities are safer, with a reduction in indicators of social harm, such as substance associated ill-health or injury, assault or drug possession charges. 2. Individuals have minimal barriers to engaging in normal daily activities, with a decrease drug/alcohol intake, decrease spending on gambling, and a more stabilised financial situation.

53 The results of the ANAO’s analysis are provided at Appendix 4. The criteria applied were initially developed in Auditor-General Report No.21 2013–14 Pilot Project to Audit Key Performance Indicators and subsequently updated in Auditor-General Report No.58 2015–16 Implementation of the Annual Performance Statement Requirements to factor in updated guidance from the Department of Finance. Additionally, KPIs and performance measurement were considered broadly in relation to the Department of Finance, Resource Management Guide No. 131 Developing good performance information, Commonwealth of Australia, Canberra, April 2015, available from https://www.finance.gov.au/sites/default/files/RMG%20131%20Developing%20good%20performance%20information.pdf [accessed February 2018].

54 Circumvention, the Social Services call centre, wellbeing exemptions, community visits, levels of cash available in the community (in contrast to their 80 per cent of IS payments being quarantined which is just an indicator that DHS has switched on trial participants) and staff training were not measured with KPIs.

55 For example, KPIs related to circumvention, the Social Services call centre, wellbeing exemptions, community visits, levels of cash available in the community and staff training.

56 The ANAO noted in its 2012–13 audit report on the Administration of New Income Management in the Northern Territory that Human Services was unable to directly isolate the costs of income management. The report noted that ‘…the departments [Human Services/Families, Housing, Community Services and Indigenous Affairs] estimate that the cost of providing Income Management services is in the order of $6600 to $7900 per annum’, and that ‘…there is also scope for DHS to strengthen its internal monitoring and reporting arrangements by developing performance indicators that better measure the efficiency and effectiveness of Income Management service delivery’. Auditor-General Report No. 19, 2012–13 available from https://www.anao.gov.au/sites/g/files/net4981/f/201213%20Audit%20Report%20No%2019.pdf

57 The income management extension was in the Perth Metropolitan, Peel and Kimberley regions, Laverton, Kiwirrkurra and Ngaanyatjarra Lands in Western Australia; Anangu Pitjantjatjara Yankunytjatjara Lands, Ceduna and Playford in South Australia; Cape York, Rockhampton, Livingstone and Logan in Queensland; Bankstown in New South Wales; Greater Shepparton in Victoria; and in the Northern Territory.

58 The Minister announced an extension of the CDC on 14 March 2017.

59Resource Management Guide 131 provides guidance in matching the level of performance reporting to its intended audience and use, ‘The level of detail required for performance information will depend on what it will be used for. Information used to make decisions about how to deliver an activity (tactical/management) is likely to be at a finer level of detail than that used by ministers to make decisions about resource allocations (strategic/accountability).’ Department of Finance, Resource Management Guide No. 131 Developing good performance information, Commonwealth of Australia, Canberra, April 2015, p. 11, available from https://www.finance.gov.au/sites/default/files/RMG%20131%20Developing%20good%20performance%20information.pdf [accessed February 2018].

60Social Services Legislation Amendment (Cashless Debit Card) Bill 2017, Explanatory Memorandum, p. 2.

61 The guidance was endorsed by Social Services’ Policy and Regulatory Reform Committee (chaired by a Deputy Secretary, Department of Social Services).

62 Guidance includes: the Evaluation and Review Type Selection Tool, Policy Office Evaluation Unit Service Offer, Policy Office: Roles and Responsibilities for Evaluation Activity, Evaluation Prioritisation Tool, the Evaluation and Review Activity Definitional Tool, Evaluation Process Overview, Evaluation Framework and the Fact Sheet Program Logic.

63 An intercept survey is a research method used to collect information. For the Cashless Debit Card Trial, potential respondents were approached in a range of locations and asked if they would be willing to be surveyed. ORIMA noted that to reduce bias, a systematic selection process was used, that is, every third or fourth person was selected to be interviewed.

64 Target sample sizes included 325 and 200 trial participants in Wave 1 and Wave 2 respectively, as outlined in the Evaluation Framework. Cashless Debit Card Trial Evaluation Final Evaluation Report, p. 160, available from https://www.dss.gov.au/sites/default/files/documents/08_2017/cashless_debit_card_trial_evaluation_-_final_evaluation_report.pdf [accessed February 2018].

65 ORIMA had anticipated a 30 per cent attrition rate.

66Cashless Debit Card Trial Evaluation Final Evaluation Report, p. 9, available from https://www.dss.gov.au/sites/default/files/documents/08_2017/cashless_debit_card_trial_evaluation_-_final_evaluation_report.pdf [accessed February 2018].

67 ‘The Minister’ refers to the Minister for Human Services.

68 The Minister was advised that the Research and Evaluation Database that provided measures of income support recipient vulnerability in the trial sites, was an example of the data that would be used for the evaluation. See also footnote 48.

69 Developed for the Australian Priority Investment Approach to Welfare.

70 The demographic variables used were: income support benefit type, location, gender, age, ATSI status and marital status.

71Cashless Debit Card Trial Evaluation Final Evaluation Report, p. 19, available from https://www.dss.gov.au/sites/default/files/documents/08_2017/cashless_debit_card_trial_evaluation_-_final_evaluation_report.pdf [accessed February 2018].

72 ibid., pp. 13 and 25.

73 ibid., Table 1, p. 14.

74 For discussion on the Research and Evaluation Database, see paragraph 3.11 and Table 3.1.

75 The Research and Evaluation Database has been used in earlier government trial evaluations, including the evaluation of the School Enrolment and Attendance Measure. Department of Education, Employment and Workplace Relations, Improving School Enrolment and Attendance through Welfare Reform Measure (SEAM) Trial (2009 – 2012): Final Evaluation Report, Commonwealth of Australia, May 2014, available from https://www.pmc.gov.au/sites/default/files/publications/Improving_School_Enrolment_Attendance_through_Welfare_Reform_Measure_trial.pdf [accessed February 2018].

76Cashless Debit Card Trial Final Evaluation Report, Appendix A, pp. 119–152, available from https://www.dss.gov.au/sites/default/files/documents/08_2017/cashless_debit_card_trial_evaluation_-_final_evaluation_report.pdf [accessed May 2018].

77 Inclusive of the Goods and Services Tax.

78 The monthly data reports were also provided to community leaders and ORIMA to inform the evaluation of the CDC trial.

79 Copies of the monthly data reports were also circulated to the Evaluation Unit in Social Services.

80 The report included data up to August 2016.

81 Data used in Social Services’ monitoring analysis was not consistent across states.

82 Department of Social Services, Cashless Debit Card Trial Progress Report: October 2016 [Internet], DSS, 2016, available from https://www.mhs.gov.au/sites/g/files/net1006/f/cashless-debit-card-trial-data.pdf [accessed February 2018].

83 Seasonality can be observed in time series data where regular and predictable changes or patterns recur every calendar year. For instance, the wet season prevents movement of individuals between some remote communities and the consumption of alcohol, drugs and gambling generally increases over this period.

84 Department of Social Services, Cashless Debit Card Trial Progress Report: October 2016 [Internet], DSS, 2016, available from https://www.mhs.gov.au/sites/g/files/net1006/f/cashless-debit-card-trial-data.pdf [accessed February 2018].

85 Social Services provided no caveat in presenting its analysis.

86 The night patrol vehicle provides safe transport for intoxicated persons, parents and children, people with disabilities, people with physical or mental health problems, elderly people and those who are at risk to themselves or others during its hours of operation. Ngnowar-Aerwah, Sobering up Shelter [Internet], available from https://www.ngnowar-aerwah.org/sobering-up-shelter-night-patrol [accessed February 2018].

87Cashless Debit Card Evaluation Framework Summary, available from https://www.dss.gov.au/families-and-children/programs-services/welfare-conditionality/cashless-debt-card-trial-evaluation-framework-summary [accessed February 2018].

88Cashless Debit Card Trial Evaluation Final Evaluation Report, p. 110, available from https://www.dss.gov.au/sites/default/files/documents/08_2017/cashless_debit_card_trial_evaluation_-_final_evaluation_report.pdf [accessed February 2018].

89 The Ministers for the Department of Social Services announced on 15 May 2018 that ‘A second evaluation of the Cashless Debit Card across the three trial sites will assess the ongoing effectiveness of the program’, available from https://ministers.dss.gov.au/media-releases/3066.