General

Reviewer skewers report into $19b Covid subsidy scheme


Tasked with commissioning a “timely” review of the $19 billion Covid-19 wage subsidy scheme, the Ministry for Social Development defended the fact it would take a more than a year to complete some of the work.

“We want to ensure any evaluation process is robust, considers the various complexities and is done correctly – this does take time,” George Van Ooyen, MSD’s group general manager client service support, said in September 2021.

The assessment reports – four in all – were only published in July last year.

They gave the scheme an overall pass mark, with eligibility criteria and rules deemed “about right”.

Now, thanks to documents released under the Official Information Act, Newsroom can reveal details of a scathing external assessment of one of those reports – a draft of the “process evaluation” review by consultants MartinJenkins.

Rather than being robust, the withering peer review suggested the draft report might have negative value.

Questions were also raised about the close relationship between the public service and regularly used consulting firms.

The genesis of the MSD-commissioned reviews was a recommendation by Auditor-General John Ryan, in a report released in May 2021.

Such a move shouldn’t surprise.

The wage subsidy scheme was designed quickly, and rolled out urgently in a high-trust model, to keep people in jobs and keep businesses afloat during lockdowns.

“If the waiter’s hands are a bit grubby, it’s often an indication that the restaurant kitchen is filthy.”

Dr Simon Chapple peer review of MartinJenkins’ process evaluation

As noted in the opening of the finalised MartinJenkins report, it “was the Government’s single-largest area of spending in response to Covid-19, and indirectly supported about 1.8 million workers”.

Wage subsidy reviews were overseen by a cross-agency working group and steering group, headed by MSD and with representatives from Treasury, Inland Revenue, and the Business Ministry, MBIE.

The main questions to be answered by the so-called process evaluation were: How well did policy development process work given the crisis context, time and resource constraints; and how well was the scheme implemented, and risks managed?

The 113-page draft was delivered by MartinJenkins on August 12 last year.

Peer reviewer Dr Simon Chapple, then a director of the Institute for Governance and Policy Studies at Victoria University of Wellington, submitted his report six days later.

(The deadline was extended by four days because the draft was larger than expected, but the academic stuck to the original date as he was going on annual leave.)

So exercised was he by what he’d read, Chapple’s review – stamped “in-confidence” – ran to more than 10,000 words.

The body of the peer review noted many things, including basic problems such as spelling mistakes, poor grammar, insufficient sourcing, and a table that wasn’t numbered or indexed.

“Where low cost-to-identify and low cost-to-remedy problems like this are identified, a lack of care and attention to detail is demonstrated,” Chapple wrote.

“If the waiter’s hands are a bit grubby, it’s often an indication that the restaurant kitchen is filthy.”

The peer review also took issue with conclusions reached without “acceptable minimum evidential standards”. “It suggests a tendency to jump to predetermined conclusions in advance of the evidence.”

Chapple was at his most trenchant in his report’s conclusion, in which he expressed surprise and disappointment in the draft, as MartinJenkins is a “large, well-known, and long-standing private consulting organisation with a strong reputation”, with staff experienced in evaluation.

“The document lacks depth, nuance, subtlety, and self-reflection.”

One passage, about transparency and the care not to overstate the extent of engagement and representativeness of samples “makes me alternately laugh and weep”, Chapple wrote. Many of the basics of good science were “missing”, he said, while conclusions lacked a “valid evidential base”.

There was considerable risk that a high degree of confirmation bias was locked in to the report, Chapple wrote.

The tacit methodological approach was “not sound”, the report said, and therefore conclusions were “not clear and logical”.

“The obvious risk is that conclusions are a hot mish-mash of evaluator cognitive and political biases and shibboleths peculiar to the Wellington public policy milieu. Surely, we can do much better.”

Chapple recommended a new evaluation be written “in order to produce a deliverable which meets minimum acceptable standards”.

One way forward, he suggested, was a completely different team undertake the remaining work, “possibly blind to this draft”.

‘It’s worse than that’

Readers of Chapple’s peer review might conclude he dismissed the evaluation as having low or even no value.

“This is not the case,” he wrote. “It is actually worse than that.

“This evaluation should be about setting standards for other evaluations in the New Zealand public sector. By setting the bar so low for such a high-profile and important evaluation as this one is, there are negative spill-overs across the public service in terms of what is acceptable in the future in another internal or external evaluation.

“Second, if the public service responds to information where there is a strong possibility it is misleading (in this or any other evaluation influenced by it), significant harm may be done, and substantial net costs incurred.”

“The document lacks depth, nuance, subtlety, and self-reflection.”

Dr Simon Chapple peer review of MartinJenkins’ process evaluation

Newsroom approached the Public Service Commission for comment.

Chapple widened the lens somewhat, on the issue of bias.

The draft report’s title page noted the evaluation was independent. However, Chapple wrote MartinJenkins (MJ) “has close links to the New Zealand public service via ongoing large-scale, and regular contracting out of a variety of policy advice and evaluation functions”.

The peer reviewer noted a Stuff opinion piece from 2022 which opened with the line: “There is a saying floating around Wellington which, like most jokes, hides an uncomfortable truth: ‘There are three branches of government: the legislature, the judiciary and MartinJenkins’.”

Wrote Chapple: “It is unclear to me how MJ have successfully managed the risks of loss of independence in running this evaluation which arise out of their very close and long-standing relationship with the New Zealand public sector as a major client.”

Ananish Chaudhuri, a professor of economics at the University of Auckland, told Newsroom he is unfamiliar with MartinJenkins, so was unsure if the draft report on the wage subsidy was typical of its work.

“Maybe a junior person ended up having to deal with this.”

While Chaudhuri found some of Chapple’s comments “a bit extreme” – “you would expect this to be more professional” – he felt many criticisms of the draft report seemed reasonable.

“You want to spend public funding wisely, and you want to make sure that you there’s some accountability, and the people who we’re asking to look over these things are qualified.”

Newsroom asked MSD if it agreed with the majority of Chapple’s criticisms of the MartinJenkins report.

Fleur McLaren, group general manager of insights, says Chapple reviewed an early draft.

“He offered some advice which aligned with points raised by the working group and MSD staff.

“Some points were out of scope, not accepted, or not considered an appropriate course of action for a first evaluation draft. This was documented in our file note of August 4, 2023.”

(We’ll come back to the file note later.)

Given the tone of Chapple’s peer review, did MSD express its disappointment about the quality of MartinJenkins work, and did it prompt a wider review?

McLaren says: “We provided feedback to MartinJenkins as part of our normal process. Quality assurance is an expected part of any evaluation.”

MartinJenkins partner Sarah Baddeley says the methodology for the process evaluation was approved by the cross-agency working group, and using feedback from an ethics panel.

Did Chapple’s peer review shock MartinJenkins, and did it believe the draft was of such low quality?

Baddeley says: “MartinJenkins agreed with MSD that the peer review commented on a number of elements related to the wage subsidy that were not relevant to the scope of our evaluation nor were usual for this type of early-stage peer review.”

Several times in late 2022, the consulting firm asked for a full copy of the Chapple peer review but MSD initially declined “as the feedback was intended for internal feedback only and salient points had already been shared with the supplier”.

MartinJenkins then asked for the report “on equity issues”, and, with Chapple’s agreement, the report was sent in December of that year.

Once it received the full peer review, Baddeley says the firm raised concerns about “the process of commissioning and quality of the peer review”.

“MSD confirmed the points of the peer review that they considered were relevant to the evaluation, noting that large numbers of comments were deemed out of scope and irrelevant.”

The final report considered feedback from the peer review “that MSD considered relevant”, she says, alongside other feedback from the cross-agency working group.

“This was a straightforward process and was consistent with our usual methodology,” Baddeley says.

“The main changes we made were presentational, more clearly setting out our method, and did not go to the substantive findings. 

“Our intention was to prepare an independent evaluation that would be helpful to future policymakers; it was not an academic evaluation nor was it commissioned as such.”

Asked if MartinJenkins had strengthened its internal quality assurance processes, Baddeley says its assessment was a rigorous quality process had been followed.

“Throughout the assignment, we were very mindful of the significance and importance of this process evaluation, and we continue to stand by our process and findings.

“By welcoming robust review processes, we ensure agencies and the public can have confidence in the relevance and accuracy of the process evaluation findings.”

Back to the file note

The file note from August last year shows Chapple wasn’t out on a limb with his criticisms.

First-off, the file note – circulated “in confidence” within MSD after the quartet of review reports were published – detailed how Chapple was selected on the advice of MSD’s research and evaluation experts, and his appointment was endorsed by the ministry’s chief science adviser, Tracey McIntosh.

The note said Chapple’s feedback overlapped with that of MSD’s working group. In fact, MSD staff met with MartinJenkins on August 17, the day before Chapple’s report landed.

MSD “initial internal feedback” noted “the need for the report to include a more developed methodology section and for the report to be restructured”.

The peer review “recommended some important quality improvements”, the note said.

“The feedback was discussed with the supplier before providing a report of synthesised feedback, and it was agreed that the supplier would be given additional time to make necessary changes.”

A section of the file note is headlined: Summary of peer review advice not considered an appropriate course of action. One passage said: “Some of the external peer review included commentary or advice that was out of scope, not accepted, or not considered an appropriate course of action for a first evaluation draft.”

Chapple’s suggestion the evaluation be stopped and re-started “was not relevant for an evaluation conducted in a public sector context”. MSD’s view of the risks of confirmation bias if MartinJenkins continued its work “was not practical or necessary”.

Instead, the ministry “gave the supplier comprehensive feedback that the supplier addressed in the next iteration of the report”.

The report’s second draft was sent to MSD on October 7, 2022, and once further interviews were undertaken, the final draft was submitted in February last year. The cross-agency steering group approved the finalised reports in April, and, as mentioned earlier, they were published in July.

In an ironic, plot-twisting postscript, Chapple is now an affiliate at non-profit research institute Motu, which also undertook assessments of the wage subsidy scheme for MSD.

Criticism of MartinJenkins’ work comes as consultants working for the public sector are in the spotlight.

Pre-election, the National Party pledged to slash $400 million from central government spending on consultants and contractors, which reached $1.2 billion in 2021/22.

On the campaign trail, the party leader who went on to become prime minister, Christopher Luxon, said: “Under National, this gravy train’s gonna stop at the station.”

National’s ‘100 Day Action Plan’ said it would: “Instruct public sector chief executives to begin reducing consultant and contractor expenditure, and to report on current spending within 100 days.”

In December’s mini-budget, Finance Minister Nicola Willis said she’d asked government agencies to find savings of about $1.5 billion a year.

Willis, who is also Public Service Minister, told Newsroom yesterday:Further updates on this savings programme will be provided in due course. Individual agencies have been asked to factor consultant and contractor expenditure into their savings plans as we prepare Budget 2024.”

Be known by your own web domain (en)

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *