The Performance Dimensions in Budget > The design and use of PEFA and other assessments

Edwin Starr and PEFA?

(1/2) > >>

STONE:
I was in a taxi in London recently with a driver going on a bit so I tuned out of his chatter (the state of British Prisons - inside story apparently) and heard Edwin Starr on the radio criticising that long established human institution - War.  The recurring question Starr poses is what use does it serve.  His conclusion is pretty clear - absolutely nothin(g).  As these things usually do - the question rang a recent bell and I was minded of a piece sent to me by ODI with the same subsidiary question  "PEFA: What is it good for?" see attached...

I haven't had time to read it, but it seems it caused a bit of stir in Buda and Pest at the PEFA conference and has now elicited a response from Lawson and Bucknall at Fiscus Ltd.

The PFM Board was set up in the light of the PEFA thing and as this corner of the Board is for design and use of PEFA I thought we should try and bring the debate in, or at least follow and nurture it.

Hadley and Miller (2016) is attached and Lawson and Bucknall appear in original on the Fiscus website but I got it from here : https://www.linkedin.com/pulse/2016-pefa-framework-lets-throw-baby-out-bath-water-andrew-lawson

John Short:
“Using PEFA as a blueprint for reform runs counter to the advice of the PEFA Secretariat, which explicitly states that ‘PEFA does not provide recommendations for reforms’ (PEFA Secretariat, 2016a:4)”. (Fourth paragraph section 4)

Spot on PEFA Sec but the next step in the sequence may be to use the PEFA Assessment to develop a PFM Reform Action Plan.  The PEFA does one thing and a PFMRAP does another.  The ODI paper implies that the PEFA is a catch all doing both and appears to be critiquing the PEFA on that ground rather than for what it is and what it attempts to be: a diagnostic assessment of PFM systems and procedures at a moment in time.  Nothing more and nothing less- but something that has to be done properly – many in the past have not:  hence the PEFACheck.

STONE:
On a first read of Hadley and Miller and Lawson and Bucknall, I agree with the latter and also with John Short.  But I suspect that Hadely and Miller are essentially trying to argue that PEFA should be used as intended - a diagnostic tool to provide an information base to be shared between a government desirous of leading its own reform programme supported by co-ordinated donors and IFIs.

If the PEFA instrument is being abused should we not highlight signs of abuse - a PEFA watch perhaps?

Signs of PEFA abuse might include the following -

"Programme decisions and the provision of budget support are ... linked to specific PEFA indicators" - Hadley and Miller (2016:9)

"Contractors and consultants working on PFM commonly monitor PEFA scores as part of their logical (sic) frameworks" - Hadley and Miller (2016:9)

"If it took much longer [than the 5months the PEFA Secretariat suggests for a PEFA assessment and a draft Performance Assessment] the results might be out of date by the time they are published, and costs might make organisations think twice before commissioning a PEFA assessment" - Hadley and Miller (2016:11)

"... the framework is relatively easily understood, even by someone with relatively limited knowledge of PFM"  - Hadley and Miller (2016:9)

PEFA letter scores being converted into numbers and then having arithmetical operations performed on them....

Napodano:
Yes, I myself have seen PEFA scores used as indicators of performance (baselines and targets) in many budget support operations...(sic)

John Short:
Signs of PEFA abuse might include the following -

"Programme decisions and the provision of budget support are ... linked to specific PEFA indicators" - Hadley and Miller (2016:9)

I suspect the reason for this is PEFA is used as a Fiduciary Risk tool.  DFID FRA is also based on a PEFA though with a specific corruption section added.  This then allows an objective (hopefully) assessment whether there is a reduction in fiduciary risk over time or whether there is need for TA or a reform programme to mitigate against FR

"Contractors and consultants working on PFM commonly monitor PEFA scores as part of their logical (sic) frameworks" - Hadley and Miller (2016:9)

Not too sure of what this means exactly, but in the SMART world we inhabit monitoring indicators provides some measure of performance against a benchmark.

"If it took much longer [than the 5months the PEFA Secretariat suggests for a PEFA assessment and a draft Performance Assessment] the results might be out of date by the time they are published, and costs might make organisations think twice before commissioning a PEFA assessment" - Hadley and Miller (2016:11)

This reflects a fixation with scores alone and not enough emphasis on the narrative that accompanies the score.  A few lines indicating if any reform is in place and the timeframe should take care of this.  PFM reform is not so instantaneous that the time between doing the fieldwork, scoring and report writing is going to mean that the scores are going to be real history.  Besides much of that time at the end of the process is taken up with dealing with comments – government itself will have seen an early version and this of course argues for getting the initial draft and scoring right in the first place!

"... the framework is relatively easily understood, even by someone with relatively limited knowledge of PFM" - Hadley and Miller (2016:9)
Surely a virtue and testament to those who have put the framework together.

PEFA letter scores being converted into numbers and then having arithmetical operations performed on them....

Arithmetic operations relating to what?  I can’t see the difference within a an individual PEFA.  For making aggregations of countries in a region and getting an average to compare against, possibly

I think it is important to be less fixated on the scores which of course is the easy thing to do – just scan the summary table.  Moving the summary assessment (2005/2011 PEFA methodology) to a separate Chapter 4 Conclusions of the analysis of PFM systems (2016 methodology) is positive.  However the narrative of each dimension is also vital.  To take an example, PI-30 deals with External Audit but the scoring of 30.2 Submission of audit reports to the legislature is not necessary an assessment of the efficiency of the SAI.  The SAI may complete the audit within 3 months to score an A but the procedures in a country may be that the SAI submits to the Minister of Finance who lays the report before the legislature and this may be done after 12 months.  Thus the score is a D and the most the whole indicator gets is a D or a D+ if one of the other dimensions scores greater than a D.  They all may be As.  Scanning the scores and not reading the narrative in this case may give the impression that the SAI is poor whereas in reality the SAI is efficient.  (A true case, by the way!)

Navigation

[0] Message Index

[#] Next page

Go to full version