Author Topic: Assessing the (PFM) assessors  (Read 832 times)

marybetley

  • Moderator
  • PFM Member
  • *****
  • Posts: 9
Assessing the (PFM) assessors
« on: October 04, 2010, 13:52:16 GMT »
A new research paper1 aims to answer the question of how well the current framework for measuring PFM performance widely used by aid agencies and developing country governments (the PEFA Framework) is aligned to good practice as identified by academic research on performance measurement.  This research uniquely applies lessons from academic research to the PEFA Performance Measurement System (PMS).  Given the potentially critical role played by the PEFA PMS, particularly in its role on co-operation between governments and aid agencies, and its wide application across the world, this topic is an important area of research.

The research develops a three-level scorecard to assess each criterion selected from the literature on good practice for the design of performance measurement systems.  The analysis shows that the PEFA performance measurement system compares reasonably well against these criteria, most particularly in the format of the structure of its design (i.e. the performance measures themselves).  It has performed relatively less well in terms of its application in practice.

In particular, overall, the findings show that the PEFA PMS has been well-designed, and generally designed in accordance with good practice for performance measurement systems as determined in the wider academic literature.  However, it is in its application (e.g. how easy the performance measures are to use or ensuring that the performance reports are consistent) where more difficulties occur.  This result is perhaps not surprising, given that many of the criteria were based on the design of the PMS, which was extensively tested and was based on previous experiences of measuring PFM performance across countries.  On the other hand, the documentary evidence suggests that the role of quality assurance has been important in improving the understanding of the PMS and consistency in its reporting across countries, thereby increasing the level of compliance.

1. Betley, M (2010), “Assessing the Assessors: How well does a key performance measurement system for public finance measure up?”, Dissertation, Warwick University, UK.
« Last Edit: October 04, 2010, 20:59:35 GMT by marybetley »

Napodano

  • Administrator
  • PFM Member
  • *****
  • Posts: 682
Re: Assessing the (PFM) assessors
« Reply #1 on: October 04, 2010, 14:10:49 GMT »
marybetley,

Interesting research, indeed. Your document needs careful reading.

In the meantime, I attach a presentation made by the PEFA Secretariat about the links of PEFA to other assessment tools.
« Last Edit: October 05, 2010, 10:45:52 GMT by Napodano »

harnett

  • Global Moderator
  • PFM Member
  • *****
  • Posts: 204
    • REPIM
Re: Assessing the (PFM) assessors
« Reply #2 on: October 09, 2010, 10:09:35 GMT »
Mary

This is great.  Thanks a lot for your work which provides a welcome stocktake of where we are with PEFA.

A couple of comments: 

1. It may have been interesting to note that there have been no changes to the indicators in 5 years, though 3 changes have recently been proposed - surely a vindication of how well PEFA was prepared.

2. You state that: "Direct experience suggests that the measurement of some qualitative indicators can be a source of interpretation. Evidence of this may be found in instances of changes to scores following presentation of initial results."  This may be true but I would also assert that often initial scores focus the minds of government officials and as a result new evidence is presented before the finalisation of the report.  This was especially so during the first round of assessments.  Furthermore, such changes may also have come about as a result of the Secretariat's QA role, indicating that interpreation was, in fact, consistent.

I also wonder to what extent you could have discussed the difficulty associated with measuring PFM across so many countries.  Whilst training I often offer myself as an example of scepticism with the PEFA framework, as I couldn't believe that PFM could be usefully reduced to 31 indicators, though the application of the process actually reveals just how well it was designed and how useful it is.  Its not perfect - what is? - but it is certainly useful, moreso over time, as repeat assessments indicate progress adequately, and inform on required further reforms.
« Last Edit: October 09, 2010, 10:23:29 GMT by harnett »

marybetley

  • Moderator
  • PFM Member
  • *****
  • Posts: 9
Re: Assessing the (PFM) assessors
« Reply #3 on: December 17, 2010, 13:50:24 GMT »
Very good points, which I aim to address in the follow-on research paper.

Specifically, regarding your point number 2, this is true, but it is still the case that some of the dimensions require judgement to decide between the gradations of different scores (e.g. where is the line between "some" and "substantial"), and is this consistent across all assessments?

Napodano

  • Administrator
  • PFM Member
  • *****
  • Posts: 682
Re: Assessing the (PFM) assessors
« Reply #4 on: January 24, 2011, 08:37:17 GMT »
^

 

RSS | Mobile

© 2002-2024 Taperssection.com
Powered by SMF