The impact of European Commission support to PSD? We’re still none the wiser.
With ever-increasing pressure for donors to “show impact” for their aid programmes, there was a lot of interest in last week’s presentation of the Evaluation of EC Support to Private Sector Development in Third Countries from 2004 to 2010. But what can the European Commission (EC) actually take from this evaluation?
The report estimates that the EC spent €2.4bn directly on PSD over the period (plus €4.5bn if you count indirect support through general budget support and the EIC-managed ACP investment facility). The Agenda for Change also laid out a strong case for promoting and working with the private sector for development in the future. As such, there were expectations that this evaluation would get to the heart of where PSD support has been channelled across countries and regions, to what sectors and countries, through what types of instruments, and to what effect. But it didn’t.
Despite 3kg’s worth of project data and discussion(!), gathered in three tomes, the evaluation provides very little to chew on. Even for the EC.
Too much to evaluate?
This is not all down to the evaluators, as the EC recognised, but partly stems from broad the scope of the work. To evaluate all types of private sector support across all countries and then draw lessons is no mean feat. And this clearly wasn’t the place to discuss detailed project-level impact evaluations.
The conclusions are consequently vague and uninformative. There are mentions of the “heterogeneous package of activities”, “different types of value added”, the EC’s “generalist approach” and the difficulty of obtaining “a clear and complete picture of the observed results”, with little on how or where support was spread within regions. So there is little in the way of learning opportunities.
One clear conclusion
Of the 15 conclusions made, the clearest and most striking is the “very distant” linkage between PSD and job creation in EU support. This is especially surprising given that PSD is to a certain extent all about promoting job creation by creating the conditions for investment and firm expansion. How could it not be linked?
In fact this is down to the EC’s institutional division of labour between employment, considered a social issue, and PSD, an economic issue – so strengthening that institutional link could make sense. “Jobs”, has after all, become a key mantra of the development agenda following the Arab Spring, and recognition that this is really what people want and need, particularly youth and women.
Apart from this, the main conclusions presented related largely to EU “visibility” and “stakeholder awareness” – the EU was not seen as a major player in supporting the private sector – as though this was really the impact being sought.
So apart from a need to “raise the visibility of EU PSD support”, what lessons did the evaluators draw for implementing the Agenda for Change?
They suggest that the EC “build on its value-added”. This makes sense, but at the same time requires you to judge where the value-added is., The EC’s value added for PSD support lies, according to the presenters, in i) the large scale of finance for PSD relative to other donors; ii) a long-term presence and iii) the untied nature of the support. Building on that kind of value added may not be easy.
It’s not easy
This is not to underplay the difficulty of measuring the impact of development policy for PSD or otherwise, a major and ever-growing preoccupation of academia and the development community. The challenges were made apparent at our December meeting on measuring impact, where we also discussed tools to support donor efforts such as the DCED Standard for Results Measurement. While not providing the answer, it nonetheless does provide guidance to help ensure that support can be better evaluated.
Other recommendations for the EC’s future support could relate to:
Evaluations are a key part of the policy cycle, allowing governments and policy makers to adjust policy according to the lessons learnt. But that only happens if an evaluation provides concrete analysis. Here was therefore a missed opportunity.
More broadly, a key recommendation would be to “ensure the conditions are in place for an effective evaluation of all future programmes”. Then we might be a little wiser in five years’ time.