It is always good to draw lessons that allow us to improve current and future strategies by evaluating past actions. In this regard, the report ‘Evaluation of the European Union’s support to private sector development in third countries’ is important to help our understanding of best practice and lessons learned. In line with Eurodad concerns, the report highlights some key failures in EU support to private sector development (PSD), particularly regarding implementation, added value, impacts and monitoring and evaluation. However, it overstates the role of blending mechanisms as the key added value of European Commission (EC) aid when there are still clear problems in evaluating their impacts.
This evaluation is broad in its coverage, as it includes “all support provided during the period 2004-2010 in all regions where the Commission support was implemented”. In terms of the funds, it includes important instruments, such as the European Development Fund (EDF), the Development Cooperation Instrument (DCI) and the EC’s budget support programmes. On the whole, it concerns a total of €2.4 billion of direct support to the private sector by the EC over the period covered.
How does EU support to PSD work?
According to the evaluation, the EU “is now equipped to address quite comprehensively the range of PSD needs in the different regions,” due to the wide array of mechanisms at its disposal, such as bilateral support, regional and centralised programmes and regional blending facilities. However, the evaluation concludes that “there is little evidence of a structured EU approach to exploiting the potential and complementarities of the set of support mechanisms and aid modalities at country level in support of the private sector”.
As Eurodad has recently pointed out, access to finance, particularly for small and medium-sized enterprises (SMEs), is often seen as a key constraint in many developing countries. However, the EC is still struggling with how to deliver aid money to this sector: although access to finance was the second largest area of EC direct support over the evaluation period, “Commission activities were not based on strong diagnostic approaches, and there is some evidence to suggest that the relevance of these particular activities suffered as a result”.
According to the evaluation, the impact of EC activities was mixed: “some activities at macro-level, including institutional and regulatory reform, showed positive impact, but others less so; at meso-level, significant capacity building support was given to financial intermediaries; but little evidence was found of improved access to finance by SMEs”. This casts doubts on whether these public resources are targeted in the right way to the sector most in need.
What is the EC’s real added value?
The evaluation looks at different types of EC value added. There are two important things to mention here.
First, controversially, the evaluation concludes that “a key value added provided by the Commission was that its grant money could be blended with loans.” The reason for stressing blending mechanisms lies on the assumption that the EC can leverage investment provided by international institutions and even private actors. However, this conclusion does not consider that leveraging entails big challenges, some of them in terms of assessing financial and development additionality of the public resources.
Second, the evaluation attempts to address the financial additionality by analysing “whether the EC was not displacing other players when providing such support, in the first place the beneficiaries themselves”. Once again, this point relates to Eurodad and others’ concern about using scarce official development assistance (ODA) to ‘crowd out’ or replicate existing sources of finance and to worries about using ODA to subsidise private sector activities.
Given the relevance of the issue, its conclusion is surprising: “Commission documents provide little information on the importance of avoiding support that could also be provided by the beneficiaries.” The appendix also mentions that “most reports did not address the extent to which beneficiaries could or could not implement a programme without the Commission’s support.” This lack of clarity indicates that the EC’s private sector interventions potentially do not have any clear additionality, either in terms of finance or in terms of development impact on the local economy of developing countries.
What about coordination, impacts and monitoring?
The evaluation also draws conclusions about key issues, such as coordination of EU instruments, impacts and monitoring and evaluation. Its conclusion rightly points out some of the major problems already highlighted by Eurodad and our partners. Here are some quotes to illustrate these points:
• On the architecture of aid: “Most support mechanisms had their own logic and mode of operation, with little internal coordination. The portfolio of PSD support in a country often stemmed from a juxtaposition of activities rather than from a structured PSD strategy with logical sequencing and distribution.”
• On maximisation of impact: “The EU’s PSD support in the different countries responded to needs but was generally not part of a strategy aiming at maximising the EU’s impact through clear prioritisation, a focus on value added, and on synergies with other actors and activities.”
• On Monitoring and Evaluation: “The EU carried out a wide range of monitoring and evaluation activities… Nevertheless, it remained difficult to obtain a clear and complete picture of the observed results, notably because of weaknesses in terms of monitoring and evaluation.”
Key recommendations and next steps
In a nutshell, the evaluation recommends that “the EU should continue to be a provider of a wide range of different types of private sector development support”. This broad statement gives the authors the opportunity to recommend, among other things, that “adopting a ‘generalist’ approach should not preclude the Commission from making sure that the conditions for maximising the impact of its PSD support are fulfilled, notably through diagnoses, prioritization, coordination between EU support mechanisms and appropriate M&E practices”.
The results of the evaluation will be officially presented on 22 April during a seminar aimed at a large audience, including member states representatives, European institutions and embassies of the countries where case studies were undertaken. So far, civil society organisations have not received an invitation to this event. We certainly look forward to discussing the findings of this evaluation with all relevant stakeholders.