Recently a summary of a stakeholder consultation completed on the OECD DAC evaluation criteria for development assistance, was posted on the OECD website. This provides a list of factors considered to merit additional consideration in the DAC environment. These include taking Sustainable Development Goal requirements more into account, acknowledging the complexity of decision analysis and the need for systems models which represent the interconnection of relevant factors. Another area requiring change was cited to be the application of evaluation criteria to policy, programmes, institutional and strategic evaluations and systems. Cross-cutting themes on gender, equity and human rights were also mentioned. The last item referred to the need for support to provide stronger guidance for project implementation.
Although not receiving strong feedback for change, rather the reverse, there has been, so far, little published evidence of additional official feedback, analysis or conclusions. However, a recent report by OQSI1 that will be available through The Boolean Library reviews DAC evaluation criteria as well as the stakeholder consultation summary report (SCSR). OQSI is very much geared to the development of practical solutions to all issues concerning project cycle and portfolio management and this report provides recommendations to improve the effectiveness of the application of DAC criteria covering the areas of change highlighted in the SCSR.
Stakeholder analysisThe DAC stakeholder consultation summary report (SCSR) involved the "evaluation community" which provides professional services to donors, executing agencies and governments in the management of evaluation assignments. Therefore, these practitioners are part of the project and development assistance stakeholder group. Most evaluation assignments are classed as independent exercises, thereby removing any risk of conflict of interest, it is apparent that the lack of personal interest in what is evaluated can, however, lead situations that reduce the quality, effectiveness and utility of evaluation.
The main stakeholders of projects are:
- the teams who design a project
- the funding organizations, the donors or agents who select projects for funding
- the teams who manage a project's implementation
- the beneficiaries
- by extension a sector
- by extension a region
- by extension a country
Although associated with OECD, the origin of the DAC criteria can be traced to the “process approach” as now promoted under ISO:9000 and BSI series, applied by organizations, manufacturers and product and service development teams perfecting the design and delivery of processes, products and services.
The significance of design stage evaluation
The initial and successful applications of these criteria involved tight knit groups within single organizations, collaborating closely as a cluster of specialists, all dedicated to working on a design and testing cycle to perfect some product or service. This approach is the foundation of successful quality management and zero defect manufacturing.
It is worth reflecting upon the main characteristics of this form of development that have contributed to its success. It is a diagnostic procedure that is reiteratively applied to the improvement of the design of a product or process through testing and evaluation. The testing and diagnosis of the outcome of a development cycle is in reality an evaluation. In this system evaluators have a central role in contributing, as members of the development group, to the perfection of the product or service under development. On this basis, “lessons learned” are identified and applied immediately to the next part of the design cycle and not after the product or service is launched. These close-knit teams consist of funder representatives, designers, evaluators, sometimes potential users, workshop personnel and other technicians. This system has proven to be highly successful in terms of the development of reliable minimum defect products and services. This operation is illustrated below.
The most significant benefits of this approach are:
- The establishment of a coherent project memory. This signifies that the level of detailed knowledge of the process and product is high so that someone within the product/service cluster will be able to answer almost any question on any aspect of the current or past status of the cycle. This is because of the team’s close association and necesssity for a joint understanding of all tasks in the cycle.
- Those with a critical function in this process are evaluators.
- The repetition of the same agreed analytical methods covering the cycle steps
- The high standard of monitoring record keeping and documentation as a basic quality assurance procedure which provides any newcomer with a comprehensive review of procedures and any aspect of a system’s development
The case of international development projectsIn the general field of economic development, this type of tight cluster seldom exists. Very often different people work on each part of the cycle, sometimes from different departments and even organizations. An example of typical participants are shown below.
This type of arrangement can result in two challenges.
- A common occurrence is that some members of these groups, during the course of a project cycle, leave for other types of occupation or move within their organization and no longer participate in the project. As a result the parts of the joint project memory can be lost.
- Compensation for loss of project memory is sometimes difficult because project documentation is of an inadequate quality. This is often the result of the multi-departmental or organizational involvement in different parts of the project cycle. This often prevents the production and maintenance of a single coherent centralised and fully up-to-date documentation on the whole project cycle to date.
Analytical methods
Unlike the intensive development cycle in the process approach, where evaluators are part of the team designing and implementing the process, economic development project evaluation aims to ensure independent and unbiased evaluations. Therefore evaluators are usually not involved in project design and other cycle phases. They will often have had no prior contact with those who designed or are managing or have managed projects. As a result it is often the case that there is only a weak establishment of agreed analytical methods because evaluators are often not involved in the original design procedures which will have applied specific analytical methods.
A report by the World Bank Evaluation Group, reviewing the Bank's project portfolio noted that, for example, only around 22% of projects has been subjected to cost-benefit analysis and where this has been carried out, most were not carried out in a correct fashion. If project designers are applying incomplete analyses there is clearly a deficient quality control. However, no propositions were made on how to resolve this important reality.
Invariably evaluators undertake their assignments at the end of a project although evaluations are also carried out mid-term or at other predetermined project “milestones”
Evaluators therefore rely on interviews, meetings and documentation on project design and any implementation records, to come up to speed in understandng the project’s justification/s and planned objectives, and designs as a specified sequence of processes, resources used and implementation performance.
Documentation
For any evaluation process, access to a good project memory and coherent records is of fundamental importance. Project memory and documentation need to provide sufficient information to permit the application of the DAC criteria to the different phases of a project cycle.
Cycle phases There are important phases where evaluation has particular contributions to make to performance analysis. Each phase involves specific types of decisions which need to be evaluated. These phases are:
- Design and project plan
- Evaluation of the project context, appropriate scales and implementation policy frameworks
- Project selection process and decision-maker preferences applied
- Oversight and recording of:
- Implementation and the impacts of change
- Decisions taken in response to change
- The impact on project performance of decisions made
- The overall level of impact achieved
- Future operational sustainability
Each phase involves activities and reporting that are related but quite distinct. If DAC criteria are applied to each one it is possible to obtain a very transparent and useful breakdown of the contributing factors to the overall project performance as a detailed diagnostic evaluation. Each phase is elaborated below.
1. Design and project planOne of the recommendations for change in the DAC stakeholder's review report is:
".........taking Sustainable Development Goal requirements more into account, acknowledging the complexity of decision analysis and the need for systems models which represent the interconnection of relevant factors."
Since 2015 the complexity of decision analysis associated with economic development initiatives has been brought into focus by Agenda 2030. The Sustainable Development Goals with over 230 national level indicators requiring solutions that involve multi-factor analyses and systems models which need to take into account a wide range of specialist topics and analyses. One of the generalised problems with project designs is the significant differences in expertise applied to the design process. This is related to the difficulty facing some institutions in establishing multi-disciplinary teams with all of the necessary disciplinary expertise, and the ability to carry out the necessary analytical methods thoroughly and to a high standard relevant to the domain of the project.
The default situation has tended to be for teams to fill in pre-prepared documentation templates and Log Frames as the basis for presenting a project's design. It is often difficult to determine if those who completed the proposals fully understood the intent of each section. At the extreme, there are cases of proposers regarding the required documents and templates as a necessary chore for having any chance of securing funding.
Within proposals based on conventional documentation alone, the weakest element tends to be the quality of analyses and situations arising where there is difficulty in determining if a project is over-optimistic or under-optimistic.
Over-optimistic projects are destined to fail.
Under-optimistic projects represent a potentially inefficient use of resources because the lower-than-feasible objectives only guarantee a lower-than-achievable impact on development. In the case of under-optimistic projects there is a need to guard against these being default options in order to pass the “cost-effectiveness test”.
The importance of data quality
The ability to ascertain where the feasibility of a project proposal lies depends heavily on the quality of the design as reflected in the quality of information and data making up the evidence presented in the form of benchmarks and analyses. Because guidelines tend to be provided in the form of documents there is usually no additional support for design teams to support the actual application of the necessary analytical methods. Evaluators need to have some means of checking that analyses and their conclusions are correct.
Decision analysis and analytical toolsIf evaluators cannot be present during the design phase activities to provide feedback on developments, at least they need to have a transparent access to how design data was generated and validated.

1. Better design
Analytical tools and due diligence procedures
The use of appropriate decision analysis and analytical tools have an important role to play in enabling a standardisation of analytical methods and recording of results. They enable evaluators to "replay" design assumptions and findings and to assess if calculations are correct.
Analytical tools need to be "sequenced" to ensure that all relevant factors are taken into account. Therefore their application needs to be linked to a standardised sequential due diligence design procedure to ensure the avoidance of data gaps. The due diligence procedures need to cover global level or national analyses of gaps, the target commmunities with most need and the constraints that have caused gaps and that need to be managed by any solutions as baseline information to identify a basic feasible solution (BFS) which is then optimised applying analytical tools in terms of costs, efficiency, effectiveness, likely impact and sustainability based on sensitivity and dependency analysis. This helps evaluators to detect any gaps in analysis that may have occurred. |
|
|
2. Evaluation of the project context, appropriate scales and implementation policy frameworksAnother recommendation for change in the DAC stakeholders' review report was:
".......the application of evaluation criteria to policy, programmes, institutional and strategic evaluations and systems."How a specific objective such as Sustainable Development Goal will be accomplished depends upon the operational environment and resources availability. This involves assessing the scale of the required solution and a review of the best options including a single project, a programme of multiple distributed projects and an appropriate supportive policy framework.

2. Projects, programmes and policy frameworks
High quality documentation with evidence-based benchmarked information
The introduction of analytical tools and due diligence procedures, improves the transparency and quality of data used in a project design model. Analytical tools can generate report and proposal content as high quality documentation for helping selection committees distinguish between project options. Some analytical tools provide narrative analysis in natural language which help prevent misinterpretation of complex analyses.
If one takes into account the gaps, needs and constraints analysis that should be key phases in a due diligence design procedure it is evident that on the completion of a design it is possible to assess and compare the options for varying the basis for resolving an issue such as an SDG. The options involve such things as a single large project, a programme of several projects and various options for policy incentives to involve other actors in contributing to an initiative. Each of these options can quantify the likely time requirement to change national indicators, the required budgets and therefore the strategic options. However, this important level of optimisation of initiative beyond the frontier of a single project requires computer-assisted decision analysis models and analytical tools. |
|
|
3. Project selection process and decision-maker preferences appliedAn important stage in a project cycle is clarity and transparency in the selection process for project designers, donors and selection committees. This specific issue is not dwelt upon in any way in the DAC stakeholders review report but it is an important decision-making step that involved evaluation, and evaluation that sets in train a course of action which will be evaluated at some later stage. Clearly this selection, baseline evaluation is of fundamental importance in the selection of feasible projects.
Funding organizations and donors should have access to as much well-ordered information of adequate quality upon which to base their selection of projects they consider to be worthy of support. There is a need to demonstrate a design procedure that has reviewed gaps, needs and constraints and to provide a transparent process of identifying solution options with recommendations for what is considered to be the best solution.
The OQSI report observes the lack of specific attention that has been given to project design and selection. These are considered by OQSI to be fundamental issues that establish the operational framework for a project's implementation.

3. Project selection
High quality documentation with evidence-based benchmarked information
The introduction of analytical tools and due diligence procedures, improves the transparency and quality of data used in a project design model. Analytical tools can generate report and proposal content as high quality documentation for helping selection committees distinguish between project options. Some analytical tools provide narrative analysis in natural language to prevent misinterpretation of complex analyses. The creation of such a high quality contribution to the project memory should result in more feasible projects, selected on the basis of an evaluation of the evidence presented. Subsequent evaluations are in essence evaluations of the selection process, an evaluation of an evaluation. Therefore records of proposal assessment outcomes and feeback should be part of the project memory to enable the detection of any oversights or misinterpretations linked to quality of information and the assessment decision analysis process. |
|
|
4. Oversight and recording of:
i.Implementation and the impacts of change;
ii. Decisions taken in response to change;
iii. The impact on project performance of decisions made;
iv. The overall level of impact achieved.
The DAC stakeholders' review report does refer to necessary changes with respect to:".......the need for support to provide stronger guidance for project implementation"

4. Oversight and recording of: i.Implementation and the impacts of change; ii. Decisions taken in response to change; iii. The impact on project performance of decisions made; iv. The overall level of impact achieved.
Most significantly a well designed project model is an important resource to be used to guide decisions made during implementation of a project. This can be developed through a more rigorous analysis of the potential impacts of changes (both internal and external) that might occur during implementation. Sensitivity or dependency analysis can help identify those projects options that are more resilient to expected changes. All of this knowledge becomes an essential part of the project memory (see below) to be called upon during implementation to guide decisions in response to changes in real time. This type of investment in raising the quality of real time decision-making during implementation has the purpose of reducing the number of lessons learned that appear at the end of a project rather than being applied during the course of the project to the benefit of it's overall performance.
Real time oversight in the form of Real Time Audit, can provide real time analysis and reporting on any aspects of project performance.
It is important to ensure that as each activity concludes that an internal monitoring and evaluation report on the results achieved and identification of any causes of variance with initial expectations be prepared. |
|
|
Project memoryBy linking the results of decision analysis models and analytical tool calculations to a database all design team members can participate in the scenario simulations during the design phase. It is well-established that the impact of instructional simulation, or educational simulation, on teams can be quite profound in raising awareness of critical relationships that determine project performance. This understanding is fundamental to the exploration of options and the identification of solutions from the standpoint of DAC criteria of relevance, effectiveness, efficiency, impact and sustainability. With all such analyses being kept on record the project design approach can establish a very well-informed team and a comprehensive centralised reference documentation. Not only are these vital sources of information for evaluators but, as mentioned, they help improve the quality of decision-making during implementation.
The effective participatory development of the project memory helps make team members aware of the probabilities of activity success and as a result internal monitoring and evaluation reports prepared by them can promote a more productive analysis of events with which they are intimately involved. Objectivity can be maintained by concentrating on quantitative indicators.
When external evaluators have access to the project memory and internal M&E reports they not only have access to particularly informative high quality documentation but will be dealing with a well-informed team able to add relevant details to the process of diagnosis or any questions put to them by external evaluators.
5. Future operational sustainabilityNormally, evaluation is applied as an external exercise involving "independent" evaluators applied to the funding periods of projects. Funding agencies often set limits on their evaluations to the specific times during a project implementation as well as post-implementation evaluations. However, the fund of knowledge gained during implementation can provide a detailed review of the profile of needed adjustments so as to improve performance beyond the funding period. This would be normal in investments that are established to be self-supporting. In these cases the evaluation reports and the project memory resources can provide a useful fund of knowledge for helping to optimise future operations and maintaining sustainability.

5. Future operational sustainability
All evaluation assignments should be required to produce a sustainability report which identifies those adjustments required to raise the probability of adequate performance in the future. In this case lessons learned would be assessed by project managers and where considered to be valid, applied immediately. In this way "lessons learned" are more likely to find immediate beneficial application as opposed to being applied to a subsequent project or simply being archived. |
|
|
Lastly, all cross-cutting issues should be introduced as constraints so that project designs address the specific needs that are identified. In this way project designs accommodate these by creating specific actions as components of a project's actions so as to address each one in a positive fashion. This embedding of such issues would address the DAC report item:
"....cross-cutting themes on gender, equity and human rights were also mentioned."In this context and referring to Sustainable development Goals, which cover many cross-cutting issues as mainstream objectives, there is a need for more detailed specifications of the significance and meaning and different contexts of gender equality, equity in human rights and inequality. These are topics of a forthcoming OQSI report on the application of analytical tools.