Home page




Leading Issues







DIO
Development Intelligence Organization


Open Quality Standards Initiative's project due diligence design procedures - are they practical?
Part 3


Reproduction authorized by DIO and APEurope.org


The Development Intelligence Organization's annual "Leading Issues" workshop took place in Alexandria, Virginia, 9-11 August, 2019. The Keynote address by Hector McNeill of the George Boole Foundation (GBF) was extensively discussed. This outlined a set of new evaluation criteria for projects, programmes and policies, in the context of Agenda 2030 Sustainable Development Goals. In particular, these have been designed to accommodate and adjust project design and implementation management to support Reduced Inequality (SDG10), Responsible Consumption and Production (SDG12) and Climate Action (SDG13). Hector McNeill, the Director of GBF, is a leading development economist who has worked in this field now for over 50 years. He is the world's leading developer of the Real Incomes Approach to economic development having initiated this work in 1975. This specific topic was covered in a separate interview that took place in 2015.

This interview, is divided into three parts. The first part deals with the problems the OQSI work has been designed to solve. These relate to gaps in current practice that exacerbate the state of affairs of SDGs 10, 12 and 13 as well as many others. Part 2 covered some of the shortcomings in common administrative arrangements and procedures applied to project cycle management. In this, Part 3 OQSI proposed solutions are described and I probe to test their practicality.

Nevit Turk
Economics Correspondent
Agence Presse Européenne, Centre Montparnasse
Paris
email: nevit.turk@apeurope.org
January, 2020    




Hector McNeill
Nevit Turk: We have discussed issues needing to be resolved with respect to the disappointing performance of SDG 10 reduced inequalities, SDG 12 responsible consumption and production and SDG 13 climate action and in this section of our interview I hope to understand how the OQSI recommendations on procedures and criteria can help contribute to resolving this issues. As a reference I wish to maintain the linkage to the way in which these contribute to resolving the issues posed by poor performance of SDGs 10,12 and 13.

Hector McNeill: In the previous sessions we touched on the need for projects, programmes and policies to act in a coherent fashion to deliver SDG targets. In the pre-Agenda 2030 environment, projects were drawn up in response to existing policies. Since 2015, this has undergone a transformation as a result of the complexity of the analysis required to come up with optimized SDG solutions. There is an issue with respect to too few qualified human resources to cover the range of disciplines and experience necessary within project teams to design and manage multi-project programmes created to serve specific SDGs. Most project cycle management guidelines emphasize the creation of a plan and time line linking activities. Quite often the information on any gaps and needs and constraints analysis is absent because the need has been declared in a policy document. However, in order to optimize in the sense of maximizing the effective use of scare human and financial resources, each project needs to adapt the policy objectives to the specific conditions of a project environment and the characteristics of the target community it is addressing. The current multi tiered administrative structures associated with project identifications and design through to implementation lack coherence caused by communications issues and documentation of variable quality. This contributes to fragile project memories.

The OQSI considers the establishment of a high quality project memory to be an essential requirement for a coherent documentation of project identification, design and implementation to act as a central real time reference for all concerned with a project or programme. However, the management of the development of the project memory needs to be organized and managed at the project level so as to minimize the atomization of information that typifies the conventional administrative systems deployed today. Therefore the necessary procedures, methods and relevant advanced analytical capabilities need to placed into the hands of the communities and project teams concerned. There is a need to remove the barriers that divide those who should be closely involved as a cluster around specific issues.

I would add here, that part of the problem is the concentration of some types of expertise in places such as universities and research labs whose full potential would be greatly enhanced through more horizontal work with other disciplines and above all people with established practical project level experience, including stakeholders from the target communities. So an effective solution needs to support multi-disciplinary and multi-competence systems groups to identify and design solutions.

Nevit Turk: That appears to be quite a challenge are there any examples of this type of operation?

Hector McNeill: Yes there are in the form of agricultural extension systems. The performance of agricultural extension systems is highly variable according to government policies. Those geared to the support of research and development and dissemination of practice for nationally important export commodities tend to be well supported whereas smaller family farming sectors tend to be less well supported. Under Agenda 2030 most of the issues facing delivery on SDG performance are linked to the lowest 40th percentiles in terms of consumers income levels, farm sizes and aggregate farm incomes. So the issue here is to bend more relevant knowledge and analysis to address the issues facing these segments. The OQSI model, as part of the George Boole Foundation's function as an extension system, is designed to support this philosophy.

Insert note: There is a general description of extension systems here and a Boolean Library "Short" (slide show) on the role of the extension model in the application of environmental data here - Nevit Turk)
So the basic idea is to transport the whole organization of solutions to where they are needed while upgrading the quality standards of work to improve the performance of projects, gain sustainability and reduce waste. This is now totally feasible with the W3 and state-of-the-art scripting.

Nevit Turk: Can you explain how this will work in practice?

Hector McNeill: The best way to explain this is to describe the system that has implemented the OQSI recommendations. This is the SDGToolkit which should come online by the end of June 2020. This is a cloud-based application that is in essence an advanced project cycle management system. However there are significant differences. SDGToolkit is designed to provide an extension support to all users through well-supported online help and a technical support service. So the system doubles-up as a project cycle management system and an in work training system designed to enhance understanding and competence in applying relevant concepts. There is sufficient support to enable people from different disciplines to build up a common operational reference medium so as to improve cross disciplinary communications. For example different disciplines observing simulation outputs all learn quite quickly what the common goals are as opposed to each pulling in different directions. Indeed, because the system is so accessible a wider range of people can use the system so as to help balance up the range of inputs as well as foment a growth in understanding of an objective and its solutions across a broader community, including stakeholders.

The SDGToolkit will deliver three basic packages each containing several analytical tools. The packages are:
  • GCA - Global Constraints Analysis
  • 3DP - Due Diligence Design Procedures
  • RTME - Real Time Monitoring & Evaluation
Conventional project cycle structure

OQSI project cycle structure as implemented in the SDGToolkit


GCA-Global Constraints Analysis including DIM-Dimensioning;   
3DP-due diligence design procedures;   Focus on design of a specific project or programme;
RTME-Real Time Monitoring & Evaluation creation and maintenance of Project Memory.


Nevit Turk: Could you elaborate in the function of each package?

Hector McNeill: OK. The Global Constraints Analysis (GCA) toolbox contains a series of analytical tools concerned with the establishment of the global constraints facing a nation. The GCA does not rely on the SDG indicators because many low income countries do not have all of the required data, in any case 65% of the indicators for climate action, SDG 13 and responsible consumption and production, SDG 12, have not yet been specified.

The concept of the GCA is basically for local teams to rework the national analyses that exist or have been presented as policy jusifications so as to build up an understanding of the existing constraints in the context of gaps and needs and to eventually relate these analyses to the reality on the ground in the area of interest of a community. I should mention that the GCA analysis starts at national level and then dimensions the national issues under question into macro-dimensioned actions. This helps identify and estimate the size of the target communities and their characteristics. At this point a lot of information can be generated of interest to policy makers since the dimensioning can help indicate whether a single multi-project programme is required or a series of localized projects. Depending upon the decision on this questions, policy-makers can rationalize their policy actions in relation to the provisions of incentives to gain traction in delivering results in the area of concern.

The tools are grouped into related sets under the general titles of:

  1. Population projection
  2. Comparative national projections for current and desired pc consumption levels
  3. National consumption transition profiles from current to desired pc consumption levels
  4. Current national commodity balance sheets
  5. Estimates of national production areas required to meet consumption needs
  6. Sensitivity of real income to population dynamics and inflation
  7. Disposable real incomes and unit prices
  8. Target gross margins
  9. Critical production constraints and input factor costs
  10. Production areas required, exploring solution options
  11. Gross budgetary requirements
  12. Financial appraisals of identified solutions
  13. Policy framework and instruments
  14. Policy portfolio management, monitoring and evaluation

Although there are 14 groups these link to over 30 analytical tools and the Critical production constraint and input factor costs - group 9 - links to the 3DP series which contains over 40 analytical tools.

Nevit Turk: That sounds impressive but how do users access this system. What equipment and downloadable programs or apps are required?

Hector McNeill: This system is designed to be fully accessible to practitioners in low income countries so there are no software requirements other than a browser and the system runs on mobiles, tablets or PCs. Naturally there is a need for internet access but the bandwidth requirement is very low since all of the calculations and projections are generated on the central servers so the overall specification of the receiving device can be quite modest and therefore cheap.

Nevit Turk: Just to clarify, why are tools grouped in this way, what, for example might make up a group and in the case of group 9, what are the 41 analytical tools you refer to?

Hector McNeill: Under say group 4, concerning national commodity balance sheets, the analytical tools cover Crops, including Grain, Oilseeds, Orchard produce, Roots, Soft fruits, Vegetables, Vines and Beverages as well as Animal produce, including Meat, Milk. These deal with raw equivalents of each commodity complex so , for example, soft, hard cheese, cream and butter are all converted back into milk equivalent according to the type of animal. As you will appreciate there are many types of grain or oilseeds and even more orchard crops, roots, soft fruits and vegetables. Beverages include such commodities as tea and coffee.

Under group 5 which relates to estimates of national production areas required to meet projected consumption needs generated in Group 4 - related to population growth and per capita requirements - enable the estimation of require production areas based on the productivity across poor, average and good yields in Rainfed, Field-based protected, Irrigated, Covered closed circuit systems, Shifting clearance and cultivation production systems. All individual analytical tools permit the ability to alter inputs in order to simulate different scenarios so across all of the groups and the analytical commodity targets or options there is a very wide degree of freedom which results in their being thousands of feasible input combinations and generated projections.

Nevit Turk: Isn't that going to cause a good deal of confusion?

Hector McNeill: This is the usual first reaction, but
3DP - Due Diligence Design Procedures

Pre-registration of project to associate data

1. Registration

Social, economic, environmental and ecosystem

2. Population
3. Culture
4. Education & training
5. Diet
6. Health status
7. Economy
8. Income distribution
9. Inflation
10. Factor markets
11. Produce markets
12. Logistics
13. Environmental issues
14. Bioclimatic factors
15. Renewable NR-based activities
16. Technology & techniques
17. Water
18. Soils
19. Energy
20. Ecosystem issues
21. Carrying capacity
22. Sustainability
23. Other locational factors
by starting with the constraints established by population dynamics, inflation, GDP growth rates, interest rates and per capita incomes and distributions one ends up with a very realistic picture of the operational constraints envelope within which a country needs to find solutions. This imposes a realistic limitation on explorations undertaken to find solutions. This is one of the objectives of the GCA  through a stepwise revelation of the reality facing a country that is reasonably accurate. This can be missing from policy documents which sometimes provide no quantitative parameters resulting in misdirected projects. Most evaluations completed so far have confirmed that many of the steps are missing from conventional project cycle guidelines. The effect of the GCA approach is to eliminate or reduce over-optimistic and unrealistic project proposals that are destined to fail and waste valuable resources.

I should add that the Development Intelligence Organization (DIO) has decided to make use of the GCA to handle the statistical series generation for a series of publications they intend to launch next year on the sustainable development prospects of each country.

Nevit Turk: That is an interesting point. I will follow up with DIO next year to enquire as to the way they are applying SDGToolkit. But I am curious, what are these 41 analytical tools in the 3DP?

Hector McNeill: The 3DP is a due diligence design procedure that follows a stepwise analysis to focus the results of GCA which is not neccesarily concerned with any particular project it simply establishes the focus on the sort of thing that is feasible either as a single project or multi-project programme. On the other hand, the 3DP can be used to prepare the design of an identifiable project with a title and internal ID number. I mentioned in our previous sessions some of the issues that lead projects astray, for example inappropriate administrative procedures imposed on project teams. The 3DP is designed to identify each of these in order to ensure they are embedded in the final project plan and time line.
3DP - Due Diligence Design Procedures

Recorded selection preferences

24. Eligibility
25. Action objectives
26. Constituent membership of team
27. Financial criteria
28. Environmental criteria
29. Other selection criteria

Administrative constraints

30. Legal & regulatory
31. Levies and taxation
32. Financial transfers
33. Procurement
34. Accounting issues
35. Technical and financial audit

Additional information

36. Relevant projects & initiatives
37. Any other matters

Utilities

38. Data
39. Financial

References

40. SDG indicators
41. Evaluation criteria
The 3DP is divided into seven sections. One to pre-register a project to provide a title and internal identity number to which all of the 3DP data will be attached within a database. A second section covers the information on Social, economic, environmental and ecosystem details. The third section covers Recorded selection preferences the project proposal needs to meet, the fourth covers Administrative constraints such as financial controls and procurement. Additional information is contained in the fifth section and then the last two contain utilities and reference links. In the utilities section there is access to all of the reports, graphs and projection tabulations generated in the GCA which is used to justify statements made in relevant sections of the 3DP. The other utility is a very comprehensive financial analysis tool for completing cost benefit and cost effectiveness analyses on options under consideration. The reference section contains online lookups for the SDG indicators and evaluation criteria.

Nevit Turk: This seems to quite extensive I wonder if people will think all of this information is necessary. So how is this information arranged as a project memory?

Hector McNeill: The procedure is surprisingly easy to complete and there is adequate onboard help. Much of the information can be pulled across from the GCA. The important point is that gaps and oversights are avoided, especially on those areas that are often overlooked.

The project memory is managed by the last package in the form of the Real Time Monitoring and Evaluation system which is used to input the final project design and details of activities, timing, resource requirements and the like.

The RTME sets up the final design option which becomes the plan and time line. The projected performance parameters become the benchmarks against which performance will be evaluated.

Nevit Turk: In a conversation with the editorial of Agricultural Innovation there was mention that OQSI doesn't follow OECD DAC evaluation criteria, is that the case?

Hector McNeill: The George Boole Foundation is an extension service concerned with the dissemination of best practice and delivery of training to support better quality projects and more effective implementation management. Therefore the OQSI focus is on the project teams and communities served. In the context of the SDGs that are not performing we have changed the list of evaluation criteria. We accept all good practice and amongst these I would include OECD DAC evaluation criteria. I don't know how OECD operate and we are a separate private non-profit. Based on our cumulative field experience we have added what we consider to be additional relevant criteria. As you know the current criteria list is: relevance, efficiency, effectiveness, impact and sustainability to which we have added coherence and resilience. Coherence is an obvious requirement both in terms of project memory components needing to be complementary and parallel projects needing to support, rather than frustrate, progress in other projects in a programme. Resilience allows design teams to assess risk and to have their design solutions be assessed in terms of resilience or degree of risk.

The OECD DAC evaluation criteria are stated to be normative by the OECD DAC. However, they also seem to indicate that these criteria be applied to different circumstances at the discretion of the evaluator. In effect this undermines the effectiveness of the criteria because of the very different personal experience, training and capabilities of evaluators. In the prototyping trials of the RTME system it was noticed that what is being evaluated at each stage of the project cycle is not the same type performance target. Therefore the evaluation criteria need to be applied in a different way because the targets change during the cycle.

Nevit Turk: Could you give some examples?

Hector McNeill: The evaluation of design decisions and a final design is an assessment of evidence and coherence between that evidence and what is projected to be feasible. The comparison of an activity performance and the original benchmarks is a simple comparative exercise. If things change during implementation, the emphasis under OQSI recommendations is to take decisions to resolve any issues in real time so as to keep a project on course. Therefore the project memory needs to record the changes, the decisions and then assess the decision outcome. Each of these steps related to an implementation decision involves a different type of evaluation.

In fact the production of an evaluation criteria guidance to maintain coherence and comparative evaluation records is no an easy task. It needs to be intelligible, easy to measure and record; we are working with OQSI on this who intend to place this recommendation online as a global reference.

Nevit Turk: That is really interesting. I have never come across this specific topic before. I wonder why this was not raised many years ago. It seems to me that the take away here is that evaluation under current procedures does not address all of the key issues.

Hector McNeill: You may recall that in the multi-tiered administrative structure the process of evaluation is a separate process. This is related to the notion that evaluation needs to be independent and unbiased. This reinforces their isolation. But this approach is only as successful as the quality of information made available to evaluators. We know that documentary quality is not the best and evaluators are seldom involved in project design. There is also another issue which varies according to the project and circumstances. This is that evaluation schedules are often not really part of project design but is usually a framework to be applied to a project externally. This has several consequences. Project teams are less concerned with the evaluation process and when external evaluation teams visit projects there can be communications problems in terms of the team members views on what constitutes performance. Occasionally assessment can be interpreted as criticism of professional competence. This can occur when an evaluator is unaware of aspects of a projects design in relation to constraints that were the cause of some performance deficiency and where there is a perception that the individual responsible for managing the activity concerned is being held responsible for the performance.

Nevit Turk: That would appear to me to be a common problem. Is there any way to improve the quality of information for evaluators because they have no previous involvement in a project?

Hector McNeill: The RTME system contains a utility for project teams to not only monitor progress but to also complete evaluations. Unless teams are required to do this, decision-making becomes defective. At the extreme, there are examples where teams have waited for interim evaluations before taking important decsions. Because of the delay between an event and evaluators turning up can be prolonged, the normal result is to downgrade performance indicators to keep the project alive. Teams should have the information and confidence to undertake timely decisions without resorting to external evaluators. This is caused by an inevitability resulting from time delays, lack of full involvement of the team in decision-making and a default outcome condemning the project deficient performance or even failure. Because of the multi-tiered nature of oversight this sort of thing is more common than many will admit.

In order to avoid this type of situation the OQSI philosophy is to have project teams brought up to a highly competent level of professional operations by using analytical tools. A fundamental underlying issue is the importance of separating the logic of evidence-based design, based on decision analysis and a sound understanding of information quality and of the probabilities of events or uncertainty. In other words teams do their best to identify and design feasible projects within existing constraints. By basing design and management decision making during implementation on these principles the team becomes used to observing unexpected events in the design stage, in a simulation for example, from the perspective of additional knowledge as opposed to being a mistake of the person who has input a particular set of data. On this basis a team takes a dispassionate ownership of the project with any emotional commitment being more related to the desired outcome. This philosophy is promoted by providing teams with sound analytical tools and guidance so that their work is highly productive. The RTME system requires that team members undertaken evaluations on designs, to note feedback at proposal assessment stages, and to evaluate all activities as they conclude, record changes in circumstances, decisions, decision logic and decision outcomes. The spirit of this process is essentially one of discovery and fundamentally one of learning as a basis for the gaining of practical experience contributing to professional competence.

This leads to real time decision-making and a likelihood that projects designed and implemented on this basis will be more successful.

Nevit Turk: This is really interesting but what about the need for independent evaluation, where does that fit in?

Hector McNeill: These continue as always, however, under this system they will not only have the highest quality documentation to refer to but also a highly motivated and well-informed team to communicate with. In this way, external evaluators will be able to learn more quickly critical issues but also be in a position to prepare very informative, more analytical and useful reports.

Nevit Turk: I have found this to be really interesting and not what I expected when I started these interviews. I do keep abreast of this are of development but have not come across this approach before. There was a consultation on DAC evaluation criteria completed last year but that seemed to suggest no significant changes were needed. However, I think this OQSI approach seems to be more thorough and I particularly liked the handling of team competence in this way because you did point out the issues of the deficit in adequately trained human resources. It occurs to me that this has implications for development funding organizations and donors, could you mention what these might be?

Hector McNeill:
The OQSI Critical Path for Climate Action

International development organizations, donors and executing agencies can operate at a far more efficient level if they support this approach and benefit from oversight based on access to the project memory. All of the packages provide access for authorized individuals to gain on-demand analyses and reports on any aspect of the project cycle. In terms of national governments and executing agencies the 3DP and RTME can be used to design and manage any number of projects at the same time. So an agency that needs to coordinate several projects, even of the activities are quite different, can gain immediate access to project level documentation. Therefore, rather than relying on slow administrative processes and multi-tiered reporting structures, a portfolio manager in a donor organization can access in real time to check on any aspect of any project. Because the internal monitoring and evaluation aspect keeps this level of analysis up to date, this obviates the need to await an independent evaluation. In essence the independent evaluations would only have the function of validating existing internal evaluations. The savings in administration time and human resources, economy in travel and carbon foot prints and the cost of this on the job training are very significant and yet the quality of design and documentation is far superior. The transfer of know-how-do to where it is needed can also be significantly increased.

Nevit Turk: Looking through my notes, there is an outstanding issue here which has been slightly lost in this interesting exchange. You mentioned previously the need to combine a balance between the economic rates or return and the rates of return to the environment as a basic condition for addressing SDG 10, SDG 12 and SDG 13. It is still not clear to me how or where this is managed within the OQSI system.

Hector McNeill: In reply to your question concerning OECD DAC evaluation criteria I should have added that in addition to coherence and resilience being added as evaluation criteria, there has been an addition to what is referred to as balance or equilibrium criteria that appear under the heading of sustainability. These are commonly levels of acceptability of social, technical, economic, financial, environmental and ecosystem considerations of sustainability. In order to link these sustainability criteria to climate change (SDG 13), a score is applied as a measure of the impact of a proposed project profile to the carrying capacity of the environment in terms of those determinants that also affect climate. As we know current practice is depleting carrying capacity and impacting climate negatively so SDG 12 and SDG 13 go backwards. The carrying capacity impact is ranked as positive, neutral or negative. The OQSI has arranged all of these criteria into a critical path which defines, for a given constraints envelope specified by the GCA, the boundaries within which any project must operate in order to have neutral or positive impacts on climate action. Any project showing a potential for a negative impact on carrying capacity is rejected. The carrying capacity impact needs to be neutral or positive. The actual measurement of this impact is calculated on the basis of rates or change in the carrying capacity or rate of return to the environment (RRE). In the 3DP procedures that economic rates of return (ERR) are handled under financial criteria (group 27) and rates of return to the environment (RRE) are handled under environmental criteria (group 28). The narrative describing the impacts are in group 21.

In reality to undertake this effectively one is relying on seven performance criteria and seven equilibrium, or trade-off criteria, that is fourteeen in all, and not just the five commonly referred to.

Nevit Turk: This has been very informative. There is a last item which maybe does not relate directly to the OQSI recommendations. This is the issue of the reduction of inequalities (SDG 10), which I think in this case translates into inequalities in income levels. Could you say something on how this is to be tackled?

Hector McNeill: We are working with the Development Intelligence Organization who support the principles of the Real Incomes Approach to macroeconomic policy. This approach emphasises supply side production and questions of learning, the acquisition of tacit and explicit knowledge so as to advance productivity through innovation. The SDGToolkit has an important role here. But at the policy level the provisions of incentives can make a large difference to this process so that the real incomes of producers and the employed can rise so as to improve the quality of consumption and to reduce income disparity. The DIO will be publishing a series on sustainable development prospects for each country and these will contain policy suggestions depending upon the particular circumstances of each country. I know that one of the important aspects of this work is the deployment of strategies that minimise the need for debt.

Nevit Turk: Do you know when these reports will be released?

Hector McNeill: In the last tentative discussions on this topic it seemed to me that they will initiate these reports as a series in mid 2020, this is roughly time the SDGToolkit will be launched.

Nevit Turk: This has been a very interesting interview. Thank you.

Posted: 20200412
We welcome questions and feedback:

To submit questions or comments on the contents of this article please contact the author or main reference source by email.
The relevant emails are provided below:

  Author:   Nevit Turk:     nevit.turk@apeurope.org         Source:   Hector McNeill:     hector.mcneill@boolean.org.uk