Agricultural Innovation



  Home page



Second International Workshop on Analytical Tools

Nevit Turk
Economics Correspondent,
Agence Presse Européenne.

This workshop, as announced, was relatively small and limited to a review of operational analytical tools with deliverable impacts on agricultural project design and decision analysis. This was limited to the output of SEEL who have been in this business since 1983. Some of this work is what provided the baseline logic and technology for the new SDGToolkit.

The basic theme of this workshop was "moving away from interesting concepts and to delivering practical beneficial and measurable impact"


Introduction to the evolution of analytical tools (ATs)

The introductory presentation by Hector McNeill described the likely direction in which the use of ATs was evolving. ATs are defined as software modules that complete logical analytical procedures on specific topics. They are simply algorithms fed by data inputs and they generate projections, tabulations, graphs and narratives, in the case of more advanced tools. Some tools are a set of essential questions on a form which require answers. In this way they prompt users as to essential information that needs to be collected and analyse before progressing along a due diligence design procedure. For small teams this is important so that there is an awareness of what to ask domain experts. In this way the work involved is better defined saving time and money. Questions that cannot be answered can be left until later but need, where possible, to be answered.

McNeill referred to the "early days" of "canned" stats programs which, quite often, were applied to complete analyses for which the programs were not designed to address. This could give rise to misleading claims often based on the misinterpretation of assuming correlation signifies cause and effect.

Since early studies in the late 1960s too little data collected in the agricultural sector, including experimentation, took into account the most significant causes of production variance such as locational-state bioclimatic variables and terrain characteristics. Even now, 50 years later with the development of agroecological zoning and increased awareness of climate change there remains a significant mismatch between what is being measured and analyse and the practical problems facing projects and farmers in the context of climate change.

The need to move from random to systematic sampling

The statistical issues surrounding survey design remain an essential tool in minimizing survey costs and maximizing the quality of relevance of data. With relevant data, ATs can do a effective job in supporting decisions. However, the quality of data used as inputs needs to be guided by other due diligence procedures applied to data collection. It is notable that there is a confusion between sampling being a process of randomized selection, which is blind to ensure no bias, of sample points as opposed to a more systematic procedure of stratification requiring an informed logic based on production site or farm typologies based on knowledge of the actual target population. In the past, project and farm typology for stratification and sample weighting was limited to just project or farm size and production systems. An example of this has been the Farm Accountancy Data Network (FADN) system in the EU. Today, this is insufficient and the factors that define types needs to be extended. In statistical terms, random sampling reduces the proportion of explained variance and augments the proportion of unexplained variance, for example, in the factors determining yields. With good typologies it is possible to augment explained variance substantially which becomes a central factor in possessing sufficient data to advise project managers or farmers on what decisions to take or plans to execute to sustain cash flow. to include:

Crop yield
Locational-state bioclimatic projection



With random sampling it is difficult to augment the number of factors to be analysed to explain yield with precision and thereby provide good decision making advice. With typological systematic sampling the 3-factor yield surface illustrated above can be transformed into a precise 10-factor surface with very high explained variance. This improves diagnosis and enables effective and efficient adjustment to practice.
Please note: When first posted the above diagram had an incorrect water axis. We have corrected this in the above.
Editorial
  • project and/or farm size
  • production systems
  • genotypes
  • variable inputs
  • soil texture
  • latitude,
  • longitude
  • altitude
  • rainfall
  • temperatures
  • terrain characteristics
  • water deficit


There remains a reluctance of many to embrace the complexity of the agricultural reality by structuring data collection and analysis to manage the information more effectively. McNeill cited Eric Fisher. Fisher, although the designer of methods which have simplified experimental and survey design and their analysis, was completely aware of the dangers of limiting the number of variable considered,

"No aphorism is more frequently repeated in connection with field trials, than that we must ask Nature few questions, or, ideally one question at a time. The writer is convinced that this view is wholly mistaken. Nature will best respond to a logical and carefully thought out questionnaire, indeed, if we ask her a single question, she will often refuse to answer until some other topic has been discussed"

The needed transition is from random sampling to systematic sampling based on typologies requires more initial effort to gain information on a population. However, this results in far lower costs involved in collecting very high quality data in terms of error and representation of data of the target population and its environment. The combination of due diligence procedures and ATs goes a long way towards resolving this issue but there is a need to support this process with clearer guidance on how to specify and apply typologies because these will vary according to the target population.

There is therefore a strong interplay between AT utility and the quality of data collected which can result in ATs evolving to involve more complex algorithms to generate more useful results.


State of the art ATs

Angus Raeburn of SEEL, made use of some of the SDGToolkit ATs to demonstrate the advance in the state-of-the-art in these tools. Concerning their development he referred to a triple testing routine involving taking OQSI (Open Quality Standards Initiative) recommendations, completing demonstration of concept to answer the question is the requirement workable? Implementing the AT and testing its operation in the cloud lab by engineers and then testing the same module with stakeholders. There used to be an issue around what is intuitive to engineers and what is intuitive to stakeholders causing some degree of redesign of interfaces. However by standardizing the interfaces by type of AT and providing adequate guidance in the due diligence procedures, this divergence on what is intuitive no longer exists, even with new stakeholders. As a result the testing cycles are now quite short. This is, in reality, a result of engineers descending a learning curve on "what works" and this is invaluable for their possible future roles in technical support, training of users as well as advancing developments.

Although it would be convenient to leave ATs as "standards" this is complicated by the advance in knowledge and user needs. As increasing numbers of ATs are required there is a feedback aspect to learning which helps identify ways to improve existing ATs. This is linked to cognitive ergonomics in the sense that if specific considerations are introduced to early ATs these concepts are easier to comprehend in the context of calculations that occur later in the due diligence sequence involving more complex considerations and datasets. The best example of this is the current work on the multi-factor yield model that was referred to by McNeill. As the data collection response to the advance of knowledge in the areas of locational-state bioclimatic variables becomes more systematic, so then, it is worth investing in more sophisticated ATs to make use of the data in a practical fashion.
Setting the scene within which a project is designed

Projects and farm production occur within a national macroeconomic context and quite often the key factors need to be assessed from the state of affairs where a project or farm is operational.

ATs that support Global Constraint Analyses have an important role in justifying projects in terms of realistic challenges such as the real incomes of low income segments, inflation and population growth. Productivity is a critical elements in reducing the negative consequences of population growth and inflation.



Reference: SDGToolkit Global Constraints Analysis, Demonstration of real income projections (March 2021)


A document, "Why was SDGToolkit developed?" was released as part of the information pack of the Workshop. The relationship between due diligence procedures and the ATs is illustrated in a diagram that appears in this document, as shown below.



As can be seen their role throughout the project design through to operations cycle means that ATs address decision analysis and evidence gathering at many different levels of consideration. For example, quite often, project teams are not concerned with analyzing national level gaps and needs since these are quite often defined by governments or sector analyses by development agencies such as the World Bank. By including national level considerations within a project design and development procedures project teams are able to understand better the nature of the challenges projects need to address. For example, population growth rates, inflation and real income growth are important determinants of the size of the challenge nationally while real incomes can set a cap on the feasibility of projects to sell output three or four years downstream. Sensitivity to the macroeconomic reality helps teams in their design and especially in operations decision making, to keep a project moving towards feasibility. Too many projects find that low income segments cannot afford the output leading to this being redirected to cities and export. As a result the original beneficiaies of the project are ignored and their plight not solved. On the consumption side the very same macroanalyis helps focus attention of projects not only on financial return but also the ability to pay higher real incomes in the context of lowering income disparity.

ATs remain in a logical sequence established by diligence design procedures where the function of the AT is made apparent. The generation of automatic human readable narratives helps avoid misinterpretation of results which in the cases of some more complex analyses is a risk.

The actual utility of ATs is related to the degree that they provide reliant and precise information upon which to base project design decisions. However, there needs to be a coherence between what the tasks are at each phase in a project cycle and the decisions taken to ensure desired objectives linked to that phase activities are achieved. Because following the decision on a project design the assumptions on likely implementation conditions made to guide the design can, and invariably they do, change. Therefore at in each phase ATs have a role in providing support to decision analysis and to a realistic reassessment of likely achievement under changed and changing conditions. Therefore ATs are not simply design stage support but provide an important component of project resilience in supporting ongoing response to change.

Project portfolios

Runa Janesen of Plasma.Systems gave a short presentation on the relationship between ATs and databases. Jansen leads the work at Plasma.Systems concerned with the data storage developments at SDGToolkit.com

She emphasised two points. One is that some ATs have an operating system function of locking database sections applying Accumulog technology so as to create immutable sections of what is referred to as a Project Memory (PM) which holds the record of all of the data, simulations, decisions and decision outcomes including end of task evalations. Accumulogs as also used to hold "locked" finally agreed project plan specifications and performance targets, as well as any contractual agreements. Somewhat like one of the cryptocurrency Etherium's functionalities.

The other ATs work with the database on an open and dynamic way during real time simulation reiterations permitting users to save to the PM the more useful options. One important benefit of the AT structures is that something like 90% storage capacity is eliminated by only saving option input values. This means that when options are later to be recalled for comparative analysis, the system generates the projections in tabular, graphic and narrative form in real time. These can use used and even downloaded to MS Office applications and formats.

Each user has a dedicated database server and with the Plasma.Systems techniques, these databases can hold any number of projects in a portfolio together with all of the datasets for each projects and with the capability of running cross porfolio in-depth comparative analyses such as relative performance and identifying common issues of note used to improve the design system procedures. Lessons learned are not in some dusty ignored archive but are integrated into the ongoing systems development.

Future developments include a major investment in locational-state logic in database structures and analysis and aggregation models to facilitate project more detailed comparisons across very large multi-project portfolios.
Posted: 20210505
We welcome questions and feedback:
  To submit questions or comments on the contents of this article please contact the author or main reference source by email.
The relevant emails are provided below:
  Author:   Nevit Turk:     nevit.turk@apeurope.org         Source:   SDGToolkit:     angus.raeburn@sdgtoolkit.com

APEurope.org