Home page

Decision analysis as a basis for more effective agricultural innovation

©2018 APE


Part 4: The cloud-based decision analysis tool box

by

Hector McNeill1
Summary

The analytical techniques, software and hardware that are used to build project cycle and portfolio management systems have advanced significantly during the last 70 years. Since 1940, more than 350 programming languages were developed, averaging over 60 new additions each decade; currently only a handful dominate today's implementations. Hardware has undergone a revolution in terms of processing capacity, speed, dimensions, energy consumption and costs.

Most of the robust analytical methods applied in decision analysis were developed over 70 years ago but recently there have been developments in the identification of appropriate data sets, information, knowledge and design of data structures. Database structures continue to advance by accommodating the management of more dynamic data sets such as those characteristic of complex natural resource systems.

We certainly have not exhausted what can be done with decision analysis because there are several important evolving techniques, some of which are being developed by the George Boole Foundation Group through research at SEEL, or applied services through Navatec.

This article provides a summary of how the benefits of an advancing state-of-the-art of these factors are being integrated into Navatec System to create a next generation project cycle and portfolio management system.



1. Due diligence

Our core team applications experience over the last 50 years has been in agricultural and economic development project identification, design, management and evaluation. As a result of this experience it is apparent that there is a wide range of practice, in spite of existing guidelines. There is often a lack of balance in the types of information used to design projects. Our approach with Navatec System is to apply a due diligence project design procedure to identify all of the relevant factors that need to be considered to ensure that the relevant information is collected. The due diligence procedures are not a simple tick box list. Each procedure is accompanied by methods of data specification, collection and analysis.

This is of importance in the case of small teams that lack the full complement of disciplines required to design complex projects or in the case of younger less experienced team members where we can complement this with training and mentoring input through our partners at STRIDES or through our IT team. This approach is essential to ensure adequate information is made available to support project design and decision analysis model simulation to identify the best project options.
Some primary constraints

Constraints analysis is used to define a feasibility envelope within which a project can find resilience. However it is the constraints that also can change during implementation.

  • Cultural & social issues
  • Economic issues
  • Factor markets
  • Produce & service markets
  • Logistics
  • Environmental factors
  • Ecosystems
  • Key location co-ordinates
  • Key location distances
  • Other locational factors
  • Eligibility
  • Legal & regulatory
  • Administrative procedures
  • Financial criteria & budget
  • Other relevant initiatives
  • Wellbeing
  • Environmental sustainability
  • Climate change & impacts
  • Governance of x-cutting issues

Based on the OQSI:1 (2017) recommendation subject to update in June 2018.


1.(i) Stakeholder participation

Currently, the degree to which stakeholders are involved in project design and execution varies significantly. At the extreme we have even encountered project proposals where the listed stakeholders were unaware of the fact their names were on a project application for funding. Direct stakeholder participation is an imperative in terms of securing project design relevance and quality. A project management system must facilitate a productive stakeholder involvement throughout a project's life cycle. The initial identification of gaps and needs involves stakeholders directly. Details of their involvement in final project design are covered in items 2. (ii)-(iv), below.

1. (ii) Higher level development goals

There are risks associated with the presumptions that higher level development goals necessarily reflect the specific circumstances of a particular community; they invariably don't. Each community, or the mix of stakeholders involved in a value chain, face very specific gaps in provisions and have needs which need to be exposed analysed and prioritized by the stakeholders so as to place general goals such as Sustainable Development Goals (SDGs) or Country Strategies in an objective context. Where these do have role is covered in item (iv)

1. (iii) Gaps and needs analysis

The first step in the due diligence design procedures is gaps and needs analysis. Even if it is assumed that the objective and methods of a project are known, the results of gaps and needs analysis provide the basic justification to donors that the project will address a relevant need.

In Navatec System due diligence recommendations are designed to avoid the common presumption that gaps can be identified as processes. Emphasis is given to the identification and measurement of gaps in provisions expressed as nouns and not as verbs. Verbs involve processes which presume that the gaps have been identified and assessed and the process, a solution, is the gap. This is an important distinction emphasized by Roger Kaufman1.

With the constant advance of knowledge and state-of-the-art in technology and techniques, it is risky to assume solutions before identifying gaps. One role of decision analysis is to review the full range of available technologies and techniques in order to identify feasible options from which to select the most appropriate as the basis for a project's initial plan.

1. (iv) Constraints analysis & the feasibility envelope

The due diligence procedures on gaps and needs identify the issues that stakholders wish to resolve, largely by closing gaps. This process provides a good foundation for a transparent stakeholder involvement in coming up with an agreed list of prioritized needs. The next step in due diligence is constraints analysis. This is used to identify and determine the degree to which several existing factors (see the box on the right) will impose limitations on the ability to close gaps and satisfy needs. This analysis results in a dimensioned feasible operational envelope that combines all of the ranges of project constraints which can have up to 100 data elements associated with them.

During this phase of a project design the significance of the ranges of impacts of factors is not always fully apparent; it can be confusing. However, the relative quantitative impacts of the different types of constraint become clear when Monte Carlo Simulation is applied to the baseline design in the simulation phase (see Item 2.).

1. (v) OEA-Organizational Elements Analysis

Decision analysis research at SEEL

SEEL-Systems Engineering Economic Lab was established in 1983 to monitor and conduct research on the evolving communications technologies associated with the Internet. Our main applications experience was agricultural research, agricultural economics, project design, assessment, management and evaluation. Our initial interest was to introduce decision analysis to agricultural development policy and project cycle management. In 1985, SEEL purchased all of output of the Stanford Research Institute Decision Analysts Group as well as additional references, in all, around 150 documents. We have since been developing procedural frameworks for agricultural projects to improve the effectiveness of decision analysis in handling natural resources systems complexity.

Over the last 30 years we have integrated several existing procedural methods and have developed others. During this period we changed our focus from client side stand alone modeling to server side cloud models. SEEL span off the George Boole Foundation to organize the Decision Analysis Initiative (DAI 2010-1015) to develop a new generation of agricultural development project decision analysis. In 2015 the George Boole Foundation established the Open Quality Standards Initiative (OQSI) to coordinate the establishment of due diligence procedure recommendations for project cycle and portfolio management. Between 2015-2018 , Navatec.com, the services division of the Foundation developed and implemented Navatec System.

SEEL continues as the Group applied research unit. Since 1987 SEEL has been the world's leader in the development of Locational State Theory and Accumulog applications such as in the object oriented Plasma DataBase.
source: see Decision Analysis Initiative 2010-2020, Interim Report, 2018.
It is helpful to create a map of the linkages and relationships between community or stakeholder needs that exist outside a project to the sequence of processes within a project. This can be generated by making use of Organizational Elements Analysis (OEA). This technique was developed by Roger Kaufman see Organizational Elements Analysis.

The baseline initial project solution and model make up what is referred to as a micro-OEA and the external broader model is referred to as a macro-OEA because it is extended from the immediate vicinity of a project though different arranged organizational elements up to national level.

This apporach is useful in order to place such elements as SDGs and Country Strategic Plans in context in the Macro-OEA while the essential workings, solutions, resource requirements, and budget remain in the micro-OEA.

Depending upon a project focus there are comprehensive OEAs which are particularly detailed in the macroeconomic analysis as well as normal OEAs which have a simpler format. In both cases the micro-OEAs containing the basic project design, remain the same in each case..can be found here

2. Simulation

2. (i) Methods

The baseline model is used to complete simulations of the operational options of project setup, implementation and operations1. Simulation helps select specific decision-maker preferences such as unit costs, timing, quantity of project throughput, quality of output, alternative budgets and the associated risk with each solution.

Some of the most relevant simulation techniques, Monte Carlo Simulation, linear programming & Markov chains, have been discussed in Part 1 and 2 of this series.

2. (ii) Instructional simulation

Simulation has an associated role as a means whereby a team and stakeholders can gain a refined and comprehensive insight into a project's resilience to change and to review the relative benefits and drawbacks of any proposed changes in basic design or conditions. The process of probing different "scenarios" by having the simulation process answer questions of the type:

"what is the likely outcome if factor X in increased, decreased or has a given range of possible values?".

Such questions should come from team members and stakeholders and on occasions from donors.

This reiterative decision analysis cycle is interesting and a very valuable basis for improving team and stakeholder understanding of their project design, based on instructional simulation2. It is difficult to over-estimate the value of simulation in helping all involved by answering every question raised and to end up with a team and stakeholders with an excellent understanding of the project's feasible options.

2. (iii) Stakeholder components

From our experience, stakeholder involvement in many conventionally designed projects has tended to be somewhat ephemeral consisting, at best, of mainly of facilitated participatory workshops. However, we have witnessed in some cases that the facilitators concerned came with preconceived ideas and agenda and even excluded the consideration of certain types of impact analyses or constraints which represented sensitive topics for some parties. However, to be impartial and to complete competent review of options to an adequate professional standard it is essential to accept that stakeholders have an essential role in assessing simulation outcomes. This way the full range of consequences of selected feasible designs can be better understood.

Although the constraints analysis collects all of the relevant constraints information, in practice, stakeholder practical experience in the project environment is invaluable in fine tuning final feasible expectations. In order for stakeholders to provide effective inputs they need to possess some ownership of parts of the design process. Therefore Navatec System provides a practical interface for stakeholders through the creation of decision analysis models that extend upstream and downstream of the project's value chain as well as providing analysis of the immediate project level community aspects. Through these mechanisms the stakeholder contributions and judgements can be made more explicit, be taken into account and documented.

3. Oversight

Project design and implementation phases need to allow oversight by investors, donors, stakeholders, executing agency management, project managers, team members and evaluators. All should have a secure but convenient access to different aspects of a project activities and performance.
The Information Technology & Telecommunications Task Force - ITTTF (Brussels)

The ITTTF was a temporary programme development initiative at the European Commission in Brussels established in the early 1980s. It was organized to attempt to coordinate a response by Europe to what was perceived to be an emerging threat from Japan's plans, contained in the ICOT Report, to invest in 5th Generation systems to become world leaders in a new knowledge-based economy. At that time the trade balance between European IT exports and Japanese imports had become untenable. I joined the ITTTF in 1984, it had about a dozen staff and was managed by Roland Huber. This unit spun off a major revolution in IT in Europe in the fields of communications such as broadband and mobile telephony (GSM) as well as leading applications in transport navigation, biomedicine and distance education systems. I was asked to coordinate the preparation of EU-funded initiatives in learning technology.

Having an agricultural, economics and systems engineering background and many years experience in agricultural extension systems, I considered the brief to be one to develop IT systems to help in the development of tacit knowledge (operation skills) and to manage explicit knowledge (instructions and knowledge on state of the art) more effectively. The objective would be to increase the rates of innovation in all sectors through work place and personal life-long learning support. I used this notion to lead discussions with over 200 experts who participated in sector panels to identify their sector gaps, needs and constraints covering agriculture, medicine, transport, educational establishments (schools, universities and technical colleges) was well as telecommunications experts, NGOs and Labour Unions.

In the end, the outcome of this work was a very simple proposal to develop a similar functionality to that which exists today as the world wide web as the global network starting by using the existing Internet as its backbone. This proposal received a widespread support from the panel members as well as the consultants working on this initiative. The only panel member to oppose this proposal was a representative from a distance education establishment who saw the whole requirement as being what they were already doing at that time.

I should add that at that time Tim Berners-Lee had not yet come up with the browser and the convenience of "links" and his whole W3 concept. What we were advancing at the time had been somewhat complicated "intelligent switches" used to find content but Berners-Lee resolved all of this in a far simpler and logical manner from a browser interface.

The range of concepts and associated potential applications arising from the panels was impressive. One expert (David Leiper) even produced a prototype design for a hand held computerized mobile telephone to handle multi-media data streams (a smart phone) some 20 years before the iPhone. My own conclusions, drawing from a very comprehensive analysis of needs and constraints, were two essential issues which have become serious constraints on how W3 technologies are applied today some 30 years later:
  • Locational state notation
  • Accumulogs
Location state notation

It was apparent in 1985 that with the merging of images, text, sound, data and other forms of communication and recording into digital streams was going to demand that information communicated for use in decision analysis needed to be reliable, representative and as accurate as required. The questions that needed to be answered were:

What methods should be applied to specify information requested over a global network, to support decision analysis?

How could those responding with information to be sent over a global network be confident that what they transmitted was what was required?

Locational state notation started out as a "method of dimensions" in order to ensure the right units were defined. However it soon became apparent in the area of natural resources and agriculture that one is not dealing with single units but rather with space-time issues and seasonal cycles which alter data according to location in space and time. This bundle of concepts has evolved into a theory, Locational State Theory which involves aspects of evolutionary processes and even Fourier analysis to integrate dynamic environmental and ecosystem cycles into models.

Today, on a broader front, we see this issue of veracity of content has become a pervasive issue by permeating the media and the political domain surfacing as the issue of media and partisan bias and the emerging issue of fake news.

see Locational State Theory

Accumulog

In terms of receiving data and recording it, on a personal basis, in a way to secure a convenient recall (reminder) and understanding, the concept of a permanent and precise temporal ledger or blockchain was considered to be a basic requirement. At that time the concept of blockchains did not exist and the name for this data structure was an Accumulog derived from "accumulation" and "log" or record.

On the subject of project records and documentation there is a need to record what has been learned as a result of due diligence procedural analyses and simulation require a clear and explicit record that facilitates recall and understanding. The Accumulog concept and structure has a direct role in managing data for decision analysis of project design and to support transparent recall to support decisions in response to changes during implementation.

see Accumulogs

3. (i) Real time audit

Navatec Systems makes effective use of the W3 communications and access protocols to provide a real time monitoring system in the form of a Real Time Audit (RTA) with a global reach. The supportive components allow authorised users to access the system from any location, including using a mobile, to drill down to the specific information they require in any project in any location, in a few clicks.

The granularity of this process is facilitated by the programming and database structural approach applied (see Items 4 and 5).

4. Logic & language
OOP was developed as a simulation language structure

In the 1960s computers were controlled by a business oriented logic based on programs such as FORTRAN. Kristen Nygaard had become concerned with changing the structure and syntax of programming languages to enable a more precise description of reality so as to create better simulation models. He was later joined by Ole-Johan Dahl, and together they developed what is now known as object oriented programming (OOP) and a series of simulation models built in their simulation object oriented language, SIMULA. They both worked at the Norwegian Computer Centre in Oslo. OOP became more widely adopted after is was introduced to the popular C++ language in the 1970s by Bjarne Stroustrup, based on his PhD work at the University of Cambridge.

George Boole's seminal contribution

Some 83 years after the publication of George Boole's book, "The Laws of Thought", Claude Elwood Shannon identified the role of Boolean Logic in the optimization of the design of electrical switching circuits in a PhD thesis published in 1936 (University of Michigan). As a result Boolean Logic became the foundation of modern digital technologies (hardware and software). Being founded on the basis for human expression and deduction computers can mimic human logical processes as well as describe human understanding of complex systems. This is how artificial intelligence (AI) and expert systems operate.


The logic used to express how a model is built and operates is an application of human reason, logic and deduction. The way human deduce and reason as was described as a mathematical logic in 1854 by George Boole in his book entitled, "The Laws of Thought". This work provided the rationale and methodology for reducing complex logical relationships to a smaller set of simpler relationships which are able to reproduce all of the possible relationships from which the set was derived2. Boolean logic remains the fundamental linguistic and mathematical logic of all computer programming.

Scripting reality

The process of decision analysis model building for project design is a challenge in scripting reality. The better the emulation of reality the more effective will be the contribution of this process in identifying feasible projects. This involves an analysis of relationships and the translation of these into code or scripts. To date the most effective basis for doing this is the object oriented approach to programming (OOP). Indeed, simulation of reality was the reason OOP was developed in the first place by Kristen Nygaard and Ole-Johan Dahl in the 1960s at the Norwegian Computer Centre in Oslo1.

A paradox is that many programmers see OOP as just another paradigm as opposed to being a fundamentally important basis for building simulation models. OOP identifies phenomena as objects (people, animate and inanimate things, flora and fauna) which have specific properties or attributes (dimensions, weight, colour etc) and methods (what the objects can do or how they carry out these actions or the response of objects to impacts of other phenomena on that object). This basic description enables any real world phenomena to be described and coded as a computer-based simulation model. In reality, there is no need for people to have to program in a computer language but it is useful to know the basics of the object oriented approach which, in any case, like Boolean logic and deduction, it is how we model relationships mentally.

The Navatec System builds object oriented decision analysis models automatically based on the information collected by the due diligence procedures. This makes this process a smooth seamless operation for users. All of the simulations and analyses of impacts of potential changes run on these models.

Navatec System deploys a powerful object oriented server side JavaScript built as an extension of the ECMA standard, ECMAScript (ECMA Standard 262 and ISO/IEC Standard 16262). Unlike the client side web page JavaScript, server side JavaScript has complete control over the processing environment, a dedicated database and operating system.

5. Datasets & structures

The basic requirement in decision analysis is the identification of the required information to build and run a decision analysis model used to analyse and optimize project designs. This has three basic components:

  • identifying the fundamentally dynamic factors that determine the inter-relationships between objects, processes and outputs
  • translating these into a dataset as a specification of data, information and knowledge requirements and a set of functional relationships (algorithms) that become components of a model
  • structuring and recording data, information and knowledge in a way that provides instant support to decision-making beyond the design stage and throughout the implementation stages

Following gaps, needs and constraints analysis, dataset requirements and specifications can be determined making use of a Data Reference Model (DRM) developed at SEEL and now an OQSI recommendation. This is a relatively simple structure that relates required information to a process and method of calculation to the basic dataset requirement. Although DRMs are a device applied in Navatec System model building, users only have to complete the due diligence processes information input because these are structured on DRMs. The process is simple and seamless.

In the case of agricultural and natural resources systems a refinement can often be added by applying Locational State Theory considerations to data. These can reflect the normal locational state adjustments that occur on a cyclic basis each year and/or daily. Locational State Theory has an important role is explaining why data on natural phenomena changes according to well-defined deterministic relationships. These can be built into models and simulations. In this way statistical sampling calculations can be adjusted to include a larger proportion of variance in the "explained variance" bin and thereby enabling better estimates of risk and uncertainty and an improved capability to predict changes that can impact project performance.

6. Project sustainability factors

6. (i) Managing the impacts of change

There are two types of major impact on the performance of projects during implementation. These are:

  • external changes
  • internal changes

The relationships between external and internal changes were explained in Part 3 of this series under "impact cascades".

6. (ii) Management economy

A priority objective in cloud-based project cycle and portfolio management systems is to minimize the costs of entry and ongoing utilization of the system. This has a major contribution to make to the ability of organizations to maintain the operation of a sophisticated project cycle and portfolio management system. To achieve this Navatec System combines the following set of provisions:

  • remove the need to use higher cost hardware to a set of lower capacity "thin clients" including PCs, Lap Tops, Tablets and Mobile
  • remove the need for additional software and associated costs by carrying all requirements with the service provided to be accessed by a browser
  • substitute mobile applications (Apps) with server side utilities (SSUs) to bring extraordinary processing power to client devices
  • run all functionality server side
  • deploy object oriented server side scripting which has indigenous coding for access to databases and Accumulogs
  • maximize operational response and number of simultaneous users through microscripts

The fee-based model applied by Navatec System is one of the most efficient and effective in this field and this is described in an additional article to this series entitled: "Paying for successful projects".

NOTE: This reference section is being updated; publication authorised in order for this article to cross-relate to others already posted

Missing references relate mainly to Organizational Elements Analyis.

1   Curtis Franklin Jr. - 11 Programming Languages That Lost Their Mojo - Programming languages come and go. Here are 11 that have given up the spotlight to more modern options. https://www.informationweek.com/it-life/11-programming-languages-that-lost-their-mojo/d/d-id/1321678?




APEurope.org