Saturday, 18 January 2020

The Prioritization of Risk Mitigation Using the Turtle Island Matrix


In Ojibway (Anishinabek) mythology, North America is referred to as Turtle Island.  The Environmental Review Panel decided to adapt the concept of Turtle Island to fit their evaluation of the environmental, social and economic impact of the development proposals they will be receiving.  The Environmental Review Panel believes that all of the 13 characteristics, shown in Figure 1, must work in harmony for the community to obtain sustainable development.  Figure 1 shows the traditional Ojibway names for these characteristics along with their English translations.



Figure 1: The Turtle Island Matrix



Although, the Serpent River First Nation is highly concerned about all of the elements in the Turtle Island Matrix, not every project will impact the elements in the same way.  Some projects might have specific impacts on some elements and not as much on other elements.  This suggests that risk mitigation resources might be allocated differently for different projects.  We developed a tool based on the Analytical Hierarchy Process [1] to assist in the allocation of resources towards the various elements of the Turtle Island Matrix on a project-by-project basis and for a combination of projects overall (see Figure 2 for an example).



Figure 2: Example Allocations for Risk Mitigation for Two Economic Development Projects

To determine the allocation of resources, the members of the Environmental Review Panel would need to evaluate each of the proposals for their relative importance.  This is done by completing a series of pair-wise comparisons.  First, the Panel must evaluate which of the projects is more important and click the appropriate button on the user interface.  Then the Panel would express how much more important the one project is compared to the other projects using a scroll bar on a scale from “1 – equally important” to “9 – absolutely more important” (see Figure 3).  From these pair-wise comparisons, the program can calculate the “weights of the proposals” using the software in an Excel spreadsheet [2].  These weights are represented as percentage values that sum to 100%.


Figure 3: Pair-wise Comparison of Economic Development Proposals

Then the Panel would do a similar set of pair-wise comparisons for the Turtle Island Matrix elements as they correspond to the individual proposals (see Figure 4).  When all of the pair-wise comparisons are provided, the program calculates the percentage of effort to invest in each element to avoid risk.  Again, for each project, the effort percentages for the Turtle Island Matrix elements sum to 100%.


Figure 4: Pair-wise Comparisons for the Turtle Island Matrix Elements for a Particular Economic Development Proposal

To get the overall resource allocation, the weights of the proposals is multiplied by the effort percentages for each Turtle Island Matrix element and summed over the proposals.  The final results can be displayed as a graph as shown in Figure 5.


Figure 5: Graphical Display of the Resource Allocation among the Turtle Island Matrix Elements Overall

It is hoped this methodology, implemented in an easy to use spreadsheet, will help the ERP allocate the resources available for risk mitigation to the various elements of the Turtle Island Matrix for the economic development proposals they receive.

References

1. Saaty, Thomas L. [1980], The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation, McGraw-Hill, New York.
2. Albright, S. Christian, [2001], VBA for Modelers: Developing Decision Support Systems with Microsoft Excel, Duxbury, Pacific Grove, California.


Saturday, 4 January 2020

Environmental Review Panel Go/No-Go Checklist

When I retired I was able to concentrate on Policy Dynamics Inc.  My first project was for the Serpent River First Nation.  They had established an Environmental Review Panel (ERP) to evaluate economic development proposals to ensure that environmental considerations were given a high priority.

The Serpent River First Nation’s ERP had developed a detailed questionnaire to be completed by managers of economic development proposals.  The onus is on the proposal manager to explain the goals and impact of their projects by fully completing the questionnaire.  The role of the ERP is to evaluate the proposals to ensure the First Nation receives economic and social benefits without sacrificing the environment.  To conduct this evaluation, the ERP needs to be satisfied with the data provided by the proposal manager in their completed questionnaires.

Go/No-Go Checklists have been used for many years in the aviation industry when pilots need to conduct a complex pre-flight inspection of their aircraft.  Go/No-Go Checklists have been successfully adapted for hospital use to ensure that medical procedures are conducted correctly by doctors and nurses prior to operations.  As the title suggests, if the checklist is completed satisfactorily, then the operation is a Go.  If the checklist is not completed to the operator’s satisfaction, it is No-Go.

I had worked with checklists in the Department of National Defence and although this wasn't a simulation problem, I thought a Go/No-Go Checklist implemented in Microsoft Excel macros might support the ERP’s evaluation of the questionnaires. The Go/No-Go Checklist I developed has nine satisfaction levels of increasing importance.  Each of the nine satisfaction levels in the evaluation has been assigned multiple questions from the questionnaire.  In the checklist, the questions in each satisfaction level have been cross-referenced with sections in the written questionnaire. 

There are cells to identify the proposal by name along with the proposal manager.  There is also a cell to identify when the Checklist was last updated.  The Checklist spreadsheet file would be saved individually with a proposal specific name.

Each satisfaction level is colour coded as Red (unsatisfactory), Yellow (partially satisfied) and Green (satisfactory).  The calculation of the colour code is done automatically by the macro software as the ERP conducts its evaluation of the questionnaire.  The colour codes are sequential.  That is, satisfaction level 2 cannot be reached (Green) until satisfaction level 1 is obtained (Green).  All nine satisfaction levels must be Green for the ERP to recommend the proposal be a Go.  Until that time, the proposal is a No-Go. 

Figure 1 shows an example of a Satisfaction Level Summary in which satisfaction level 1 is achieved (Green) and satisfaction level 2 is partially achieved (Yellow).  Some questions in satisfaction level 3 have been answered but the satisfaction level is still Red.  No questions in satisfaction levels 4 through 9 have been answered yet.
Satisfaction Level
Satisfaction Level Achieved
1
2
3
4
5
6
7
8
9










Figure 1: Satisfaction Level Summary


There is a cell in the checklist labelled “Green set point is:” (see Figure 2).  The value in this cell should be between 0 and 100.  This value is used by the macro software to compute the colour code for each satisfaction level based on the percent of the questions in each level that have been satisfactorily answered.

Green set point is:
75%

Figure 2: If 75% of the Questions are Answered Satisfactorily the SL is Coded Green

It is recommended that the Green set point be 100% because the goal of the Go/No-Go checklist is to ensure that all of the questions in the questionnaire have been answered satisfactorily before the project is approved as a Go.

However if the user wants, the value of the Green set point could be set to, for example, 75%.  The user would type the value 75 into the appropriate cell and press Enter.  Then if there are 8 questions in satisfaction level 1, once 6 of the questions in that section have been completed satisfactorily, the cell under satisfaction level 1 in the summary turns Green (see Figure 1).

Entering a value in the cell labelled Yellow set point (see Figure 3) is similar to the result when entering a value in the Green set point cell.  Assume the value in the Yellow set point cell is, for example, 50% and the value in the Green set point cell is 75%.  If there are 14 questions in satisfaction level 2 and 7 questions are completed to the satisfaction of the ERP, the cell under the satisfaction level 2 in the summary turns Yellow (see Figure 1).


Yellow set point is:
50%

Figure 3: If 50% of the Questions are Answered Satisfactorily the SL is Coded Yellow

If there are 4 questions in satisfaction level 3 and only 1 question is satisfactorily answered, the cell under the satisfaction level in the summary turns Red (see Figure 1).

There is a cell entitled “% Complete is now set at:” (see Figure 4).  The value in this cell should be between 0 and 100.  It is used by the software to compute when each question has been satisfactorily completed.  That is, an individual question can be partially completed and the software will consider the question to be satisfactorily answered.

% Complete is
now set at:
75

Figure 4: A Question Needs to be 75% Complete to be Considered Satisfactory

It is recommended that the value of % Complete be set to 100.  However, if the user would like to set the % Complete to, for example, 75%, then the user would type 75 in the cell and press Enter. 

One reason this approach might be useful is if, for example, three out of four members of the ERP thought the question was satisfactorily answered, the user might move the slider beside that question to 75% complete.  At that point, the cell beside the question would turn green suggesting the ERP is sufficiently satisfied with the answer given by the proposal manager.

There are comments cells provided in the project identification section and for each of the satisfaction levels.  These comment cells are for the user to make notes and are not used by the software.

Each of the nine satisfaction levels in the evaluation has been assigned multiple questions from the questionnaires.  The satisfaction level description table attempts to explain how the questions have been organized (see Figure 5).  




SATISFACTION LEVEL (SL 1-9) DESCRIPTIONS
1
Project Identification is Satisfactory
2
Project Activities and Preliminaries are Satisfactory
3
Benefits to First Nations are Satisfactory
4
Consultation with First Nation Communities is Satisfactory
5
Turtle Island Issue Identification is Satisfactory
6
Turtle Island Issue Considerations are Satisfactory
7
Problem Avoidance is Satisfactory
8
Risk Mitigation is Satisfactory
9
Reclamation Issues are Satisfactory

Figure 5: Summary Description of the Satisfaction Levels

Each of the questions assigned to a satisfaction level is numbered and cross-referenced with the written questionnaire. 

For each of the questions in the checklist, there are three ways for the ERP to express their impression of the quality of the proposal manager’s answer. 

One way is to use a slider or the arrow keys to increase or decrease their impression of the quality of the answer.  For example if the slider was used to move the cell beside the question to the 65 point, the cell would turn yellow meaning the ERP is not sufficiently impressed with the answer that was given by the proposal manager. This is because the % Complete set point was 75.  If the slider in was used to specify the cell to be 75, the question is considered by the ERP to be sufficiently complete and the colour of the cell would turn green and the box beside the question would become checked.

The second way for the ERP to show their impression of the quality of the answers in the checklist is to simply enter a number into the cell beside the question.  In this case, if the value entered is less than 75, the cell would be coloured yellow and if the value entered is equal or greater than 75, the cell would be coloured green. 

The third way to record the ERP’s impression of the quality of the proposal manager’s answer is to click the checkbox beside the question.  By clicking the checkbox, the ERP indicates that the question is completely answered (100% and therefore green).

One of the benefits of the Go/No-Go Checklist is that it can be used to identify areas where the proposal manager’s answers to the questions are not complete or not acceptable to the ERP.  Then the ERP can go back to the proposal managers and get them to complete that question in more detail, or change their proposal to assure the ERP that the environmental factors are being sufficiently addressed.

Although this work didn't involve simulation, it was well-received by the Serpent River First Nation Environmental Review Panel.



Saturday, 10 January 2015

Results from the Defence and the Economy System Dynamics Model


In this post, I will summarize the approach used in the System Dynamics Model of Defence and the Canadian Economy.  Then I will provide some findings for Defence expenditure for three threat scenarios.  Finally, I will provide the results for all of the sectors in the economy.

The processes in the model of Defence and the economy can be summarized by the figure below.  The need for Defence is modified by shocking the system through changes to the Defence per capita normal.  The discrepancy between the current Defence per capita in the model and the Defence per capita normal creates a pressure on the need for Defence.



If the Defence per capita is less than the Defence per capita normal, such as after September 11th, 2001, then the need for Defence will increase.

If the need for Defence increases the desired expenditure on Defence increases and the "desired" expenditure on labour, capital and resources for Defence will increase.

This leads to actual increases in labour, capital and resources devoted to Defence and thereby allows the production of Defence to increase.

When Defence production increases, the Defence per capita increases and the discrepancy between the Defence per capita and Defence per capita normal decreases.  So the need for Defence will decrease.

Therefore, these are balancing or goal seeking loops in the language of System Dynamics.

Of course, if the Defence per capita is greater than the Defence per capita normal, such as at the end of the Cold War, then the relationships would work in the opposite direction and the desired expenditure on Defence and the Defence production would decrease.

The figure below shows the results from the model in terms of Defence expenditure for three scenarios.


The first scenario (the yellow curve) is the counter-factual situation in which there was no terrorist attack on September 11th, 2001.  We can see that the Defence budget is still assumed to increase because of an increase in the population of Canada.

The second scenario (the purple curve) mimics reality in which there was an increased need for Defence after September 11th, 2001 and that this period of increase need continued until the end of Canada's combat commitment in Afghanistan.  The the need for Defence returned to a lower lever after 2012.  We can see that the Defence expenditure gradually increases and then reaches a higher steady-state than the counter-factual.

The final scenario that was examined (the blue curve) was a continued state of higher tension that resulted in a higher need for Defence indefinitely.  In this case, Defence is believed to require continually increasing investment.

Preliminary findings for all of the sectors of the economy shown in the figure below suggest that the economy will be dominated by the private services sector which will continue to increase for the next 50 years.  The other sectors of the economy are predicted to reach a steady-state around 2060.




The results from the model suggest that increased Defence expenditure will have little or no impact on total gross domestic product of the nation in the short-term.  Continuing increases in Defence expenditure will have a positive effect on the gross domestic product of the nation in the long-term but the effect is believed to be relatively marginal.

The results from the model appear to show that increased Defence expenditure will not "crowd out" development in other sectors because there is inherent slack in the economy to draw additional resources for Defence.

These results seem realistic to me.  I believe that Defence is a small part of the Canadian economy and changes in Defence expenditure within the expected ranges will neither hurt nor aid the economy to a significant effect.  This is a somewhat negative result from the point of view of proponents for increased Defence expenditure.

One of the novel concepts provided by this model is Defence per capita as a measure of the appropriate expenditure on Defence.  The usual measure in Defence economics is Defence expenditure as a percent of GDP.

Defence as a percentage of GDP implies that richer countries will spend more to defend themselves than poorer countries. The threats faced by a country are not directly considered.  Instead, the idea might be that rich countries have more to lose from outside threats and therefore spend more on Defence to protect their assets.

As an alternative, in our model, the gap between Defence per capita and Normal Defence per capita suggests that the Defence expenditure would be based on the population's perception of the threats to their lives.  That is, citizens in countries with significant perceptions of threats to their lives would be willing to spend more per capita than countries with less of a perception of threat to their lives.  This might better explain why Canada spends relatively little on Defence; namely, because Canadians do not feel their lives are being threatened.

On this note, I retired from the Defence Research and Development Agency.