This blog is intended to share my work on computer simulation. My computer simulation work started while I was doing my Masters degree in Information and Systems Science at Carleton University in Ottawa, Canada in 1985. I continued doing computer simulation during my 33 years in the Centre for Operational Research in the Canadian Defence Research and Development Agency. I am currently doing computer simulation in my management consulting company Policy Dynamics Inc.
Wednesday, 7 October 2020
Sunday, 5 July 2020
Saturday, 18 January 2020
The Prioritization of Risk Mitigation Using the Turtle Island Matrix
In Ojibway (Anishinabek) mythology, North
America is referred to as Turtle
Island. The Environmental Review Panel decided to
adapt the concept of Turtle
Island to fit their
evaluation of the environmental, social and economic impact of the development
proposals they will be receiving. The
Environmental Review Panel believes that all of the 13 characteristics, shown
in Figure 1, must work in harmony for the community to obtain sustainable
development. Figure 1 shows the
traditional Ojibway names for these characteristics along with their English
translations.
Figure
1: The Turtle Island Matrix
Although, the Serpent River First Nation is highly concerned about all of the elements in the Turtle Island Matrix, not every project will impact the elements in the same way. Some projects might have specific impacts on some elements and not as much on other elements. This suggests that risk mitigation resources might be allocated differently for different projects. We developed a tool based on the Analytical Hierarchy Process [1] to assist in the allocation of resources towards the various elements of the Turtle Island Matrix on a project-by-project basis and for a combination of projects overall (see Figure 2 for an example).
Figure 2: Example Allocations for Risk Mitigation for
Two Economic Development Projects
To determine the allocation of resources, the members of the Environmental Review Panel would need to evaluate each of the proposals for their relative importance. This is done by completing a series of pair-wise comparisons. First, the Panel must evaluate which of the projects is more important and click the appropriate button on the user interface. Then the Panel would express how much more important the one project is compared to the other projects using a scroll bar on a scale from “1 – equally important” to “9 – absolutely more important” (see Figure 3). From these pair-wise comparisons, the program can calculate the “weights of the proposals” using the software in an Excel spreadsheet [2]. These weights are represented as percentage values that sum to 100%.
Figure 3: Pair-wise Comparison of Economic Development
Proposals
Then the Panel would do a similar set of
pair-wise comparisons for the Turtle Island Matrix elements as they correspond
to the individual proposals (see Figure 4).
When all of the pair-wise comparisons are provided, the program
calculates the percentage of effort to invest in each element to avoid
risk. Again, for each project, the
effort percentages for the Turtle Island Matrix elements sum to 100%.
Figure 4: Pair-wise Comparisons for the Turtle Island
Matrix Elements for a Particular Economic Development Proposal
To get the overall resource allocation, the
weights of the proposals is multiplied by the effort percentages for each Turtle
Island Matrix element and summed over the proposals. The final results can be displayed as a graph
as shown in Figure 5.
Figure 5: Graphical Display of the Resource Allocation
among the Turtle Island Matrix Elements Overall
It is hoped this methodology, implemented in an easy to use spreadsheet, will help the ERP allocate the resources available for risk mitigation to the various elements of the Turtle Island Matrix for the economic development proposals they receive.
References
1. Saaty, Thomas L. [1980], The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation, McGraw-Hill, New York.
2. Albright, S. Christian, [2001], VBA for Modelers: Developing Decision Support Systems with Microsoft Excel, Duxbury, Pacific Grove, California.
1. Saaty, Thomas L. [1980], The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation, McGraw-Hill, New York.
2. Albright, S. Christian, [2001], VBA for Modelers: Developing Decision Support Systems with Microsoft Excel, Duxbury, Pacific Grove, California.
Saturday, 4 January 2020
Environmental Review Panel Go/No-Go Checklist
When I retired I was able to concentrate on
Policy Dynamics Inc. My first project
was for the Serpent River First Nation.
They had established an Environmental Review Panel (ERP) to evaluate
economic development proposals to ensure that environmental considerations were
given a high priority.
The Serpent River First Nation’s ERP had
developed a detailed questionnaire to be completed by managers of economic development
proposals. The onus is on the proposal
manager to explain the goals and impact of their projects by fully completing
the questionnaire. The role of the ERP
is to evaluate the proposals to ensure the First Nation receives economic and
social benefits without sacrificing the environment. To conduct this evaluation, the ERP needs to
be satisfied with the data provided by the proposal manager in their completed
questionnaires.
Go/No-Go Checklists have been used for many
years in the aviation industry when pilots need to conduct a complex pre-flight
inspection of their aircraft. Go/No-Go
Checklists have been successfully adapted for hospital use to ensure that
medical procedures are conducted correctly by doctors and nurses prior to
operations. As the title suggests, if
the checklist is completed satisfactorily, then the operation is a Go. If the checklist is not completed to the
operator’s satisfaction, it is No-Go.
I had worked with checklists in the Department
of National Defence and although this wasn't a simulation problem, I thought a Go/No-Go
Checklist implemented in Microsoft Excel macros might support the ERP’s evaluation
of the questionnaires. The Go/No-Go Checklist I developed has nine satisfaction
levels of increasing importance. Each of
the nine satisfaction levels in the evaluation has been assigned multiple
questions from the questionnaire. In the
checklist, the questions in each satisfaction level have been cross-referenced
with sections in the written questionnaire.
There are cells to identify the proposal by
name along with the proposal manager. There
is also a cell to identify when the Checklist was last updated. The Checklist spreadsheet file would be saved
individually with a proposal specific name.
Each satisfaction level is colour coded as
Red (unsatisfactory), Yellow (partially satisfied) and Green
(satisfactory). The calculation of the
colour code is done automatically by the macro software as the ERP conducts its
evaluation of the questionnaire. The
colour codes are sequential. That is,
satisfaction level 2 cannot be reached (Green) until satisfaction level 1 is
obtained (Green). All nine satisfaction
levels must be Green for the ERP to recommend the proposal be a Go. Until that time, the proposal is a No-Go.
Figure 1 shows an example of a Satisfaction
Level Summary in which satisfaction level 1 is achieved (Green) and satisfaction
level 2 is partially achieved (Yellow). Some
questions in satisfaction level 3 have been answered but the satisfaction level
is still Red. No questions in
satisfaction levels 4 through 9 have been answered yet.
Satisfaction
Level
|
||||||||
Satisfaction Level
Achieved
|
||||||||
1
|
2
|
3
|
4
|
5
|
6
|
7
|
8
|
9
|
Figure
1: Satisfaction Level Summary
There is a cell in the checklist labelled
“Green set point is:” (see Figure 2).
The value in this cell should be between 0 and 100. This value is used by the macro software to
compute the colour code for each satisfaction level based on the percent of the
questions in each level that have been satisfactorily answered.
Green set point is:
|
75%
|
Figure
2: If 75% of the Questions are Answered Satisfactorily the SL is Coded Green
It is recommended that the Green set point
be 100% because the goal of the Go/No-Go checklist is to ensure that all of the
questions in the questionnaire have been answered satisfactorily before the
project is approved as a Go.
However if the user wants, the value of the
Green set point could be set to, for example, 75%. The user would type the value 75 into the
appropriate cell and press Enter. Then if
there are 8 questions in satisfaction level 1, once 6 of the questions in that
section have been completed satisfactorily, the cell under satisfaction level 1
in the summary turns Green (see Figure 1).
Entering a value in the cell labelled
Yellow set point (see Figure 3) is similar to the result when entering a value
in the Green set point cell. Assume the
value in the Yellow set point cell is, for example, 50% and the value in the
Green set point cell is 75%. If there
are 14 questions in satisfaction level 2 and 7 questions are completed to the
satisfaction of the ERP, the cell under the satisfaction level 2 in the summary
turns Yellow (see Figure 1).
Yellow set point is:
|
50%
|
Figure
3: If 50% of the Questions are Answered Satisfactorily the SL is Coded Yellow
If there are 4 questions in satisfaction
level 3 and only 1 question is satisfactorily answered, the cell under the
satisfaction level in the summary turns Red (see Figure 1).
There is a cell entitled “% Complete is now
set at:” (see Figure 4). The value in
this cell should be between 0 and 100.
It is used by the software to compute when each question has been satisfactorily
completed. That is, an individual
question can be partially completed and the software will consider the question
to be satisfactorily answered.
% Complete is
|
now set at:
|
75
|
Figure
4: A Question Needs to be 75% Complete to be Considered Satisfactory
It is recommended that the value of %
Complete be set to 100. However, if the
user would like to set the % Complete to, for example, 75%, then the user would
type 75 in the cell and press Enter.
One reason this approach might be useful is
if, for example, three out of four members of the ERP thought the question was
satisfactorily answered, the user might move the slider beside that question to
75% complete. At that point, the cell beside
the question would turn green suggesting the ERP is sufficiently satisfied with
the answer given by the proposal manager.
There are comments cells provided in the
project identification section and for each of the satisfaction levels. These comment cells are for the user to make
notes and are not used by the software.
Each of the nine satisfaction levels in the
evaluation has been assigned multiple questions from the questionnaires. The satisfaction level description table
attempts to explain how the questions have been organized (see Figure 5).
SATISFACTION LEVEL (SL 1-9)
DESCRIPTIONS
|
|
1
|
Project Identification is Satisfactory
|
2
|
Project Activities and Preliminaries are Satisfactory
|
3
|
Benefits to First Nations are Satisfactory
|
4
|
Consultation with First Nation Communities is Satisfactory
|
5
|
Turtle Island Issue Identification is Satisfactory
|
6
|
Turtle Island Issue Considerations are Satisfactory
|
7
|
Problem Avoidance is Satisfactory
|
8
|
Risk Mitigation is Satisfactory
|
9
|
Reclamation Issues are Satisfactory
|
Figure
5: Summary Description of the Satisfaction Levels
Each of the questions assigned to a satisfaction
level is numbered and cross-referenced with the written questionnaire.
For each of the questions in the checklist,
there are three ways for the ERP to express their impression of the quality of
the proposal manager’s answer.
One way is to use a slider or the arrow
keys to increase or decrease their impression of the quality of the answer. For example if the slider was used to move
the cell beside the question to the 65 point, the cell would turn yellow
meaning the ERP is not sufficiently impressed with the answer that was given by
the proposal manager. This is because the % Complete set point was 75. If the slider in was used to specify the cell
to be 75, the question is considered by the ERP to be sufficiently complete and
the colour of the cell would turn green and the box beside the question would become
checked.
The second way for the ERP to show their
impression of the quality of the answers in the checklist is to simply enter a
number into the cell beside the question.
In this case, if the value entered is less than 75, the cell would be
coloured yellow and if the value entered is equal or greater than 75, the cell
would be coloured green.
The third way to record the ERP’s impression
of the quality of the proposal manager’s answer is to click the checkbox beside
the question. By clicking the checkbox,
the ERP indicates that the question is completely answered (100% and therefore
green).
One of the benefits of the Go/No-Go
Checklist is that it can be used to identify areas where the proposal manager’s
answers to the questions are not complete or not acceptable to the ERP. Then the ERP can go back to the proposal
managers and get them to complete that question in more detail, or change their
proposal to assure the ERP that the environmental factors are being sufficiently
addressed.
Although this work didn't involve
simulation, it was well-received by the Serpent River First Nation
Environmental Review Panel.
Subscribe to:
Posts (Atom)