0% found this document useful (0 votes)
5 views7 pages

Decision Tree Analysis

The document discusses decision trees as a graphical representation of decision problems, illustrating how decisions are made through nodes and arcs. It explains the structure of decision trees, including decision nodes and chance nodes, and how they can be used to calculate expected values for various alternatives. An example involving a patient's decision regarding lasik eye surgery demonstrates the application of decision trees in evaluating different choices based on expected health utility.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views7 pages

Decision Tree Analysis

The document discusses decision trees as a graphical representation of decision problems, illustrating how decisions are made through nodes and arcs. It explains the structure of decision trees, including decision nodes and chance nodes, and how they can be used to calculate expected values for various alternatives. An example involving a patient's decision regarding lasik eye surgery demonstrates the application of decision trees in evaluating different choices based on expected health utility.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Kluwer Academic Publishers 2001

10.1007/1-4020-0611-X_220

Decision trees
Stuart Eriksen1, Candice Huynh1, and L. Robin Keller1

(1) University of California, Irvine, USA

Without Abstract

A decision tree is a pictorial description of a well-defined decision problem. It is


a graphical representation consisting of nodes (where decisions are made or
chance events occur) and arcs (which connect nodes). Decision trees are
useful because they provide a clear, documentable and discussible model of
either how the decision was made or how it will be made.

The tree provides a framework for the calculation of the expected value of
each available alternative. The alternative with the maximum expected value is
the best choice path based on the information and mind-set of the decision-
makers at the time the decision is made. This best choice path indicates the
best overall alternative, including the best subsidiary decisions at future
decision steps, when uncertainties have been resolved.

The decision tree should be arranged, for convenience, from left to right in the
temporal order in which the events and decisions will occur. Therefore, the
steps on the left occur earlier in time than those on the right.

DECISION NODES
Steps in the decision process involving decisions between several choice
alternatives are indicated by decision nodes, drawn as square boxes. Each
available choice is shown as one arc (or “path”) leading away from its decision
node toward the right. When a planned decision has been made at such a
node, the result of that decision is recorded by drawing an arrow in the box
pointing toward the chosen option. As an example of the process, consider a
pharmaceutical company president's choice of which drug dosage to market.
The basic dosage choice decision tree is shown in Figure 1. Note that the
values of the eventual outcomes (on the far right) will be expressed as some
measure of value to the eventual user (for example, the patient or the
physician).
Dosage A
Value of A
Dosage B
Value of B
Dosage C
Value of C

Figure 1 The choice of drug dosage.

CHANCE NODES
Steps in the process which involve uncertainties are indicated by circles
(called chance nodes), and the possible outcomes of these probabilistic
events are again shown as arcs or paths leading away from the node toward
the right. The results of these uncertain factors are out of the hands of the
decision-maker; chance or some other group or person (uncontrolled by the
decision-maker) will determine the outcome of this node. Each of the potential
outcomes of a chance node is labeled with its probability of occurrence. All
possible outcomes must be indicated, so the sum of the potential outcome
probabilities of a chance node must equal 1.0. Using the drug dose selection
problem noted above, the best choice of dose depends on at least one
probabilistic event: the level of performance of the drug in clinical trials, which
is a proxy measure of the efficacy of the drug. A simplified decision tree for
that part of the firm's decision is shown in Figure 2. Note that each dosage
choice has a subsequent efficacy chance node similar to the one shown, so
the expanded tree would have nine outcomes. The probabilities (p1, p2, and
p3) associated with the outcomes are expected to differ for each dosage.

Dosage A

Efficacy Level E1
Value of level E1
P1
Dosage B Efficacy Level E2
B Value of level E2
P2
Efficacy Level E3
Value of level E2
P3

Dosage C
C

Figure 2 The choice of drug dosage based on efficacy outcome.

There are often several nodes in a decision tree; in the case of the drug
dosage decision, the decision will also depend on the toxicity as demonstrated
by both animal study data and human toxicity study data, as well as on the
efficacy data. The basic structure of this more complex decision is shown in
Figure 3. The completely expanded tree has 27 eventual outcomes and
associated values. Notice that although not always the case, here the
probabilities (q1, q2, and q3) of the toxicity levels are independent of the
efficacy level.

Dosage A

Efficacy Level E1
P1
Toxicity level T1
Value of E2 & T1
Q1
Dosage B Efficacy Level E2 Toxicity level T2
Value of E2 & T2
P2 Q2
Toxicity level T3
Value of E2 & T3
Q3
Efficacy Level E3
P3

Dosage C

Figure 3 The choice of dosage based on uncertain efficacy and toxicity.

One use of a decision tree is to clearly display the factors and assumptions
involved in a decision. If the decision outcomes are quantified and the
probabilities of chance events are specified, the tree can also be analyzed by
calculating the expected value of each alternative. If several decisions are
involved in the problem being considered, the strategy best suited to each
specific set of chance outcomes can be planned in advance.

PROBABILITIES
Estimates of the probabilities for each of the outcomes of the chance nodes
must be made. In the simplified case of the drug dose decision above, the
later chance node outcome probabilities are modeled as being independent of
the earlier chance nodes. While not intuitively obvious, careful thought should
show that the physiological factors involved in clinical efficacy must be
different from those involved in toxicity, even if the drug is being used to treat
that toxicity. Therefore, with most drugs, the probability of high human toxicity
is likely independent of the level of human efficacy. In the more general non-
drug situations, however, for sequential steps, the latter probabilities are often
dependent conditional probabilities, since their value depends on the earlier
chance outcomes.

For example, consider the problem in Figure 4, where the outcome being used
for the drug dose decision is based on the eventual sales of it. The values of
the eventual outcomes now are expressed as sales for the firm.
Dosage A

Efficacy Level E1

Toxicity level T1

High Sales
Value of High Sales
Dosage B Efficacy Level E2 Toxicity level T2 Medium Sales
Value of Medium Sales
Low Sales
Value of Low Sales
Toxicity level T3

Efficacy Level E3

Dosage C

Figure 4 The choice of dosage based on efficacy and toxicity and their
eventual effect on sales.

The probability of high sales depends on the efficacy as well as on the toxicity,
so the dependent conditional probability of high sales is the probability of high
sales given that the efficacy is level 2 and toxicity is level 2, which can be
written as p(High Sales|E2&T2).

OUTCOME MEASURES
At the far right of the tree, the possible outcomes are listed at the end of each
branch. To calculate numerical expected values for alternative choices,
outcomes must be measured numerically and often monetary measures will
be used. More generally, the “utility” of the outcomes can be calculated. Single
or multiple attribute utility functions have been elicited in many decision
situations to represent decision makers' preferences for different outcomes on
a numerical (although not monetary) scale.

THE TREE AS AN AID IN DECISION MAKING


The decision tree analysis method is called “fold-back” and “prune.” Beginning
at a far right chance node of the tree, the expected value of the outcome
measure is calculated and recorded for each chance node by summing, over
all the outcomes, the product of the probability of the outcome times the
measured value of the outcome. Figure 5 shows this calculation for the first
step in the analysis of the drug-dose decision tree.

Dosage A

Efficacy Level E1

Toxicity level T1
Value

High Sales
$11.5 M
P(11.5|B,E2,T2)=0.30
Dosage B Efficacy Level E2 Toxicity level T2 EV =Medium
9.31 Sales
$9.2 M
P(9.2|B,E2,T2)=0.50
Low Sales
$6.3 M
P(6.3|B,E2,T2)=0.20
Toxicity level T3
Value

Efficacy Level E3

Dosage C

Figure 5 The first step, calculating the expected value of the chance node for
sales: EV = 0.3(11.5) + 0.5(9.2) + 0.2(6.3) = 9.31.

This step is called “folding back the tree” since the branches emanating from
the chance node are folded up or collapsed, so that the chance node is now
represented by its expected value. This is continued until all the chance nodes
on the far right have been evaluated. These expected values then become the
values for the outcomes of the chance or decision nodes further to the left in
the diagram. At a decision node, the best of the alternatives is the one with the
maximum expected value, which is then recorded by drawing an arrow
towards that choice in the decision node box and writing down the expected
value associated with the chosen option. This is referred to as “pruning the
tree,” as the less valuable choices are eliminated from further consideration.
The process continues from right to left, by calculating the expected value at
each chance node and pruning at each decision node. Finally the best choice
for the overall decision is found when the last decision node at the far left has
been evaluated.

EXAMPLE
In this example, we will consider a decision faced by a patient who is
considering lasik eye surgery to improve her vision. The basic decision
process is shown in Figure 6. The initial decision a patient encounters is
whether to: have the surgery, wait for more technological advances, or not
have the surgery at all.
Surgery

Lasik Surgery Wait 5 Yrs

No Surgery

Figure 6 The initial decision point

Suppose that if a patient chooses to wait at the first decision node, she will
observe the outcome of possible technological advances at the first chance
node, and then will have to make the decision of whether to have the surgery
or not. Figure 7 shows a detailed decision tree of this patient’s decision
process. The entries at the end of the branches can be seen as a measure of
health utility to the patient, on a 0-100 scale, where 100 is the best level of
health utility. Other patients can customize this tree to their personal
circumstances using a combination of chance and decision nodes.

Successful
100
0.75
Surgery Successful w/ Setbacks
70
0.21
Unsuccessful
0
0.04
Successful
95
0.95
Surgery Successful w/ Setbacks
Significant Tech 65
Improvements 0.04
Unsuccessful
0
0.70 0.01
No Surgery
40
Successful
95
0.92
Lasik Surgery Surgery Successful w/ Setbacks
Moderate Tech 65
Improvements 0.06
Wait 5 years Unsuccessful
0
0.20 0.02
No Surgery
40
Successful
95
0.75
Surgery Successful w/ Setbacks
65
0.21
No Tech Improvements Unsuccessful
0
0.10 0.04
No Surgery
40
No Surgery
40

Figure 7 Complete mapping of the decision process of whether or not to have


lasik surgery
Following the method of “folding back the tree” we find that the expected
health utility of having the surgery immediately is 89.70, waiting 5 years is
91.74, and not having the surgery at all is 40.00, where the calculation of each
chance node is the expected health utility. And so waiting 5 years is the
optimal decision for the patient in this example.

See Bayesian decision theory; Decision analysis; Decision making;


Decision problems; Group decision making; Influence diagrams; Multi-
attribute utility theory; Preference theory; Utility theory.

References
[1]. Eriksen, S.P. and Keller, L.R. (1993). “A Multi-Attribute Approach to Weighing
the Risks and Benefits of Pharmaceutical Agents,” Medical Decision Making,
13, 118–125.

[2]. Keeney, R.L. and Raiffa, H. (1993). Decisions with Multiple Objectives:
Preferences and Value Tradeoffs, Cambridge University Press, Cambridge,
United Kingdom.

[3]. Raiffa, H. (1968). Decision Analysis, Addison-Wesley, Reading,


Massachusetts.

[4]. Clemen, R. and Reilly, T. (2000). Making Hard Decisions with Decision Tools,
Duxbury Press, Belmont, California.

[5]. Kirkwood, C. (1997). Strategic Decision Making: Multiobjective Decision


Analysis with Spreadsheets, Duxbury Press, Belmont, California.

You might also like