0 ratings 0% found this document useful (0 votes) 13 views 39 pages Calculations of WinLoss Dist For BJ
This thesis presents a method for calculating win-loss distributions in blackjack, emphasizing the game's potential for fair play without external devices. It discusses the importance of understanding outcome distributions to optimize betting strategies and highlights the limitations of traditional simulation methods. The program developed allows for precise calculations of probabilities and optimal plays based on player and dealer hands, ultimately guiding players on how much to bet in various situations.
AI-enhanced title and description
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here .
Available Formats
Download as PDF or read online on Scribd
Go to previous items Go to next items
Save Calculations of WinLoss Dist for BJ For Later ‘Calculation Of Win-Loss Distributions For Blackjack
by
JOHN HAN CHANG
Submitted to the Department of
Electrical Engineering and Computer Science
in Partial Fulfillment of the
Requirements of the Degree of
BACHELOR OF SCIENCE IN COMPUTER SCIENCE AND ENGINEERING
atthe
MASSACHUSETTS INSTITUTE OF TECHNOLOGY
‘August 1985
@© Jobn Han Chang 1985
‘The author hereby grants to M.L.T permissian to reproduce and to distribute copies of this thesis document in
whole or in part,
Signature of Author:
sparument of Electrical Engineering and Computer Science
August 21, 1985
Centifledby: |
Nesmith Ankeny
‘Thesis Supervisor
Accepted by:
David Adler
Chairman, Undergraduate Thesis Committee
ARCHIVES:
css
DEC 11 1885Blackjack is the only casino game that can be beaten fairly, without the aid of any
external devices. Since Professor Thorpe published Beat The Dealer in 1962, its
popularity has increased dramatically. Although blackjack can be beaten, the casinos
obviously would not continue to offer the game if they lost money on it. Winning play
requires a combination of proper play and money management.
This paper discusses a procedure to calculate the distribution of win and loss for any
hand in the game of blackjack. (For an explanation of the rules of blackjack, consult the
Appendix.) Why is this important? Knowledge of the distribution of outcomes allows
you to determine precisely how much to bet in favorable situations. Knowledge of this
distribution also enables you to determine how much to bet when playing multiple
hands. One interesting fact is that the dealer wins an arbitrary hand significantly more
often than a player does, even when the composition of the shoe favors the player. It
is just that when the player wins, he often wins after doubling or splitting. Moral: if
you're out to double your money in one shot, don’t play blackjack.
There are various methods that have been used to calculate the basic strategy for
blackjack. Originally, a close approximation was achieved by simulating the results
of thousands of hands on a computer. The precise advantage of the player and the
optimal play in close decisions remained unknown. In addition, small variations in the
rules would necessitate a resimulation. The major disadvantage of the Monte Carlo
technique (simulation) is that it breaks down when looking at subsamples that occur
infrequently. The technique requires a large aumber of random samples to achieve
any given accuracy. One situation in which the technique fails is in determining the
probability of losing four bet units from one initial hand. (This unfortunate event could
arise if one were to split a pair, double down on each of the resulting hands, and lose
everything.) This event occurs perhaps one in three thousand hands; a sample of 1000
would require 3 million hands to be simulated. If one were interested in an even rarer
1event, say losing four when having a pair of cights, that number would be even larger.
My approach to deriving the basic strategy makes one major assumption. I assume
that the cards played by the player do not change the probabilities of the dealer totals.
Actually the assumption is more encompassing. I assume that the outcome of any
hand does not change the probability of the outcome of any other hand. For the
casino game, which involves at least one deck of cards, this assumption is valid. Only
when a few cards remain does such an approximation become invalid. Consider the
following situation: two cards remain unseen-a ten and an 8. The dealer has a ten
showing, and the player has a total of 10 on his first two cards. This program would
assign independent probabilities of .5 to both the player and the dealer getting 20 and
18. Consequently, the program would calculate a .5 probability of a tie, which results
from the .25 probability of tying at 18 and the .25 probability of tying at 20. It also
calculates that the player has a .25 probability of winning and a .25 probability of
losing. Of course, the probabilities are dependent: if the player has 20, the dealer must
have 18, and vice versa. A tie is impossible. This is, however, an extreme case.
In general, this assumption produces a slightly pessimistic expectation for the
player. Epstein describes the error as ‘miniscule,’ amounting to an expectation er-
ror of .003 for one deck, a figure determined by simulating the results of millions of
hands. The error for more than one deck is even less, since the effects of removal of
any one card become smaller when the number of cards increases. The converse is also
true.
Griffin describes the difference in expectation for n decks as varying as 1/n. For
example, if the expectation for one deck is x and that for an infinite deck is y, then the
expectation for 2 decks is approximately halfway between at (x + y)/2; in general, for
one in which removal
n decks, the expectation is y + (x - y)/n. An ‘infinite’ ded
of any card does not affect the probability for the next card. My approximation has
2no error when dealing with an ‘infinite’ deck, since independence between hands is a
property of such a deck. For four decks, the error is approximately .003/4 = .00075.
In general it is more favorable to play a game with as few decks as possible. Griffin
attributes the gain in playing fewer decks mostly to the increased favorability of dou-
bling; it usually favors the player to remove the cards composing a hand that adds to
9, 10, or 11, a hand that is often doubled. These cards tend to favor the dealer when
they are prevalent in the deck because they have a tendency to help his hand reach a
total from 17 to 21. Tens, on the other hand, when drawn as hit cards, more often than
not bust the hand to which they are drawn. For a hand totaling 9, 10, or 11, however,
they are most favorable. Since it is more likely that a ten or ace will be drawn after
smaller cards have been removed, it is also more likely that either the dealer will bust,
or the player will have a good hand. The likelihood of these events increases as the
number of decks the game is dealt from decreases, because, again, the effect of removal
of any one card increases as the number of decks decreases.
Discussion of
mption
The reason for this assumption is to reduce the order of the algorithm from O(n)
to O(nm), where:
number of card sequences that can develop from the original card
m = number of hands possible
= Dealer's hand + player's hand + number of additional hands due to splitting
= 2+ Number of times splitting is allowed.
If the probability for each hand is calculated exactly, a split pair would necessitate
calculating the probability for each possible sequence of cards in a hand that could
develop from the initial hand, then calculating probabilities for all possibilities for the
second hand, and finally calculating those for the dealer. For each possible developmentof the players hand, one must, calculate the dealer probabilities, which results in n?
calculations. If the player splits, for each possible developed hand from the first, the
Program must enumerate all possible hands from the second as well as those of the
dealer, resulting in n° calculations.
An initial hand contains, on average, approximately 1000 possible card sequences.
Therefore, if n is 1000, a billion (n°) calculations ensue. If resplitting up to three
times (four hands) is allowed, the analysis requires O(n®)~10'5 operations. For a
computer that doer 107 operations per second, this would take at least three years, an
unreasonable length of time for just one hand. To consider all 550 dealer-player initial
hand combinations would take over a millenium.
A clever approach might avoid recalculating the probabilities for each subsequent
split hand, since the hands that can result are a subset of the ones being considered.
If the number of different probabilities is sufficiently small, a table can be constructed
containing the probabilities for each possible hand. Nonetheless, the analysis given
above still holds; the amount of time each operation takes can be reduced (because
table lookup is about 1000 times faster than these calculations), but the aggregate
number is still too large to deal with. The running time might be reduced to a year on
powerful minicomputer, such as a VAX with floating point hardware support.
The assumption of independence among hands allows one to calculate all probabil-
ities for one player hand. The dealer probabilities can be calculated separately (in fact,
they are calculated first). The probabilities of each total for both player and dealer ara
then compared to produce an expectation and a distribution of win, loss, or tie. For
example, if the dealer probabilities of various totals is calculated as:
Dealer total: 17 18 19 20 a1 bj bust
P(total): 14 113 ll4 333 036 078 213
(These are approximate probabilities when the dealer has a 10 showing.)
4And player probabil
of totals are:
Player total: <17 i 18 19 20 a bj bust
P(total): 385 078 073 078 078 3070 0
(These are approximate probabilities when the player has a totai of 11, given opti-
mal play against the dealer 10.)
Then, the distribution is:
Win 2 A 0 1 2
P(win) 388 078 063 0 ATL
The program maximizes expectation to determine the optimal play; in this example,
the optimal play is to double down the 11 against the dealer's 10. The result is either
to win twice the original bet, to lose twice, or to tie. A third possibility, to lose the
original bet, occurs only when the dealer has blackjack.
In any case, the independence assumption results in O(2n)~2000 caiculations, re-
gardless of whether the player has split. In comparison to the exact calculations, the
results are instantaneous. If he splits, each hand is considered identical to the first;
only calculating the distribution of win/loss becomes more involved. An example of
one of the cases that must be enumerated is calculating the probability of tying after
a split. This can result from the player losing one hand and winning the other; it can
also result from tying both hands; it can even result from losing one double down and
winning another.Program discussion
Usage
The program, written in C, was compiled and run on both an IBM PC using
Microsoft C 3.0 and a VAX 11/750 under 4.2 BSD Unix. It was written as a large
group of sub-programs, listed on pages 24-38. Two calling routines were written to
access these sub-programs. One, on pages 17-19, takes as input player and dealer hands
Each hand is delimited by a 0. The number of decks is specified on the command line
when the program is run. If zero decks is input, the prograin takes the next 10 numbers
as the number of cards corresponding to its position in the list: the first number is the
number of aces; the second, the number of twos; and s0 on.
A second calling routine, on pages 20-23, is included for completeness. It generates
all possible hands for a specified number of decks and adds their contributions to the
total expectation, win-loss distribution, and distribution of hand totals.
The program not only derives the optimal strategy for any player and dealer hands,
but also calculates the distribution of outcomes for such a strategy. Given such a
distribution, the precise amount to bet for a given level of risk can be calculated exactly.
Dealer probabilities
Dealer probabilities are calculated independently of the player's probabilities. The
algorithm for the routine is as follows:
Dealer probability routine (dprobe)
If the dealer's hand is less than 17,
For each card that does not bust the hand,
Take a card from the shoe and add it to the dealer’s hand
Call the dealer probability routine.
Remove this card from the hand and replace it in the shoe.Sum probabilities of dealer totals; subtract from one to get P(bust).
Otherwise, add the probability of this hand to the probability for its total.
Hitting routine
This is another recursive routine. The routine compares the expectation of drawing
with the expectation of standing. The question is not resolved until the criterion to
stop is achieved. This criterion is that the expectation of standing must be greater
than -1, and that the hand must be at least a hard 17 or a soft nineteen. Why? From
first principles, the stopping criterion should be a total of 21. For the vast majority of
deck compositions, however, the expectation of hitting hard 17 or soft 19 versus any
dealer upcard is less than that of standing. Also in the vast majority of cases, if it is
best to stand on a total t, then it is best also to stand on any total greater than t. A
refinement to the routine is that only cards that do not bust the hand are dealt; this
increases the speed of the routine as well, since floating point calculations take place
only when cards are dealt.
Splitting routine
This is the most complicated routine of all. The routine treats aces and non-aces
separately as follows:
Aces
Splitting aces is the easier case to handle, because cusinos generally allow only one
card to be dealt to each ace. Resplitting is generally not allowed. The routine assumes
an independence between each of the split aces. Therefore, one and only one card is
dealt to one ace; the split distribution routine is then called with the appropriate player
and dealer probabilities of totals.
Non — acesAfter a player splits, his options remain essentially the same as when he started; of
course, he may stand or hit. In many casinos, he may also resplit or double down. Asa
result, all the options must be compared in order to determine which is best. After thia
is done for one hand, the split distribution routine is invoked with the distribution of
player totals and amounts bet for each total. This routine generates the probabilities
of winning from -4 to +4 bet units. It simply considers all possible permutations of
results for both hands. This program does not handle the case of resplitting, since
the number of possibilities for three or more hands becomes unwieldy, though not
ridiculously unmanageable.
Discussion of results
A prudent gambler (investor) never bets all he has unless he is absolutely guaran-
teed to win. If the investor wishes to increase his money as quickly as possible, the
appropriate amount to wager in an advantageous situation is governed by the Kelly
criterion, which says to bet in proportion to your expectation. In addition to the ex
pectation, however, the distribution of results also governs the correct bet size. For
example, if the expectation of some game were .01 and the distribution of its results
included zero chance of losing, we would attempt to bet as much as possible on this
game. If, on the other hand, the expectation were the same, but the distribution were
.495 probability of losing and .505 probability of winning, we would be fools to bet any
significant fraction of our wealth.
An application of the determination of the distribution of outcomes is to solve this
problem. Once the distribution is known and a utility function of money is chosen,
the problem becomes one of simple numerical analysis. The utility function is simply
maximized with respect to the bet size. It is not the intention of this paper to discuss
utility functions or betting systems in general. Rather, it is to introduce a tool that
can be used in conjunction with the results of these subjects.Consider the following situation: At the end of a round, the dealer accidentally
shows the next card, a ten, to the player. What fraction of his bankroll should the
player bet on the first hand in the next round?
Im order to maximize the player's rate of capital growth, he should bet an amount
Which maximizes the expectation of his atility, ie, E(In Mj, where M is his bankroll,
and in M is his utility function. The program gives the probability distribution for @
player receiving a ten as his first card in a four-deck game as approximately:
a 0 +1 +15
P(wi 316 119 AML O74
We want to maximize 316 In (M-B) + 491 In (M+B) + .074 In (M+1.5B). B is
the amount that should be bet under the criterion of maximal growth rate. The result
is B= .0927 M.
‘The program can be run for all possible dealer and player hands. For a single deck
game played under current Atlantic City rules (described in the Appendix) the net
expectation is 00136. The win-loss distribution is as follows:
Win Probability
4 0002242
3 0015758
2 0445612
a 4326174
0 0843654
1 3242601
2 0634407
3 0020058
4 0004570
bj 0464924
‘The same optimization of the bet gives B = 00109 M. If late surrender is allowed,
the probabilities for winning one, losing one, and tying one change to:A 4088225
0 0822621
1 SIT1211
75 0330372
‘As noted above, the probability of losing one-half compensates for the reduced
probabilities of losing one, winning one, and tying. The expectation increases to .00150.
‘The optimal bet is 00114 M. The value of late surrender in a single deck game is small,
only +.014%.
For four decks, the expectation becomes -0.00403. The win-loss distribution is:
Win Probability
4 0001997
3 0016831
2 0423956
a 4346690
0 0870881
1 3268699
2 0591181
3 0021410
4 0003983
bj 0454372
With late surrender, the expectation for four decks increases to -.00332, an increase
of .071%. These results demonstrate that single deck blackjack is in fact more favorable
than the four deck game. Surrender is only taken when the player has a bad hand.
Therefore, surrender has a greater effect in the four deck game because it is a worse
game; consequently, the opportunities for surrender are more frequent.
‘These results are only a sampling of those possible. ‘The program contains various
flags that can be changed in order to calculate the effect of various rule changes such
as no double down after split, or dealer hits soft seventeen.
Other games that can be analyzed with this approach, include double exposure,
10and possibly some aspects of tournament blackjack. Double exposure is a game similar
to blackjack, except that the dealer's second card is shown to the player before his
decisions are made. To compensate for this edge, the house wins all ties and pays
blackjack at 1:1. To analyze double exposure simply requires changing the expectation
evaluation to make ties a loss and blackjacks pay only even money. The program takes
as input any cards as dealer or player initial hands, so that no further changes are
necessary.
The analysis of tournament blackjack is a much more difficult problem. Tournament
blackjack differs from ordinary blackjack in that the opponent is not the dealer, but
the other players. Although the rules are the same, the object of the game is not
necessarily to win the most amount of money; instead it is to have more than anyone
else at the table at the end of a session that typically lasts 30 rounds, In order to negate
the advantage of betting last, a marker that indicates where the first player must bet
rotates around the table. The program presented here can help to analyze the last
rounds of this game by calculating the probabilities of win, loss, or tie. Of course, the
criterion for the optimal play must be changed; the expectation of the hand is not the
only important value, since winning the session involves a bonus significantly larger
than the money on that hand. Consequently, a more useful criterion is maximizing the
probability of winning the round, rather than maximizing the expectation of the hand.
Unfortunately it seems difficult to prescribe a simple strategy; the actions of the other
players, their bankrolls, and their bets all influence the what the prop (y) T(x) + (yD)
DISTRIBUTION
omenst] = ¢
in -n < filename\n(n
fumber of decks fe Aon:
weenie.”
votd perror(er:
C
{print (at
exit
ov]:
HL]. patorefHe]:
‘OUTCONE
‘ANSWER
(argv to]
((med <0) Ty (ne
(a):
4 < any $64)
noel 1]:
Nec)
yth(ghand) I= EOF) Bb (geth(phand) I= EOF)) {
)
prance,
dutorefo}
for (10 1; 4 ) 700 2 ON)
prince ™ x9.71¢\0\
for (1+ 0; 1 <9;
t-onat)Et] I* 0.0)
#\or (dati = 4, (deat-onet)E tN):
atm
tt ({bentssne8) C9
prantt (-P(1-8)
)
printt("P( bj) = 20.717\n", (bei
4¢°((best->atan){7} 1+ 0.0)
printt(=P(>21) = 30, 707\n", (Dent->atan)[7]);
printt(*\n");
C1]. ev, (*anawers)[2].0¥))s
printt(tevar"):
stand #15
4¢ (maxey = (Sanewors)[1).0v) {
beat. aC Ceanaare) C31
1%
MRIev) {
0include “hiteten.n
fine HL 20
fine DOAS 1
fine newline printt(*\n":
fine MHANDS (8ize0f(hi
un(aad) (((A)>(
for ($* 10: J > 0: J--) {ohang{t] ©
C0)
‘hang[0] = 1
print¢(*\n\nver
enter 6 :\n\n",dhendl1}):
noef0] -* re(noc,chand);
for (k= 0; k < MHANDS: ket) {
phand[t) = ha
phandf2] = hand[k].0;
P * prod(handtk
foc[0] == re(noc, phand
prtaee(*(xd x6) v X0:\0"
for (194; 4 <8:
Af ((phand[t] =* 1) 88 (phand[ 1] Ce})
hand, ap itraaut, past ,6(answers[2])):
ratt].ev = '=2.0;chstoten.ne
(ey) (OQ) > 0) TO) = ON)
+)
it (( pnat)(i] t= 0.0) {
printt("P(x d) = 20. 71f\n"
(netans.net)[s) +
a
(nstans
Man)[0} t= 0.0) {
<7) = 30, 717\0"
(aetans.atan)[o} + (
ehenyes ==
40 ((oenterttenytey 10,0) ¢
printe(*P( bg)» 0,71¢\0"
(netand.oien)[6) += (dei
Ant) s = 4, (beat-snet)[t]):
-onet EOD):
boat->etan)[0}):
an);joy = max((*anewers)[0].ev,man((*answers)[1).0v, (*answers)(2].ev));
ra)f0}.0v) {
race:
DEAD-ev) {
D2I.ev) £ctot return aum of cards eppropr iat
re return quad
ac return numb
geth get nan
votd md(nod.noc) unsigned nod, nocl):
G unsigned 1;
MgC Sas 1 < 205 94) moett] = noe #4:
ae a)
necfo} = nod * bit
ras ¢
DrAneE(“Enter noe vector:
poet] oo
fa te a to)
ctor(ae att is
noc{0) += foeft
2
)
but who knows,
for (1= 4; 4 <# WO}: t+) {
tun += Ct):
‘Mf (n{t] #1) andy
; Pocuenc (ca > 0) a6 (auc 12) A4 (aun > 6)) 7 (oun # 20) ¢ 00m):
Fe(nec.h) unstgned nocl).a0):
for (1 = 3; ¢ <= H[O]; t++) noe(Alt]}--:
returm (0[0});
jared noel }.NC]:
<= [0]; 44+) noe(ACs]}+:
» raere ron:
geen) onatgnee 0:
od index = 1, 6c * 1;
matte (05) {include *nitéten.h*
DISTRIBUTION ap:
vartables in the recursion as static vari
increasing speed.
ay
+ tots
tot = ctot(deater):
Vim = (tot > At) 72 = toes 10):
ft (tot a7)
be 0.0:
for (104: 4< 7; 144) pte
ops?) = 1.0 ~
af ((t0
6] + ep:
21) a8 (
‘tot = 18) + €p:
aC}:
4 < NOUTCOMES; 1++)
0.0;
nd:
( 4 < WOUTCOMES: 1+) {
ae ‘patsinclude “nit
extern RESULT standresult, doubleresult;
extern DISTRIBUTION 6p:
7/* To vse this routine, dp aust already be calculated */
Jsnet) unsigned nocl]. phand{]: OUTCOME net;
ned 1, J, Tim, sums 0, dusteds © 0, tot, ctot():
4 (phangfo} I= 2) ¢
printf(*Double allowed on two cards only\n*):
( .! .
for (4 Vim; 1 < 11; 444) bu
returns a different ev **/
a different gevs, or must modify its result for ev */
id takes thie sttuation {ato consideration */
< NOUTCOMES; J++)
jot(phand)) < 17)
pLtot - 16) = 1.0;
t)
jan)EJ) * pts
an)C7] +0 busteds/(doudte) noc[0):
for (1 » 0; 1°< MOUTCOMES; t++)
C1}:
Ne.
t+)
(standranut
7[OM wt
DISTRIBUTION
OUTCOME dummy!
for (1 * 0: 1 < MOUTCOMES; 1+)
wi[t] = w2ft] © 0.0;
)
fot
phang[ 0}
for (1 #4: 1 < 41)
‘4 (noel 1] > 0) ¢
pt
7* What about reeplita? °7
y= ay
er rae Bu te)
= mocls) 7 (4 0;
wer afeeetD / Ceeubte)wael ON aay ts
LIT om true) Bb (canrespiit < WRESPLITHANDS) BB (phand{1] ©» phend(2))) {
moc[t]--; nocl0}.
yl iteenultaet);"2 J] += pt * (coud
< MOUTCOMES: $+)
ult-paten)[9] » wiLJ] += pt * (nitreauit.dten)( J}:
3b En ons 07
wae yo: 4 « wourcoMes: $04)
Casteieeculesvatan)Cd} © sifg] pf © (ndtranutt.cten)C43s#toclude “nitdten.n
extern RESULT
sult, nitresutt;
roe
1
) from avn and add to P(bust)?
Compare hit and stand ove & return reqult of optics) decteton
”
oct ).phand[}; OUTCOME
att) = 0.0;
for (J * 11 ym += phanclj):
Af (tot < 17)
0} * 1.
141)
for (J Vim; J < 11: J++) buateds += noclJ):
) Wb ({sum > 16) I] (tot » 189) ¢
uit.ev = evs:
1 < NDUTCOMES: $44)
it.disn)Ct] = pest
for (je 45 f< it
4€ (noel) » 0) ¢
= noel 3)a1include “hit
extern DISTRIBUTION dp;
extern RESULT atandresuit:
wre ;
. register wastgeed shert 1;
for (12 0: 4 < 10; t+*)
nat(t) © 0.0;
rogram, accounting for as much as 1/3 the total
net{o) = pp[6) * (2.0 -
for (1 =
4 <7: 446) net[a] +* deft] © ppEt]:
net[s}
= net(s] + 2
‘< NOUTCORES;
Ve.dien)Ct] = pel:
ly, since this routine fe among the most heavily uiinclude *hitetan.n*
extern DISTRIBUTION 6;
extera uni
extera RESULT
net{s]
for (1 4
net(2] =
stendresut
for (include “nite
tne th
ine fal
To wf1T assume no repliten
The ficat t
” nit oy
24 nner
ajfay = Cant iteenuti-seten)C6) + (
ul t->atan) [6
for (1 1; 1 < MOUTCOMES: 14+) {
ues) 2 mitt:
r
Since the routine that passes wi & w2 doesn't
this routine assumes that any values pa
mis].
‘baalo}{s) * wife) + wife):
neta} «
that 10 A cannot be Oj,
be agaes to
for (19 4; 1 <
79 do the dfegonats and nondiagonals
e807
)
Nae 48 (4 8
. aatfay\s ae
went
anetfoeen) +9 t:
beats:
aCa}Ck]:
0 C
Af (C1 > 4) Bb E>
nat{2-arn] += ty
fe (C42 gy a (Fe DE
[3m] +t
te (ct) ab CH OD C
[3-0] +
Noe tf (C4 oe §) BB (= AD)
net[ay + ti
40 (1 > 9) (F< RD)
[smeay ot
dee te (C6 $9 88 (1 > 09) €
et 4-nem) ti
Hots) a oC
sae (tee €
tatttes so 43
?
i Af Che 0 oe 1 OC
3 metered
”
7* handle deater bust here */
for (1 = 0; 1.68: 94) ¢
nobustt +» wiCt]:
138
yan splitting °/
nobusti += wi[0}:
nectar es aot) 2 wil?)
neta] += 2.
e[0}) + 3.0 (net[7] =
uItepev = 4.0% (netL eft] + 2.0 °
met{(2}) + net[s] ~ net
4
(veto)
)#include “nite:
extern RESULT doubleresult;
extern CISTRIBUTION 6p:
extern unstgued
(noc, phand.ans) unsigned nocl]. pl
L): ANSWER ana;
unetgacd 1
‘OUTCOME dnat:
20: 4 « Mt
(ana->aten)[t] =
Yor (1 = 05 4.6 105 $009 ¢
fata-seeiy Es} = annalsinclude *hiteten.a”
DISTATOUTION 4
wnat q
ters
‘extern RESULT M4
votd tentgnit(ne ANSWER Sane;
t
cured masts
h4t(noe,phand, net):
unsigned noel}, phanelinclude “nitdtan.n”
‘extora DI:
extern unetgued dhend[]}
Titreault.net.ans)
tteult->ev:
4 < NDUTCOMES: 194) {
DEA] = (plitrasurt-n