Chapter 4
A Tutorial on Meta-Heuristics for
Optimization
Shu-Chuan Chu, Chin-Shiuh Shiel
, and John F. Roddick
Nature has inspired computing and engineering researchers in many
different ways, Natural processes have been emulated through a va-
riety of techniques including genetic algorithms, ant systems and
particle swarm optimization, as computational models for optimiza-
tion. In this chapter, we discuss these meta-heuristics from a practi-
tioner’s point of view, emphasizing the fundamental ideas and their
implementations. After presenting the underlying philosophy and al-
gorithms, detailed implementations are given, followed by some dis-
cussion of alternatives.
1 Introduction
Optimization problems arise from almost every field ranging from
academie research to industrial application. In addition to other (ar-
‘guably more conventional) optimization techniques, meta-heuristics,
such as genetic algorithms, particle swarm optimization and ant
colony systems, have received increasing attention in recent years
for their interesting characteristics and their success in solving prob-
lems in a number of realms. In this tutorial, we discuss these meta-
heuristics from a practicer’s point of view. After a brief explanation
of the underlying philosophy and a discussion of the algorithms, de-
tailed implementations in C are given, followed by some implemen-
tation notes and possible alternatives.
798 8-C. Oh CS. Shieh, and J. F. Roddick
‘The discussion is not intended to be a comprehensive review of re-
lated topics, but a compact guide for implementation, with which
readers can put these meta-heuristics to work and experience their
Power in timely fashion. The C programming language was adopted
because of its portability and availablity. The coding style is delibe
srately made as accessible as possible, so that interesting readers can
easily transfer the code into any ther programming language as pre.
ferred’,
Jn general, an optimization problem can be modeled as follows
FOE = (2, 22,--,2,) EX, a
Where F(z) is the object function subject to optimization, and X is
the domain of independent variables 21, 22, ---,2r4. We are asked to
find out certain configuration of # = (x1, 22,+--, 2,) to maximize or
‘minimize the object function F(z)
such as high dimensionality of Z, constrains imposed on #, non-
differentiability of F(Z), and the existence of local optima.
2 Genetic Algorithms
Based on long-term observation, Darwin asserted his theory of natu
ral evolution. In the natural world, creatures compete with cach other
for limited resources. Those individuals tht survive in the compet.
tion have the opportunity to reproduce and generate descendants In
0 doing, any exchange of genes may result in superior or inferior
descendants with the process of natural selection eventually filtering
ut inferior individuals and retain those adapted best to their envi
ronment
"The code segments can be downloaded from
hetp://kdm. first. flinders.edu.au/IDM/.
A Tutorial on Meta-Heuristics for Optimization 99
Inspired by Darwin's theory of evolution, Holland (Holland 1975,
Goldberg 1989) introduced the genetic algorithm as a powerful com-
putational model for optimization. Genetic algorithms work on a
Population of potential solutions, inthe form of chromosomes, and
tty to locate a best solution through the process of artificial evolu-
tion, which consist of repeated artificial genetic operations, namely
evaluation, selection, crossover and mutation,
i. ji i in Figure 1 is used
A multi-modal object function F,(r, y) as shown in |
to illustate ths. The glabal optimum is located at approximately
F,(1.9931, 1.9896) = 4.2947.
4 a
Alew) = Goargqcaai w+aFri
2
— - Es 5 (2)
GyrumweT ~Say<8 @
Figure 1, Object function Fy
The first design issue in applying genetic algorithms is to select an
adequate coding scheme to represent potential solutions in the searchreper
100 S.-C. Chu CS. Shieh, and J. F. Roddick
Space in the form of chromosomes. Among other altematives, such
as expression trees for genetic programming (Willis et al, 1997) and
city index permutation for the travelling salesperson. problem, binary
string coding is widely used for numerical optimization. Figure 2
ives a typical binary string coding for the test fiction F'. Each
genotype has 16 bits to encode an independent variable. A decoding
function will map the 65536 Possible combinations Of bys «++ by onto
the range [~5,5) linearly. A chromosome is then formed by casead.
ing genotypes for each variable. With this coding scheme, any 32 bit
binary string stands for a legal point in the problem domain,
18. 9.94
A bssbr4 +++ Bibo) Eicah x 10-5 @)
bis [bus] Pb
16 bits for x 16 bits for y
Figure 2. A binary coding scheme for F,
A second issue is to decide the population size, The choice of popu-
lation size, NV, isa tradeoff between solution quality and computation
cost. A larger population size will maintain higher genetic diversity
and therefore a higher possibility of locating global optimum, how.
ever at a higher computational cost. The operation of genetic algo-
rithms is outlined as follows:
Step 1 Initialization
Each bit of all V chromosomes inthe population is randomly set
{00 or 1, This operation in effet spreads chromosomes randomly
into the problem domains. Whenever possible, itis suggested te
incorporate any a priori knowledge of the search space into the
initialization process to endow the genetic algorithm with a better
starting point.
A Tutorial on Meta-Heuristice for Optimization 101
Step 2 Evaluation
Each chromosome is decoded and evaluated according to the
given object function. The fitness value, fj, reflects the degree of
success chromosome ¢; can achieve in its environment.
H = D(a)
fi = FQ) @
Step 3 Selection
‘Chromosomes are stochastically picked to form the population for
the next generation based on their fitness values. The selection is
done by roulette wheel selection with replacement as the follow-
ing:
Pr(c; be selected) = 6)
The selection factor, SF, controls the discrimination between su-
perior and inferior chromosomes by reshaping the landscape of
the fitness function. As a result, better chromosomes will have
‘more copies in the new population, mimicking the process of nat-
ural selection. In some applications, the best chromosome found
is always retained in the next generation to ensure its genetic ma-
terial remains in the gene pool.
Step 4 Crossover
Pairs of chromosomes in the newly generated population are sub-
ject to a crossover (or swap) operation with probability Po, called
Crossover Rate, The crossover operator generates new chromo-
somes by exchanging genetic material of pair of chromosomes
actoss randomly selected sites, as depicted in Figure 3. Similar
to the process of natural breeding, the newly generated chromo-
somes can be better or worse than their parents. They will be
tested in the subsequent selection process, and only those which
are an improvement will thrive.102 S.-C. Chu, C.-8, Shieh, and J, F. Reddick
==
———
Figure 3. Crossover operation
Step 5 Mutation
After the crossover operation, each bit of all chromosomes are
subjected to mutation with probability Pyy, called the Mutation
Rate. Mutation flips bit values and introduces new genetic mate.
rial into the gene pool. This operation is essential to avoid the en.
tire population converging to a single instance of a chromosome,
since crossover becomes ineffective in such situations. In most
applications, the mutation rate should be kept low and acts as a
background operator to prevent genetic algorithms from random
walking.
Step 6 Termination Checking
Genetic algorithms repeat Step 2 to Step 5 until a given termina-
tion criterion is met, such as pre-defined number of generations
ot quality improvement has failed to have progressed for a given
number of generations. Once terminated, the algorithm reports the
best chromosome it found.
Program 1 is an implementation of genetic algorithm, Note that, for
the sake of program readability, variable of int type is used to store
8 single bit. More compact representation is possible with slightly
tricky genetic operators.
The result of applying Program 1 to the object function F, is reported
A Tutorial on Meta-Heuristics Jor Optimization 103
in Figure 4. With a population of size 10, after 20 generations, the
‘genetic algorithm was capable of locating a near optimal solution at
F (1.9853, 1.9810) = 4.2942. Readers should be aware that, due to
the stochastic nature of genetic algorithms, the same program may
produce a different results on different machines.
a ee ee)
Number of Generations
Figure 4. Progress ofthe GA program applied to test function Fi,
Although the operation of genetic algorithms is quite simple, it does
have some important characteristics providing robustness:
+ They search from a population of points rather than a single point.
© The use the object function directly, not their derivative.
* They use probabilistic transition rules, not deterministic ones, to
guide the search toward promising region,
In effect, genetic algorithms maintain a population of candidate solu-
tions and conduct stochastic searches via information selection and104 S.-C. Chu, C-S. Shieh, and J. F. Roddick
exchange. Itis well recognized that, with genetic algorithms, near-
optimal solutions can be obtained within justified computation cost.
However, itis difficult for genetic algorithms to pin point the global
‘optimum. In practice, a hybrid approach is recommended by incor-
Porating gradient-based or local greedy optimization techniques. In
such integration, genetic algorithms act as course-grain optimizers
and gradient-based method as fine-grain ones,
The power of genetic algorithms originates from the chromosome
coding and associated genetic operators. It is worth paying atten-
tion to these issues so that genetic algorithms can explore the search
space more efficiently. The selection factor controls the discrimi-
nation between superior and inferior chromosomes. In some appl
cations, more sophisticated reshaping of the fitness landscape may
bbe required. Other selection schemes (Whitley 1993), such as rank-
based selection, or tournament selection are possible alternatives for
the controlling of discrimination.
Numerous variants with different application profiles have been
developed following the standard genetic algorithm, Island-model
genetic algorithms, or parallel genetic algorithms (Abramson and
Abela 1991), attempt to maintain genetic diversity by splitting a
Population into several sub-populations, each of them evolves inde-
pendently and occasionally exchanges information with each other.
Multiple-objective genetic algorithms (Gao et al. 2000, Fonseca and
Fleming 1993, Fonseca and Fleming 1998) attempt to locate all near-
optimal solutions by careful controlling the number of copies of
superior chromosomes such that the population will not be domi-
nated by the single best chromosome (Sareni and Krahenbuhl 1998).
Co-evolutionary systems (Handa et al. 2002, Bull 2001) have two
or more independently evolved populations. The object function for
each population is not static, but a dynamic function depends on the
current states of other populations. This architecture vividly models
interaction systems, such as prey and predator, virus and immune
system,
A Tutorial on Meta-Heuristics for Optimization 105
3 Ant Systems
Inspired by the food-seeking behavior of real ants, Ant Systems, at-
tributable to Dorigo er al. (Dorigo et al. 1996), has demonstrated
itself to be an efficient, effective tool for combinatorial optimization
problems. In nature, a real ant wandering in its surrounding envi-
ronment will leave a biological trace, called pheromone, on its path.
The intensity of left pheromone will bias the path-taking decision of
subsequent ants. A shorter path will possess higher pheromone con-
centration and therefore encourage subsequent ants to follow it, As
a result, an initially irregular path from nest to food will eventually
contract to a shorter path. With appropriate abstraction and modifi-
cation, this observation has led to a number of successful computa-
tional models for combinatorial optimization.
‘The operation of ant systems can be illustrated by the classical Trav-
elling Salesman Problem (see Figure $ for example). In the TSP
problem, a travelling salesman problem is looking for a route which
covers all cities with minimal total distance. Suppose there aren
cities and m ants. The entire algorithm starts with initial pheromone
intensity set to 79 on all edges. In every subsequent ant system cycle,
or episode, each ant begins its trip from a randomly selected starting
city and is required to visit every city exactly once (a Hamiltonian
Circuit). The experience gained in this phase is then used to update
the pheromone intensity on all edges.
‘The operation of ant systems is given below:
Step 1 Initialization
Initial pheromone intensities on all edges are set to 7
Step 2 Walking phase
In this phase, each ant begins its trip ftom a randomly selected
starting city and is required to visit every city exactly once. When
an ant, the k-th ant for example, is located at city r and needs to106 S-C. Chu, C8. Shich, and J.P. Roddick
0 02 Of 08S
esorsnate
Figure 5. A traveling salesman problem with 12 cities
decide the next city s, the path-taking decision is made stochasti-
cally based on the following probability function:
r(rs\-[nirs\P
Pall taiap? fe Hl
uch)
P(ns) = ©
0, otherwise.
where 7(r,) is the pheromone intensity on the edge between
cities r and s; visibility n(r, s) = 525 isthe reciprocal of the dis-
tance between cities r and s; J,(r) is the set of unvisited cities for
the k-th ant. According to Equation 6, an ant will favour a nearer
city or a path with higher pheromone intensity. is parameter
used to control the relative weighting of these two factors. During
the circuit, the route made by each ant is recorded for pheromone
updating in step 3. The best route found so far is also tracked.
‘Step 3 Updating phase
A Tutorial on Meta-Heurstics for Optimisation 107
ie experience accumulated in step 2 is then used to modify the
pheromone intensity by the following updating rule
1(r,8) — (1-a)-r(r,8) + > An(r.8) M
a
ier if (r,s) € route made by ant k;
0, otherwise.
where 0 < a < 1 is a parameter modelling the evaporation of
pheromone; Ly is the length of the route made by the k-th ant;
Arr(r, ) is the pheromone trace contributed by the k-th ant to
edge (r,s).
The updated pheromone intensities are then used to guide the
path-taking decision in the next ant system cycle. It can be ex-
pected that, as the ant system cycle proceeds, the pheromone in-
tensities on the edges will converge to values reflecting their po-
tential for being components of the shortest route. The higher the
intensity, the more chance of being a link in the shortest route, and
vice visa.
Step 4 Termination Checking
Ant systems repeat Step 2 to Step 3 until certain termination crite-
ria are met, such as a pre-defined number of episodes is performed
or the algorithm has failed to make improvements for certain num-
ber of episodes. Once terminated, ant system reports the shortest
route found.
Program 2 at the end of this chapter is an implementation of an ant
system. The results of applying Program 2 to the test problem in
Figure 5 are given in Figure 6 and 7. Figure 6 reports a found short-
est route of length 3.308, which is the truly shortest route validated
by exhaustive search. Figure 7 gives a snapshot of the pheromone
intensities after 20 episodes. A higher intensity is represented by a108 $C. Chu, C8. Shieh, and J. F. Roddick
wider edge. Notice that intensity alone cannot be used as a criteria
for judging whether a link is a constitute part of the shortest route or
not, since the shortest route relies on the cooperation of other links,
aca eee ca aera een
Figure 6. The shortest route found by the ant system.
A close inspection on the ant system reveals that the heavy com-
Putation required may make it prohibitive in certain applications.
Ant Colony Systems was introduced by Dorigo et al. (Dorigo and
Gambardella 1997) to remedy this difficulty. Ant colony systems dif-
fer from the simpler ant system in the following ways:
© There is explicit control on exploration and exploitation. When an
ant is located at city r and needs to decide the next city s, there
are two modes for the path-taking decision, namely exploitation
and biased exploration. Which mode to be used is governed by a
random variable 0 < q <1,
‘A Tutorial on Meta-Heuristice for Optimization 109
Figure 7. The snapshot of pheromone intensities after 20 episodes
Exploitation Mode:
arg max (r(rul-In(rsu)?, ifa
q (9)
‘+ Local updating. A local updating rule is applied whenever a edge
from city r to city s is taken:
(1,8) — (1p) -7(r,8) + pAr(r, 8) (10)
where Ar(r, s) ‘n+ Lnn)7', Lan is a rough estimation
of circuit length calculated using the nearest neighbor heuristic;
FPises),
Sen — Hi if £5) > fGen) (4)
Where f(Z) is the object function subject to maximization.
Step 5 Termination Checking
The algorithm repeats Steps 2 to 4 until certain termination
conditions are met, such as pre-defined number of iterations
oF a failure to make progress for certain number of iterations
Once terminated, the algorithm reports the dex and f(dex) as its
solution.
Program 3 at the end of this chapter is a straightforward implemen-
tation of the algorithm above. To experience the power of particle
swarm optimization, Program 3 is applied to the following test func-
tion, as visualized in Figure 8.
Fi(a,y) =—zsin (viz) an (viv), 600 < 2,y < 500
as)
A Tutorial on Meta-Heuristics for Optimization 113
‘where global optimum is at F,(—420.97, —420.97) = 837.97.
Figure 8. Object function Fe
I ing factors, c, and c, are set to a value of
ants waa nt weight is used according tothe suggestion
from Shi and Eberhart (1999), Figure 9 reports the progress of parti-
cle swarm optimization on the test function F(z, y) for the frst 300
iterations. At the end of 1000 iterations, F,(—420.97, 420.96) =
837.97 is located, which is close to the global optimum,
is ics of particle swarm opti-
It is worthwhile to look into the dynamics of partic n
mization. Figure 10 presents the distribution of particles at different
iterations. There is a clear trend that particles start from their initial
positions and fly toward the global optimum.
Numerous variants had been introduced since the first particle swarmMA S.C, Chu, C-8. Shick, and J. P. Roddick
a = |
a8
pecan Foo
&
‘Beat Object Vote
@ 8 8
a8
ee
Figure 9. Progress of PSO on object function F,
optimization. A discrete binary version of the ticle
mization algorithm was Proposed by Kennedy and Ebeton ( om)
Shi and Eberhart (2001) applied fuzzy theory to particle swarm op-
timization algorithm, and successfully incorporated the concept of
¢0-evolution in solving min-max problems (Shi and Krohling 2002)
(Chu et al. 2003) have proposed a parallel architecture with commis
nication mechanisms for information exchange among independent
ee Sroups, in which solution quality can be significantly im-
5 Discussions and Conclusions
The nonstop evolution process has succ:
; essfully driven natural
Species to develop effective solutions to a wide range of problems
‘A Tutorial on Meta-Heuristics for Optimization 116
Bo 8
os
(4) oh iteration, () 10-thitertion,
be
(© 300-th iteration
(6) 100-th iterton,
Figure 10. The distribution of particles at diferent iterations
Genetic algorithms, ant systems, and particle swarm optimization,
all inspired by nature, have also proved themselves to be effective
solutions to optimization problems. However, readers should remem-
ber that, despite of the robustness these approaches claim to be of,
there is no panacea. As discussed in previous sections, there are con-
‘rol parameters involved in these meta-heuristics and an adequate
setting of these parameters is a key point for success. In general,
some kind of trial-and-error tuning is necessary for each particular
instance of optimization problem. In addition, these meta-heuristics
‘should not be considered in isolation. Prospective users should. ‘spec-
ulate on the possibility of hybrid approaches and the integration of
gradient-based methods, which are promising directions deserving
further study.NG S.-C. Chu, OS. Shieh, and J. F. Roddick
References
Abramson, D. and Abela, J. (1991), “A parallel it
J. genetic algorithm for
solving the school timetabling problem,” Technical report, Divi
sion of Information Technology, CSIRO. ,
Bull, L. (2001), “On coevolutionary genetic algorithms,” S
L , 5” Soft Com-
uting, vol. 5, no. 3, pp. 201-207, SOREN SA Come
Chu, S. C. and Roddick, J. F. and Pan, J. $. (2003), “Parallel par-
ticle swarm optimization algorithm with communication strate-
gies,” personal communication,
Chu, S. C. and Roddick, J. F. and Pan, J. S. and Su, C. J. (2003),
aa an colony systems,” 14th International Symposium on
fethodologies for Intelligent Systems, LNCS, Springer-Ve
(will appear in Japan). ene
Dorigo, M. and Maniezzo, V. and Colom, A. (1996), “The ant sys-
tem: optimization by a colony of cooperating agents,” IEEE Trans
on Systems, Man, and Cybernetics-Part B, vol. 26, no. 2, pp. 29-
Dorigo, J. M. and Gambardella, L. M. (1997), “Ant colony system: a
‘cooperative learning approach to the traveling salesman problem,”
Pas Trans. on Evolutionary Computation, vol. 26, no. 1, pp. 53~
Fonseca, C. M. and Fleming, P. J. (1993), “Multiobj
. CM, PJ. pjective genetic
algorithms," IEE Colloquium on Genetic Algorithms for Control
‘Systems Engineering, number 193/130, pp. 6/1-6/5.
Fonseca, C. M. and Fleming, P. J. (1998), “Multiobjective opti
mization and multiple constraint handling with evolutionary a
gorithms I: A unified formulation,” IEEE Trans. on Systems, Man
4and Cybernetics-Part A, vol. 28, 0. 1, pp. 26-37, ,
A Tutorial on Meta-Heuristice for Optimization 117
Gao, Y., Shi, L., and Yao, P, (2000), “Study on multi-objective ge-
netic algorithm,” Proceedings of the Third World Congress on In-
telligent Control and Automation, pp. 646-650.
Goldberg, D. E. (1989), Genetic Algorithm in Search, Optimization
‘and Machine Learning, Addison-Wesley, Reading, MA.
Handa, H., Baba, M., Horiuchi, T., and Katai, O. (2002), “A novel
hybrid framework of coevolutionary GA and machine learning,”
International Journal of Computational Intelligence and Appli-
cations, vol. 2, no. 1, pp. 33-52.
Holland, J. (1975), Adaptation In Natural and Artificial Systems,
University of Michigan Press.
Kennedy, J. and Eberhart, R. (1995), “Particle swarm optimization,”
IEEE International Conference on Neural Networks., pp. 1942-
1948
Papadimitriou C. H. and Steiglitz, K. (1982), Combinatorial Opti-
‘mization — Algorithms and Complexity, Prentice Hall.
Sareni, B. and Krahenbuhl, L. (1998), “Fitness sharing and niching
methods revisited,” IEEE Trans. on Evolutionary Computation,
vol. 2, no. 3, pp. 97-106.
Shi, Y. and Eberhart, R. C. (2001), “Fuzzy adaptive particle swarm
optimization,” Proceedings of 2001 Congress on Evolutionary
Computation (CEC'2001), pp. 101-106.
Shi, Y. and Krohling, R. A. (2002), “Co-evolutionary particle swarm
optimization to solve min-max problems,” Proceedings of 2002
Congress on Evolutionary Computation (CEC’2002), vol. 2, pp.
1682-1687.
‘Wang, L. and Wu, Q. (2001), “Ant system algorithm for optimization,
in continuous space,” IEEE International Conference on Control
Applications (CCA'2001), pp. 395-400.M8 $C. Chu, C.-S. Shieh, and J. P. Roddick
Whitley, D. (1993), A genetic algorithm tutorial, Technical Report
CS-93-103, Department of Computer Science, Colorado State
University, Fort Collins, CO 8082.
Willis, MJ, Hiden, H. G., Marenbach, P, McKay, B., and Mon-
tague, G.A. (1997), “Genetic programming: an introduction and
survey of applications,” Proceedings of the Second International
Conference on Genetic Algorithms in Engineering Systems: Inno
vations and Applications (GALESIA'97), pp. 314-319,
A Tutorial on Meta-Heuristics for Optimisation 119
Program 1, An implementation of genetic algorithm in
¢ language.
include
#include
include
fidefine MG 50 /* Maximal Number of Generations */
fidefine N 10 /* Population Size */
fidefine CL 32 /* Number of bits in each
chromosome */
fidefine SF 2.0 /* Selection Factor */
Hdefine cR 0.5 /* Crossover Rate */
fidefine MR 0.05 /* Mutation Rate */
/* Macro for random number between 0 and 1 */
fidefine RAND ( (float) rand()/ (float) (RAND_MAX+1) )
int IN) (CLI; /* Population of Chromosomes */
float £16); /* Fitness Value of Chromosomes
”
int best_c(cL]; /* Best Chromosome */
float best_f; /* Best Fitness Value */
/* Decode Chromosome */
void decode (int chromosome(Ch], float *x, float *y)
(
int $7
/* Decode the lower 16 bits for variable x */
(#x)=0.07
for (j™0;4best_f)
i
best_fvf [ils
for (J=0;4p{k] : k++)
tmpfstmpf-pik]7i
122 $C. Chu, C8. Shieh, and J. F. Roddick
/* Chromosome k is selected +/
for (j=0:3
include
include
fidefine Map "map.txt" —/* file name of city ma
pts
fdefine Numberofcity 12 /* munber of cities */
fidefine NumberofAnt 10 /* nunber of ante */
define alpha 0.2. /* pheromone decay fact
or */
fidefine beta 2.0 /* tradeoff factor
between pheromone and distance */
define taud 0.01 /* initial intensity of
pheromone */
fidefine BpiscdeLimit 20 /* Limit of episode */
fidefine Route “route.txt" /* file name for route
map */
/* RAND: Macro for random number hetween 0 and 1 */
Hidefine RAND | (float) rand()/ (float) (RAND_MAX+1) )
typedef struct (
float x: /* x coordinate */
float yi /* y coordinate */
} CityTyper
typedef struct {
int route (Numberofcity];
/* visiting sequence of cities */
float length: 7* length of route */
} Routerype;
Citytype city(Nunberofcity]; (+ city array +/
float delta [Numberofci ty] [Numberofcity];
/* distance matrix */
| float eta [Numberofcity] {Numberofcity] ;124 8-6. Chu,
Shieh, and J. P. Roddick
/* weighted visibility matrix #/
fleat _tau[NumberOfcity] [Numberofcityl;
/* pheromone intensity matrix */
7* shortest route 4/
7* ant array */
RouteType BestRoute;
RouteType ant (NunberofAnt!
float piNumberofcity) /* path-taking proba
ln ctsteespnens bility array «/
nt visited INunberofcityl; /+ array tor visitin
status */ ¥ $
float delta_tau{Numberofcity] (Numberofcity] ;
/* sum of change in tau */
void main (void)
{
ues /* tite pointer for city map +
ine /* indices for cities */
int Js Sndex for ane */
int episodes 7+ index for ent system cycle #/
int Steps 1s index for outing seep @/
float taps 7+ cemposary variable */
FILE* routefpr; /* file pointer for route map */
/* Set random seed */
farand(1)+
/+ Read city map */
mapfpr=fopen (Map, "x") +
for (r=0; reNumberofcitysr++)
fecanf (mapfpr, "8f 9£", ¢(city(r].x),&(eitytr).
ys
fclose (mapfpr) +
/* Evaluate distance matrix */
for (r=0;rp[s}7s++)
‘tmp~=p ls]:
ant [k] . route [step]
)
A Tutorial on Meta-Heuristcs for Optimization 127
/* Update pheromone intensity */
/* Reset matrix for sum of change in tau */
for (r=0) r
7* Write route map */ include
: routetpr=fopen Route, "w") Hinelude
for (r=0;rgbest_fitness)
(
for (d-0:dVmax) pli) -v{d) =vmax:
Sf (pLi] vc] <-Vmax) pi] -v[d) =-Vmaxz
PU) -x(d)=p(i) .x(d} #p(3) -vidi
if (pli] x(a] >500.0)p(i] .x{d)
=500.07
Af (pl4) x14] <-00.0)p{i] .xId)
=-500.0; )
pli]. £itness=Schwefel (pli] .x)s
/* Update pbest */
if (pli). fitness>p(i] .best_fitness)
(
for (d=0;dghest_fitness)
(
for (d=0;d