0% found this document useful (0 votes)
9 views71 pages

Genetic AlgorithmsDrHussein08102024

Evolutionary Computation (EC) algorithms, including Genetic Algorithms (GA), are inspired by natural evolution principles such as selection, mutation, and crossover to solve complex computational problems. These algorithms operate on populations of solutions that evolve over generations, improving through iterative processes. EC is versatile, applicable to various domains, and can be hybridized with other methods for enhanced performance.

Uploaded by

rana.hesnawy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views71 pages

Genetic AlgorithmsDrHussein08102024

Evolutionary Computation (EC) algorithms, including Genetic Algorithms (GA), are inspired by natural evolution principles such as selection, mutation, and crossover to solve complex computational problems. These algorithms operate on populations of solutions that evolve over generations, improving through iterative processes. EC is versatile, applicable to various domains, and can be hybridized with other methods for enhanced performance.

Uploaded by

rana.hesnawy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 71

Evolutionary Computation

Genetic Algorithms
An Overview
Evolutionary Computation
Genetic Algorithms
An Overview
• They are called Evolutionary Computation (EC) algorithms because they
are inspired by the process of natural evolution. EC algorithms mimic the
principles of biological evolution, such as:
• Selection: Similar to how in nature, organisms better suited to their
environment are more likely to survive and reproduce, EC algorithms use a
"fitness" function to select the most optimal solutions from a population
of candidate solutions.
• Mutation: In nature, random mutations introduce genetic diversity. In EC,
random alterations are applied to solutions to explore new possibilities
and prevent premature convergence on suboptimal solutions.
• Crossover (Recombination): In nature, genetic information from parents
combines to produce offspring. In EC algorithms, crossover allows
solutions to exchange characteristics to generate new candidate solutions.
• Population-based Approach: EC algorithms operate on a population of
solutions, which evolve over generations. Each generation improves based
on the principles of selection, mutation, and crossover.

10/9/2024 ٢
Evolutionary Computation
Genetic Algorithms
An Overview
• Since these algorithms draw their structure
and inspiration from evolutionary processes,
they are collectively referred to as
Evolutionary Computation (EC) algorithms.
They use nature’s way of optimizing and
adapting to problems, applying this
evolutionary framework to solve complex
computational problems.

10/9/2024 ٣
Evolutionary Computation
Genetic Algorithms
An Overview
• some well-known algorithms that are regarded as Evolutionary
Computation (EC) algorithms:
• 1. Genetic Algorithms (GA)
• Description: One of the most popular EC algorithms, genetic
algorithms simulate natural selection where the fittest individuals
are selected for reproduction to produce the next generation.
• Applications: Optimization, scheduling, machine learning,
engineering design.
• 2. Genetic Programming (GP)
• Description: An extension of genetic algorithms where the
solutions are represented as computer programs (typically trees),
and the evolution seeks to evolve programs to solve a problem.
• Applications: Symbolic regression, program synthesis, automatic
code generation.

10/9/2024 ٤
Evolutionary Computation
Genetic Algorithms/An Overview
• 3. Evolutionary Strategies (ES)
• Description: Focuses on optimizing real-valued parameters by evolving a
population of solutions using mutation and selection mechanisms.
Crossover is often less emphasized compared to GAs.
• Applications: Continuous optimization, control systems, machine learning.
• 4. Differential Evolution (DE)
• Description: An algorithm that focuses on optimization over continuous
spaces, using differential variations of individuals to drive the search for
optimal solutions.
• Applications: Global optimization, engineering problems, and machine
learning.
• 5. Estimation of Distribution Algorithms (EDA)
• Description: Instead of using mutation and crossover, EDAs build a
probabilistic model of the best individuals and sample from this model to
generate new candidates.
• Applications: Combinatorial optimization, machine learning, and
optimization under uncertainty.

10/9/2024 ٥
Evolutionary Computation
Genetic Algorithms/An Overview
• 6. Particle Swarm Optimization (PSO)
• Description: Inspired by the social behavior of birds flocking or fish
schooling, PSO evolves a population of candidate solutions
(particles) by having them "fly" through the search space.
• Applications: Optimization in engineering, economics, machine
learning, neural network training.
• 7. Ant Colony Optimization (ACO)
• Description: Inspired by the behavior of ants seeking paths
between their colony and food sources, ACO is a probabilistic
technique used to solve problems by finding optimal paths.
• Applications: Traveling salesman problem, routing, scheduling.
• 8. Artificial Immune System (AIS)
• Description: Inspired by the human immune system, AIS algorithms
focus on learning and optimization by maintaining a diverse set of
solutions (similar to the immune system's diversity of antibodies).
• Applications: Anomaly detection, pattern recognition

10/9/2024 ٦
Evolutionary Computation
Genetic Algorithms/An Overview
• Swarm Optimization algorithms are often considered part of Evolutionary
Computation (EC), even though they are inspired by collective behaviors
of social organisms (like birds, fish, or ants) rather than biological
evolution directly.
• While traditional EC algorithms (like Genetic Algorithms) are inspired by
Darwinian evolution (with selection, mutation, and reproduction), Swarm
Intelligence algorithms are inspired by the collaborative and decentralized
behaviors seen in natural swarms.
• Both Evolutionary Computation and Swarm Intelligence fall under the
broader umbrella of Nature-Inspired Algorithms and share many
similarities:
• Both work with populations of solutions.
• Both use mechanisms to iteratively improve a set of candidate solutions.
• Both are widely used for solving optimization and search problems.

10/9/2024 ٧
Evolutionary Computation
Genetic Algorithms/An Overview
• Since Swarm Optimization algorithms share these common characteristics
and are biologically inspired, they are often grouped together with EC
algorithms, though they are technically a distinct subset of bio-inspired
computation.
• Some well-known Swarm Optimization algorithms include:
• Particle Swarm Optimization (PSO): Inspired by birds flocking or fish
schooling.
• Ant Colony Optimization (ACO): Based on how ants find the shortest path
to food.
• Artificial Bee Colony (ABC): Mimics the foraging behavior of honeybees.
• In summary, although Swarm Optimization algorithms have a different
inspiration from Evolutionary Computation algorithms, they are often
considered part of the same broader family of techniques used in bio-
inspired optimization.

10/9/2024 ٨
Advantages of Evolutionary
Computation EC
• Conceptual Simplicity
• Broad Applicability
• Hybridization with Other Methods
• Parallelism
• Robust to Dynamic Changes
• Solves Problems that have no Solutions

10/9/2024 ٩
Advantages of Evolutionary
Computation EC
• 1. Conceptual Simplicity
• Explanation:
• Evolutionary computation mimics the process of
natural evolution and follows simple principles such as
selection, mutation, and recombination. These
principles are easy to understand and implement.
Despite this simplicity, EC algorithms are capable of
solving complex problems effectively. It does not
require the problem to be differentiable, continuous,
or structured in a particular way, making it accessible
for a wide range of problem-solving tasks.

10/9/2024 ١٠
Advantages of Evolutionary
Computation EC
• 2. Broad Applicability
• Explanation:
• EC is applicable to a vast range of problems, including
optimization, machine learning, design, and planning.
It can handle various types of data, problem domains,
and optimization goals. Whether the problem is static
or dynamic, constrained or unconstrained, EC can be
applied without significant changes in the algorithm's
core structure. This versatility makes EC a powerful tool
in both research and practical applications.

10/9/2024 ١١
Advantages of Evolutionary
Computation EC
• 3. Hybridization with Other Methods
• Explanation:
• EC can be effectively combined with other
optimization methods, such as gradient-based
methods or simulated annealing, to enhance
performance. Hybrid approaches often exploit
the strengths of multiple methods. For example,
EC can provide global exploration, while other
methods focus on local optimization, resulting in
more efficient and effective solutions.
10/9/2024 ١٢
Advantages of Evolutionary
Computation EC
• 4. Parallelism
• Explanation:
• Evolutionary algorithms are inherently parallel because
populations of potential solutions can be evaluated
and evolved simultaneously. This allows EC to take
advantage of modern multi-core processors and
distributed computing environments. The ability to
perform operations on multiple individuals in parallel
significantly speeds up the search for optimal solutions,
making EC a great choice for computationally
expensive problems.

10/9/2024 ١٣
Advantages of Evolutionary
Computation EC
• 5. Robust to Dynamic Changes
• Explanation:
• Many real-world problems change over time,
requiring adaptive solutions. EC is robust in
dynamic environments because its population-
based approach maintains diversity in the
solution space, allowing the algorithm to adapt to
changes. If the problem changes, EC algorithms
can adjust their search process and evolve new
solutions in response to the new conditions.
10/9/2024 ١٤
Advantages of Evolutionary
Computation EC
• 6. Solves Problems that have no Solutions
• Explanation:
• EC can be used to find approximations or good-
enough solutions to problems that don’t have
precise or analytical solutions. This is particularly
useful in complex, poorly understood, or
unstructured problem domains where traditional
methods fail. EC can explore a broad solution
space and converge on solutions that meet
certain criteria, even when the problem is ill-
defined or has no exact solution.

10/9/2024 ١٥
Advantages of Evolutionary
Computation EC

• Each of these advantages showcases why


evolutionary computation is a powerful
approach in tackling a wide array of
computational and real-world challenges.

10/9/2024 ١٦
A gentle introduction to GAs:
 What Are Genetic Algorithms:
 A genetic algorithm is a search heuristic that is
inspired by Charles Darwin’s theory of natural
evolution. This algorithm reflects the process of
natural selection where the fittest individuals are
selected for reproduction in order to produce
offspring of the next generation.
 GA is a search procedure that uses random choice as
a tool to guide a highly exploitative search through a
coding of parameter space.
10/9/2024 ١٧
A gentle introduction to GAs:
• Notion of Natural Selection
• The process of natural selection starts with the selection of fittest
individuals from a population.
• They produce offspring which inherit the characteristics of the
parents and will be added to the next generation.

• If parents have better fitness, their offspring will be better than


parents and have a better chance at surviving.

• This process keeps on iterating and at the end, a generation with


the fittest individuals will be found.

• This notion can be applied for a search problem. We consider a set


of solutions for a problem and select the set of best ones out of
them.

10/9/2024 ١٨
Five phases are considered in a genetic algorithm.
Initial population
Fitness function
Selection
Crossover
Mutation

Genetic algorithms have been developed by John


Holland, his colleagues, and his students at the University of
Michigan.

10/9/2024 ١٩
Outline of the Basic Genetic Algorithm
• [Start] Generate random population of n chromosomes
(suitable solutions for the problem).
• [Fitness] Evaluate the fitness f(x) of each chromosome x in
the population.
• Repeat until terminating condition is satisfied
− [Selection] Select two parent chromosomes from a
population according to their fitness (the better fitness, the
bigger chance to be selected).
− [Crossover] Crossover the parents to form new offsprings
(children). If no crossover was performed, offspring is
the exact copy of parents.
− [Mutation] Mutate new offspring at selected position(s) in
chromosome).
− [Accepting] Generate new population by placing new
offsprings.
• Return the best solution in current population
10/9/2024 ٢٠
Outline of the Basic Genetic Algorithm

10/9/2024 ٢١
Initial population
•The process begins with a set of individuals which is called
a Population. Each individual is a solution to the problem
you want to solve.
• An individual is characterized by a set of parameters
(variables) known as Genes. Genes are joined into a string
to form a Chromosome (solution).
• In a genetic algorithm, the set of genes of an individual is
represented using a string, in terms of an alphabet. Usually,
binary values are used (string of 1s and 0s). We say that we
encode the genes in a chromosome.

10/9/2024 ٢٢
Fitness Function
The fitness function determines how fit an individual is (the
ability of an individual to compete with other individuals). It
gives a fitness score to each individual. The probability that
an individual will be selected for reproduction is based on its
fitness score.
Selection
•The idea of selection phase is to select the fittest
individuals and let them pass their genes to the next
generation.
•Two pairs of individuals (parents) are selected based on
their fitness scores.
• Individuals with high fitness have more chance to be
selected for reproduction.

10/9/2024 ٢٣
Selection pressure:
 Is the degree to which the better individuals are favored.
 The higher the selection pressure, the more the better individuals
are favored.
 This selection pressure drives GA to improve population fitness
over succeeding generations. The convergence rate of genetic
algorithm is largely determined by selection pressure.
 Higher selection pressure results higher convergence rates. GAs
are able to identify optimal or near optimal solution under a wide
range of selection pressure.
 However, if the selection rate is too low the convergence rate
will be slow, and the GA will unnecessarily take longer to find
the optimal solution.
 If the selection pressure is too high, there is an increased change
of GA prematurely converging to an incorrect (sub-optimal)
solution.
10/9/2024 ٢٤
Crossover
Crossover is the most significant phase in a genetic
algorithm. For each pair of parents to be mated,
a crossover point is chosen at random from within the
genes, (in this example).
•For example, consider the crossover point to be 3 as shown
below.

10/9/2024 ٢٥
Crossover
•Offspring are created by exchanging the genes of parents
among themselves until the crossover point is reached.

10/9/2024 ٢٦
2- Crossover:
Exchange corresponding genetic
material from two parents,
allowing useful genes on different
parents to be combining in their
offspring. Two parents may or may
not be replaced in the original
population for the next generation.
10/9/2024 ٢٧
Type of crossover:
•One- point crossover:

10/9/2024 ٢٨
•Two- point crossover:

10/9/2024 ٢٩
3- Uniform crossover:

10/9/2024 ٣٠
3- Mutation:
 Random mutation provides background
variation and occasionally introduces
beneficial material into a species'
chromosomes.
 Without the mutation, all the individuals in
population will eventually be the same
(because of exchange of genetic material) and
there will be no progress anymore.
 We will be stuck in local maximum.
Mutation, in case of binary, just inverting a
bit 0 to 1 & 1 to 0.
10/9/2024 ٣١
Mutation
•In certain new offspring formed, some of their genes can be
subjected to a mutation with a low random probability. This
implies that some of the bits in the bit string can be flipped.

•Mutation occurs to maintain diversity within the population


and prevent premature convergence.

10/9/2024 ٣٢
 Termination
•The algorithm terminates
• if the population has converged (does not produce
offspring which are significantly different from the previous
generation).
• Then it is said that the genetic algorithm has provided a set
of solutions to our problem.
 Comments
•The population has a fixed size.
• As new generations are formed, individuals with least
fitness die, providing space for new offspring.
•The sequence of phases is repeated to produce individuals
in each new generation which are better than the previous
generation.

10/9/2024 ٣٣
A Pseudo-code For Simple GA:
t=0; //set the generation number to zero.
Initialize (pop(t)); //initialize the population at random.
Evaluate(pop(t)); //evaluate the fitness values.
Repeat
Selection (pop(t)); //select better string.
Recombination(pop(t)); //cross better string to produce offspring.
Mutation(pop(t)); //mutation population.
Evaluate(pop(t)); //evaluate the new population.
t=t+1; //increment generation counter.
Until(t > max or (termination criterion TRUE)).

10/9/2024 ٣٤
10/9/2024 ٣٥
Simple Example:
Use the genetic algorithm (GA) to find the binary values of the literals
a, b, c, d, e, f, and g such that the following conjunctive normal form CNF
will be true. Describe the coding, genetic operations, and fitness
function of this problem.
(¬a ν c)Λ (¬a ν c ν ¬e) Λ (¬b ν c ν d ν ¬e) Λ (a ν ¬b ν c) Λ (¬e ν f)
Λ(a ν c ν f ν g)
Answer:
GA coding: The individuals will be represented as binary
string. We have seven literals, therefore the string consists of
seven 0's and/or 1's., i.e., the 1st bit of the string represents
the value of a, the 2nd bit represent the value of b, and so on.
Fitness Function (ff):
 ff(individual)= No. of true clauses in the CNF according to the
bits of the individual.

Crossover: Uniform crossover will be used


Mutation: An arbitrary bit will be complemented.
10/9/2024 ٣٦
Simple Example:
(¬a ν c)Λ (¬a ν c ν ¬e) Λ (¬b ν c ν d ν ¬e) Λ (a ν ¬b ν c) Λ (¬e ν f) Λ(a ν c ν f ν g)
Answer:
1) At the beginning, random population will be chosen which consists of four
individuals:
P0={0000000, 1101111, 1000101, 0000111}
2) Compute the fitness for each individual
ff(0000000)=5; ff(1101111)=4; ff(1000101)=4; ff(1000111)=4
3) Accomplish uniform crossover between (0000000) and (1101111) to
obtain (0101010) and (1000101)
4) Suppose that (0000000) is selected for mutation. Also bit#3 is randomly
selected to be complemented to obtain new individual (0010000). Now
the new population P1={0010000, 0101010, 1000101, 1101111}
5) Compute the fitness values of the P1 individuals
ff(0010000)=6; ff(0101010)=4; ff(1000101)=4; ff(1101111)=4

 We can see the individual (0010000) satisfies the truth of the given CNF
equation, because its fitness=6, i.e., all the clauses will be true when we
assign its bits to the literals. Hence the values of the literals will be as
follows: a=0, b=0, c=1, d=0, e=0, f=0, and g=0. And the process will be
terminated.
10/9/2024 ٣٧
Issues involved

• How to create chromosomes and what


type of encoding to choose?
• How to perform Crossover and
Mutation, the two basic operators of
GA?
• How to select parents for crossover?

10/9/2024 ٣٨
Termination of Loop

• Reaching some (known/hoped for) fitness.


• Reaching some maximum allowed
number of generations.
• Reaching some minimum level of
diversity.
• Reaching some specified number of
generations without fitness improvement.

10/9/2024 ٣٩
Binary Encoding Schemes
Binary Encoding
1. Crossover
 Single point crossover:
• one crossover point is selected,
• binary string from the beginning of the chromosome to the
crossover point is copied from the first parent,
• the rest is copied from the other parent

11001011+11011111 = 11001111

10/9/2024 ٤٠
Contd…
 Two point crossover:
• two crossover points are selected,
• binary string from the beginning of the chromosome to
the first crossover point is copied from the first parent,
• the part from the first to the second crossover point is
copied from the other parent and
• the rest is copied from the first parent again

11001001 + 11011111 = 11011101

10/9/2024 ٤١
Contd…
 Arithmetic crossover: Arithmetic operation is
performed to make a new offspring
11001011 + 11011111 = 11001001 (AND)

 Uniform crossover: bits are randomly copied from the


first and second parent
11101010 + 11010101 = 11010011

2. Mutation
Bit inversion: selected bits are inverted
11010011 => 11110010

10/9/2024 ٤٢
Permutation Encoding
1. Crossover : Single point crossover -
• one crossover point is selected,
• the genes are copied from the first parent till the crossover
point, then
• the other parent is scanned and if the gene is not yet copied
in the offspring, it is added
• Note: there are more ways to produce the rest after
crossover point
(1 2 3 4 5 6 7 8 9) + (4 5 3 6 8 9 7 2 1) =
(1 2 3 4 5 6 8 9 7)
2. Mutation: Order changing -
• two numbers are selected and exchanged
(1 2 3 4 5 6 8 9 7) => (1 8 3 4 5 6 2 9 7)

10/9/2024 ٤٣
Value Encoding
1. Crossover
All crossovers methods from binary encoding can be used
2. Mutation
Adding (for real value encoding) - a small number is added to
(or subtracted from) selected values
(1.29 5.68 2.86 4.11 5.55) => (1.29 5.68 2.73 4.22 5.55)

Tree Encoding
1. Crossover
Tree crossover - one crossover point is selected in both parents,
and the parts below crossover points are exchanged to produce
new offspring
2. Mutation
Changing operator, number - selected nodes are changed

10/9/2024 ٤٤
Advantages and Disadvantages of GA
• Applicable when little knowledge is encoded in
the system.
• Effective way of finding a reasonable solution to a
complex problem quickly.
• NP-complete problems can be solved in efficient
way.
• Parallelism and easy implementation is an
advantage.
• However, they give very poor performance on
some problems as might be expected from
knowledge-poor approaches.
10/9/2024 ٤٥
Contd..
• There are NP-complete problems that can not be solved
algorithmically in efficient way.
• NP stands for nondeterministic polynomial and it means
that it is possible to guess the solution and then check it in
polynomial time.
• If we have some mechanism to guess a solution, then we
would be able to find a solution in some reasonable or
polynomial time .
• The characteristic for NP-problems is that algorithm is
usually O(2n) and it is not usable when n is large.
• For such problems, GA works well.
• But the disadvantage of GAs is in their computational
time.
10/9/2024 ٤٦
• They can be slower than some other methods.
• Some of the problems are listed below
 Choosing encoding and fitness function can be difficult.
 GAs may have a tendency to converge towards local
optima or even arbitrary points rather than the global
optimum in many problems ..
• GAs cannot effectively solve problems in which
the only fitness measure is right/wrong, as there is
no way to converge on the solution . (Consider
next slide)
• In these cases, a random search may find a
solution as quickly as a GA .

10/9/2024 ٤٧
• The statement highlights a limitation of Genetic Algorithms (GAs) when applied to problems where the fitness function can only assess
solutions as either right or wrong , without providing any intermediate feedback or graded evaluation.
• Here’s a breakdown of the key points:
• 1. Fitness Function is Right/Wrong
• - Right/Wrong Fitness Measure : In this context, the fitness function assigns a solution a binary value—either it is correct (right) or incorrect
(wrong). There are no partial rewards or clues about how "close" a solution is to being correct.
• - GA's Normal Behavior : GAs typically rely on a graded fitness function , where solutions can be evaluated on a scale (e.g., how "fit" they are
for solving the problem). This graded feedback helps the algorithm converge gradually by favoring better (though not yet perfect) solutions over
worse ones.
• 2. No Gradual Convergence Possible
• - No Way to Converge : When the only feedback is "right or wrong," there's no intermediate information guiding the GA towards
improvement. The algorithm cannot differentiate between slightly wrong solutions and completely wrong solutions. All wrong solutions are
equally penalized, which makes it impossible to iteratively improve over generations.

• 3. Random Search Could Be Just as Effective
• - Random Search : In problems with only binary feedback (right/wrong), a random search may be as effective as a GA because both
approaches are essentially just guessing. Since the GA has no gradual way to improve partial solutions, it operates similarly to random
guessing.
• - No Benefit from Evolution : The advantage of GAs is that they usually exploit a population of partially good solutions to explore the solution
space more effectively. However, if there’s no information about how partial solutions are performing (no gradual fitness), then the
evolutionary process (selection, mutation, crossover) offers no improvement over just randomly guessing until a correct solution is found.
• Example:
• Consider a problem where you need to guess a specific combination of numbers, and the only feedback you get is "correct" or "incorrect" for
the entire combination, without any indication of how close your guesses are. If GAs try random combinations, but can't evaluate partial
correctness (like whether some digits are correct), they can’t "evolve" better guesses. This would be similar to flipping a coin: you're either right
or wrong without a pathway to improve.

• Thus, without a nuanced fitness function, the GA is reduced to random searching, and it loses its evolutionary advantage.

10/9/2024 ٤٨
Criteria for GA Approaches
• Completeness: Any solution should have its
encoding
• Non redundancy: Codes and solutions should
correspond one to one
• Soundness: Any code (produced by genetic
operators) should have its corresponding solution
• Characteristic perseverance: Offspring should
inherit useful characteristics from parents.

10/9/2024 ٤٩
Optimization theory:
•Optimization theory studies how to
describe and attain what is best.
• One knows how to measure and alter what
is good or bad.
•Thus, optimization seeks to improve
performance toward some optimal points:
•We seek improvement to approach some
Optimal point.
10/9/2024 ٥٠
Traditional methods move gingerly from point to
point, using transition rules to determine the next point,
this yield multimodal peaked search space.
GA differs from traditional methods in four ways:
•GAs work with a coding of the parameter set, not the
parameters themselves.
•GAs search from a population of points, not a single
point.
•GAs use payoff (objective function) information, not
derivatives or other auxiliary knowledge.
•GAs use probabilistic transition not deterministic
rules.

10/9/2024 ٥١
GA works from rich database of point
simultaneously (population of strings) climbing
many peaks in parallel, thus probability of finding
false peak is reduced.

10/9/2024 ٥٢
1-Reproduction:
Is the process in which individual strings are
copied according to their objective function
values, which is called fitness function.
Fitness function:
Is a measure of profit, utility, goodness that
we want to maximize.
Reproduction means that strings with a
higher value have a higher probability of
contributing one or more offspring in the
next generation.
10/9/2024 ٥٣
Selection:
Selection mechanism
finds two (or more)
candidates for crossover.

10/9/2024 ٥٤
Selection pressure:
is the degree to which the better individuals are favored.
The higher the selection pressure, the more the better
individuals are favored. This selection pressure drives GA
to improve population fitness over succeeding generations.
The convergence rate of genetic algorithm is largely
determined by selection pressure. Higher selection pressure
results higher convergence rates. GAs are able to identify
optimal or near optimal solution under a wide range of
selection pressure. However, if the selection rate is too low
the convergence rate will be slow, and the GA will
unnecessarily take longer to find the optimal solution. If
the selection pressure is too high, there is an increased
change of GA prematurely converging to an incorrect (sub-
optimal)
10/9/2024
solution. ٥٥
There are several selection strategies possible:
1- Roulette wheel selection:
It is a standard Gambler's roulette wheel, a
spinning circle divided into several equal size
sections. The croupier set the wheel spinning and
throws marble into boul. After the motion of the
wheel decreases marble comes to rest in one of
the numbered sections.
In GA, roulette wheel could be used to select
individuals for further reproduction. The wheel
corresponds to fitness array and the marble is a
random unsigned integer less than the sum of all
fitness in population.
10/9/2024 ٥٦
2- Stochastic universal selection:
It is a less noisy version of roulette wheel
selection in which N makers are placed
around the roulette wheel, where N is a
number of individuals in the population. N
individuals are selected in a single spin of the
roulette wheel, and the number of copies of
each individual selected is equal to the
number of markers inside the corresponding
slot.
10/9/2024 ٥٧
3- Tournament selection:
S individuals are taken random, and the better is
selected from them. It is possible to adjust its
selection pressure by changing tournament size.
The winner of the tournament is the individual
with the highest fitness of the S tournament
competitors, which is inserted into mating pool.
The mating pool being filled with tournament
winner and has a higher average fitness than the
average population fitness.

10/9/2024 ٥٨
2- Crossover:
Exchange corresponding genetic
material from two parents,
allowing useful genes on different
parents to be combining in their
offspring. Two parents may or may
not be replaced in the original
population for the next generation.
10/9/2024 ٥٩
Type of crossover:
•One- point crossover:

10/9/2024 ٦٠
•Two- point crossover:

10/9/2024 ٦١
3- Uniform crossover:

10/9/2024 ٦٢
Population size:
It is useful to increase the
population size to increase the
genetic material for selection
and reproduction, and make
bigger changes to build better
individuals.
10/9/2024 ٦٣
Mutation rate:
 It must be small enough in order not to lose
the useful individuals and big enough to
provide population with new genetic
material at the same time.
 Different sources of literature suggest values
between 0.1 and 0.01.
 OR 1/length(chromosome)
 There is interesting relation between
population size and mutation rate. When we
lower population size, then it is useful to
increase the mutation rate and vice versa.
10/9/2024 ٦٤
Crossover rate:
Normally, when two
candidates are selected,
they always do
crossover.

10/9/2024 ٦٥
Number of generations:
It means how many cycles we do before
we terminate.
•Depend on task's complexity. Sometimes
hundred generations is enough, another
time thousands of generation is not
enough.
•It is probably wise to stop when no
improvement in solution quality is made
for certain time.
10/9/2024 ٦٦
•We can put limit condition, which can be
calculated taking account task's dimensions.
• When it is hard to determine appropriate value
for some parameter, it is good idea to use self-
adaptation for that parameter.
•This can be done by changing mutation rate
along the progress of GA (i.e. allow mutation
more often at the beginning and fewer mutation
at the end when solution needs only fine-tuning).
•When we are too far from plausible solution, we
need to use bigger mutation rate, so intelligent
self-adaptation would be very promising.
10/9/2024 ٦٧
GA Applications
• Control
• Design
• Scheduling
• Robotics
• Machine Learning
• Signal Processing
• Game Playing
• Combinatorial Optimization

10/9/2024 ٦٨
More Specific Applications of GA

• TSP and sequence scheduling


• Finding shape of protein molecules
• Strategy planning
• Nonlinear dynamical systems - predicting, data analysis
• Designing neural networks, both architecture and
weights
• Evolving LISP programs (genetic programming)

10/9/2024 ٦٩
Many
thanks for
10/9/2024
listening ٧٠
Simple Example:
(¬a ν c)Λ (¬a ν c ν ¬e) Λ (¬b ν c ν d ν ¬e) Λ (a ν ¬b ν c) Λ (¬e ν f) Λ(a ν c ν f ν g)
Answer:
1) At the beginning, random population will be chosen which consists of four
individuals:
P0={0000000, 1101111, 1000101, 0000111}
2) Compute the fitness for each individual
ff(0000000)=5; ff(1101111)=4; ff(1000101)=4; ff(1000111)=4
3) Accomplish uniform crossover between (0000000) and (1101111) to
obtain (0101010) and (1000101)
4) Suppose that (0000000) is selected for mutation. Also bit#3 is randomly
selected to be complemented to obtain new individual (0010000). Now
the new population P1={0010000, 0101010, 1000101, 1101111}
5) Compute the fitness values of the P1 individuals
ff(0010000)=6; ff(0101010)=4; ff(1000101)=4; ff(1101111)=4

 We can see the individual (0010000) satisfies the truth of the given CNF
equation, because its fitness=6, i.e., all the clauses will be true when we
assign its bits to the literals. Hence the values of the literals will be as
follows: a=0, b=0, c=1, d=0, e=0, f=0, and g=0. And the process will be
terminated.
10/9/2024 ٧١

You might also like