Genetic Algorithm (GA) Optimization
GA is inspired by Darwinian evolution and natural selection. It evolves a population of candidate solutions
over generations using genetic operations like selection, crossover, and mutation.
Goal: Find optimal or near-optimal solutions to complex problems.
Problem:
Find the highest number between 0 and 100.
Step 1: Start with random numbers
Let’s say we randomly pick 5 numbers: 12, 45, 78, 23, 56. These are like our ”creatures” (solutions).
Step 2: Check how good each number is
The bigger the number, the better it is (because we want the highest number). So 78 is the best one right
now.
Step 3: Mix the best ones (Crossover)
Take two good numbers (say 78 and 56) and mix their digits. For example:
• Take first digit from 78 (7) and second digit from 56 (6) → new number 76
• Take first digit from 56 (5) and second digit from 78 (8) → new number 58
Now we have two new numbers: 76 and 58.
Step 4: Add small random changes (Mutation)
Randomly change a digit a little bit. Suppose we mutate 76 by changing 6 to 9 → it becomes 79. Now our
numbers are: 79, 58.
Step 5: Repeat
Now treat these as the new ”parents,” pick the best, mix them, mutate, and so on. After a few rounds, you
might find 99 — the highest possible!
Algorithm Steps
1. Initialization
• Create an initial population of chromosomes (candidate solutions).
• Chromosomes can be binary strings, real numbers, or other encodings.
2. Fitness Evaluation
• Calculate the fitness of each chromosome using the objective function.
• Example: For minimizing f (x) = x2 , the fitness of x = 3 is f (3) = 9.
3. Selection
• Select parents for reproduction based on fitness.
• Common methods: Roulette Wheel Selection, Tournament Selection.
4. Crossover (Recombination)
• Combine genetic information from two parents to create offspring.
• Example: Single-point crossover for binary chromosomes.
1
5. Mutation
• Introduce random changes to offspring to maintain diversity.
• Example: Flip a bit in a binary string.
6. Replacement
• Replace the old population with the new offspring (with optional elitism).
7. Termination
• Stop when a stopping criterion is met (e.g., max generations, fitness threshold).
Step-by-Step Example
Let’s minimize f (x) = x2 with x encoded as a 4-bit binary integer (range: 0 to 15).
Step 1: Initialize Population
• 0010 (2) → Fitness = 22 = 4
• 1100 (12) → Fitness = 122 = 144
• 0111 (7) → Fitness = 72 = 49
• 1001 (9) → Fitness = 92 = 81
Step 2: Selection
Use tournament selection:
• 0010 vs. 1100 → 0010 wins.
• 0111 vs. 1001 → 0111 wins.
Selected Parents: [0010, 0111]
Step 3: Crossover
Single-point crossover after 2nd bit:
Parent 1: 00|10 → Offspring 1: 00|11
Parent 2: 01|11 → Offspring 2: 01|10
Offspring: [0011 (3), 0110 (6)]
Step 4: Mutation
Mutation (5% probability):
• 0011 → 0010 (no mutation)
• 0110 → 0111 (bit flip)
Mutated Offspring: [0010 (2), 0111 (7)]
2
Step 5: Replacement
New population:
• [0010(2), 0111(7), 0010(2), 0111(7)]
Key Parameters
Parameter Role Typical Value
Population Size Number of candidate solutions 50–200
Crossover Rate Probability of combining parents 60%–90%
Mutation Rate Probability of altering a gene 1%–10%
Selection Method How parents are chosen (e.g., tournament) Tournament
Elitism Retain top solutions in the next generation 1–10% of popula-
tion
Exploration vs. Exploitation in Genetic Algorithms
Genetic Algorithms balance exploration (searching new regions) and exploitation (refining known good
solutions).
Exploration Mechanisms
• Initialization: Random generation.
• Mutation: Random alterations.
• Population Diversity: Maintain variety.
Exploitation Mechanisms
• Selection: Fitter individuals preferred.
• Crossover: Combine parent traits.
• Elitism: Copy top solutions forward.
How GA Balances Exploration and Exploitation
Component Role in Exploration Role in Exploitation
Mutation Adds diversity by perturbing solutions Rarely, mainly exploration
Crossover Limited exploration Exploits parent traits
Selection Less selective = exploration Strong selection = exploitation
Population Size Larger = more exploration Smaller = faster exploitation
Example: Minimizing f (x) = x2
Iteration 1:
• Exploration: Random initial population.
• Exploitation: Selection favors x = 2 and x = 7.
3
Iteration 2:
• Exploitation: Crossover creates offspring (3,6).
• Exploration: Mutation changes 6 to 7.
Iteration 3:
• Exploitation: Retain x = 2 by elitism.
• Exploration: Mutation may discover x = 0.
Key Parameters Affecting Balance
• Mutation Rate: High → More exploration; Low → More exploitation.
• Crossover Rate: High → More exploitation.
• Selection Pressure: Strong → More exploitation.
• Elitism: Increases exploitation.
Advantages
• Works with discrete, continuous, combinatorial problems.
• Avoids local optima.
• Parallelizable.
Applications
• Traveling Salesman Problem (TSP)
• Neural Network Hyperparameter Tuning
• Scheduling
• Engineering Design
Comparison with PSO
Feature Genetic Algorithm (GA) Particle Swarm Optimization
(PSO)
Mechanism Evolution via crossover/mutation Social behavior via pBest/gBest
Exploration High Moderate
Best For Discrete, combinatorial problems Continuous optimization
Parameters Population size, crossover/mutation Inertia weight, c1 , c2
rates