Introduction to Particle
Swarm Optimization
Rajib Kumar Bhattacharjya
Department of Civil Engineering
Indian Institute of Technology Guwahati
Particle Swarm Algorithm
Inspired by social behavior of bird flocking and fish schooling United we stand
Suppose a group of birds is searching food in an area
Only one piece of food is available
Birds do not have any knowledge
about the location of the food
But they know how far the food
is from their present location
So what is the best strategy to locate the food?
The best strategy is to follow the bird nearest to the food
Particle Swarm Algorithm
Next position
Current position
A flying bird has a position and a velocity This searching process can be artificially
at any time ݐ simulated for solving non-linear
In search of food, the bird changes his optimization problem
position by adjusting the velocity
So this is a population based stochastic
The velocity changes based on his past optimization technique inspired by social
experience and also the feedbacks received behaviour of bird flocking or fish
from his neighbor schooling
Particle Swarm Algorithm
Each solution is considered as bird, called particle
All the particles have a fitness value. The fitness values can
be calculated using objective function
All the particles preserve their individual best performance
They also know the best performance of their group
They adjust their velocity considering their best performance and also
considering the best performance of the best particle
Particle Swarm Algorithm
Inertia effect
Initialize particles
Evaluate fitness of each particles Local search, personal
influence
Modify velocities based on previous best and
global best positions
Global search, Social
Next iteration Terminate criteria influence
STOP
Velocity is updated
ܸାଵ = ܸ߱ + ܥଵ ∗ ܤܲ ∗ )(݀݊ܽݎ − ܺ + ܥଶ ∗ ܤܩ ∗ )(݀݊ܽݎ − ܺ
Position is updated
ܥଵ and ܥଶ are the learning factor
ܺାଵ = ܺ + ܸାଵ
߱ is the inertia weight
Particle Swarm Algorithm
ܲܤ
Personal best
performance
ܺ
ܤܩ
Best performance
of the group
ܺାଵ
ܸ
Example problem
ଶ ଶ
Minimize ݂ ݔ, ݔ( = ݕଶ + ݕ− 11) +( ݔ+ ݕଶ − 7)
80
5
70
4.5
60
4
50
3.5
Fitness
3 40
2.5 30
y
2 20
1.5 10
5
1 0
4.5 0 50 100 150 200 250 300 350 400 450 500
0.5
4
Generation
0
0 0.5 1 1.5 3.52 2.5 3 3.5 4 4.5 5
x
3
2.5
y
1.5
Optimum Solution is = ∗ ݔ3, = ∗ ݕ2
1
0.5
0
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
x
5 5
4.5 4.5
4 4
3.5
Smaller value 3.5
3 of the 3
2.5 parameters 2.5
y
y
2 2
1.5 1.5
1 1
0.5 0.5
0 0
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
x 5 x
5
4.5
4.5
4
4
3.5
3.5
3
3
2.5
Larger value of
y
2.5
y
2
2
the 1.5
1.5
1
parameters 1
0.5
0.5
0
0 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 x
x
Examples