Experiment-1
AIM:
Implement particle swarm optimization.
INTRODUCTION:
Particle swarm optimization (PSO) is a population based stochastic optimization
technique developed by Dr. Eberhart and Dr. Kennedy in 1995, inspired by social
behavior of bird flocking or fish schooling.
PSO shares many similarities with evolutionary computation techniques such as Genetic
Algorithms (GA). The system is initialized with a population of random solutions and
searches for optima by updating generations. However, unlike GA, PSO has no
evolution operators such as crossover and mutation. In PSO, the potential solutions,
called particles, fly through the problem space by following the current optimum
particles. The detailed information will be given in following sections.
Compared to GA, the advantages of PSO are that PSO is easy to implement and there
are few parameters to adjust. PSO has been successfully applied in many areas:
function optimization, artificial neural network training, fuzzy system control, and other
areas where GA can be applied.
CODE:
from numpy import array
from random import random
from math import sin, sqrt
iter_max = 10000
pop_size = 100
dimensions = 2
c1 = 2
c2 = 2
err_crit = 0.00001
class Particle: pass
def f6(param):
'''Schaffer's F6 function'''
para = param*10
para = param[0:2]
num = (sin(sqrt((para[0] * para[0]) + (para[1] *
para[1]))))*\ (sin(sqrt((para[0] * para[0]) +
(para[1] * para[1])))) - 0.5 denom = (1.0 +
0.001 * ((para[0] * para[0]) + (para[1] *
para[1]))) * \ (1.0 + 0.001 * ((para[0] *
para[0]) + (para[1] * para[1]))) f6 = 0.5 -
(num/denom)
errorf6 = 1 - f6
return f6, errorf6;
#initialize the particles
particles = []
for i in range(pop_size):
p = Particle()
[Link] = array([random() for i in
range(dimensions)]) [Link] = 0.0
p.v = 0.0
[Link](p)
# let the first particle be the global best
gbest = particles[0]
err = 999999999
while i < iter_max:
for p in particles:
fitness,err = f6([Link])
if fitness > [Link]:
[Link] = fitness
[Link] = [Link]
if fitness > [Link]:
gbest = p
v = p.v + c1 * random() * ([Link] - [Link]) \
+ c2 * random() * ([Link] - [Link])
[Link] = [Link] + v
i += 1
if err < err_crit:
break
#progress bar. '.' = 10%
if i % (iter_max/10) == 0:
print ('.')
print ('\nParticle Swarm Optimisation\n')
print ('PARAMETERS\n','-'*9)
print ('Population size : ', pop_size)
print ('Dimensions : ', dimensions)
print ('Error Criterion : ', err_crit)
print ('c1 : ', c1)
print ('c2 : ', c2)
print ('function : f6')
print ('RESULTS\n', '-'*7)
print ('gbest fitness : ', [Link])
print ('gbest params : ', [Link])
print ('iterations : ', i+1)
for p in particles:
print ('params: %s, fitness: %s, best: %s' % ([Link], [Link],
[Link]))
OUTPUT: