0% found this document useful (0 votes)
131 views

The Particle Swarm Optimization Algorithm

The document discusses the Particle Swarm Optimization algorithm, which is a computational method for optimizing problems by iteratively improving candidate solutions. It was developed in 1995 and is inspired by social behaviors seen in bird flocking and fish schooling. The algorithm works by having particles "fly" through problem space, with each particle adjusting its position based on its own experience and the experiences of neighboring particles, and always moving towards the best solutions found so far.

Uploaded by

javed765
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
131 views

The Particle Swarm Optimization Algorithm

The document discusses the Particle Swarm Optimization algorithm, which is a computational method for optimizing problems by iteratively improving candidate solutions. It was developed in 1995 and is inspired by social behaviors seen in bird flocking and fish schooling. The algorithm works by having particles "fly" through problem space, with each particle adjusting its position based on its own experience and the experiences of neighboring particles, and always moving towards the best solutions found so far.

Uploaded by

javed765
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 18

The Particle Swarm Optimization Algorithm

Neboja Trpkovi
[email protected]
10th Dec 2010

Problem Definition
optimization of continuous nonlinear functions
finding the best solution in problem space

Neboja Trpkovi

[email protected]

Slide 2 of 18

Example

Neboja Trpkovi

[email protected]

Slide 3 of 18

Importance
function optimization
artificial neural network training fuzzy system control

Neboja Trpkovi

[email protected]

Slide 4 of 18

Existing Solutions
Ant Colony (ACO)
discrete

Genetic Algorithms (GA)


slow convergence

Neboja Trpkovi

[email protected]

Slide 5 of 18

Particle Swarm Optimization


Very simple classification:
a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality

Neboja Trpkovi

[email protected]

Slide 6 of 18

Particle Swarm Optimization


Facts:
developed by Russell C. Eberhart and James Kennedy in 1995

inspired by social behavior of bird flocking or fish schooling


similar to evolutionary techniques such as Genetic Algorithms (GA)

Neboja Trpkovi

[email protected]

Slide 7 of 18

Particle Swarm Optimization


Benefits: faster convergence less parameters to tune easier searching in very large problem spaces

Neboja Trpkovi

[email protected]

Slide 8 of 18

Particle Swarm Optimization


Basic principle: let particle swarm move towards the best position in search space, remembering each particles best known position and global (swarms) best known position

Neboja Trpkovi

[email protected]

Slide 9 of 18

Velocity Change
xi specific particle pi particles (personal) best known position g swarms (global) best known position vi particles velocity

vi vi + prp(pi - xi) + grg(g - xi)


inertia cognitive social

Neboja Trpkovi

[email protected]

Slide 10 of 18

Position Change
xi specific particle vi particles velocity

xi xi + vi

Neboja Trpkovi

[email protected]

Slide 11 of 18

Algorithm
For each particle Initialize particle END Do For each particle Calculate fitness value If the fitness value is better than the best personal fitness value in history, set current value as a new best personal fitness value End Choose the particle with the best fitness value of all the particles, and if that fitness value is better then current global best, set as a global best fitness value For each particle Calculate particle velocity according velocity change equation Update particle position according position change equation End While maximum iterations or minimum error criteria is not attained
Neboja Trpkovi
[email protected]

Slide 12 of 18

Single Particle

Neboja Trpkovi

[email protected]

Slide 13 of 18

Parameters selection
Different ways to choose parameters:
proper balance between exploration and exploitation
(avoiding premature convergence to a local optimum yet still ensuring a good rate of convergence to the optimum)

putting all attention on exploitation


(making possible searches in a vast problem spaces)

automatization by meta-optimization

Neboja Trpkovi

[email protected]

Slide 14 of 18

Avoiding Local Optimums

adding randomization factor to velocity calculation

adding random momentum in a specific iterations

Neboja Trpkovi

[email protected]

Slide 15 of 18

Swarm

Neboja Trpkovi

[email protected]

Slide 16 of 18

Conclusion
This algorithm belongs ideologically to that philosophical school that allows wisdom to emerge rather than trying to impose it,

that emulates nature rather than trying to control it,


and that seeks to make things simpler rather than more complex.

James Kennedy, Russell Eberhart


Neboja Trpkovi
[email protected]

Slide 17 of 18

References
Wikipedia
http://www.wikipedia.org/

Swarm Intelligence
http://www.swarmintelligence.org/

Application of a particle swarm optimization algorithm for determining optimum well location and type, Jerome Onwunalu
and Louis J. Durlofsky, 2009

Particle Swarm Optimization, James Kennedy and Russell Eberhart,


1995 http://www.engr.iupui.edu/~shi/Coference/psopap4.html

Robot Swarm driven by Particle Swarm Optimization algorithm, thinkfluid


http://www.youtube.com/watch?v=RLIA1EKfSys
Neboja Trpkovi
[email protected]

Slide 18 of 18

You might also like