Skip to main content

Fuzzy Self-Tuning PSO global optimization library

Project description

=====================
Fuzzy Self-Tuning PSO
=====================

*Fuzzy Self-Tuning PSO* (FST-PSO) is a swarm intelligence global optimization method [1]
based on Particle Swarm Optimization [2]. FST-PSO is designed for real-valued
multi-dimensional minimization problems.

FST-PSO can be used as follows:

from fstpso import FuzzyPSO

# example of fitness function (min x^2)
def fitnessfunction(x):
return x**2

dims = 10 # number of dimensions of the problem
FP = FuzzyPSO( D=dims )
FP.set_fitness( fitnessfunction )
FP.set_search_space( [[-30, 30]]*dims ) # definition of the search space
result = FP.solve_with_fstpso()
print "Best solution:", result[0]
print "Whose fitness is:", result[1]


Basics
======

FST-PSO is settings-free version of PSO which exploits fuzzy logic to
dynamically assign the functioning parameters to each particle in the swarm.

Specifically, during each generation, FST-PSO is determines the optimal choice
for the cognitive factor, the social factor, the inertia value, the minimum
velocity, and the maximum velocity. FST-PSO also uses an heuristics to choose
the swarm size.

The programmer must specify:

* a custom fitness function;

* the number of dimensions of the problem;

* the boundaries of the search space for each dimension.

The programmer can also specify the maximum number of fitness evaluations.

FST-PSO returns the best fitting solution along with its fitness value.


Further information
-------------------

FST-PSO has been created by M.S. Nobile, D. Besozzi, G. Pasi, G. Mauri,
R. Colombo (University of Milan-Bicocca, Italy), and P. Cazzaniga (University
of Bergamo, Italy). The source code was written by M.S. Nobile.

Further information:

[1] Nobile, Cazzaniga, Besozzi, Colombo, Mauri, Pasi, "Fuzzy Self-Tuning PSO:
A Settings-Free Algorithm for Global Optimization", Swarm & Evolutionary
Computation, 2017 (in press)

[2] Kennedy, Eberhart, Particle swarm optimization, in: Proceedings IEEE
International Conference on Neural Networks, Vol. 4, 1995, pp. 1942–1948

<http://www.disco.unimib.it/go/45712>`_

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fst-pso-1.1.0.zip (16.3 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page