Skip to main content

StochOPy (STOCHastic OPtimization for PYthon) provides user-friendly routines to sample or optimize objective functions with the most popular algorithms.

Project description

Summary

Version:

1.3.2

Author:

Keurfon Luu

Web site:

https://github.com/keurfonluu/stochopy

Copyright:

This document has been placed in the public domain.

License:

StochOPy is released under the MIT License.

NOTE: StochOPy has been implemented in the frame of my Ph. D. thesis. If you find any error or bug, or if you have any suggestion, please don’t hesitate to contact me.

Features

StochOPy provides routines for sampling of a model parameter space:

  • Pure Monte-Carlo

  • Metropolis-Hastings algorithm

  • Hamiltonian (Hybrid) Monte-Carlo [1] [2]

or optimization of an objective function:

  • Differential Evolution [3]

  • Particle Swarm Optimization [4] [5]

  • Competitive Particle Swarm Optimization [6]

  • Covariance Matrix Adaptation - Evolution Strategy [7]

Installation

The recommended way to install StochOPy is through pip (internet required):

pip install stochopy

Otherwise, download and extract the package, then run:

python setup.py install

Usage

New in 1.3.0: Run StochOPy Viewer to see how popular stochastic algorithm work, and play with the tuning parameters on several benchmark functions.

from stochopy.gui import main

main()

First, import StochOPy and define an objective function (here Rosenbrock):

import numpy as np
from stochopy import MonteCarlo, Evolutionary

f = lambda x: 100*np.sum((x[1:]-x[:-1]**2)**2)+np.sum((1-x[:-1])**2)

You can define the search space boundaries if necessary:

n_dim = 2
lower = np.full(n_dim, -5.12)
upper = np.full(n_dim, 5.12)

Initialize the Monte-Carlo sampler:

max_iter = 1000
mc = MonteCarlo(f, lower = lower, upper = upper, max_iter = max_iter)

Now, you can start sampling with the simple method ‘sample’:

mc.sample(sampler = "hamiltonian", stepsize = 0.005, n_leap = 20, xstart = [ 2., 2. ])

Note that sampler can be set to “pure” or “hastings” too. The models sampled and their corresponding energies are stored in:

print(mc.models)
print(mc.energy)

Optimization is just as easy:

n_dim = 10
lower = np.full(n_dim, -5.12)
upper = np.full(n_dim, 5.12)
popsize = 4 + np.floor(3.*np.log(n_dim))
ea = Evolutionary(f, lower = lower, upper = upper, popsize = popsize, max_iter = max_iter)
xopt, gfit = ea.optimize(solver = "cmaes")
print(xopt)
print(gfit)

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

stochopy-1.3.2.tar.gz (297.7 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page