Skip to main content

mlrose_reborn: Machine Learning, Randomized Optimization and Search

Project description

mlrose: Machine Learning, Randomized Optimization and SEarch

mlrose is a Python package for applying some of the most common randomized optimization and search algorithms to a range of different optimization problems, over both discrete- and continuous-valued parameter spaces.

Project Background

mlrose was initially developed to support students of Georgia Tech's OMSCS/OMSA offering of CS 7641: Machine Learning.

It includes implementations of all randomized optimization algorithms taught in this course, as well as functionality to apply these algorithms to integer-string optimization problems, such as N-Queens and the Knapsack problem; continuous-valued optimization problems, such as the neural network weight problem; and tour optimization problems, such as the Travelling Salesperson problem. It also has the flexibility to solve user-defined optimization problems.

At the time of development, there did not exist a single Python package that collected all of this functionality together in the one location.

Main Features

Randomized Optimization Algorithms

  • Implementations of: hill climbing, randomized hill climbing, simulated annealing, genetic algorithm and (discrete) MIMIC;
  • Solve both maximization and minimization problems;
  • Define the algorithm's initial state or start from a random state;
  • Define your own simulated annealing decay schedule or use one of three pre-defined, customizable decay schedules: geometric decay, arithmetic decay or exponential decay.

Problem Types

  • Solve discrete-value (bit-string and integer-string), continuous-value and tour optimization (travelling salesperson) problems;
  • Define your own fitness function for optimization or use a pre-defined function.
  • Pre-defined fitness functions exist for solving the: One Max, Flip Flop, Four Peaks, Six Peaks, Continuous Peaks, Knapsack, Travelling Salesperson, N-Queens and Max-K Color optimization problems.

Machine Learning Weight Optimization

  • Optimize the weights of neural networks, linear regression models and logistic regression models using randomized hill climbing, simulated annealing, the genetic algorithm or gradient descent;
  • Supports classification and regression neural networks.

Installation

mlrose was written in Python 3 and requires NumPy, SciPy and Scikit-Learn (sklearn).

The latest version can be installed using pip:

pip install -e git+git://github.com/hiive/mlrose#egg=mlrose-hiive

Documentation

The official mlrose documentation can be found here.

A Jupyter notebook containing the examples used in the documentation is also available here.

Licensing, Authors, Acknowledgements

mlrose was written by Genevieve Hayes and is distributed under the 3-Clause BSD license.

You can cite mlrose in research publications and reports as follows:

You can cite this fork in a similar way, but please be sure to reference the original work. Thanks to David S. Park for the MIMIC enhancements (from https://github.com/parkds/mlrose).

BibTeX entry:

@misc{Hayes19,
 author = {Hayes, G},
 title 	= {{mlrose: Machine Learning, Randomized Optimization and SEarch package for Python}},
 year 	= 2019,
 howpublished = {\url{https://github.com/gkhayes/mlrose}},
 note 	= {Accessed: day month year}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlrose_reborn-2.0.0.tar.gz (47.5 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page