Machine Learning Experiment Hyperparameter Optimization
Project description
Lightweight Hyperparameter Optimization 🚀
Simple and intuitive hyperparameter optimization API for your Machine Learning Experiments (MLE). This includes simple grid and random search as well as sequential model-based optimization (SMBO) and a set of more unorthodox search algorithms (multi-objective via nevergrad
and a coordinate-wise search). Portable hyperparameter spaces are available for real, integer and categorical-valued variables. The search strategies assume that the underlying objective is minimized (multiple by -1 if this is not the case). For a quickstart checkout the notebook blog.
The API 🎮
from mle_hyperopt import RandomSearch
# Instantiate random search class
strategy = RandomSearch(real={"lrate": {"begin": 0.1,
"end": 0.5,
"prior": "log-uniform"}},
integer={"batch_size": {"begin": 32,
"end": 128,
"prior": "uniform"}},
categorical={"arch": ["mlp", "cnn"]})
# Simple ask - eval - tell API
configs = strategy.ask(5)
values = [train_network(**c) for c in configs]
strategy.tell(configs, values)
Implemented Search Types 🔭
Search Type | Description | search_config |
---|---|---|
GridSearch |
Search over list of discrete values | - |
RandomSearch |
Random search over variable ranges | refine_after , refine_top_k |
SMBOSearch |
Sequential model-based optim. | base_estimator , acq_function , n_initial_points |
CoordinateSearch |
Coordinate-wise optim. with defaults | order , defaults |
NevergradSearch |
Multi-objective nevergrad wrapper | optimizer , budget_size , num_workers |
Variable Types & Hyperparameter Spaces 🌍
Variable | Type | Space Specification |
---|---|---|
real |
Real-valued | Dict : begin , end , prior /bins (grid) |
integer |
Integer-valued | Dict : begin , end , prior /bins (grid) |
categorical |
Categorical | List : Values to search over |
Installation ⏳
A PyPI installation is available via:
pip install mle-hyperopt
Alternatively, you can clone this repository and afterwards 'manually' install it:
git clone https://github.com/RobertTLange/mle-hyperopt.git
cd mle-hyperopt
pip install -e .
Further Options 🚴
Saving & Reloading Logs 🏪
# Storing & reloading of results from .pkl
strategy.save("search_log.json")
strategy = RandomSearch(..., reload_path="search_log.json")
# Or manually add info after class instantiation
strategy = RandomSearch(...)
strategy.load("search_log.json")
Search Decorator 🧶
from mle_hyperopt import hyperopt
@hyperopt(strategy_type="grid",
num_search_iters=25,
real={"x": {"begin": 0., "end": 0.5, "bins": 5},
"y": {"begin": 0, "end": 0.5, "bins": 5}})
def circle(config):
distance = abs((config["x"] ** 2 + config["y"] ** 2))
return distance
strategy = circle()
Storing Configuration Files 📑
# Store 2 proposed configurations - eval_0.yaml, eval_1.yaml
strategy.ask(2, store=True)
# Store with explicit configuration filenames - conf_0.yaml, conf_1.yaml
strategy.ask(2, store=True, config_fnames=["conf_0.yaml", "conf_1.yaml"])
Retrieving Top Performers & Visualizing Results 📉
# Get the top k best performing configurations
strategy.get_best(top_k=4)
Refining the Search Space of Your Strategy 🪓
# Refine the search space after 5 iterations based on top 2 configurations
strategy = RandomSearch(real={"lrate": {"begin": 0.1,
"end": 0.5,
"prior": "uniform"}},
integer={"batch_size": {"begin": 1,
"end": 5,
"prior": "log-uniform"}},
categorical={"arch": ["mlp", "cnn"]},
search_config={"refine_after": 5,
"refine_top_k": 2})
Development & Milestones for Next Release
You can run the test suite via python -m pytest -vv tests/
. If you find a bug or are missing your favourite feature, feel free to contact me @RobertTLange or create an issue :hugs:. Here are some features I want to implement for the next release:
- Add min vs max objective option to choose at strategy init
- Add text to notebook + visualization for what is implemented
- Allow space refinement for other strategies
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mle_hyperopt-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7f8cd50f14cdeb9bd7f3cc07493cd423eedc697172e9f3220d992e99ed69f4d3 |
|
MD5 | 03c6ee257cb90352f90d2a76c8dcca6f |
|
BLAKE2b-256 | 0552ed55910f173f8c615091b7be86f97bab3815ad676b09a840ad6d717741eb |