Hyperparameter optimizer with distributed hardware at heart
Project description
shadho
- Scalable Hardware-Aware Distributed Hyperparameter Optimizer
shadho
is framework for distributed hyperparameter optimization developed for
machine/deep learning applications.
- Website/Documentation: https://shadho.readthedocs.io
- Bug Reports: https://github.com/jeffkinnison/shadho/issues
Installation
Note: The post-install step may look like it hangs, but it is just compiling Work Queue behind the scenes and my take a few minutes.
$ pip install shadho
$ python -m shadho.installers.workqueue
Installing on a Shared System
The owner of the shared installation should follow the steps above. Then, another user installs with
$ pip install shadho
$ python -m shadho.installers.workqueue --prefix <path to shared install>
Dependencies
- numpy
- scipy
- pyrameter
- Work Queue (Built and installed by setup.py)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
shadho-0.2.post4.tar.gz
(22.5 kB
view hashes)
Built Distribution
shadho-0.2.post4-py3-none-any.whl
(29.1 kB
view hashes)
Close
Hashes for shadho-0.2.post4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | eb05e1d915c3e432291b03c1b1a2d99140bb8d70b4c71cb19c207c68b9b58bc9 |
|
MD5 | 1c7edc019d82e98590d40e3aecbb6c87 |
|
BLAKE2b-256 | a3249cf20ac1d1cff7dd652dd13165f2f7a8ad5f0b745c34522c10fc8c01b05e |