Hyperparameter optimizer with distributed hardware at heart
Project description
shadho
- Scalable Hardware-Aware Distributed Hyperparameter Optimizer
shadho
is framework for distributed hyperparameter optimization developed for
machine/deep learning applications.
- Website/Documentation: https://shadho.readthedocs.io
- Bug Reports: https://github.com/jeffkinnison/shadho/issues
Installation
Note: The post-install step may look like it hangs, but it is just compiling Work Queue behind the scenes and may take a few minutes.
$ pip install shadho
$ python -m shadho.installers.workqueue
Installing on a Shared System
The owner of the shared installation should follow the steps above. Then, another user installs with
$ pip install shadho
$ python -m shadho.installers.workqueue --prefix <path to shared install>
Dependencies
- numpy
- scipy
- pyrameter
- Work Queue (Built and installed by setup.py)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
shadho-0.3.0.post1.tar.gz
(24.8 kB
view hashes)
Built Distribution
Close
Hashes for shadho-0.3.0.post1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 983478296e5f402742111b820bf1c2c741c23c0d70b24b370e68c951ccaad481 |
|
MD5 | cc4af367dd5834302a7e9a115676887b |
|
BLAKE2b-256 | ab2e07cd91898d2bd13bb7817a1210e8264675a0a30f513db709c04c4f63395a |