Skip to main content

Neural Pipeline Search helps deep learning experts find the best neural pipeline.

Project description

Neural Pipeline Search (NePS)

PyPI version Python versions License Tests

Welcome to NePS, a powerful and flexible Python library for hyperparameter optimization (HPO) and neural architecture search (NAS) with its primary goal: enable HPO adoption in practice for deep learners!

NePS houses recently published and some more well-established algorithms that are all capable of being run massively parallel on any distributed setup, with tools to analyze runs, restart runs, etc.

Take a look at our documentation and continue following through current README for instructions on how to use NePS!

Key Features

In addition to the common features offered by traditional HPO and NAS libraries, NePS stands out with the following key features:

  1. Hyperparameter Optimization (HPO) With Prior Knowledge:

  2. Neural Architecture Search (NAS) With Context-free Grammar Search Spaces:

  3. Easy Parallelization and Resumption of Runs:

    • NePS simplifies the process of parallelizing optimization tasks both on individual computers and in distributed computing environments. It also allows users to conveniently resume these optimization tasks after completion to ensure a seamless and efficient workflow for long-running experiments.
  4. Seamless User Code Integration:

    • NePS's modular design ensures flexibility and extensibility. Integrate NePS effortlessly into existing machine learning workflows.

Getting Started

1. Installation

Using pip:

pip install neural-pipeline-search

Note: As indicated with the v0.x.x version number, NePS is early stage code and APIs might change in the future.

2. Basic Usage

Using neps always follows the same pattern:

  1. Define a run_pipeline function capable of evaluating different architectural and/or hyperparameter configurations for your problem.
  2. Define a search space named pipeline_space of those Parameters e.g. via a dictionary
  3. Call neps.run to optimize run_pipeline over pipeline_space

In code, the usage pattern can look like this:

import neps
import logging


# 1. Define a function that accepts hyperparameters and computes the validation error
def run_pipeline(
    hyperparameter_a: float, hyperparameter_b: int, architecture_parameter: str
) -> dict:
    # Create your model
    model = MyModel(architecture_parameter)

    # Train and evaluate the model with your training pipeline
    validation_error, training_error = train_and_eval(
        model, hyperparameter_a, hyperparameter_b
    )

    return {  # dict or float(validation error)
        "loss": validation_error,
        "info_dict": {
            "training_error": training_error
            # + Other metrics
        },
    }


# 2. Define a search space of parameters; use the same names for the parameters as in run_pipeline
pipeline_space = dict(
    hyperparameter_b=neps.IntegerParameter(
        lower=1, upper=42, is_fidelity=True
    ),  # Mark 'is_fidelity' as true for a multi-fidelity approach.
    hyperparameter_a=neps.FloatParameter(
        lower=0.001, upper=0.1, log=True
    ),  # If True, the search space is sampled in log space.
    architecture_parameter=neps.CategoricalParameter(
        ["option_a", "option_b", "option_c"]
    ),
)

if __name__ == "__main__":
    # 3. Run the NePS optimization
    logging.basicConfig(level=logging.INFO)
    neps.run(
        run_pipeline=run_pipeline,
        pipeline_space=pipeline_space,
        root_directory="path/to/save/results",  # Replace with the actual path.
        max_evaluations_total=100,
        searcher="hyperband"  # Optional specifies the search strategy,
        # otherwise NePs decides based on your data.
    )

Examples

Discover how NePS works through these practical examples:

Documentation

For more details and features please have a look at our documentation

Analysing runs

See our documentation on analysing runs.

Contributing

Please see the documentation for contributors.

Citations

Please consider citing us if you use our tool!

Refer to our documentation on citations.

Alternatives

NePS does not cover your use-case? Have a look at some alternatives.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neural_pipeline_search-0.11.1.tar.gz (232.2 kB view hashes)

Uploaded Source

Built Distribution

neural_pipeline_search-0.11.1-py3-none-any.whl (301.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page