Skip to main content

An implementation of adan optimization algorithm for optax.

Project description

optax-adan

An implementation of adan optimizer for optax based on Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models.

Collab with usage example can be found here.

How to use:

Install the package:

python3 -m pip install optax-adan

Import the optimizer:

from optax_adan import adan

Use it as you would use any other optimizer from optax:

# init
optimizer = adan(learning_rate=0.01)
optimizer_state = optimizer.init(initial_params)
# step
grad = grad_func(params)
updates, optimizer_state = optimizer.update(grad, optimizer_state, params)
params = optax.apply_updates(params, updates)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optax-adan-0.1.5.tar.gz (7.7 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page