Skip to main content

A pure-Python neural network library

Project description

MyNN is a simple NumPy-centric neural network library that builds on top of MyGrad. It provides convenient wrappers for such functionality as

  • Convenient neural network layers (e.g. convolutional, dense, batch normalization, dropout)

  • Weight initialization functions (e.g. Glorot, He, uniform, normal)

  • Neural network activation functions (e.g. elu, glu, tanh, sigmoid)

  • Common loss functions (e.g. cross-entropy, KL-divergence, Huber loss)

  • Optimization algorithms (e.g. sgd, adadelta, adam, rmsprop)

MyNN comes complete with several examples to ramp you up to being a fluent user of the library. It was written as an extension to MyGrad for rapid prototyping of neural networks with minimal dependencies, a clean codebase with excellent documentation, and as a learning tool.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mynn-0.9.4.tar.gz (31.6 kB view hashes)

Uploaded Source

Built Distributions

mynn-0.9.4-py3.9.egg (53.4 kB view hashes)

Uploaded Source

mynn-0.9.4-py3-none-any.whl (24.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page