skip to navigation
skip to content

Hebel 0.03-dev

GPU-Accelerated Deep Learning Library in Python

Latest Version: 0.02.1

GPU-Accelerated Deep Learning Library in Python

Hebel is a library for deep learning with neural networks in Python using GPU acceleration with CUDA through PyCUDA. It implements the most important types of neural network models and offers a variety of different activation functions and training methods such as momentum, Nesterov momentum, dropout, and early stopping.


Right now, Hebel implements feed-forward neural networks for classification and regression on one or multiple tasks. Other models such as Autoencoder, Convolutional neural nets, and Restricted Boltzman machines are planned for the future.

Hebel implements dropout as well as L1 and L2 weight decay for regularization.


Hebel implements stochastic gradient descent (SGD) with regular and Nesterov momentum.


Currently, Hebel will run on Linux and Windows, and probably Mac OS X (not tested).


  • PyCUDA
  • numpy
  • PyYAML
  • skdata (only for MNIST example)

Getting started

Study the yaml configuration files in examples/ and run

python examples/mnist_neural_net_shallow.yml

The script will create a directory in examples/mnist where the models and logs are saved.

Read the Getting started guide at for more information.

Documentation (coming slowly)


Maintained by Hannes Bretschneider ( If your are using Hebel, please let me know whether you find it useful and file a Github issue if you find any bugs or have feature requests.

What’s with the name?

Hebel is the German word for lever, one of the oldest tools that humans use. As Archimedes said it: “Give me a lever long enough and a fulcrum on which to place it, and I shall move the world.”

  • Downloads (All Versions):
  • 5 downloads in the last day
  • 31 downloads in the last week
  • 123 downloads in the last month