Skip to main content

Easily train your own text-generating neural network of any size and complexity

Project description

Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code, or quickly train on a text using a pretrained model.

  • A modern neural network architecture which utilizes new techniques as attention-weighting and skip-embedding to accelerate training and improve model quality.
  • Able to train on and generate text at either the character-level or word-level.
  • Able to configure RNN size, the number of RNN layers, and whether to use bidirectional RNNs.
  • Able to train on any generic input text file, including large files.
  • Able to train models on a GPU and then use them with a CPU.
  • Able to utilize a powerful CuDNN implementation of RNNs when trained on the GPU, which massively speeds up training time as opposed to normal LSTM implementations.
  • Able to train the model using contextual labels, allowing it to learn faster and produce better results in some cases.
  • Able to generate text interactively for customized stories.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

textgenrnn-2.0.0.tar.gz (1.7 MB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page