Skip to main content

TensorFlow-compatible Transformer layers and models.

Project description

# maximal

See the [Official Documentation site](https://ivanbongiorni.github.io/maximal/)

Current version: 1.2.1

A TensorFlow-compatible Python library that provides models and layers to implement custom Transformer neural networks.

Built on TensorFlow 2.

<a href=”url”><img src=”https://github.com/IvanBongiorni/maximal/blob/main/utils/maximal_stablediffusion_00.png” align=”center”></a> <br> Logo generated by Stable Diffusion 2.1 <br>

# Installation Its installation is straightforward:

` pip install maximal `

# How to use it? maximal is commonly called as:

` import maximal as ml from maximal.layers import TransformerLayer, GPTLayer `

and can be used in a tf.keras model as any common layer.

# Documentation An [Official Website](https://ivanbongiorni.github.io/maximal/) is now available with documentation and tutorials.

PyPI link is available [here](https://pypi.org/project/maximal/1.0/.

# Elements

In layers.py: - SelfAttention: keras.Layer, computes Scaled Dot-Product Attention.

  • MultiHeadSelfAttention: keras.Layer, it is a concatenation of SelfAttention layers, resized back to original input shape through linear transformation.

  • PositionalEmbedding: keras.Layer, implements double Embedding layers used in Transformers literature, for tokens and positions. Positional encoding is learned through a tf.keras.layers.Embedding() layer, instead of deterministic positional encoding in the original paper.

  • ImageEmbedding: keras.Layer, implements double Embedding layers used as inputs of Vision Transformers, for image patches and positions.

  • TransformerLayer: keras.Layer single Transformer Encoder piece. It can be used inside any Sequential() model in Keras.

  • GPTLayer: keras.Layer GPT block. Similar to TransformerLayer but with causal Attention mechanism. It can be used inside any Sequential() model in Keras.

In schedules.py: - OriginalTransformerSchedule: keras.Layer implements the learning rate schedule of the original Transformer paper. It is taken from this [official TensorFlow tutorial](https://www.tensorflow.org/text/tutorials/transformer).

# Requirements ` h5py numpy tensorflow >= 2.0 `

# Author Ivan Bongiorni. [LinkedIn](https://www.linkedin.com/in/ivan-bongiorni-b8a583164/)

# License 2020 Ivan Bongiorni

This repository is licensed under the MIT license. See [LICENCE.txt]() for further details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

maximal-1.2.1.tar.gz (10.6 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page