Skip to main content

Package implementing different attention mechanisms as tf.keras layers

Project description

Kattention

This package implements different Attention mechanisms as Keras layers.

Setup

pip install kattention

Usage

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Flatten, Dense, Softmax
from kattention.layers import Transformer 

SEQUENCE_LENGTH = 4
EMBEDDING_SIZE = 300
CLASSES_TO_PREDICT = 5
ATT_HEADS = 2

model = Sequential()
model.add(Transformer(attention_heads=ATT_HEADS, input_shape=(SEQUENCE_LENGTH, EMBEDDING_SIZE)))
model.add(Transformer(attention_heads=ATT_HEADS))
model.add(Flatten())
model.add(Dense(CLASSES_TO_PREDICT))
model.add(Softmax())

print(model.summary())

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kattention-0.1.2.tar.gz (2.5 kB view hashes)

Uploaded Source

Built Distribution

kattention-0.1.2-py3-none-any.whl (5.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page