Building attention mechanisms and Transformer models from scratch. Alias ATF.
Project description
Attention mechanisms and Transformers
- This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
- At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.
Installation
- Using pip to install from pypi
pip install Attention-and-Transformers
- Using pip to install latest version from github
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
- Local clone and install
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install
Test Installation
python load_test.py
Attention Mechanisms
# No. | Mechanism | Paper |
---|---|---|
1 | Multi-head Self Attention | Attention is all you need |
2 | Multi-head Self Attention 2D | MobileViT V1 |
2 | Separable Self Attention | MobileViT V2 |
Transformer Models
# No. | Models | Paper |
---|---|---|
1 | Vision Transformer | An Image is Worth 16x16 Words: |
2 | MobileViT-V1 | MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer |
3 | MobileViT-V2 (under development) | Separable Self-attention for Mobile Vision Transformers |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for Attention_and_Transformers-0.0.9.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1a3ed456ad6be647a299f0a6aacf618a64acdaaa8e445511b175137277287fb0 |
|
MD5 | 7ad14ab184681b576b393a3af6c8bb65 |
|
BLAKE2b-256 | f7329dae3f53230f3460af436b3ddabbdef2665a3bd399fc067537dbe49fd904 |
Close
Hashes for Attention_and_Transformers-0.0.9-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4ab6a68901d8e4a4a2ea1d94f7d1159b16c16433e391fc9ed832cfb5bf8f1357 |
|
MD5 | 6ff9cc62a6053719749f0e54fd6f5f13 |
|
BLAKE2b-256 | 3980b4b9dfa15bca93f4a1053021758960593096a7a1dcc93c4606dcf4da7565 |