This repository contains code to run faster sentence-transformers using tools like quantization, ONNX and pruning.
Project description
Fast Sentence Transformers
This repository contains code to run faster sentence-transformers
using tools like quantization and ONNX
. Just run your model much faster, while a lot of memory. There is not much to it!
Install
pip install fast-sentence-transformers
Or for GPU support.
pip install fast-sentence-transformers[gpu]
Quickstart
from fast_sentence_transformers import FastSentenceTransformer as SentenceTransformer
# use any sentence-transformer
encoder = SentenceTransformer("all-MiniLM-L6-v2", device="cpu", quantize=True)
encoder.encode("Hello hello, hey, hello hello")
encoder.encode(["Life is too short to eat bad food!"] * 2)
Benchmark
Indicative benchmark for CPU usage with smallest and largest model on sentence-transformers
. Note, ONNX doesn't have GPU support for quantization yet.
model | Type | default | ONNX | ONNX+quantized | ONNX+GPU |
---|---|---|---|---|---|
paraphrase-albert-small-v2 | memory | 1x | 1x | 1x | 1x |
speed | 1x | 2x | 5x | 20x | |
paraphrase-multilingual-mpnet-base-v2 | memory | 1x | 1x | 4x | 4x |
speed | 1x | 2x | 5x | 20x |
Shout-Out
This package heavily leans on sentence-transformers
and txtai
.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for fast_sentence_transformers-0.3.4.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | ff4cc1b9801e748955c1804a50f9c232aab7c1362368df214e46a44e096c40a7 |
|
MD5 | c15fde589b0b0942861665265a937cc9 |
|
BLAKE2b-256 | 3cb5caaf3cd5f075f934b2e5c1e842563612506eb37acf9522a4bcb85e154f73 |
Close
Hashes for fast_sentence_transformers-0.3.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4bd39a81ede2fca9c14f92c83638a5e6561b1b930cfb6f7f7ace4766182338de |
|
MD5 | ad03be5745bac5400bca0b0ff96d9de3 |
|
BLAKE2b-256 | 02bb962551a0d0a44bd45fbd82b8bc138f31e41742597a1fd391997e897f864b |