Skip to main content

Fast and Customizable Tokenizers

Project description

PyPI version

Tokenizers

A fast and easy to use implementation of today's most used tokenizers.

This API is currently in the process of being stabilized. We might introduce breaking changes really often in the coming days/weeks, so use at your own risks.

Installation

With pip:

pip install tokenizers

From sources:

To use this method, you need to have the Rust nightly toolchain installed.

# Install with:
curl https://sh.rustup.rs -sSf | sh -s -- -default-toolchain nightly-2019-11-01 -y
export PATH="$HOME/.cargo/bin:$PATH"

# Or select the right toolchain:
rustup default nightly-2019-11-01

Once Rust is installed and using the right toolchain you can do the following.

git clone https://github.com/huggingface/tokenizers
cd tokenizers/bindings/python

# Create a virtual env (you can use yours as well)
python -m venv .env
source .env/bin/activate

# Install `tokenizers` in the current virtual env
pip install maturin
maturin develop --release

Usage

Use a pre-trained tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders

# Load a BPE Model
vocab = "./path/to/vocab.json"
merges = "./path/to/merges.txt"
bpe = models.BPE.from_files(vocab, merges)

# Initialize a tokenizer
tokenizer = Tokenizer(bpe)

# Customize pre-tokenization and decoding
tokenizer.with_pre_tokenizer(pre_tokenizers.ByteLevel.new(True))
tokenizer.with_decoder(decoders.ByteLevel.new())

# And then encode:
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded)

# Or tokenize multiple sentences at once:
encoded = tokenizer.encode_batch([
	"I can feel the magic, can you?",
	"The quick brown fox jumps over the lazy dog"
])
print(encoded)

Train a new tokenizer

from tokenizers import Tokenizer, models, pre_tokenizers, decoders, trainers

# Initialize a tokenizer
tokenizer = Tokenizer(models.BPE.empty())

# Customize pre-tokenization and decoding
tokenizer.with_pre_tokenizer(pre_tokenizers.ByteLevel.new(True))
tokenizer.with_decoder(decoders.ByteLevel.new())

# And then train
trainer = trainers.BpeTrainer.new(vocab_size=20000, min_frequency=2)
tokenizer.train(trainer, [
	"./path/to/dataset/1.txt",
	"./path/to/dataset/2.txt",
	"./path/to/dataset/3.txt"
])

# Now we can encode
encoded = tokenizer.encode("I can feel the magic, can you?")
print(encoded)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenizers-0.0.9.tar.gz (34.9 kB view hashes)

Uploaded Source

Built Distributions

tokenizers-0.0.9-cp38-cp38-win_amd64.whl (777.7 kB view hashes)

Uploaded CPython 3.8 Windows x86-64

tokenizers-0.0.9-cp38-cp38-manylinux1_x86_64.whl (6.2 MB view hashes)

Uploaded CPython 3.8

tokenizers-0.0.9-cp38-cp38-macosx_10_13_x86_64.whl (853.2 kB view hashes)

Uploaded CPython 3.8 macOS 10.13+ x86-64

tokenizers-0.0.9-cp37-cp37m-win_amd64.whl (777.2 kB view hashes)

Uploaded CPython 3.7m Windows x86-64

tokenizers-0.0.9-cp37-cp37m-manylinux1_x86_64.whl (4.6 MB view hashes)

Uploaded CPython 3.7m

tokenizers-0.0.9-cp37-cp37m-macosx_10_13_x86_64.whl (853.5 kB view hashes)

Uploaded CPython 3.7m macOS 10.13+ x86-64

tokenizers-0.0.9-cp36-cp36m-win_amd64.whl (777.6 kB view hashes)

Uploaded CPython 3.6m Windows x86-64

tokenizers-0.0.9-cp36-cp36m-manylinux1_x86_64.whl (3.1 MB view hashes)

Uploaded CPython 3.6m

tokenizers-0.0.9-cp36-cp36m-macosx_10_13_x86_64.whl (853.8 kB view hashes)

Uploaded CPython 3.6m macOS 10.13+ x86-64

tokenizers-0.0.9-cp35-cp35m-win_amd64.whl (777.6 kB view hashes)

Uploaded CPython 3.5m Windows x86-64

tokenizers-0.0.9-cp35-cp35m-manylinux1_x86_64.whl (1.5 MB view hashes)

Uploaded CPython 3.5m

tokenizers-0.0.9-cp35-cp35m-macosx_10_13_x86_64.whl (853.7 kB view hashes)

Uploaded CPython 3.5m macOS 10.13+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page