Transformers based NLP models
Project description
NLP Models
A repository for building transformer based nlp models
Installation
Install from PyPi
pip install nlp-models
Install from source
git clone git@github.com:minggnim/nlp-models.git
pip install -r requirements
Llama2 Quantization model on consumer CPU
Run Chat applications on CPU
-
Streamlit UI
cd apps streamlit run chat.py
-
Command line
llm_app chat -s 'hi there'
Run Q&A application on CPU
-
Steamlit UI
cd apps streamlit run qa.py
Models
-
bert_classifier
A wrapper package around BERT-based classification models -
multi_task_model
An implementation of multi-tasking model built on encoder models -
GPT-2
-
Falcon 7B
-
Quantized Llama2 models
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nlp-models-4.2.0.tar.gz
(15.6 kB
view hashes)
Built Distribution
nlp_models-4.2.0-py3-none-any.whl
(19.1 kB
view hashes)
Close
Hashes for nlp_models-4.2.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8067cd1b9cc716230c1ee52916e9761308c12b98bb846bce78e71f533882ad51 |
|
MD5 | 99278a4094470f157345bfdc5e2ad44b |
|
BLAKE2b-256 | c37d8fc6e6e57df7a058360addfcb5d165172d86bd8a4620a0b86211214502a3 |