Skip to main content

Toolkit for finetuning and evaluating transformer based language models

Project description

FARM LOGO

(Framework for Adapting Representation Models)

Build Release License Last Commit Last Commit

What is it?

FARM makes cutting edge Transfer Learning for NLP simple. Building upon transformers, FARM is a home for all species of pretrained language models (e.g. BERT) that can be adapted to different domain languages or down-stream tasks. With FARM you can easily create SOTA NLP models for tasks like document classification, NER or question answering. The standardized interfaces for language models and prediction heads allow flexible extension by researchers and easy application for practitioners. Additional experiment tracking and visualizations support you along the way to adapt a SOTA model to your own NLP problem and have a fast proof-of-concept.

Core features

  • Easy adaptation of language models (e.g. BERT) to your own use case

  • Fast integration of custom datasets via Processor class

  • Modular design of language model and prediction heads

  • Switch between heads or just combine them for multitask learning

  • Smooth upgrading to new language models

  • Powerful experiment tracking & execution

  • Simple deployment and visualization to showcase your model

  • Tasks: Question Answering, LM Domain Adaptation, NER, (Multilabel) Doc Classification

Resources

Installation

Recommended (because of active development):

git clone https://github.com/deepset-ai/FARM.git
cd FARM
pip install -r requirements.txt
pip install --editable .

If problems occur, please do a git pull. The –editable flag will update changes immediately.

From PyPi:

pip install farm

Basic Usage

1. Train a downstream model

FARM offers two modes for model training:

Option 1: Run experiment(s) from config

https://raw.githubusercontent.com/deepset-ai/FARM/master/docs/img/code_snippet_experiment.png

Use cases: Training your first model, hyperparameter optimization, evaluating a language model on multiple down-stream tasks.

Option 2: Stick together your own building blocks

https://raw.githubusercontent.com/deepset-ai/FARM/master/docs/img/code_snippet_building_blocks.png

Usecases: Custom datasets, language models, prediction heads …

Metrics and parameters of your model training get automatically logged via MLflow. We provide a public MLflow server for testing and learning purposes. Check it out to see your own experiment results! Just be aware: We will start deleting all experiments on a regular schedule to ensure decent server performance for everybody!

2. Run Inference (API + UI)

FARM Inferennce UI

One docker container exposes a REST API (localhost:5000) and another one runs a simple demo UI (localhost:3000). You can use both of them individually and mount your own models. Check out the docs for details.

Core concepts

Model

AdaptiveModel = Language Model + Prediction Head(s) With this modular approach you can easily add prediction heads (multitask learning) and re-use them for different types of language model. (Learn more)

https://raw.githubusercontent.com/deepset-ai/FARM/master/docs/img/adaptive_model_no_bg_small.jpg

Data Processing

Custom Datasets can be loaded by customizing the Processor. It converts “raw data” into PyTorch Datasets. Much of the heavy lifting is then handled behind the scenes to make it fast & simple to debug. (Learn more)

https://raw.githubusercontent.com/deepset-ai/FARM/master/docs/img/data_silo_no_bg_small.jpg

Upcoming features

  • More pretrained models roBERTa, XLNet …

  • Improved functionality for Question Answering Task

  • Additional visualizations and statistics to explore and debug your model

  • SOTA adaptation strategies (Adapter Modules, Discriminative Fine-tuning …)

  • Enabling large scale deployment for production

Acknowledgements

  • FARM is built upon parts of the great transformers repository from Huggingface. It utilizes their implementations of the BERT model and Tokenizer.

  • The original BERT model and paper was published by Jacob Devlin, Ming-Wei Chang, Kenton Lee and Kristina Toutanova.

Citation

As of now there is no published paper on FARM. If you want to use or cite our framework, please include the link to this repository. If you are working with the German Bert model, you can link our blog post describing its training details and performance.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

farm-0.2.1.tar.gz (78.3 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page