Skip to content
🏡 Home for all species of language models. Fast & easy transfer learning for NLP
Python Jupyter Notebook Dockerfile
Branch: master
Clone or download
busyxin and Timoeller Add prediction head for regression (#50)
Add regression processor, prediction head and example. Adapt and
Latest commit 92f921e Aug 20, 2019



(Framework for Adapting Representation Models)

Release License Last Commit Last Commit

What is it?

FARM makes cutting edge Transfer Learning for NLP simple. It is a home for all species of pretrained language models (e.g. BERT) that can be adapted to different down-stream tasks. The aim is to make it simple to perform document classification, NER and question answering, for example, using the one language model. The standardized interfaces for language models and prediction heads allow flexible extension by researchers and easy adaptation for practitioners. Additional experiment tracking and visualizations support you along the way to adapt a SOTA model to your own NLP problem and have a very fast proof-of-concept.

Core features

  • Easy adaptation of language models (e.g. BERT) to your own use case
  • Fast integration of custom datasets via Processor class
  • Modular design of language model and prediction heads
  • Switch between heads or just combine them for multitask learning
  • Smooth upgrading to new language models
  • Powerful experiment tracking & execution
  • Simple deployment and visualization to showcase your model



Recommended (because of active development):

git clone
pip install -r requirements.txt
pip install --editable .

If problems occur, please do a git pull. The --editable flag will update changes immediately.

From PyPi:

pip install farm

Basic Usage

1. Train a downstream model

FARM offers two modes for model training:

Option 1: Run experiment(s) from config

Use cases: Training your first model, hyperparameter optimization, evaluating a language model on multiple down-stream tasks.

Option 2: Stick together your own building blocks

Usecases: Custom datasets, language models, prediction heads ...

Metrics and parameters of your model training get automatically logged via MLflow. We provide a public MLflow server for testing and learning purposes. Check it out to see your own experiment results! Just be aware: We will start deleting all experiments on a regular schedule to ensure decent server performance for everybody!

2. Run Inference (API + UI)

FARM Inferennce UI

One docker container exposes a REST API (localhost:5000) and another one runs a simple demo UI (localhost:3000). You can use both of them individually and mount your own models. Check out the docs for details.

Upcoming features

  • More pretrained models XLNet, XLM ...
  • SOTA adaptation strategies (Adapter Modules, Discriminative Fine-tuning ...)
  • Enabling large scale deployment for production
  • Additional Visualizations and statistics to explore and debug your model
You can’t perform that action at this time.