Skip to content
/ thinc Public
forked from explosion/thinc

๐Ÿ”ฎ A refreshing functional take on deep learning, compatible with your favorite libraries

License

Notifications You must be signed in to change notification settings

liaeh/thinc

ย 
ย 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Thinc: A refreshing functional take on deep learning, compatible with your favorite libraries

From the makers of spaCy, Prodigy and FastAPI

Thinc is a lightweight deep learning library that offers an elegant, type-checked, functional-programming API for composing models, with support for layers defined in other frameworks such as PyTorch, TensorFlow and MXNet. You can use Thinc as an interface layer, a standalone toolkit or a flexible way to develop new models. Previous versions of Thinc have been running quietly in production in thousands of companies, via both spaCy and Prodigy. We wrote the new version to let users compose, configure and deploy custom models built with their favorite framework.

Azure Pipelines Current Release Version PyPi Version conda Version Python wheels Code style: black Open demo in Colab

๐Ÿ”ฅ Features

  • Type-check your model definitions with custom types and mypy plugin.
  • Wrap PyTorch, TensorFlow and MXNet models for use in your network.
  • Concise functional-programming approach to model definition, using composition rather than inheritance.
  • Optional custom infix notation via operator overloading.
  • Integrated config system to describe trees of objects and hyperparameters.
  • Choice of extensible backends.
  • Read more โ†’

๐Ÿš€ Quickstart

Thinc is compatible with Python 3.6+ and runs on Linux, macOS and Windows. The latest releases with binary wheels are available from pip. Before you install Thinc and its dependencies, make sure that your pip, setuptools and wheel are up to date. For the most recent releases, pip 19.3 or newer is recommended.

pip install -U pip setuptools wheel
pip install thinc --pre

See the extended installation docs for details on optional dependencies for different backends and GPU. You might also want to set up static type checking to take advantage of Thinc's type system.

โš ๏ธ If you have installed PyTorch and you are using Python 3.7+, uninstall the package dataclasses with pip uninstall dataclasses, since it may have been installed by PyTorch and is incompatible with Python 3.7+.

๐Ÿ““ Selected examples and notebooks

Also see the /examples directory and usage documentation for more examples. Most examples are Jupyter notebooks โ€“ to launch them on Google Colab (with GPU support!) click on the button next to the notebook name.

Notebook Description
intro_to_thinc
Open in Colab
Everything you need to know to get started. Composing and training a model on the MNIST data, using config files, registering custom functions and wrapping PyTorch, TensorFlow and MXNet models.
transformers_tagger_bert
Open in Colab
How to use Thinc, transformers and PyTorch to train a part-of-speech tagger. From model definition and config to the training loop.
pos_tagger_basic_cnn
Open in Colab
Implementing and training a basic CNN for part-of-speech tagging model without external dependencies and using different levels of Thinc's config system.
parallel_training_ray
Open in Colab
How to set up synchronous and asynchronous parameter server training with Thinc and Ray.

View more โ†’

๐Ÿ“– Documentation & usage guides

Documentation Description
Introduction Everything you need to know.
Concept & Design Thinc's conceptual model and how it works.
Defining and using models How to compose models and update state.
Configuration system Thinc's config system and function registry.
Integrating PyTorch, TensorFlow & MXNet Interoperability with machine learning frameworks
Layers API Weights layers, transforms, combinators and wrappers.
Type Checking Type-check your model definitions and more.

๐Ÿ—บ What's where

Module Description
thinc.api User-facing API. All classes and functions should be imported from here.
thinc.types Custom types and dataclasses.
thinc.model The Model class. All Thinc models are an instance (not a subclass) of Model.
thinc.layers The layers. Each layer is implemented in its own module.
thinc.shims Interface for external models implemented in PyTorch, TensorFlow etc.
thinc.loss Functions to calculate losses.
thinc.optimizers Functions to create optimizers. Currently supports "vanilla" SGD, Adam and RAdam.
thinc.schedules Generators for different rates, schedules, decays or series.
thinc.backends Backends for numpy and cupy.
thinc.config Config parsing and validation and function registry system.
thinc.util Utilities and helper functions.

๐Ÿ Development notes

Thinc uses black for auto-formatting, flake8 for linting and mypy for type checking. All code is written compatible with Python 3.6+, with type hints wherever possible. See the type reference for more details on Thinc's custom types.

๐Ÿ‘ทโ€โ™€๏ธ Building Thinc from source

Building Thinc from source requires the full dependencies listed in requirements.txt to be installed. You'll also need a compiler to build the C extensions.

git clone https://github.com/explosion/thinc
cd thinc
python -m venv .env
source .env/bin/activate
pip install -U pip setuptools wheel
pip install -r requirements.txt
pip install --no-build-isolation .

Alternatively, install in editable mode:

pip install -r requirements.txt
pip install --no-build-isolation --editable .

Or by setting PYTHONPATH:

export PYTHONPATH=`pwd`
pip install -r requirements.txt
python setup.py build_ext --inplace

๐Ÿšฆ Running tests

Thinc comes with an extensive test suite. The following should all pass and not report any warnings or errors:

python -m pytest thinc    # test suite
python -m mypy thinc      # type checks
python -m flake8 thinc    # linting

To view test coverage, you can run python -m pytest thinc --cov=thinc. We aim for a 100% test coverage. This doesn't mean that we meticulously write tests for every single line โ€“ we ignore blocks that are not relevant or difficult to test and make sure that the tests execute all code paths.

About

๐Ÿ”ฎ A refreshing functional take on deep learning, compatible with your favorite libraries

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 79.2%
  • Cython 8.4%
  • JavaScript 6.5%
  • Sass 3.4%
  • Cuda 2.5%