Skip to content
Transfer Learning library for Deep Neural Networks.
Jupyter Notebook Python
Branch: master
Clone or download
jnkm Bumping minor version to 1.1.0 (#55)
* Add brackets to predict prob docstring

* Remove i.e. from predict prob docstring

* Update requirements link

* Add link to feature request template in contributing doc (#29)

* Update links to feature request template

* Correct typo

* Update documentation

* additional -> all

* Move GPy to be with other 3P imports

* Remove 'i.e.' from predict docstring

* Add fullstops

* Add full stops to module docstrings

* Remove doc dependency

* Add missing word to save_repurposer docstring

* Demo transfer learning for text categorization

* change kernel to default

* Address pull request comments

* Remove plot from hpo notebook because of gpyopt error

* Include plot_acquistion command as markdown code with links to open issues

* Linux build trusty->xenial. Notebook tests osx->linux

* brew->sudo apt-get

* Add Gluon notebooks

* Execute missed cells

* Remove pip update

* Update pip on macosx

* Updated README

* Remove hpo notebook from travis build

* Bump version: 0.3.6 → 1.0.0

* Add back hpo notebook to travis build

* Kernel xfer-env to python3

* Remove "language: generic"

* Separate gluon demos in travis to avoid timeout

* Fix typo: 'filename'->filename

* Fix bug: gluon notebooks being tested twice

* Run gluon notebook tests on linux

* Show more modelhandler features in demo

* Speed up hpo notebook test using latest linux machines

* Rerun notebook

* Update kernel name

* Replace emist letters dataset with mnist

* Catch modified error string

* Add jupyter libraries to test requirements file

* Limit version of Tornado to <6

* Leap (#51)

* Add leap files

* Add copyright notices

* Fix flake8 formatting

* Fixing missing label in parameter plot

* Fix relative imports

* Address commits

* Add indent to branch (#52)

* Add short description to the Leap's demo (#53)

* Add short description to the Leap's demo

* Update xfer/contrib/xfer_leap/demos/multitask_leap.ipynb

Co-Authored-By: pgmoren <>

* Update xfer/contrib/xfer_leap/demos/multitask_leap.ipynb

Co-Authored-By: pgmoren <>

* Update xfer/contrib/xfer_leap/demos/multitask_leap.ipynb

Co-Authored-By: pgmoren <>

* Update xfer/contrib/xfer_leap/demos/multitask_leap.ipynb

Co-Authored-By: pgmoren <>

* Bump version: 1.0.0 → 1.1.0 (#54)
Latest commit f637f93 Mar 21, 2019


Deep Transfer Learning for MXNet

Build Status Documentation Status codecov pypi GitHub license

Website | Documentation | Contribution Guide

What is Xfer?

Xfer is a library that allows quick and easy transfer of knowledge1,2,3 stored in deep neural networks implemented in MXNet. Xfer can be used with data of arbitrary numeric format, and can be applied to the common cases of image or text data.

Xfer can be used as a pipeline that spans from extracting features to training a repurposer. The repurposer is then an object that carries out predictions in the target task.

You can also use individual components of Xfer as part of your own pipeline. For example, you can leverage the feature extractor to extract features from deep neural networks or ModelHandler, which allows for quick building of neural networks, even if you are not an MXNet expert.

Why should I use Xfer?

  • Resource efficiency: you don't have to train big neural networks from scratch.
  • Data efficiency: by transferring knowledge, you can classify complex data even if you have very few labels.
  • Easy access to neural networks: you don't need to be an ML ninja in order to leverage the power of neural networks. With Xfer you can easily re-use them or even modify existing architectures and create your own solution.
  • Utilities for feature extraction from neural networks.
  • Rapid prototyping: ModelHandler allows you to easily modify a neural network architecture, e.g. by providing one-liners for adding / removing / freezing layers.
  • Uncertainty modeling: With the Bayesian neural network (BNN) or the Gaussian process (GP) repurposers, you can obtain uncertainty in the predictions of the rerpurposer.

Minimal demo

After defining an MXNet source model and data iterators for your target task, you can perform transfer learning with just 3 lines of code:

repurposer = xfer.LrRepurposer(source_model, feature_layer_names=['fc7'])
predictions = repurposer.predict_label(test_iterator)

Getting Started


  • Dependencies: Primary dependencies are MXNet >=1.2 and GPy >= 1.9.5. See all requirements in

  • Supported architectures / versions: Python 3.6+ on MacOS and Amazon Linux.

    Also tested on Python 3.4 and 3.5. Kindly note that for these Python versions, NumPy must be pre-installed for GPy (dependency of Xfer) to work.

  • Install with pip:
pip install xfer-ml
  • Install from source: To install Xfer from source, after cloning the repository run the following from the top-level directory:
pip install .

To confirm installation, run:

>>> import xfer
>>> xfer.__version__

And confirm that version returned matches the expected package version number.


Have a look at our contributing guide, thanks for the interest!

Points of contact for Xfer are: Jordan Massiah, Keerthana Elango, Pablo G. Moreno, Nikos Aletras, Andreas Damianou, Sebastian Flennerhag


Xfer is licensed under the Apache 2.0 License.

You can’t perform that action at this time.