TFLibrary is an open-source toolkit based on Tensorflow, with the design goals of modularity, flexibility and extensibility. It is in active development and used extensively in my own researches.
Modelssimple wrapper class around models.
Modules.attentionattention modules takes in as input multiple inputs and compute attended features, currently includes matrix cross attention and BiDAF-style attention.
Modules.embeddingembedding modules takes texts or text-ids and produce dense representations, currently includes matrix embedding, ELMO embedding, and GLOVE embedding.
Modules.encodersencoder modules encodes dense representations of input texts, currently includes LSTM encoder and BiDAF-style encoder.
Modules.transformertransformer modules are simplified / decomposed version of Transformer in Tensor2Tensor library, currently include Transformer encoder.
TunerHyper-parameter tuner using grid search as well as Bayesian Optimization, with support for multi-GPUs in parallel computations settings.
utilsa collection of various utility functions.
utils.TrainingManagera light helper class for monitoring training progress, and providing signals for early stopping.
pip install -e . pip install -r REQUIREMENTS.txt
# using `--no-cache` to avoid accidentally missing changes in parent tensorflow image docker build -t tf-library:`git rev-parse --abbrev-ref HEAD` .
utils/to its own directory
Tuner/, and changed both the interface and implementation of
Tunerto support Bayesian Optimization based tuning and more flexible extension.