Skip to content

Releases: EricFillion/happy-transformer

Version 3.0.0

07 Aug 03:02
b127826
Compare
Choose a tag to compare

New Features:

  • Deepspeed is now supported for fine-tuning.
  • Apple's MPS backend is now automatically used for both training and fine-tuning if detected.
  • Evaluating data is now used during fine-tuning to track the fine-tuning progress.
  • WandB can now be used to log the results from fine-tuning.
  • CSV files are supported for training/evaluating text generation and word prediction models. This makes it easy to isolate cases.
  • Push models to Hugging Face's Hub with one command.
  • Enable saving models periodically during training.

Breaking changes:

  • Preprocesses data is now saved in the Hugging Face's Dataset format rather than in JSON format.
  • Dictionary argument inputs for training and evaluating are no longer supported
  • Removed adam_beta1, adam_beta2, adam_epsilon and max_grad_norm learning parameters.
  • Replaced save_preprocessed_data, save_preprocessed_data_path with a single parameter called save_path. Likewise for load_preprocessed_data and load_preprocessed_data_path being replaced by load_path.
  • Removed support for dictionary settings for the args parameter for training and evaluating.
  • Removed the preprocessing_processes parameter for training and evaluating.

Updated Save Strategy

06 Feb 06:58
7f0449c
Compare
Choose a tag to compare

Contains fix from #280 to ensure that models are not saved during training.

Half Precession Training and Encoding Format

19 Nov 05:50
8a75ed0
Compare
Choose a tag to compare
  • You can now use the set the training parameter called "fp16" to enable half precession training. This has the potential to decrease training time and memory consumption. #257
  • Set the encoding format to utf-8 for HappyTextClassification and HappyQuestionAnswering. #265

Fixed use_auth_token not being passed to the HappyTransformer class

30 Oct 18:17
349686b
Compare
Choose a tag to compare
  1. Contains fix from #268
  2. Added text-to-text articles to docs to provide additional examples.

Added Support For Private Models

29 Oct 23:03
f70fec7
Compare
Choose a tag to compare

Allows users to use their authentication token to access their private models from Hugging Face's Model Hub.

See #266

Added Support to Prevent Bad Words/Phrases

11 Sep 16:05
68ef679
Compare
Choose a tag to compare

2.3.0: Text-to-Text Fine-Tuning

15 Aug 03:29
da9929b
Compare
Choose a tag to compare

You can now fine-tune text-to-text models like T5! Common applications of text-to-text finetuning include translation, summarization and grammar correction. You can find examples of these three applications under the example's folder here. You can also view the new documentation on the website.

2.2.5: Increase tqdm version

21 Jul 17:01
7907a1c
Compare
Choose a tag to compare

Text-to-text!

17 Jun 19:37
abbcbfc
Compare
Choose a tag to compare
  • Added text-to-text functionality
  • Fixed a bug regarding using word prediction with ALBERT. Sometimes the model would predict a blank string which would cause an exception during post-processing. Also added the preventative code to RoBERTa's post-processing method.

#236

Added Top-p To Text Generation Settings

12 Jun 22:20
89202b2
Compare
Choose a tag to compare
Merge pull request #235 from EricFillion/ef/fix-gen-setting-table

Fixed Text Generation Setting Table