Skip to content

Releases: tunib-ai/parallelformers

v1.2.7

27 Jul 19:51
c0e4747
Compare
Choose a tag to compare
  • Fix some multiprocessing bugs reported from #32
  • Add OPT models from #30
  • Add a guide for multiprocessing error in README.md

v1.2.4

29 Dec 21:48
Compare
Choose a tag to compare

Fix gpu overallocation issue by adding deallocation before forward

v1.2.3

29 Dec 11:47
Compare
Choose a tag to compare

Remove redundant operations

v1.2.2

28 Dec 19:31
Compare
Choose a tag to compare
  • [#17] Fix performance issue about random sampling

v1.2

17 Dec 09:29
Compare
Choose a tag to compare

[#16] Remove assertion to force cpu usage

v1.1

06 Dec 00:06
Compare
Choose a tag to compare
  • [#4] Add GPTJ
  • [#14] Add MegatronBert

v1.0.1

19 Jul 11:51
33808f1
Compare
Choose a tag to compare

Issues

  • [#4] Backward compatibility patch
  • [#5] Bug about AlbertModel

Patches

  • [#7] Backward compatibility patch
    • support transformers from 4.2.0
  • [#6] Fix bug about AlbertModel

v1.0

18 Jul 00:09
Compare
Choose a tag to compare

GitHub release Apache 2.0 Docs Issues


  • Parallelformers, which is based on Megatron LM, is designed to make model parallelization easier.
  • You can parallelize various models in HuggingFace Transformers on multiple GPUs with a single line of code.
  • Currently, Parallelformers only supports inference. Training features are NOT included.