Skip to content

mariomeissner/lightning-hydra-transformers

 
 

Repository files navigation

Mariomeissner's Lightning-Hydra-Transformers Template

Python PyTorch Lightning Config: hydra Code style: black

An opinionated but flexible template to kickstart your transformers research project 🚀⚡🔥
Click on Use this template to initialize new repository.

Suggestions are always welcome!

About this Template

This template is a fork of the fantastic lightning-hydra-template, modified to work smoothly with Huggingface Transformers and Huggingface Datasets. It contains an example of how to train Bert on the MNLI text classification task.

This template is very similar to the work done by the Lightning team in lightning-transformers, with the difference being that the template in this repository is simple, lightweight and very close to the original and beloved lightning-hydra-template mentioned above. It can be useful to reference lightning-transformers when implementing some complex usecase, but I personally found it useful to keep things simple, hence why I use this template instead of lightning-transformers.

Have a look at the original repository for a complete README with all the details.

Who I Recommend this Template To

Everyone! I think the research community in general can benefit from using a more structured approach like this. However, a basic understanding of PyTorch Lightning and Hydra are necessary for comfortable use. Make sure to check out those projects first before getting your hands dirty here.

About

My take on how you should organize your transformer experiments.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 98.9%
  • Shell 1.1%