Skip to content

Various LMs/LLMs below 3B parameters (for now) trained using SFT (Supervised Fine Tuning) for several downstream tasks

Notifications You must be signed in to change notification settings

sovit-123/lm_sft

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

README

Various LMs/LLMs below 3B parameters (for now) trained using SFT (Supervised Fine Tuning) for several downstream tasks like:

  • Following instructions (the general instruction fine tuning)
  • Question Answering
  • Summarization
  • Headline generation
  • Sentiment analysis
  • Text classification
  • Language translation
  • Code generation
  • And more coming...

Setup

  • Create a Conda environment:
conda create -n env_name python=3.11
  • Installing packaging:
pip install packaging
  • Install PyTorch (you can install the version of your choice from here)
conda install pytorch==2.2.0 pytorch-cuda=12.1 -c pytorch -c nvidia
  • Install rest of the requirements:
pip install -r requirements.txt

About

Various LMs/LLMs below 3B parameters (for now) trained using SFT (Supervised Fine Tuning) for several downstream tasks

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published