Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finetuning with weights in bfloat16 #100

Merged
merged 8 commits into from
Apr 6, 2023
Merged

Finetuning with weights in bfloat16 #100

merged 8 commits into from
Apr 6, 2023

Commits on Apr 5, 2023

  1. bfloat

    awaelchli committed Apr 5, 2023
    Configuration menu
    Copy the full SHA
    ffe5222 View commit details
    Browse the repository at this point in the history
  2. update readme

    awaelchli committed Apr 5, 2023
    Configuration menu
    Copy the full SHA
    f66beb6 View commit details
    Browse the repository at this point in the history
  3. typo

    awaelchli committed Apr 5, 2023
    Configuration menu
    Copy the full SHA
    93eb81e View commit details
    Browse the repository at this point in the history
  4. apply to lora

    awaelchli committed Apr 5, 2023
    Configuration menu
    Copy the full SHA
    7baf70f View commit details
    Browse the repository at this point in the history
  5. Update README.md

    Co-authored-by: Luca Antiga <luca@lightning.ai>
    awaelchli and lantiga authored Apr 5, 2023
    Configuration menu
    Copy the full SHA
    8bb4c5c View commit details
    Browse the repository at this point in the history
  6. Configuration menu
    Copy the full SHA
    6fc8692 View commit details
    Browse the repository at this point in the history
  7. typo

    awaelchli committed Apr 5, 2023
    Configuration menu
    Copy the full SHA
    180d1bf View commit details
    Browse the repository at this point in the history
  8. Configuration menu
    Copy the full SHA
    7f0f2f7 View commit details
    Browse the repository at this point in the history