Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Int8 fusing #1

Merged
merged 1 commit into from
Nov 16, 2023
Merged

Int8 fusing #1

merged 1 commit into from
Nov 16, 2023

Conversation

BobaZooba
Copy link
Owner

Description

Now it will be possible to fuse LoRA models into bitsandbytes int8. This will allow fusing 7B models in Colab.

Type of Change

  • 馃摎 Examples / docs / tutorials / dependencies update
  • 馃敡 Bug fix (non-breaking change which fixes an issue)
  • 馃 Improvement (non-breaking change which improves an existing feature)
  • 馃殌 New feature (non-breaking change which adds functionality)
  • 馃挜 Breaking change (fix or feature that would cause existing functionality to change)
  • 馃攼 Security fix

Checklist

  • I've updated the code style using make codestyle.
  • I've written tests for all new methods and classes that I created.

Copy link

codecov bot commented Nov 16, 2023

Welcome to Codecov 馃帀

Once merged to your default branch, Codecov will compare your coverage reports and display the results in this comment.

Thanks for integrating Codecov - We've got you covered 鈽傦笍

@BobaZooba BobaZooba merged commit ac58cb3 into main Nov 16, 2023
6 checks passed
@BobaZooba BobaZooba deleted the hotfix/docs-n-lora-fusing branch November 17, 2023 07:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant