Skip to content

Conversation

@alexrs
Copy link
Contributor

@alexrs alexrs commented May 15, 2023

What

When reading the README, I realized that there is a link that is supposed to contain detailed instructions on how to use LLM.int8. However, the link points to the HF Transformers repo.

Fix

I assume that the actual blog post the README refers to is "A Gentle Introduction to 8-bit Matrix Multiplication for transformers at scale using Hugging Face Transformers, Accelerate and bitsandbytes", which is also linked in the "Resources" section.

If my assumption is wrong, do not hesitate to close the PR!

@anakin87
Copy link

anakin87 commented Jun 26, 2023

👍

@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

@github-actions github-actions bot closed this Dec 30, 2023
@TimDettmers
Copy link
Collaborator

Good catch, thank you!

@TimDettmers TimDettmers reopened this Jan 1, 2024
@TimDettmers TimDettmers merged commit ef4b079 into bitsandbytes-foundation:main Jan 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants