generated from fastai/nbdev_template
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error with SFT of LLaVA-Next #1785
Labels
vlm
Related to Visual Language Model
Comments
5 tasks
Hi, sorry for the delay. Can you double-check the command. When I run it, I get
Also share the versions of try, transformers and torch please |
I have double checked the command, code and output. Versions are as follows:
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I'm trying to instruction tune llava-next models following the llava_vsft.py examples shared for llava-1.5.
The run keeps failing on a 8xH100 VM with the following error:
The full code and error stack trace is available in this gist.
The text was updated successfully, but these errors were encountered: