-
Notifications
You must be signed in to change notification settings - Fork 29.6k
Add kyutai stt #38909
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add kyutai stt #38909
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ArthurZucker commented below the last changes that require your validation 🤗
# Update: to extend _keep_in_fp32_modules flag feature, it can also be used to force modules that should stay in fp32 | ||
if model._keep_in_fp32_modules is not None and ( | ||
torch_dtype == torch.float16 or getattr(hf_quantizer, "use_keep_in_fp32_modules", False) | ||
torch_dtype == torch.float16 | ||
or torch_dtype == torch.bfloat16 | ||
or getattr(hf_quantizer, "use_keep_in_fp32_modules", False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as discussed offline, let's extend _keep_in_fp32_modules
for more intuitive fonctionning
@@ -619,7 +619,7 @@ def augmented_dependencies_for_class_node( | |||
"processing", | |||
"image_processing", | |||
"video_processing", | |||
"feature_extractor", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
discussed offline with @Cyrilvallez, the correct convention for now for file naming is feature_extraction
yet I do agree that feature_extractor
sounds better. Nonetheless let's keep it how it is for know for coherency
Yes let's go! |
What does this PR do?
Adds Kyutai's new STT model 🚀