-
Notifications
You must be signed in to change notification settings - Fork 354
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Received incompatible devices for pjitted computation #1
Comments
Hey @wimjan123! The issue originates because we now load Flax weights on CPU by default in Transformers: huggingface/transformers#15295 Currently, the easiest workaround is to comment out the following lines: https://github.com/huggingface/transformers/blob/fe1f5a639d93c9272856c670cff3b0e1a10d5b2b/src/transformers/modeling_flax_utils.py#L836-L838 This will leave the Flax weights to default to the accelerator device you have available. This should be fixed by default by the time the repo is announced! |
Everything seems to work great. And the speed is absolutely crazy. 50 minutes audio transcribed in 30 seconds. I know this is still a WIP, but if I can give one suggestion: maybe add a way to export the output as txt and srt files? Awesome repo and a gamechanger for whisper in terms of speed. |
Awesome, glad to hear that @wimjan123! Let me know if you encounter any other issues! The repo has only really been tested so far with my personal experimenting. We're otherwise more or less ready for release though. Getting quite similar numbers to you in my benchmark tests on a GPU: https://github.com/sanchit-gandhi/whisper-jax#benchmarks (note that this does not include time to load the audio file, which is the same for all three repos but can be a significant proportion of the overall transcription time) Thanks for the tip! I'll look into how we could export the output as txt/srt files. Currently, the easiest way is to write to a file manually: pred_str = pipeline(...)
with open(output.txt", "w") as text_file:
text_file.write(pred_str) |
Awesome repo! I have one question tho: Whenever I try running this code on my own TPU-v4-8, I get the following error:
Any idea how I can fix it?
The text was updated successfully, but these errors were encountered: