Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bettertransformer is default after transformers 4.36 #76

Conversation

kamalojasv181
Copy link

Bettertransformer is default in version 4.36 of transformers therefore this is causing an error. So removing it

@sanchit-gandhi

guynich added a commit to guynich/distil-whisper that referenced this pull request Feb 12, 2024
guynich added a commit to guynich/distil-whisper that referenced this pull request Feb 12, 2024
guynich added a commit to guynich/distil-whisper that referenced this pull request Feb 12, 2024
* Create run_pseudo_labelling.sh

https://github.com/huggingface/distil-whisper/tree/main/training#1-pseudo-labelling

* Change path to python file.

* Change to float16 (was bfloat16)

My workstation threw an error `ValueError: bf16 mixed precision requires PyTorch >= 1.10 and a supported device.`. My torch version is `1.13.1+cu117`.

* Update run_pseudo_labelling.py

Copied from huggingface#76

* Try reducing number or workers.

Work around for RuntimeError: One of the subprocesses has abruptly died during map operation.To debug the error, disable multiprocessing.

* Add comment for PR huggingface#76

* Reduce dataloader_num_workers to 8

Value of 16 caused freeze.
@sanchit-gandhi
Copy link
Collaborator

Thanks for the PR @kamalojasv181 - resolved in #101 with a big batch of updates!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants