We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Thanks for your impressive work. Can you share how to implement pretraining code?
The text was updated successfully, but these errors were encountered:
Thank you for your interest in our work! We have no plans to provide a pre-training code yet. You can implement your own pre-training script by referring the Hugging Face's transformers code: https://github.com/huggingface/transformers/blob/v4.19.2/src/transformers/data/data_collator.py#L748
Sorry, something went wrong.
No branches or pull requests
Thanks for your impressive work.
Can you share how to implement pretraining code?
The text was updated successfully, but these errors were encountered: