Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors with both the verify installation command as well as the final recipe #27

Closed
tatami-galaxy opened this issue Jun 25, 2023 · 2 comments

Comments

@tatami-galaxy
Copy link

After cloning and installation this command :

python pretrain.py name=test arch=bert-base train=bert-base data=sanity-check-2 dryrun=True impl.microbatch_size=2

produces "In 'cfg_pretrain': Could not find 'arch/bert-base'". If I replace the arch argument with train/hf-bert-tiny I get :

"FileNotFoundError: Directory /root/cramming/outputs/data/sanity-check-2_BPEx32768_aa4b98dc480e637aa82f59461e1b1729 not found"

If I try the final recipe : python pretrain.py name=amp_b8192_cb_o4_final arch=crammed-bert train=bert-o4 data=pile-readymade

I get "RuntimeError: Unexpected optimization option max_autotune_gemm"

@JonasGeiping
Copy link
Owner

JonasGeiping commented Jun 25, 2023

Hi! Thanks for trying the new version. Sorry, the installation command still refered to the older version, this is fixed now.

For the dataset, you should be warned that impl.forbid_dataset_preprocessing=True is set, and no new dataset is generated. Just, in case, I've flipped that flag to a default state of False now.

Finally, for Unexpected optimization option max_autotune_gemm, what PyTorch version do you have. This variant of the inductor should have been merged by now, but maybe this is still only in the nightlies.

In any case, you can disable this setting via impl._inductor_vars=null.

JonasGeiping added a commit that referenced this issue Jun 25, 2023
@tatami-galaxy
Copy link
Author

tatami-galaxy commented Jun 26, 2023

Thanks it works now 👍🏽

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants