We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hi,
Thanks for the repo. Is there a way to train the model without pre-training on ERA5? As when I try the command
python -u src/climax/global_forecast/train.py --config configs/global_forecast_climax.yaml --trainer.strategy=ddp --trainer.devices=1 --trainer.max_epochs=50 --data.root_dir=process_data --data.predict_range=24 --data.out_variables=['z_500','t_850','t2m','u10'] --data.batch_size=16 --model.pretrained_path="" --model.lr=5e-7 --model.beta_1="0.9" --model.beta_2="0.99" --model.weight_decay=1e-5
It automatically starts with pre-training. Is there a flag somewhere to disable the pre-training and train on ERA5?
Regards, Yogesh
The text was updated successfully, but these errors were encountered:
If you run this script it should start training on ERA5 without any pretraining
Sorry, something went wrong.
hi @yogeshverma1998 is this solved? if yes, please close this issue
Hi, Thanks for the reply. It’s solved now.
No branches or pull requests
hi,
Thanks for the repo. Is there a way to train the model without pre-training on ERA5? As when I try the command
python -u src/climax/global_forecast/train.py --config configs/global_forecast_climax.yaml --trainer.strategy=ddp --trainer.devices=1 --trainer.max_epochs=50 --data.root_dir=process_data --data.predict_range=24 --data.out_variables=['z_500','t_850','t2m','u10'] --data.batch_size=16 --model.pretrained_path="" --model.lr=5e-7 --model.beta_1="0.9" --model.beta_2="0.99" --model.weight_decay=1e-5
It automatically starts with pre-training. Is there a flag somewhere to disable the pre-training and train on ERA5?
Regards,
Yogesh
The text was updated successfully, but these errors were encountered: