Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changed config parser priority. #47

Closed
wants to merge 2 commits into from

Conversation

mauriciodev
Copy link
Contributor

Config priority:
command line > --config method.py > default from parser.py

First parser run to get the parameters to load f{args.method}.py
If we provided a --config file, loads it. Else, tries to find one for the method.
Push the default values for this method to the parser Second parser run to get the final parameters.

I also changed back the default values to parser.py as there was no longer a need for the dict to stay in a separate file.

Config priority:
    command line > --config method.py > default from parser.py

First parser run to get the parameters to load f{args.method}.py
If we provided a config file, loads it. Else, tries to find one for the
method.
Push the default values for this method to the parser
Second parser run to get the final parameters
@Lupin1998
Copy link
Collaborator

Hi, @mauriciodev,

We'd like to express our appreciation for your valuable contributions to OpenSTL. Your efforts will significantly enhance the project's quality. However, we have already tackled the conflict of adjusting some important training hyperparameters (e.g., the learning rate) by the config and the arg parser. Therefore, we may not accept your pull request. Thanks again for your efforts!

@mauriciodev
Copy link
Contributor Author

Hi, @Lupin1998 . Thank you very much for putting your attention to this subject. I will try to explain exactly what I was trying to fix with that pull request.

Since the previous change, the example notebook stopped working, because we no longer have a default value for the learning rate scheduler. This was easily fixed by adding the "onecycle" parameter to the config. But that got me into looking at the changes made and I understand that now we have a dict of default values hardcoded in tools/train.py. That's actually a problem, because the colab do not use tools/train.py, and default values were already defined in parser.py.

Then I tried to test the EarlyStopping procedure. I added "early_stop_epoch = 10" to the config file (configs/mmnist/simvp/SimVP_gSTA.py):
tools/train.py -d mmnist -c configs/mmnist/simvp/SimVP_gSTA.py --ex_name mmnist_simvp_gsta

This doesn't work as expected, because "-c" is applied after parsing the command line argument. The config is updated with what was set in the config file. So the default "-1" was already in place and could not be overwritten by the value provided in SimVP_gSTA.py.

Currently, the parameters in the command line, SimVP_gSTA and in train.py have different priorities. "epoch" has a different priority than "early_stop_epoch", for example. I tried to make them have a single sequence:

command line > --config SimVP_gSTA.py > default from parser.py

To do that I realized that the command line must be last parameter to update the config. But since it has it's default parameters, we should provide the SimVP_gSTA.py modifications as default values to the parser. Thus we first get the config file from the command line, read it's default values and after that we parse the command line again using those default values.

@Lupin1998
Copy link
Collaborator

Hello, @mauriciodev. The problem you mentioned is quite important, and we try to solve it in the latest commit. We create default_parser() to provide the default values and update the config after filling values in the given config file. Once again, we express our great gratitude for your contributions to OpenSTL, and it would be appreciated if you could run the latest version to check whether this issue is really solved.

@mauriciodev mauriciodev deleted the Parser-priority branch September 25, 2023 18:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants