Fix to Tuner change trainer and optimizer configs >>Change batch_size in tuner #449
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
@manujosephv. We talked in this PR some time ago #387 and you said that changing batch_size is important and want to keep this feature, but I did not understand how to do it so you committed the change to my fork to allow changes batch_size in tuner, now I tested it, but it is not working. :)
First, it is necessary to remove the ValueError:

But even after change it still does not working, I tried to train Node model with [64, 2048, 4096] batch size, which with 64 is possible, now with 2048 and 4096 was supposed to return OOM. But that did not happen, everyone was trained normally usign the batch size configured in TrainerConfig.
Can you check it for me? Otherwise, it is possible to do how I did (and you said the is slower) or remove this feature from tuner.
📚 Documentation preview 📚: https://pytorch-tabular--449.org.readthedocs.build/en/449/