-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Translate augmentation diverges #9
Comments
Hi @HosseinSheikhi thanks for bringing this to our attention, that shouldn't be the case so there may have been an issue with the translate PR. @WendyShang can you check if your PR was incorporated correctly? Maybe it's using different hyperparams? |
Hi @HosseinSheikhi, the plot of my run from |
Thank you both for your prompt reply. The PyTorch version is 1.6.0 . CUDA_VISIBLE_DEVICES=0 python train.py UDA_VISIBLE_DEVICES=0 python train.py |
@HosseinSheikhi For Cheetah Run, could you please try and let me know how the training curves look: |
I will update you, but just in case, pre_transform_image_size should not be greater than image_size? |
Ah this may be the source of your issue. In your run you have
--pre_transform_image_size
100 --image_size 84, but we need pre_transform_image_size < image_size,
since the translate aug shifts the initially rendered image (which has
size pre_transform_image_size)
in a larger container (which has size image_size). This may be why your
runs are blowing up.
Note: for crop image_size < pre_transform_image_size, since you're cropping
a smaller image from the initially rendered one
…On Thu, Oct 29, 2020 at 10:30 PM Hossein Sheikhi Darani < ***@***.***> wrote:
I will update you, but just in case, pre_transform_image_size should not
be greater than image_size?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#9 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABHWQWLQN2Q4TPEIMQUXIF3SNIXMRANCNFSM4TBXW7KQ>
.
|
Yes, that was the reason, now its converging. Thanks! |
Hello,
I wonder if I have to do fine-tunings to get results from Translate augmentation. It always diverges! I have tested for Cartpole, Walker, and Cheetah.
In the following figures, the diverged one is Translate.
The text was updated successfully, but these errors were encountered: