Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hyper-parameter for SYNTHIA dataset #14

Closed
dlsrbgg33 opened this issue Jan 3, 2020 · 4 comments
Closed

Hyper-parameter for SYNTHIA dataset #14

dlsrbgg33 opened this issue Jan 3, 2020 · 4 comments

Comments

@dlsrbgg33
Copy link

Hi, Sorry for the successive question and Thank you again for your work.

Is it possible for you to let me know the hyper-parameters for the SYNTHIA dataset for paper results, specifically "init_src_port" and "Input_size" which are commented "for GTA" besides?

Thank you and sorry for interrupting again.

@yzou2
Copy link
Owner

yzou2 commented Jan 3, 2020

You can try the following hyper-parameters.

INIT_SRC_PORT = 0.02
TRAIN_SCALE_SRC = '0.8,1.2'
TRAIN_SCALE_TGT = '0.6,1.5'
BATCH_SIZE = 2
INPUT_SIZE = '600,900'
RANDSEED = 0
LEARNING_RATE =5e-5
POWER = 0.0
MOMENTUM = 0.9
WEIGHT_DECAY = 0.0005
NUM_ROUNDS = 5
EPR = 2
SRC_SAMPLING_POLICY = 'r'
KC_POLICY = 'cb'
KC_VALUE = 'conf'
INIT_TGT_PORT = 0.2
MAX_TGT_PORT = 0.5
TGT_PORT_STEP = 0.05
MAX_SRC_PORT = 0.06
SRC_PORT_STEP = 0.0025
MRSRC = 0.0
MINE_PORT = 1e-3
RARE_CLS_NUM = 3
MINE_CHANCE = 0.8
TEST_IMAGE_SIZE = '1024,2048'
TEST_SCALE = 0.9
DS_RATE = 4

You can try our pre-trained model in
https://www.dropbox.com/s/etan97kk5v38qaa/synthia_src.pth?dl=0

@dlsrbgg33
Copy link
Author

Thank you for sharing the information. Did you also change the image mean and std for the synthia?

@yzou2
Copy link
Owner

yzou2 commented Jan 6, 2020

I did not specifically change the mean and std for SYNTHIA. These hyperparameters are the same as those of ImageNet.

@dlsrbgg33
Copy link
Author

Thank you very much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants