Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requests on pretrain code and experimental settings for other datasets #4

Closed
JoonHo-Jang opened this issue Aug 3, 2023 · 2 comments · Fixed by #6
Closed

Requests on pretrain code and experimental settings for other datasets #4

JoonHo-Jang opened this issue Aug 3, 2023 · 2 comments · Fixed by #6
Assignees
Labels
enhancement New feature or request

Comments

@JoonHo-Jang
Copy link

Dear authors,

I have requests on several things I already sent an email to author.

  1. Could you release the pretraining code for the datasets other than CIFAR10?
    • It would be pleasure if you provide the pretrain code and its corresponding experimental settings such as epochs, architecture, and learning rate on Office-Home or PACS.
  2. Could you release 'parameter.py' for each model/dataset in terms of standard settings ?
    • the standard setting I said refers to the setting utilized in Table 2, including the pre-train codes (and settings) for each dataset.

My requests are just for reproducing the results in Table 2 of your paper.

I hope this requests do not disturb you much.

Thank you.
Best,

@MarcellusZhao MarcellusZhao added the enhancement New feature or request label Aug 3, 2023
@MarcellusZhao MarcellusZhao self-assigned this Aug 3, 2023
MarcellusZhao added a commit that referenced this issue Aug 6, 2023
Provide an improved pretraining script as requested by #4
@MarcellusZhao
Copy link
Collaborator

Hey JoonHo,

As per your request, I have released an improved pretraining script that can support pretraining on all of benchmark datasets appeared in our paper except ImageNet. You can check it out in \pretrain.

I am unable to access the server I stored experiment logs before due to an unexpected vpn issue, so this time I cannot release the parameters you requested. I will update it asap once the vpn issue is resolved.

@MarcellusZhao
Copy link
Collaborator

Hey JoonHo,

We have released a set of experimental setups in exps we used to create Table 2 in the paper. In fact, we mainly care about the part of common hyperparameters which are relevant to each adaptation process, such as lr and n_train_steps as illustrated in our paper.

These setup scripts need to work with a experimental pipeline based on tmux. We have also provided this pipeline in our codebase and you can check out the updated README documentation to see how to use it.

Hope this information helps!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants