Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About fine-tune on small-scale datasets #3

Closed
sha310139 opened this issue Dec 6, 2021 · 2 comments
Closed

About fine-tune on small-scale datasets #3

sha310139 opened this issue Dec 6, 2021 · 2 comments

Comments

@sha310139
Copy link

Hi,

Thanks for sharing this amazing work.
I'm interested in how to fine-tune on small-scale datasets.
Can you share fine-tuning commands and config files? It will be very helpful for me.

Thanks.

@VideoNetworks
Copy link
Owner

VideoNetworks commented Jan 24, 2022

Training command and config is similar to train k400.

cmd = "python -u main_ddp_shift_egaze.py
--multiprocessing-distributed --world-size 1 --rank 0
--dist-ur tcp://127.0.0.1:23677
--name k400
-resume2 path-to-k400-weights
--cfg ./config/custom/egaze/egaze_vit_config_D_TK_8x8_Div4.yml"
os.system(cmd)

For main_ddp_shift_egaze.py, an extra resum2 is insert before resume args.

Screenshot from 2022-01-24 16-49-53

@sha310139
Copy link
Author

Thanks for your reply !
I will try to modify and train it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants