New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
questions of training details of coclr #14
Comments
Hi,
|
Hi! And how much accuracy does FlowMining of Cycle1 get? I would be grateful if you share your experiences! |
Hi,
|
Thank you so much!
Thank you so much! |
Thanks to you, I am doing well. But can I just ask one more question? After completing Cycle 1, I proceeded to Cycle2, which is a problem. First of all, in the Cylcle1-flowMining process, the best of the result is epoch 4 or 5, Is your best result epoch about 4 or 5 in the Cycle1-FlowMining process? and I used this to proceed with Cycle2-RGBMIning. So, after proceeding to Cycle 2, I proceeded downstream. Second, when you proceed to Cycle2, do you have to proceed with start_epoch=101 and epoch=200 like this? |
|
Hi, im trying to replicate your result on the alternation stage, I now use two init models you provided (both 400~ epochs). I have two questions.
1). According to your paper, "At the alternation stage, on UCF101 the model is trained for two cycles, where each cycle includes 200 epochs, i.e. RGB and Flow networks are each trained for 100 epochs". Does that main I need to run main_coclr.py four times? each time with 100 epoch and the newest two pretrained models I have from the previous training process?
2). If so, what lr do you use in each of four 100 epochs in the alternation stage? I also checked the COCLR pretrained model you provided, it seems in 182 epoch and 109 epoch the lr is 1e-4. Is that mean I need to train the second cycles with larger lr, e.g. 1e-2, and decay down to 1e-4?
Best Regards,
Yuqi
The text was updated successfully, but these errors were encountered: