Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Number of Training Epoch and Time #5

Closed
HenryPengZou opened this issue Dec 11, 2021 · 1 comment
Closed

Number of Training Epoch and Time #5

HenryPengZou opened this issue Dec 11, 2021 · 1 comment

Comments

@HenryPengZou
Copy link

HenryPengZou commented Dec 11, 2021

Thanks for your great work!

I notice that the number of training epochs are 6k and 8k for the two datasets respectively. That's quite a big number. I wonder what is the reason that it takes so many epochs to train? Do you have an ablation study on the effect of training epochs?

Besides, could you please provide your training times?

Thanks!

@zhang-can
Copy link
Owner

Hi @HenryPengZou

Sorry to cause this confusion. The "epoch" actually refers to "iteration" in our code, please refer to this config file.

The training of CoLA is very fast. On a single NVIDIA GeForce GTX 1080 GPU, the whole training process takes less than 1 hour.

Best wishes,
Can

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants