Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Different Optimizer #49

Open
DREAMXFAR opened this issue Nov 2, 2020 · 0 comments
Open

About Different Optimizer #49

DREAMXFAR opened this issue Nov 2, 2020 · 0 comments

Comments

@DREAMXFAR
Copy link

Hello, I have paid attention to this work for some time, and appreciate your wonderful work. I have some questions to insult you. And I will appreciate it if you can give some illustrations.

I notice that you also code for Adam optimizer when training. So I wonder if there are some differences between those two kinds of optimizers in performance? Is there anyone better than the other when using the same lr for different layers?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant