Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SGD with decay #111

sumitpai opened this issue Jul 1, 2019 · 0 comments


Copy link

commented Jul 1, 2019

Background and Context
SGD optimizer currently doesn't perform well on the tested datasets mainly due to the fixed learning rate.
Provide support to various tricks that can help boost the training performance (Eg: SGD with warm restart , decay at fixed intervals, etc)

@sumitpai sumitpai self-assigned this Jul 1, 2019

@sumitpai sumitpai added this to To do in AmpliGraph 1.1 via automation Jul 1, 2019

@sumitpai sumitpai added this to the 1.1 milestone Jul 1, 2019

sumitpai added a commit that referenced this issue Jul 1, 2019
Created separate file for optimizers. Implemented issue #111 - sgd wi…
…th warm restarts and constant interval decay
sumitpai added a commit that referenced this issue Jul 3, 2019
Merged branch feature/74. Closes issue #111 (sgd with decay), #110(Ad…
…apters for data loading), #74(Speedup eval using DB), #61(support for millions of entities using lazy loading of variables)

@sumitpai sumitpai moved this from To do to In progress in AmpliGraph 1.1 Jul 4, 2019

@sumitpai sumitpai moved this from In progress to Done in AmpliGraph 1.1 Jul 18, 2019

@sumitpai sumitpai closed this Jul 22, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
1 participant
You can’t perform that action at this time.