New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make tutorial for enabling different learning rates #1183
Comments
I'm hesitant about this because the built-in pipeline is only supposed to cover most simple use cases. Every addition makes it more difficult to maintain, to document, and to learn. Further, I don't see any obvious simple ways to configure this from a high level. As an alternative, it's possible to roll your own pipeline that does exactly what you want. I'd suggest checking out https://pykeen.readthedocs.io/en/stable/tutorial/first_steps.html#beyond-the-pipeline on how to roll your own pipeline. |
Thank you @cthoyt, I understand. I wanted to figure out if there was an alternative, because we are using so many useful parts of the built-in pipeline right now. I'll give it a try, please feel free to close this issue if you think it won't be discussed any further. |
@dfdazac if you create a minimal working example, we would love to include it in the documentation. Do you think you could do the following:
You could make your own RST document in https://github.com/pykeen/pykeen/tree/master/docs/source/tutorial in a PR that includes this. |
I might be late to the party, but another option (still quite hacky) would be to create a custom subclass of from pykeen.optimizers import optimizer_resolver
from torch import optim, nn
class ModifiedSGD(optim.SGD):
def __init__(self, params: Iterable[nn.Parameter], custom_lrs: list[tuple[list[nn.Parameter], float]], **kwargs):
custom_param_ids = set(id(p) for p in custom_params for custom_params, _ in custom_lrs)
default_params = (p for p in params if id(p) not in custom_param_ids)
super().__init__(params=[
{"params": default_params},
*({"params": custom_params, "lr": custom_lr} for custom_params, custom_lr in custom_lrs),
], **kwargs)
optimizer_resolver.register(ModifiedSGD) You can now use this optimizer with the pipeline from pykeen.pipeline import pipeline
pipeline(
optimizer=ModifiedSGD,
optimizer_kwargs=dict(
custom_params=[(model.classifier.parameters(), dict(lr=1e-3))],
lr=1e-2,
momentum=0.9,
),
...
) |
Problem Statement
First of all, thanks for the great work with the library!
It would be very useful to be able to specify different learning rates. Right now, when running a pipeline, an instance of the optimizer is created by passing all parameters in the model:
pykeen/src/pykeen/pipeline/api.py
Lines 1035 to 1039 in 313055e
However, in some cases we might also want to apply per-parameter options, for example
Describe the solution you'd like
A possible solution could be an optional dictionary passed when creating the pipeline, e.g.
optimizer_params
. If it's not provided, then the pipeline would default to the above, otherwise the user could choose different learning rates for modules in a custom model:Describe alternatives you've considered
I tried getting access to the optimizer via a TrainingCallback, and I considered modifying the learning rate for different modules in the
pre_step
method:The problem is that at this point the optimizer has already been initialized and has been assigned Parameters, which are difficult to map to the original modules.
Additional information
No response
Issue Template Checks
The text was updated successfully, but these errors were encountered: