Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Ordered recursive instantiation #1997

Closed
jbaczek opened this issue Jan 26, 2022 · 3 comments
Closed

[Feature Request] Ordered recursive instantiation #1997

jbaczek opened this issue Jan 26, 2022 · 3 comments
Labels

Comments

@jbaczek
Copy link
Contributor

jbaczek commented Jan 26, 2022

馃殌 Feature Request

During recursive instantiation hydra assumes that encapsulated objects are independent of each other. I need a way to pass fields between them during instantiation.

Motivation

I'm building PyTorch training class that I want to control using hydra with the least amount of code possible. PyTorch requires a specific order of instantiating objects so each more specific component is aware of the previous one. For example: my trainer class has an init function that has signature: def __init__(self, model, dataset, optimizer, scheduler, ...). Optimizer takes model.parameters() as an argument, and scheduler takes optimizer as it's argument in turn. So far the best solution I found is to instantiate model and optimizer outside of hydra, and then pass them to the hydra.utils.instantiate as additional args. Like this:

model = hydra.utils.instantiate(config.model)
optimizer = hydra.utils.instantiate(config.opitimizer, params=model.parameters())
trainer = hydra.utils.instantiate(config.trainer, model=model, optimizer=optimizer, scheduler={'optimizer':optimizer})

Pitch

I'd want recursive instantiation to take into account such dependencies of objects. Preferably I'd want to abstract it away in a config file. For example:

trainer:
    _target_: train.Trainer
    model:
        _target_: net.Model
        arg1: 1
        arg2: 2
    optimizer:
        _target_: torch.optim.SGD
        lr: 1e-3
        _dependency_: model.parameters()@params
    scheduler:
        _target_: torch.optim.lr_scheduler.StepLR
        _dependency_: optimizer@optimizer
        step_size: 100

Additional context

Also PyTorch documentation states that optimizer should be constructed after placing model on the correct device, which would require to call a function between model instantiation and optimizer instantiation. https://pytorch.org/docs/stable/optim.html#constructing-it

@jbaczek jbaczek added the enhancement Enhanvement request label Jan 26, 2022
@jieru-hu
Copy link
Contributor

thanks for the requests @jbaczek. we will take a look at this, in the meanwhile, we had similar requests, maybe take a look at #1283?

@jbaczek
Copy link
Contributor Author

jbaczek commented Jan 27, 2022

Yup, this is exactly what I needed! Thanks! Closing.

@jbaczek jbaczek closed this as completed Jan 27, 2022
@jieru-hu
Copy link
Contributor

@jbaczek cool! This feature has not been released yet and is only available on the main branch for now, it will be out with Hydra 1.2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants