Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

Lazy modules #1039

Closed
Closed

Conversation

bethebunny
Copy link
Contributor

Summary:
Add lazy modules to PyText. These are modules which are able to infer some of their dimensions on their first forward pass.

The main tool introduced is Lazy, which is a Module wrapping any other Module, and to which arguments can be passed that will be used to construct that wrapped Module after the first forward pass. If any of these arguments are Infer objects, those arguments will be replaced by calling the callback of the Infer object on the forward pass input.

For instance, Lazy(nn.Linear, Infer(lambda input: input.size(-1)), 4) would take its in_features dimension from the last dimension of the input to its forward pass. This can be simplified to Lazy(nn.Linear, Infer.dimension(-1), 4), or a partial can be created, for instance LazyLinear = Lazy.partial(nn.Linear, Infer.dimension(-1)); LazyLinear(4).

Finally, these Lazy objects explicitly forbid treating themselves normally; they must instead be replaced by calling init_lazy_modules on your model before training. For instance,

ll = lazy.Linear(4)
seq = nn.Sequential(ll)
final = init_lazy_modules(seq, torch.rand(1, 2)

final will be a full nn.Module graph with no lazy components; all of them will be "resolved" and replaced with their true module types.

Differential Revision: D17837321

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Oct 9, 2019
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D17837321

Summary:
Pull Request resolved: facebookresearch#1039

Add lazy modules to PyText. These are modules which are able to infer some of their dimensions on their first forward pass.

The main tool introduced is Lazy, which is a Module wrapping any other Module, and to which arguments can be passed that will be used to construct that wrapped Module after the first forward pass. If any of these arguments are Infer objects, those arguments will be replaced by calling the callback of the Infer object on the forward pass input.

For instance, `Lazy(nn.Linear, Infer(lambda input: input.size(-1)), 4)` would take its in_features dimension from the last dimension of the input to its forward pass. This can be simplified to `Lazy(nn.Linear, Infer.dimension(-1)`, 4), or a partial can be created, for instance `LazyLinear = Lazy.partial(nn.Linear, Infer.dimension(-1)); LazyLinear(4)`.

Finally, these Lazy objects explicitly forbid treating themselves normally; they must instead be replaced by calling `init_lazy_modules` on your model before training. For instance,

```
ll = lazy.Linear(4)
seq = nn.Sequential(ll)
final = init_lazy_modules(seq, torch.rand(1, 2)
```

`final` will be a full nn.Module graph with no lazy components; all of them will be "resolved" and replaced with their true module types.

Differential Revision: D17837321

fbshipit-source-id: 0ef6062f04ead4af9692d0d17e8cbfb20a29a778
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D17837321

bethebunny added a commit to bethebunny/pytext-1 that referenced this pull request Oct 10, 2019
Summary:
Pull Request resolved: facebookresearch#1039

Add lazy modules to PyText. These are modules which are able to infer some of their dimensions on their first forward pass.

The main tool introduced is Lazy, which is a Module wrapping any other Module, and to which arguments can be passed that will be used to construct that wrapped Module after the first forward pass. If any of these arguments are Infer objects, those arguments will be replaced by calling the callback of the Infer object on the forward pass input.

For instance, `Lazy(nn.Linear, Infer(lambda input: input.size(-1)), 4)` would take its in_features dimension from the last dimension of the input to its forward pass. This can be simplified to `Lazy(nn.Linear, Infer.dimension(-1)`, 4), or a partial can be created, for instance `LazyLinear = Lazy.partial(nn.Linear, Infer.dimension(-1)); LazyLinear(4)`.

Finally, these Lazy objects explicitly forbid treating themselves normally; they must instead be replaced by calling `init_lazy_modules` on your model before training. For instance,

```
ll = lazy.Linear(4)
seq = nn.Sequential(ll)
final = init_lazy_modules(seq, torch.rand(1, 2)
```

`final` will be a full nn.Module graph with no lazy components; all of them will be "resolved" and replaced with their true module types.

Differential Revision: D17837321

fbshipit-source-id: ff915ab7e73747652cfdca5fe524d13db9840e49
bethebunny added a commit to bethebunny/pytext-1 that referenced this pull request Oct 11, 2019
Summary:
Pull Request resolved: facebookresearch#1039

Add lazy modules to PyText. These are modules which are able to infer some of their dimensions on their first forward pass.

The main tool introduced is Lazy, which is a Module wrapping any other Module, and to which arguments can be passed that will be used to construct that wrapped Module after the first forward pass. If any of these arguments are Infer objects, those arguments will be replaced by calling the callback of the Infer object on the forward pass input.

For instance, `Lazy(nn.Linear, Infer(lambda input: input.size(-1)), 4)` would take its in_features dimension from the last dimension of the input to its forward pass. This can be simplified to `Lazy(nn.Linear, Infer.dimension(-1)`, 4), or a partial can be created, for instance `LazyLinear = Lazy.partial(nn.Linear, Infer.dimension(-1)); LazyLinear(4)`.

Finally, these Lazy objects explicitly forbid treating themselves normally; they must instead be replaced by calling `init_lazy_modules` on your model before training. For instance,

```
ll = lazy.Linear(4)
seq = nn.Sequential(ll)
final = init_lazy_modules(seq, torch.rand(1, 2)
```

`final` will be a full nn.Module graph with no lazy components; all of them will be "resolved" and replaced with their true module types.

Differential Revision: D17837321

fbshipit-source-id: 4f071a453ca2af6c234d57e83837bc15d50c68c2
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in e926fee.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants