Skip to content
This repository has been archived by the owner on Nov 3, 2023. It is now read-only.

Type annotate transformer/modules.py #3545

Merged
merged 2 commits into from Mar 24, 2021
Merged

Conversation

spencerp
Copy link
Contributor

Patch description
Problem:

  • When using these modules, it's not immediately obvious what types of args they expect and return. You need to read through the implementation or add breakpoints to figure this out, which wastes time.

Solution:

  • Type annotate __init__ and forward functions that aren't already annotated.

There shouldn't be any functional changes here, these are essentially just comments with very loose static checking.

Testing steps
Ran a sample transformer training run to sanity check:

λ parlai train_model --task twitter --model-file /tmp/tr_twitter --model transformer/ranker --batchsize 16 --validation-every-n-secs 3600 --candidates batch --eval-candidates batch --data-parallel True

Circle CI.

@@ -230,51 +237,6 @@ def create_position_codes(n_pos, dim, out):
out[:, 1::2] = torch.FloatTensor(np.cos(position_enc)).type_as(out)


class TransformerResponseWrapper(nn.Module):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had to move these down so TransformerEncoder is declared before using it as a type. If we want to avoid moving these, I could just wrap the type in quotes "TransformerEncoder", but that disables static checking of these.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah the secret to that is to add at the top:

from __future__ import annotations

This causes python to delay typechecking

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This ended up being resolved anyway by the breaking up of modules.py

Copy link
Contributor

@stephenroller stephenroller left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thx! suggesting we do the future annotations.

@spencerp spencerp merged commit f11c077 into master Mar 24, 2021
@spencerp spencerp deleted the transformer-type-annotations branch March 24, 2021 05:07
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants