Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initialise self._alpha in TransformerConv for torch.jit.script Compatibility #3538

Merged
merged 1 commit into from Nov 20, 2021
Merged

Conversation

RobMcH
Copy link
Contributor

@RobMcH RobMcH commented Nov 19, 2021

Currently the attribute self._alpha is not initialised in the layer causing torch.script.jit to crash when trying to jit a model using one of these layers. The fix is simple: Initialise all attributes in layer for torch.jit.script compatibility.

Initialise all attributes in layer for torch.jit.script compatability.
Copy link
Member

@rusty1s rusty1s left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@rusty1s rusty1s merged commit 70eb788 into pyg-team:master Nov 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants