Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add FT-Transformer model #12

Merged
merged 14 commits into from
Aug 18, 2023
Merged

Add FT-Transformer model #12

merged 14 commits into from
Aug 18, 2023

Conversation

weihua916
Copy link
Contributor

No description provided.

@github-actions github-actions bot added the nn label Aug 18, 2023
torch_frame/nn/__init__.py Outdated Show resolved Hide resolved
torch_frame/nn/conv/ft_transformer_conv.py Outdated Show resolved Hide resolved
torch_frame/nn/conv/ft_transformer_conv.py Outdated Show resolved Hide resolved
torch_frame/nn/decoder/cls_decoder.py Outdated Show resolved Hide resolved
torch_frame/nn/conv/ft_transformer_conv.py Outdated Show resolved Hide resolved
torch_frame/nn/conv/ft_transformer_conv.py Outdated Show resolved Hide resolved
torch_frame/nn/conv/ft_transformer_conv.py Outdated Show resolved Hide resolved
torch_frame/nn/conv/ft_transformer_conv.py Outdated Show resolved Hide resolved
self,
channels: int,
# Arguments for Transformer
num_layers: int = 4,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

QQ: Does the paper use 4 encoder layers for one layer of FTTransformerConv? This number seems a little bit large to me.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I did not check the paper. will set it according to the paper.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

read the original paper, and set num_layers to 3, nhead to 8.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Another QQ num_layers 3 and nhead 8 is just for one layer of ft transformer or for all the ft transformer?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

full ft transformer.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. I just renamed it into FTTransformer.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, should we create another folder such as nn/models (like we have in PyG) instead of putting to nn/conv? cc @rusty1s

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am open. For now, I think we can just treat it as conv and revisit this later.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, sounds good.

@weihua916 weihua916 merged commit 36ffe8d into master Aug 18, 2023
9 checks passed
@weihua916 weihua916 deleted the ft-transformer branch August 18, 2023 21:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants