Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QST] How to change categorical embeddings size? #515

Closed
antklen opened this issue Nov 7, 2022 · 2 comments
Closed

[QST] How to change categorical embeddings size? #515

antklen opened this issue Nov 7, 2022 · 2 comments

Comments

@antklen
Copy link

antklen commented Nov 7, 2022

Hi! I can't find out how to control size of embeddings for categorical features. I create embeddings with TabularSequenceFeatures.from_schema and get embeddings with size 64. How to change this value?

@rnyak
Copy link
Contributor

rnyak commented Nov 29, 2022

@antklen you can change the args here:

https://github.com/NVIDIA-Merlin/Transformers4Rec/blob/main/transformers4rec/torch/features/embedding.py#L105

the reason you get 64 is because the default embedding_dim_default is set to 64. You can change that value, In addition, you can set the embedding_dims: Optional[Dict[str, int]] as a dictionary (keys are feature names values are emb dims).

let us know if you have further questions.

@antklen
Copy link
Author

antklen commented Jan 13, 2023

@rnyak thank you, this is complete answer to my question)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants