Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Set feature importance tensor size according to embedding dimensions (TabNetRegressor) #94

Conversation

martinsotir
Copy link

What kind of change does this PR introduce?

Fix tensor shape issues when cat_emb_dim>1 in TabNetRegressor.

This issue was fixed for TabNetClassifier in #48 but it was not the for the regressor (probably an omission?).

Does this PR introduce a breaking change?

Not that I know of. input_dim should be equal to network.post_embed_dim when cat_emb_dim=1.

What needs to be documented once your changes are merged?

Nothing more.

Closing issues

closes #49

…in TabNetRegressor.train_epoch

Fix tensor shape issues when `cat_emb_dim>1` with `TabNetRegressor`.
This issue was fixed for TabNetClassifier in dreamquark-ai#48 but not for the regressor.
@Optimox
Copy link
Collaborator

Optimox commented Apr 13, 2020

oh thank you sorry I did not see your PR, I have spotted this omission as well (and did a PR, but will review yours instead).

@Optimox
Copy link
Collaborator

Optimox commented Apr 14, 2020

I merged my PR #96 since I made a few changes to the notebooks so that this bug would pop when running them if it comes again!

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Embedding dims does not work for cat_emb_dim > 1
2 participants