Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't skip normalization in TabNet during inference on a single row. #2299

Merged

Conversation

dantreiman
Copy link
Contributor

Should fix #2290

The problem is that TabNet gives different outputs if you predict a single row vs. if you predict a batch.

The cause is this line in normalization_modules.py:32:

        if batch_size != 1:
            return self.bn(inputs)
       return inputs

and also tabnet_modules.py:111:

        if batch_size != 1:
            # Skip batch normalization if the batch size is 1.
            features = self.batch_norm(features)  # [b_s, i_s]

@dantreiman dantreiman force-pushed the daniel/tabnet_inference_on_single_rows branch from a0df706 to f35deb7 Compare July 21, 2022 22:24
@github-actions
Copy link

github-actions bot commented Jul 21, 2022

Unit Test Results

       6 files  ±0         6 suites  ±0   2h 38m 54s ⏱️ +47s
2 936 tests ±0  2 887 ✔️ ±0    49 💤 ±0  0 ±0 
8 808 runs  ±0  8 625 ✔️ ±0  183 💤 ±0  0 ±0 

Results for commit 82bcaf5. ± Comparison against base commit 7f39a3c.

♻️ This comment has been updated with latest results.

@tgaddair tgaddair merged commit 06594a5 into ludwig-ai:master Jul 22, 2022
@dantreiman dantreiman deleted the daniel/tabnet_inference_on_single_rows branch July 23, 2022 02:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants