Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TSNE sets kl_divergence wrong #6507

Closed
ssaeger opened this Issue Mar 8, 2016 · 2 comments

Comments

Projects
None yet
2 participants
@ssaeger
Copy link
Contributor

ssaeger commented Mar 8, 2016

In commit 6bf63f6 error was replaced by kl_divergence, but line 850 was missed so that the final kl_divergence value is actually an old value that misses the final optimization step. This is especially a problem since people can access the kl_divergence in the current version as described in #6477.

Even in the verbose output one can see that the error/kl_divergence value from iteration 100 is kept as the final value at iteration 200.

[t-SNE] Iteration 25: error = 8.0242269, gradient norm = 0.0014062
[t-SNE] Iteration 50: error = 7.9942951, gradient norm = 0.0014150
[t-SNE] Iteration 75: error = 7.9053034, gradient norm = 0.0014402
[t-SNE] Iteration 100: error = 7.8706691, gradient norm = 0.0014515
[t-SNE] KL divergence after 100 iterations with early exaggeration: 7.870669
[t-SNE] Iteration 125: error = 0.5025748, gradient norm = 0.0009721
[t-SNE] Iteration 150: error = 0.4773589, gradient norm = 0.0009048
[t-SNE] Iteration 175: error = 0.4710594, gradient norm = 0.0008880
[t-SNE] Iteration 200: error = 0.4693623, gradient norm = 0.0008834
[t-SNE] Error after 200 iterations: 7.870669

If this confirms to be a bug, I will create a PR for it.

@AlexanderFabisch

This comment has been minimized.

Copy link
Member

AlexanderFabisch commented Mar 21, 2016

Yes, that seems to be an error. It has been introduced during the merge of the master and Barnes-Hut SNE.

@ssaeger

This comment has been minimized.

Copy link
Contributor Author

ssaeger commented Mar 21, 2016

Ok, then I will create a PR to fix it. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.