Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TSNE sets kl_divergence wrong #6507

Closed
ssaeger opened this issue Mar 8, 2016 · 2 comments · Fixed by #8634
Closed

TSNE sets kl_divergence wrong #6507

ssaeger opened this issue Mar 8, 2016 · 2 comments · Fixed by #8634

Comments

@ssaeger
Copy link

ssaeger commented Mar 8, 2016

In commit 6bf63f6 error was replaced by kl_divergence, but line 850 was missed so that the final kl_divergence value is actually an old value that misses the final optimization step. This is especially a problem since people can access the kl_divergence in the current version as described in #6477.

Even in the verbose output one can see that the error/kl_divergence value from iteration 100 is kept as the final value at iteration 200.

[t-SNE] Iteration 25: error = 8.0242269, gradient norm = 0.0014062
[t-SNE] Iteration 50: error = 7.9942951, gradient norm = 0.0014150
[t-SNE] Iteration 75: error = 7.9053034, gradient norm = 0.0014402
[t-SNE] Iteration 100: error = 7.8706691, gradient norm = 0.0014515
[t-SNE] KL divergence after 100 iterations with early exaggeration: 7.870669
[t-SNE] Iteration 125: error = 0.5025748, gradient norm = 0.0009721
[t-SNE] Iteration 150: error = 0.4773589, gradient norm = 0.0009048
[t-SNE] Iteration 175: error = 0.4710594, gradient norm = 0.0008880
[t-SNE] Iteration 200: error = 0.4693623, gradient norm = 0.0008834
[t-SNE] Error after 200 iterations: 7.870669

If this confirms to be a bug, I will create a PR for it.

@AlexanderFabisch
Copy link
Member

Yes, that seems to be an error. It has been introduced during the merge of the master and Barnes-Hut SNE.

@ssaeger
Copy link
Author

ssaeger commented Mar 21, 2016

Ok, then I will create a PR to fix it. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants