Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.Sign up
[MRG+1] Issue #8173 - Passing n_neighbors to compute MI #8181
What does this implement/fix? Explain your changes.
Any other comments?
A single test has been added in a classification case.
I did it and then I had a second thought.
I think checking that the MI changes with n_neighbors would be fine?…
On 18 January 2017 at 02:24, Olivier Grisel ***@***.***> wrote: ***@***.**** commented on this pull request. ------------------------------ In sklearn/feature_selection/tests/test_mutual_info.py <#8181 (review)> : > assert_array_equal(np.argsort(-mi), [2, 0, 1]) + assert_allclose(mi, [0.06987399, 0.03197151, 0.21946924], rtol=1e-6) + mi_7 = mutual_info_classif(X, y, discrete_features=, n_neighbors=7, + random_state=0) + assert_allclose(mi_7, [0.0735522, 0.0343685, 0.2194692], rtol=1e-5) I don't really like tests that hardcode numerical values that are true only for randomly generated but with fixed see data. Is there a better way to test the impact of n_neighbors without hardcoding the values? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#8181 (review)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AAEz6zKiPLJMV8BChk0v2M8CddRWNG5Kks5rTN0xgaJpZM4Lfpn9> .