-
Notifications
You must be signed in to change notification settings - Fork 268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About KNN in SCAN #4
Comments
Hi @zhunzhong07, Yes you're correct. We sample uniformly from the K nearest neighbors during training. Therefore, it is highly likely that the anchor sees a different neighbor in the next epoch. So, if you train long enough it should have the exact same effect as Eq. 2. After all, it is not practical to include a lot of neighbors for every sample during a forward pass, since this does not scale well with the amount K. Hope this helps. |
Hi @wvangansbeke, Thanks for your quick reply. I have another question. In your code, I find that the indices of neighbors only computed once after the self-supervised learning. Why not re-computed the neighbors after each epoch of SCAN. Will this improve the results? |
Yes a good point. I never tried it exactly like that (although something similar). It makes sense actually. However, I'm not sure that the representations are going to be much better at that point. I just think that it will be difficult to exploit the selflabling as we currently do. This step basicaly readjusts the decision boundary between classes and updates the representations based on the prototypes of each class. |
OK. Thanks for your reply! |
Hi, thanks for sharing your great work! I have a concern about the KNN in SCAN training.
In the Eq.2 of your paper, you calculate the loss by maximizing the similarities between each anchor and its KNN. However, in your code, it seems that you only maximize the similarity between each anchor and one of its randomly selected KNN, as the dataloder below.
Unsupervised-Classification/data/custom_dataset.py
Lines 71 to 72 in 69aed1c
I am not sure if my understanding is correct.
Thanks.
The text was updated successfully, but these errors were encountered: