Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KNN with k=1 or 9? #1

Closed
ChengYiBin opened this issue Jul 4, 2020 · 6 comments
Closed

KNN with k=1 or 9? #1

ChengYiBin opened this issue Jul 4, 2020 · 6 comments
Assignees

Comments

@ChengYiBin
Copy link

Is this the setting for NTU RGB+D 60 Dataset?

neigh = KNeighborsClassifier(n_neighbors=9, metric='cosine')

code in this line

@DragonLiu1995
Copy link
Collaborator

For Cross-View, we used n_neighbours=1, but for Cross-Subject we actually tried a set of parameters(1, 3, 5, 7, 9). This code snippet was cleaned up from the jupyter notebook we did for Cross-Subject experiment.

@ChengYiBin
Copy link
Author

So for Cross-Subject, k=9 is the best hyper-parameter for action recognition, and is selected for all Cross-Subject evaluation, right ?

@DragonLiu1995
Copy link
Collaborator

Yes.

@ChengYiBin
Copy link
Author

sorry to bother again. About the experimental results in this paper, those unsupervised method compared with this work, I wonder whether you re-implement them according to original papers (i.e. fine-tune a final FC layer), or also by KNN classification?
thanks!
image

@DragonLiu1995
Copy link
Collaborator

Except for LongT GAN which is implemented by us, the other results come from their papers.

@ChengYiBin
Copy link
Author

thank u for reply : )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants