-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About loss function #2
Comments
Hi, |
Thank you for your reply. Besides, I think the code of the author is something different from his paper,too. Just look at this: That's two points of my confusion.Anyway, I think your cross-entropy loss is more reasonable and outperforms the author's one in his code. |
Thanks for your question. |
@cyvius96 Hi, I have some problems below:
|
https://github.com/cyvius96/prototypical-network-pytorch/blob/a3f8f1e1afd7fcb8cab64ba89268a80790761f88/train.py#L74 |
@cyvius96 I test this code |
Of course, since it is 30-way 1-shot, and there are 30 prototypes (each with dim 1600) for 30 classes. |
@cyvius96 Yeah, you are right! Thank you!
|
You are welcome.
|
@cyvius96 Hi, Can you help me check the formula? |
@cyvius96 Thank you for your kind. |
Dear Chen,
I noticed that you used cross-entropy loss function in your code, but the author of the paper tried to minimize the distance between query and support data who share the same lable, which was used as loss function. Here's his code : https://github.com/jakesnell/prototypical-networks
So, Do you have some doubt about the author's loss function?
Thank you!
The text was updated successfully, but these errors were encountered: