Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Per-class accuracy and loss tracking at test-time? #2444

Closed
hyqneuron opened this issue May 12, 2015 · 4 comments
Closed

Per-class accuracy and loss tracking at test-time? #2444

hyqneuron opened this issue May 12, 2015 · 4 comments
Labels

Comments

@hyqneuron
Copy link

This concerns class imbalance. In datasets with highly imbalanced classes,
(1) it helps to keep track of the per-class accuracy and loss instead of overall accuracy/loss.
(2) It may also help to use class-specific learning rates, as discussed in this G+ post:

Is there an existing implementation for these?

@hyqneuron
Copy link
Author

I have another related question. How can I dynamically create top blobs? Basically I'm doing a accuracy layer that tracks per-class accuracy. And I need one named output per class. So I'm creating one top blob per class dynamically in my new layer's ctor.

Currently I directly modify a layer's LayerParam (which is const). I wonder if there's a less stupid way to do this?

@n-zhang n-zhang added the JL label May 12, 2015
@longjon
Copy link
Contributor

longjon commented May 13, 2015

Please ask usage questions on the caffe-users list. Thanks!

@longjon longjon closed this as completed May 13, 2015
@hyqneuron
Copy link
Author

The first post was a feature proposal... nvm.

@cbelth
Copy link

cbelth commented Oct 20, 2016

#2935

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants