Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Per Class Accuracy During Training/Validation #506

Open
choosehappy opened this issue Jan 13, 2016 · 4 comments
Open

Per Class Accuracy During Training/Validation #506

choosehappy opened this issue Jan 13, 2016 · 4 comments

Comments

@choosehappy
Copy link

Caffe has support wherein if you provide 2 top output blobs for the final accuracy layer, it displays a per-class accuracy as well during the testing phase.

Is it possible to get something in Digits which displays this information?

related issue: BVLC/caffe#2935

@lukeyeager
Copy link
Member

Cool, I didn't notice that PR go through. I'll add it to the TODOs.

@lukeyeager
Copy link
Member

If you just add a second top to your accuracy layer:

layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "ip2"
  bottom: "label"
  top: "accuracy"
  top: "accuracies"  # <- new
  include { stage: "val" }
}

then Caffe will print out per-class accuracies:

I1031 10:09:40.894326 13817 solver.cpp:429]     Test net output #0: accuracies = 0.940928
I1031 10:09:40.894353 13817 solver.cpp:429]     Test net output #1: accuracies = 0.960338
I1031 10:09:40.894357 13817 solver.cpp:429]     Test net output #2: accuracies = 0.953798
I1031 10:09:40.894361 13817 solver.cpp:429]     Test net output #3: accuracies = 0.941139
I1031 10:09:40.894363 13817 solver.cpp:429]     Test net output #4: accuracies = 0.942194
I1031 10:09:40.894366 13817 solver.cpp:429]     Test net output #5: accuracies = 0.951356
I1031 10:09:40.894369 13817 solver.cpp:429]     Test net output #6: accuracies = 0.955576
I1031 10:09:40.894373 13817 solver.cpp:429]     Test net output #7: accuracies = 0.953587
I1031 10:09:40.894376 13817 solver.cpp:429]     Test net output #8: accuracies = 0.941983
I1031 10:09:40.894379 13817 solver.cpp:429]     Test net output #9: accuracies = 0.930154
I1031 10:09:40.894382 13817 solver.cpp:429]     Test net output #10: accuracy = 0.97943
I1031 10:09:40.894388 13817 solver.cpp:429]     Test net output #11: loss = 0.0860136 (* 1 = 0.0860136 loss)

Unfortunately, DIGITS doesn't know how to interpret this kind of output.

I'll leave this open.

@tsungjenh
Copy link

@lukeyeager do you successfully use this PR? I'm still stuck at how to actually show accuracies for each class just like you did. Can you help?

@Adamzs
Copy link

Adamzs commented Nov 7, 2017

In PyCaffe, you can set your accuracy layer to something like:

n.accur,n.accur_by_class = L.Accuracy(n.fc8, n.label, include=dict(phase=caffe.TEST), ntop=2)

...where n is my net name from n = caffe.NetSpec(), and fc8 is my last fully connected layer. Setting ntop=2 provides a second output from the accuracy layer (which I called accur_by_class in this example).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants