Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can you explain how the code calculates the FROC score? #61

Closed
alibalapour opened this issue Dec 31, 2021 · 2 comments
Closed

Can you explain how the code calculates the FROC score? #61

alibalapour opened this issue Dec 31, 2021 · 2 comments

Comments

@alibalapour
Copy link

I want to plot and calculate FROC for 3 WSIs (Test_001, Test_065, and Test_079 from Camelyon16), but the result is nan. I can't understand how do you calculate FROC.
We have a CSV file that shows coordinates and probabilities of tumors generated by the model and ground truth mask. Is this ground truth mask have a particular property?
What are the values of the ground truth mask? 0 or 1?
What is ITC_labels in Evaluation_FROC.py?
How do you generate a ground truth mask of WSI from annotations? Is there a particular way or the same as the way provided by ASAP?

@tree1rain
Copy link

Open tumor slide image with ASAP.
Load the annotation file in .xml format.
Save.

@tree1rain
Copy link

tree1rain commented Feb 22, 2023

ASAP Save -> *.tif -> assign labels to annotation groups -> (tumor: 1), (Exclusion: 0), (_0: 1), (_1: 1), (_2: 0)
filled_image = nd.morphology.binary_fill_holes(binary)
this line of code will fill in the hole, so there is no difference between (Exclusion: 0) and (Exclusion: 1)
Evaluation_FROC.py -> computeEvaluationMask -> distance = nd.distance_transform_edt(255 - pixelarray[:,:,0]*255)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants