Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training dataset dice score lower than validition dataset #6

Open
Shawn0099 opened this issue Apr 19, 2020 · 1 comment
Open

Training dataset dice score lower than validition dataset #6

Shawn0099 opened this issue Apr 19, 2020 · 1 comment

Comments

@Shawn0099
Copy link

hi,
I trained your network for about 100 epoch. My training loss is lower than validition loss,but dice_coefficient for training dataset is lower than vadlition dataset.I also drew boxplot using evaluate.py,and my validition result is almost same as the result in your paper,but the training dataset result is worse.Is there anything wrong with my settings?
Another question is about calculating hausdorff distence.I've calculated sensitivity and speci city and my result is close to yours,but hausdorff distence is much higher than normal(about 20).I used SimpleITK.HausdorffDistanceImageFilter.Could you tell me how you do this?
Many thanks!

@woodywff
Copy link
Owner

woodywff commented Apr 19, 2020

The second answer comes first :-) I didn't calculate the four metrics (dice, hausdorff, sensitivity and specificity) by myself. CBICA's evaluation system would take care of that. You just need to Sign up, Log in, and upload your prediction results, then you'll get the metrics feedback.
brats_2019/demo_task1/draw_evaluation.py will plot the figs based on the downloaded .csv files.

When you said training and validation dataset, how many subjects have you taken into account?
The final training is on the whole 335 subjects, and the validation is for the 125 subjects.
Also notice that I mentioned two patching strategies in the article and for each of them the network has been trained for 100 epochs.
That's what came to my mind right now. Good luck to you! Keep in touch :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants