Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference with another database #6

Closed
HamidFsian opened this issue Jan 16, 2023 · 4 comments
Closed

Inference with another database #6

HamidFsian opened this issue Jan 16, 2023 · 4 comments
Assignees

Comments

@HamidFsian
Copy link

Hi,

I was trying to test your network and your wirghts on another database. But It fails each time...
Nevertheles, the size of each data (364,704,704) is bigger than the one you provided. Do you have any idea why it fails each time ?.

You will find attached one exemple of the database that I am trying to work with. Thank you

test

Best Regards,
Hamid FSIAN

@LucaLumetti LucaLumetti self-assigned this Feb 20, 2023
@LucaLumetti
Copy link
Collaborator

Hi @HamidFsian, sorry for being late in responding but I missed the notification. Could you please provide a better description of "it fails each time"? Does the code crash with some specific error or did you get really low metrics?

@HamidFsian
Copy link
Author

Hi @LucaLumetti ,
Thank's for your response.
In fact, I wanted to test your work on another database which contains a CBCT volume of the whole skull. and each time I pass this whole volume to the network, it give me low metrics.

Do you think that a pre-processing is required, where I crop the volume and take only the lower jaw for the test ?

Please not that I changed the batch_size from 6 to 1, because I have not enough computation capabilities for it.

Thank you,
Hamid

@HamidFsian HamidFsian closed this as not planned Won't fix, can't repro, duplicate, stale Feb 20, 2023
@HamidFsian HamidFsian reopened this Feb 20, 2023
@LucaLumetti
Copy link
Collaborator

@HamidFsian as you are using the whole skull, it might be that the network misclassifies the voxels of the skull that has never been seen during training. I would have to check the segmentation map that it outputs to understand it better.
To solve this issue, you could probably do a little fine-tuning using your dataset

@prittt
Copy link

prittt commented Feb 20, 2023

I agree with @LucaLumetti, fine-tuning is the best option. If you have no annotations on your volumes you can use our annotation tool. If annotating new CBCT is not an option you can still crop your data before feeding the network.

@prittt prittt closed this as completed Feb 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants