Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Paired multi-modal data? #10

Closed
mayurmallya opened this issue May 7, 2021 · 1 comment
Closed

Paired multi-modal data? #10

mayurmallya opened this issue May 7, 2021 · 1 comment

Comments

@mayurmallya
Copy link

Hi there,

Thanks for the wonderful dataset!

I was wondering if there are any paired images in this dataset. What I mean by paired images (x_i, y_i) is that they belong to 2 different modalities (in this case Modality X and Modality Y) and they come from the same patient and hence mapped to the same class labels.

I see in the paper that OrganMNIST Axial, Coronal, and Sagittal come from the same source and have the same set of labels. I was wondering if these 3 modalities have paired images in them and if it includes the pairing data (which axial image is paired with which coronal and sagittal images).

Thank you.

@duducheng
Copy link
Member

duducheng commented May 7, 2021

Hi @mayurmallya ,

Thank you for the interest ;)

Sorry to disappoint you. In MedMNIST, there is no multi-modal paired image, as it is not the primary focus of this study.

I suggest that you may look for other academic datasets such as cardiac MRIs / brain MRIs (e.g., BRATS). The acs view of 3D volumes are not truly multi-modal.You could generate the views from any 3D datasets. e.g., LIDC or our own RibFrac dataset. In fact, we are also extending MedMNIST with 3D images, which may be released at June/July 2021.

Good luck!
Jiancheng Yang

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants