Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use voxel2mesh in another dataset #11

Closed
afoix opened this issue Mar 26, 2023 · 1 comment
Closed

Use voxel2mesh in another dataset #11

afoix opened this issue Mar 26, 2023 · 1 comment

Comments

@afoix
Copy link
Contributor

afoix commented Mar 26, 2023

Hello,

I am interested to try voxel2mesh on different datasets. I have been trying to find instructions regarding how to interface with the tool. So far, I ended up having a read through the high level python files in the repo to get a sense of how to do that instead.

So far, my understanding is that an object is to be provided for the cfg.data_obj field (config.py).

Two examples are provided: Chaos and Hippocampus.
Reading a bit further, in the data/ folder, we find definition for those classes as well as a data.py.
Although this is not explicit, it looks like the class DatasetAndSupport (https://github.com/cvlab-epfl/voxel2mesh/blob/master/data/data.py#L30-L40) is a blueprint for the expected interface (probably intended as an abstract class to be inherited from like Hippocampus does https://github.com/cvlab-epfl/voxel2mesh/blob/master/data/hippocampus.py#L50)

Reading a bit further yet, it look like the main expects the quick_load_data (https://github.com/cvlab-epfl/voxel2mesh/blob/master/main.py#L76) to be defined, which in turn seems to expect that a pickle file exists (https://github.com/cvlab-epfl/voxel2mesh/blob/master/data/hippocampus.py#L58 and presumably similar in chaos). This pickle file appears to be generated upon running pre_process_dataset (https://github.com/cvlab-epfl/voxel2mesh/blob/master/data/hippocampus.py#L66) which I would guess is the only really dataset-specific chunk of code...?

I would like to avoid trying to understand the shape of the files in hippocampus and in chaos to then infer what lands in the Sample objects which seem to be the things that ultimately get serialised in the pickle file. Could you provide information regarding what the expected format is, and how one would go about producing it from a set of images?

Thank you for your help.

@udaranga3001
Copy link
Collaborator

Reading a bit further, in the data/ folder, we find definition for those classes as well as a data.py.
Although this is not explicit, it looks like the class DatasetAndSupport (https://github.com/cvlab-epfl/voxel2mesh/blob/master/data/data.py#L30-L40) is a blueprint for the expected interface (probably intended as an abstract class to be inherited from like Hippocampus does https://github.com/cvlab-epfl/voxel2mesh/blob/master/data/hippocampus.py#L50)

Correct!

The pickle file stores the pre-processed data. The idea is to use data_preprocess.py to pre-process data and save it as a pickle file. So, for a given dataset, you run data_preprocess.py once and it will call pre_process_dataset function in the Hippocampus/Chaos/ABC class. Then you use it when running the experiments using main.py which will call quick_load_data function in Hippocampus/Chaos/ABC class.

Let me know if it's still not clear.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants