Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error while training the model using ShapeNetCore.v2 #7

Closed
amarmhrjn opened this issue Oct 14, 2021 · 6 comments
Closed

Error while training the model using ShapeNetCore.v2 #7

amarmhrjn opened this issue Oct 14, 2021 · 6 comments

Comments

@amarmhrjn
Copy link

Hello,

I downloaded the "ShapeNetCore.v2" from https://github.com/AnTao97/PointCloudDatasets, as mentioned in the readme.
I try to train the model using following configurations:
python train.py -t '/home/username/datasets/shapenetcorev2_hdf5_2048' -v '/home/username/datasets/shapenetcorev2_hdf5_2048/val' -c 1 -m airplane-merger.pt -d gpu

And, I get the following error messages:
Traceback (most recent call last):
File "/home/projects/SkeletonMerger/train.py", line 79, in
x, xl = all_h5(DATASET, True, True, subclasses=(ns.subclass,), sample=None) # n x 2048 x 3
File "/home/projects/SkeletonMerger/merger/data_flower.py", line 36, in all_h5
xy = tuple(lazy)
File "/home/projects/SkeletonMerger/merger/data_flower.py", line 33, in
lazy = map(lambda x: load_h5(x, normalize, include_label),
File "/home/projects/SkeletonMerger/merger/data_flower.py", line 14, in load_h5
f = h5py.File(h5_filename, 'r')
File "/home/anaconda3/envs/pytorchworkshop/lib/python3.9/site-packages/h5py/_hl/files.py", line 455, in init
fid = make_fid(name, mode, userblock_size,
File "/home/anaconda3/envs/pytorchworkshop/lib/python3.9/site-packages/h5py/_hl/files.py", line 199, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5f.pyx", line 100, in h5py.h5f.open
OSError: Unable to open file (file signature not found)

It seems there is some issue in reading the h5 file. Please let me know.

@eliphatfs
Copy link
Owner

The error message comes from internals of h5py rather than code of Skeleton Merger. The most probable reason for the error is that the downloaded file is corrupted. You may try downloading it again.

@amarmhrjn
Copy link
Author

I downloaded the dataset again but the same issue. I have also restarted the machine to see if it works, but no progress.
Any suggestion?

@eliphatfs
Copy link
Owner

Perhaps there is something wrong with the file enumeration; you could try separately placing the target .h5 files in a folder without the meta files (e.g. .json files)

@amarmhrjn
Copy link
Author

It turns out that the main issue was due to including files other than .h5 files such as .json and .txt in the same directory. The script was reading .json and .txt files as well (also inside the subdirectory if there are any) and hence the issue.
When I extracted the "ShapeNetCore.v2" zip file, the extracted directory contains all the files in it.
I moved the .h5 files into separate directory and now able to train and generate the keypoints (stored in merger_prediction.npz) for the keypointnet pcd files.

Btw, which tool/code are you using to visualize the keypoints stored in merger_prediction.npz?
I am working on reading the npz file, loading the data and trying to plot using matplotlib, but I am still learning it though.

@eliphatfs
Copy link
Owner

The visualization in the paper is done by https://github.com/eliphatfs/PointCloudVisualizer
Since it is external (and unity is not freeware) the related scripts are not included here. You have to export the data into a json file containing relevant information to use the visualizer. The specs can be found at the repository. It shouldn't be hard to combine original points and keypoints from the present code files into the visualization json file.

@eliphatfs
Copy link
Owner

Since the problem related has been resolved I would like to close the issue now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants