You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you very much for your great codes!
I have a question about how data is organized in '.h5' files.
I run your code 'python -m pointnet2.train.train_sem_seg', then the data were automatically downloaded and unzipped.
When I open the '.h5' file, the data dimension is (1000, 4096, 9). It seems for every '.h5' file, it has 1000 point clouds. Each point cloud (item?) has 4096 points with xyz and other 6 features.
During the training, the dataloader get 'items' (point clouds) and sample 'num_points' for training.
I get several point clouds (4096,3) and visualize them. It seems each small point cloud was not randomly sampled from a large point cloud. I'm wondering how to well split a large point cloud into small pieces, and each piece has exactly 4096 points.
The text was updated successfully, but these errors were encountered:
If I split a point cloud into tiles, I cannot guarantee each tile has exactly 4096 points.
If I just randomly select 4096 points, geometry features are very difficult to be preserved, but here, object geometry is well-preserved.
Dear erikwijmans,
Thank you very much for your great codes!
I have a question about how data is organized in '.h5' files.
I run your code 'python -m pointnet2.train.train_sem_seg', then the data were automatically downloaded and unzipped.
When I open the '.h5' file, the data dimension is (1000, 4096, 9). It seems for every '.h5' file, it has 1000 point clouds. Each point cloud (item?) has 4096 points with xyz and other 6 features.
During the training, the dataloader get 'items' (point clouds) and sample 'num_points' for training.
I get several point clouds (4096,3) and visualize them. It seems each small point cloud was not randomly sampled from a large point cloud. I'm wondering how to well split a large point cloud into small pieces, and each piece has exactly 4096 points.
The text was updated successfully, but these errors were encountered: