Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about custom datasets #6

Open
eddienewton opened this issue Aug 28, 2022 · 6 comments
Open

Questions about custom datasets #6

eddienewton opened this issue Aug 28, 2022 · 6 comments

Comments

@eddienewton
Copy link

Hi,

I'm trying to evaluate some custom pointclouds, and had a couple of questions.

  • In generate_mesh.py, the generate() function has a TODO to make it compatible with open scenes. If I don't have a ground-truth and simply want to create a mesh from a pointcloud, will this affect me?
  • In generate_mesh.py, the graph_cut() function mentions about using a custom version of pyGo to avoid conversions to int32. How big of a scene will be affected by int32 overflow?
  • Also, in the mesh-tools, there a function occ2mesh(). Is this usable if the generate_mesh functions cannot handle large scenes?

Thanks in advance and great research!

@raphaelsulzer
Copy link
Owner

raphaelsulzer commented Aug 29, 2022

Hi and thanks for your interest in the work.

In generate_mesh.py, the generate() function has a TODO to make it compatible with open scenes. If I don't have a ground-truth and simply want to create a mesh from a pointcloud, will this affect me?

It depends if your pointcloud depicts a closed or open object/scene, ie if the object/scene is boundary free or has boundaries. If you specifically want to reconstruct open scenes it may be better to use the procedure with occ2mesh explained below.

In generate_mesh.py, the graph_cut() function mentions about using a custom version of pyGo to avoid conversions to int32. How big of a scene will be affected by int32 overflow?

It depends on the number of tetrahedra / cells (maybe >10M or something like that), ie indirectly on the number of points in your point cloud. In any case PyGCO will warn you if there is an overflow.

Also, in the mesh-tools, there a function occ2mesh(). Is this usable if the generate_mesh functions cannot handle large scenes?

Yes, exactly. Additionally, because I do not convert the occupancies to int32 in this program the result can sometimes be better. If you want to use it, you need to export the occupancy predictions as an .npz file. You can do this by adding export: [ "prediction" ] in the inference class in your config.yaml file. The .npz file can then be used to create the mesh with the occ2mesh tool from mesh-tools.

Hope this helps.

@eddienewton
Copy link
Author

Hi @raphaelsulzer, thanks for the insight.

I got feat to work with the colmap input, and was able to run dgnn.

When I ran dgnn, I had to disable the 'mesh' export option in dgnn, due to the number of labels not being correct. I assume this is because my scan is an open scan.

Since I used colmap as the feat input, does that mean I need to feed in the colmap directory to occ2mesh? There is no tool to convert colmap to an npz, correct?

Thanks again.

@eddienewton
Copy link
Author

Hi @raphaelsulzer,

What was the procedure you did to convert the ETH3D data to a mesh? The eth3d.yaml lists only "predictions" under the inference export option. Was occ2mesh used to create the mesh in this case?

I saw this in the readme regarding the ETH3D dataset.

[comment]: <> (### ETH3D dataset)
[comment]: <> (Reconstruct all training scenes of the ETH3D [3] dataset from the MVS point clouds used in the paper.)
[comment]: <> (1. Download and unzip the dataset in the `data` folder)
[comment]: <> (cd data)
[comment]: <> (bash download_eth3d.sh)
[comment]: <> (2. Reconstruct the meshes)
[comment]: <> (python run.py -i --config configs/eth3d.yaml)
[comment]: <> (To evaluate the results you can e.g. sample points on the meshes and use the [multi-view-evaluation]&#40;https://github.com/ETH3D/multi-view-evaluation&#41; tool provided by the ETH3D dataset authors.)

Thanks in advance again.

@eddienewton
Copy link
Author

Hi @raphaelsulzer,

I went ahead and was able to get feat to run with the following command:
./feat -w /project/dense -i . -o out -s colmap --adt 0.1

I then updated my config.yaml to the following:

inference:
  dataset: custom
  classes: null
  shapes_per_conf_per_class: 1
  files: null
  scan_confs: -1
  batch_size: 1024
  per_layer: 0
  has_label: 0
  model: best
  graph_cut: true
  fix_orientation: true
  metrics: []
  export: ["prediction"]

I ran DGNN using the following command:
python run.py -i -c configs/custom.yaml

Finally, I ran occ2mesh using the following command:
./occ2mesh -w /working_dir -i ./ -p prediction/out.npz -s colmap --adt 0.1 -e m --icomp -1

I had to manually set "try_to_make_manifold" to 1 to return the mesh. This is the output from occ2mesh. My mesh is having issues (not complete, and the wrong faces are shown). Am I missing a part to this?

Adaptive Delaunay triangulation...
	-1556957 input points
	-with 0.1 spacing
	-7079 output points
	-89174 finite facets
	-44522(+260) cells
	-done in 2s

Load prediction score...
	-from /project/dense/dgnn/prediction/out.npz
	-read 44782 cell scores.

Label cells without optimization by taking max score...
	-infinite cells will be labelled outside
mpOptions.try_to_close : 0
mpOptions.try_to_make_manifold : 1
mpOptions.number_of_components_to_keep : 1
mpOptions.factor_for_removing_large_faces : 305891131
export mesh

Create a manifold and closed mesh...
	-is manifold: 1
	-try to make manifold
	-has been made manifold: 1
	-number of components: 0
	-erased components: 0
	-is closed: 1
	-number of vertices: 1556957
	-number of edges: 0
	-number of faces: 0
	-done in 0s

Export mesh...
	-to ./_mesh.ply

-----OCC2MESH FINISHED in 9s -----

Thanks

@raphaelsulzer
Copy link
Owner

raphaelsulzer commented Sep 6, 2022

Hi,

great that you got to run mesh-tools from a colmap source. I didn't use it for a long time. Did you have to change a lot?

I can see that you do not use a graph cut optimization for creating the final mesh. Maybe that can fix some of your issues. You can try something like this:
./occ2mesh -w /working_dir -i ./ -p prediction/out.npz -s colmap --adt 0.1 -e m --icomp -1 --gco angle-0.5

Everything else looks good!

@eddienewton
Copy link
Author

Hi @raphaelsulzer,

Thanks for the graph cut suggestion. I'll try that soon.

Regarding the changes, the changes were minimal. The cmake changes were just hacks. They were:

  • In CmakeLists.txt, I added src/IO/colmapIO.cpp to add_executable for FEAT and OCC2MESH
  • commented out //#include <util/vectorArithmetic.h> in colmapInput.cpp
  • uncommented out lines 59 to 63 in feat.cpp
  • in colmapIO.h, changed void readColmapFiles(dirHolder dir, dataHolder& data); to void readColmapFiles(dirHolder& dir, dataHolder& data);
  • in fileIO.cpp, changed line 334 to CGAL::read_ply(in,import_mesh);
  • in learningIO.cpp line 453, changed to int importPrediction(dirHolder dir, dataHolder& data, runningOptions options){

I also am adding colmapInput.cpp and occ2mesh.cpp.
changes.zip

I didn't see the Eigen issue you mentioned when building Colmap.

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants