Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Planar Reconstruction #1452

Merged
merged 5 commits into from
Apr 22, 2022
Merged

Planar Reconstruction #1452

merged 5 commits into from
Apr 22, 2022

Conversation

pierotofy
Copy link
Member

@pierotofy pierotofy commented Apr 22, 2022

This PR adds support for really fast planar reconstructions (e.g. agricultural fields)

Requirements:

  • Constant flight altitude
  • Nadir-only camera shots
  • Single pass lawnmower pattern
  • Planar or mostly planar terrain
  • Single camera (no multi-camera support at this moment). Multispectral images are fine.

Then to obtain an orthophoto really quickly, one can pass:

--sfm-algorithm planar --matcher-neighbors 4 --fast-orthophoto

If one needs a full 3D model from the mostly planar scene, one can omit the --fast-orthophoto flag and a full 3D reconstruction will still take place.

Experimental! 💥

@pierotofy
Copy link
Member Author

image

image

@pierotofy
Copy link
Member Author

pierotofy commented Apr 22, 2022

Sheffield park (78 images, not 100% planar):

Incremental reconstruction time:

real	3m6.405s
user	4m15.729s
sys	0m15.910s

Planar reconstruction time:

real	2m14.561s
user	3m16.540s
sys	0m15.792s

@pierotofy
Copy link
Member Author

pierotofy commented Apr 22, 2022

Sunset park (69 images, mostly planar):

Incremental:

real	4m0.041s
user	7m2.639s
sys	0m18.828s

Planar:

real	0m56.732s
user	1m45.183s
sys	0m3.751s

@pierotofy
Copy link
Member Author

Brighton beach (18 images, not 100% planar):

Incremental:

real	0m27.320s
user	0m45.607s
sys	0m3.248s

Planar:

real	0m18.697s
user	0m31.375s
sys	0m1.966s

@pierotofy
Copy link
Member Author

Agremo dataset (195 images, AG field):

Incremental:

real	7m41.867s
user	10m25.661s
sys	0m36.215s

Planar:

real	4m2.456s
user	6m3.528s
sys	0m26.396s

@pierotofy
Copy link
Member Author

I expect numbers to be even better on machines that have lots of cores, as this algorithm has less deadlocks while doing multi-threaded operations.

@pierotofy
Copy link
Member Author

pierotofy commented Apr 22, 2022

Future improvements could include a more robust outlier filter for degenerate camera poses by looking at points that fall outside of the main plane.

@pierotofy pierotofy merged commit f389ab2 into OpenDroneMap:master Apr 22, 2022
@Saijin-Naib
Copy link
Contributor

Did you experience increased memory consumption/swapping with planar?

I'm testing it on my Old_Orchard dataset against defaults and noticed that with planar all 32GB RAM have been consumed and I'm currently pushing 30GB swap, which wasn't even touched with incremental/defaults.

@pierotofy
Copy link
Member Author

It shouldn't take more memory during the reconstruction step, but perhaps the reconstruction ended up degenerate and some other issue caused out of memory problems. Do you know at which step in the process the memory usage went up?

@Saijin-Naib
Copy link
Contributor

Partway through the OpenMVS phase.

Options:

auto-boundary: true, dsm: true, sfm-algorithm: planar

Dataset imported via Cloud Import - GitHub:
https://github.com/Saijin-Naib/sUAS_Photogrammetry_Suite_Test_Data/tree/trunk/datasets/OldOrchard_2017-07-22

Node:

node-odm-1 (manual)

Versions:

CONTAINER ID   IMAGE                        COMMAND                  CREATED        STATUS        PORTS                                         NAMES
ac7d7ee9f0a4   opendronemap/webodm_webapp   "/bin/bash -c 'chmod…"   21 hours ago   Up 12 hours   0.0.0.0:8000->8000/tcp, :::8000->8000/tcp     webapp
1f099b4e659a   opendronemap/webodm_webapp   "/bin/bash -c '/webo…"   21 hours ago   Up 12 hours                                                 worker
71289546ee79   opendronemap/nodeodm         "/usr/bin/node /var/…"   21 hours ago   Up 12 hours   0.0.0.0:49154->3000/tcp, :::49154->3000/tcp   webodm_node-odm_1
b3c5c593a7e5   redis                        "docker-entrypoint.s…"   21 hours ago   Up 12 hours   6379/tcp                                      broker
d32b63bd8590   opendronemap/webodm_db       "docker-entrypoint.s…"   21 hours ago   Up 12 hours   0.0.0.0:49153->5432/tcp, :::49153->5432/tcp   db

@pierotofy
Copy link
Member Author

pierotofy commented Apr 24, 2022

Correction, it might take more memory during opensfm reconstruct; I'm currently looking at possible memory bottlenecks.. try passing --max-concurrency 1 for the time being.

@pierotofy
Copy link
Member Author

pierotofy commented Apr 25, 2022

I was able to get a reconstruction with defaults and max-concurrency 1. Also this was done on CPU (no GPU processing).

image

@Saijin-Naib
Copy link
Contributor

Did you run with --max-concurrency 1? You didn't observe any crazy RAM/SWAP usage?

@pierotofy
Copy link
Member Author

Not with concurrency at 1. With more than 1, memory usage does go a bit crazy (working on that...)

@pierotofy
Copy link
Member Author

Memory usage should be vastly improved with #1455

@Saijin-Naib
Copy link
Contributor

Vastly improved is an understatement :)

Didn't even come close to touching swap.

Massive improvement in processing times as a result!
image

@smathermather
Copy link
Contributor

Starting small in my tests. Running a 2300 image dataset through... 😄

@smathermather
Copy link
Contributor

And a 430 image dataset... . Ok, I've got a few worth trying.

@smathermather
Copy link
Contributor

Dang, that's super parallel...
Peek 2022-04-28 18-06

@smathermather
Copy link
Contributor

image
drew-scanlon-white-guy-blinking

@pierotofy
Copy link
Member Author

pierotofy commented Apr 29, 2022

But was the end result good also? 😄

@smathermather
Copy link
Contributor

smathermather commented Apr 29, 2022

But was the end result good also?

Yes! Not without artifact, but that's to be expected. Where things are super smooth though, it looks like very classic orthos:

image

image

image

image

@smathermather
Copy link
Contributor

1855 processed in 02:23:53. Not too shabby. Corridor based mapping the ODM often struggles with. The 2000+ image set (more square than long and skinny) is taking a very long time at the orthophoto stage however.

@smathermather
Copy link
Contributor

Ok, 2305 images, and I can't decide if the results are more Dali or Escher for the orthophoto:
image

But the why is quite apparent from the point cloud:
ezgif-2-c3903f1ac7

That could by why the odm_orthophoto step too so long too. That's one complex mesh to render.

Rerunning with the following, as this is a lovely RTK dataset with 4cm accuracy data:
fast-orthophoto: true, gps-accuracy: .08, matcher-neighbors: 4, sfm-algorithm: planar

Although this accuracy is set in GPSXYAccuracy and GPSZAccuracy tags, so maybe my fix is a fools errand.

@pierotofy
Copy link
Member Author

Is the terrain actually planar in this dataset? I would imagine that as an area gets wider the planar assumption starts to break down more and more (and bundle adjustment can no longer optimize a solution).

@smathermather
Copy link
Contributor

Yeah, that's the violated assumption for this large an area, for sure, but I was hoping that at a sufficiently large scale, that might be less of an issue:
image

image

@pierotofy
Copy link
Member Author

It would be interesting to see if combining this with split-merge could handle it better.

@smathermather
Copy link
Contributor

Oooh. On it.

@smathermather
Copy link
Contributor

smathermather commented May 3, 2022

It would be interesting to see if combining this with split-merge could handle it better.

It's much closer. Something weird happened, but it's nearly complete and quite good overall:
image

image

Also, it's not quite done yet, but it's cropping the orthophoto (poor GDAL...) an here are the times as they stand now:
Incremental:
image

Field
image

@smathermather
Copy link
Contributor

Final time: 14:51:40. Not too shabby. I'll try turning down the size of the submodels. That missing bit is weird, but it's too late tonight to investigate.

@smathermather
Copy link
Contributor

smathermather commented May 3, 2022

That missing bit is weird, but it's too late tonight to investigate.

The reason for those gaps is likely everything to do with using the default split-overlap settings of 150m:
image

Now rerunning with 320m of overlap, and just for giggles, I've got the split size down to 200 so I can use more cores.

@smathermather
Copy link
Contributor

That will quite do:
image

image

It is missing the southwest corner. I haven't checked yet, but I have seen that before when cutlines fail to converge. I am re-running at a 300 image split as that will reduce the number of submodels and thus hopefully reduce the probability of missing submodels. But the gap in the middle is now filled, and the data look quite good. It seems the combo of planar reconstruction and split merge is quite suitable even for hilly areas.

@nextechit
Copy link

nextechit commented May 5, 2022

Have run my first test on a dataset of 1600 images taken with P4RTK. It has processed much quicker (9:28 vs 5:49) but according to the report there is GPS error of 427m, without planar the GPS error was 0.01. What could be causing this? I'll try run it again without fast-orthophoto and see if any difference.

Settings used:
fast-orthophoto: true, feature-quality: medium, matcher-neighbors: 4, orthophoto-resolution: 2, sfm-algorithm: planar

image

@smathermather
Copy link
Contributor

What could be causing this?

My guess: the estimates of error haven't been redesigned for this approach... . They might be meaningless. But, I didn't write any of the code, so this is pure hunch.

@smathermather
Copy link
Contributor

I am re-running at a 300 image split as that will reduce the number of submodels and thus hopefully reduce the probability of missing submodels.

Still gap filled:
image

@kikislater
Copy link
Member

kikislater commented May 9, 2022

2700 images from ebee S110 NIR, 128 threads (AMD EPYC 7662), 128Gb RAM allocated =>

image
5h22

Output:
image

Some artifacts in corner:
image

This corner is really bad, may be removing some pictures will help
image

Otherwise pretty good and usable

@smathermather
Copy link
Contributor

You might try setting split to some multi-hundred image value, say 3-400.

@kikislater
Copy link
Member

Works pretty well ! but it took 2x the time of standard parameters

Capture du 2022-05-17 10-32-57

image

@pierotofy
Copy link
Member Author

I would expect that if you set max-concurrency to 1. The algorithm is heavily parallelized and will be fast only if you let it run in parallel.

@kikislater
Copy link
Member

I think it's been a bit slow down with split: 400 rather than max-concurrency, no ?

@kikislater
Copy link
Member

I tested on multiple dataset such as marine ones. Split ~300 is needed for all my tests to have proper results

@smathermather
Copy link
Contributor

I think it's been a bit slow down with split: 400 rather than max-concurrency, no ?

No: one of the major enhancements with this is maximising concurrency of use. So if you constrain it to a single thread, you remove the major performance improvement.

@smathermather
Copy link
Contributor

I tested on multiple dataset such as marine ones. Split ~300 is needed for all my tests to have proper results

I'm finding something quite similar over cities: somewhere between 2-400 seems to be a sweet spot in assuming planarityn of input data, even for a moderately hilly city.

@nextechit
Copy link

nextechit commented May 19, 2022

image

Took a hell of lot longer to complete with planar. This is on an agricultural field that is "reletivley flat". Will retry with different settings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants