Skip to content

Commit

Permalink
Add a dataset downloader script
Browse files Browse the repository at this point in the history
  • Loading branch information
vork committed May 10, 2021
1 parent f7ce432 commit 52b0549
Show file tree
Hide file tree
Showing 2 changed files with 45 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# NeRD: Neural Reflectance Decomposition from Image Collections

### [Project Page](https://markboss.me/publication/2021-nerd/) | [Video](https://youtu.be/JL-qMTXw9VU) | [Paper](https://arxiv.org/abs/2012.03918)
### [Project Page](https://markboss.me/publication/2021-nerd/) | [Video](https://youtu.be/JL-qMTXw9VU) | [Paper](https://arxiv.org/abs/2012.03918) | [Dataset](download_datasets.py)

Implementation for NeRD. A novel method which decomposes multiple images into shape, BRDF and illumination.
<br><br>
Expand All @@ -13,7 +13,7 @@ Implementation for NeRD. A novel method which decomposes multiple images into sh

## Datasets

Soon
All datasets are released in seperate git repositories. We have created a [download script](download_datasets.py) which automatically fetches all datasets and download them to a specified folder.


## Citation
Expand Down
43 changes: 43 additions & 0 deletions download_datasets.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
import argparse
import os

datasets_synthetic = [
("Globe", "https://github.com/vork/globe_syn_photogrammetry.git"),
("Car", "https://github.com/vork/car_syn_photogrammetry.git"),
("Chair", "https://github.com/vork/chair_syn_photogrammetry.git"),
]

datasets_real_world = [
("Gnome", "https://github.com/vork/gnomes-photogrammetry.git"),
("GoldCape", "https://github.com/vork/moldGoldCape.git"),
("EthiopianHead", "https://github.com/vork/ethiopianHead.git"),
("StatueOfLiberty", "https://github.com/vork/StatueOfLiberty-Photogrammetry.git"),
("MotherChild", "https://github.com/vork/mother_child-photogrammetry.git"),
]


def args():
parser = argparse.ArgumentParser()
parser.add_argument("dataset_root", help="The root location of the datasets.")

return parser.parse_args()


def main(args):
dataset_root = args.dataset_root

variants = [("synthetic", datasets_synthetic), ("real_world", datasets_real_world)]

for name, data in variants:
dataset_variant_root = os.path.join(dataset_root, name)
os.makedirs(dataset_variant_root, exist_ok=True)

for dname, durl in data:
dataset_path = os.path.join(dataset_variant_root, dname)

if not os.path.exists(dataset_path):
os.system("git clone %s %s" % (durl, dataset_path))


if __name__ == "__main__":
main(args())

0 comments on commit 52b0549

Please sign in to comment.