Skip to content

Dissertation code base. Inverse Procedural Modeling: From Sketches to Buildings.

Notifications You must be signed in to change notification settings

SanBingYouYong/BuildingDAG24

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dissertation Code Base

"Inverse Procedural Modeling: from Sketches to Buildings"

Quick Start (docker)

docker available! run inference, load paramter and render all in one button click:

  • docker pull registry.cn-hangzhou.aliyuncs.com/sanbingyouyong/building_dag_st:1.0

    • 12 to 17G in size, may be able to shrink it in the future
  • docker run --rm -d -p 8502:8502 registry.cn-hangzhou.aliyuncs.com/sanbingyouyong/building_dag_st:1.0 or corresponding docker image id docker run --rm -d -p 8502:8502 <image id>

  • go to http://localhost:8502/ for streamlit-based webpage

    • upload sketch image (e.g. ./sample.png)
    • click Inference & Load Param & Render
    • wait for rendered image and predicted params to show up, may consume about 10G RAM in the process (CPU currently, GPU version may come with nvidia container soon)
  • closing docker container:

    • docker ps to check container id
    • docker stop <container id>

Project material and documentation on usage:

Preliminary:

  • install conda environment from environment.yml
  • install Geometry Scripts if wish to experiment with node tree generation from Python

Shape program code:

  • basic_building.py, building_mass.py, building4distortion.py, ledge.py, roof.py, shape_primitives.py, window_with_ledge.py

Dataset generation code:

  • dataset_counter.py, dataset_gen.py, distortion.py, generate_dataset.py, merge_dataset.py, paramgen.py, paramload.py, params.py, render.py

Neural network code:

  • nn_*.py

Evaluation code:

  • average_performances.py, nn_acc.py, performance.py

User Interface code:

  • ui_*.py

Blend files:

  • dataset.blend for generating synthetic datasets
  • interface.blend for user interface
  • distortion.blend for distorted sketches rendering
  • dataset_distortion.blend for generating distortion datasets

Output files:

  • ./models/*: model training output, including checkpoint, loss records, meta info for backup, loss curve visualization and notes file.
  • ./datasets/*: dataset directory, containing generated DAGDataset(s) and DAGDatasetDistorted(s)
  • in working directory: results*.yml containing model test outputs, performance*.yml containing model evaluation results, performance*.pdf visualizing model evaluation results
  • ./inference/*: captured sketch and model output files

Pipelines:

  • Generating dataset: run dataset_gen.py or generate_dataset.py with commandline args. Grammar: python generate_dataset.py batch_num sample_num varying_params_num device distortion; e.g. python generate_dataset.py 10 10 5 0 0
  • Neural network training: run nn_driver.py; modify config in code as needed.
  • User Interface: open interface.blend with Blender 3.2, go to Scripts and run ui_interface.py, the panel should appear under tool section; for testing without PyTorch installation and model weight files, switch the import from using ui_external_inference.py to use ui_mock_inference.py; Click the pencil icon to use Blender's annotation tool to draw. Toggle and adjust camera view as needed. To run inference, make sure to have proper model weights in ./models/EncDecModel.pth and a corresponding meta file in ./models/meta.yml and to have created the ./inference folder.

Data:

  • All data used for training and testing is generated by our own dataset generation pipeline

About

Dissertation code base. Inverse Procedural Modeling: From Sketches to Buildings.

Resources

Stars

Watchers

Forks