Skip to content

mlagunas/material-appearance-similarity

Repository files navigation

A Similarity Measure for Material Appearance

Project page Paper Conference Journal

alt text

Abstract

We present a model to measure the similarity in appearance between different materials, which correlates with human similarity judgments. We first create a database of 9,000 rendered images depicting objects with varying materials, shape and illumination. We then gather data on perceived similarity from crowdsourced experiments; our analysis of over 114,840 answers suggests that indeed a shared perception of appearance similarity exists. We feed this data to a deep learning architecture with a novel loss function, which learns a feature space for materials that correlates with such perceived appearance similarity. Our evaluation shows that our model outperforms existing metrics. Last, we demonstrate several applications enabled by our metric, including appearance-based search for material suggestions, database visualization, clustering and summarization, and gamut mapping.

Setting it up

Note that this has been tested using python 3.7

Dependencies

First, clone and install dependencies

# clone project   
git clone https://github.com/mlagunas/material-appearance-similarity.git   

cd material-appearance-similarity 
pip install scipy numpy matplotlib umap-learn Pillow
# install pytorch/torchvision (https://pytorch.org)

Get model pretrained weights

How to run the code

Training a new model

Make sure that you have downloaded all the training images Also, make sure that you have users' judgements on material similarity. Those are two json files inside ./data, namely answers_processed_train.json , and answers_processed_test.json. Also make sure that you have the uncropped images of each material with Havran geometry (./data/havran1_ennis_298x298_LDR).

Then, set up those arguments in the trianing script and run it:

python train.py --train-dir data/split_dataset --test-dir data
/havran1_ennis_298x298_LDR

Using the default values in the script, the trained model yields an agreement of 81.99% with users' answers.

Getting image feature vectors

Next, get the feature vectors for some images. First, modify paths inside get_embs.py.

...
weights_path = 'data/model_best.pth.tar'
imgs_path = 'data/havran1_stpeters_256x256_LDR'
embs_path = 'data/embs.mat' # we will store the obtained feature vectors in this path
...

Then, get the feature vectors for the downloaded images

python3 get_embs.py    

Get similar images

We can obtain similar images to a given reference using the previously computed feature vectors. First, set the path and necesary variables in plot_similar.py. This will store the 5 more similar images to the reference according to our metric in the path data/nickel.

...
embs_path = 'data/embs.mat'  # /mat file with the embeddings
n_close_elems = 5  # number of close elements to find
reference_img = 'data/havran1_stpeters_256x256_LDR/nickel.png'
do_unit_norm = False # normalize feature vectors to have unit norm
...

Generate UMAP plot

We can visualize the feature vectors generated for the images using dimensionality reduction algorithms like UMAP. First we set the path of the feature vectors inside plot_umap.py.

...
embs_path = 'data/embs.mat'
do_unit_norm = False # normalize feature vectors to have unit norm
...

To generate the plot we run:

python3 plot_umap.py

Citation

If you found this code useful please cite our work as:

@article{lagunas2019similarity,
    author = {Lagunas, Manuel and Malpica, Sandra and Serrano, Ana and
    Garces, Elena and Gutierrez, Diego and Masia, Belen},
    title = {A Similarity Measure for Material Appearance},
    journal = {ACM Transactions on Graphics (SIGGRAPH 2019)},
    volume = {38},
    number = {4},
    year = {2019}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages