This repository contains the code and data which can reproduce results reported in our article 'Does the data meet your expectations? Explaining diversity in a dataset of images' submitted to BNAIC 2020. Briefly, the idea is to explain sample representation in a dataset of greyscale images of circles and squares in terms of intuitive aspects such as size, position and pixel brightness.
- Make sure to install dependencies listed in requirements.txt
- We use an Nvidia GTX 1080Ti GPU, but it may be possible to replicate results without using a GPU
- To run the full experiment from scratch, run
python run_test.py --test_config=config/test_draw_128.pyThis will prompt the user to place source data files from Quick,Draw! with Google (https://quickdraw.withgoogle.com/data)
- To use pre-trained models and previously collected data, download the draw_128 directory from here and place it under SpecCheck/_tests. Additionally, untar draw_128/qd_shapes.tar and draw_28/sd_shapes.tar to draw_128/qd_shapes and draw_128/sd_shapes repsectively. After this, you can similarly execute run_test.py like the previous step.