conda env create -f environment.yml
run jupyter noteobook train.ipynb with this option, it is possible to monitor the traning graph overtime and choose appropriate learning rate. For more detail about the training process and the selection of hypterparameters, please check the training note
this code performs the following stages to find similar images for each clothes image.
- extract feature vector for all images
- build a K-nearnest-neighbor tree from all the extracted features.
- for each image, query the tree for K nearest neighbor features.
python measure_sim.py -i PATH_TO_DEEP_FASHION_DATASTE --ouput_dir DIR_FOR_KNN_RESULT
below are several sample results. The top-left image is the query images and the remaining ones are the retrieved images.
- Missing labels: DeepFashion labels one cloth category per image, but there are often more than one type of cloth in images with model. This might cause the problems for the models to learn distinguished features for each category. Below are several samples that I picked up from the dataset.
step 1:
find learning rate. In the below graph, the learning rate of
freeze the resnet encoder and fit the head of the network for 5 cycles. Below are the learning graph losses over 5 cycles.
step 2:
unfreeze the whole network and again, find the new learning rate. This time, I chose the learning rate as
fine-tune the whole network for 5 more cycles.
step 3: train model with larger image size again, find one appropriate learning rate
and we keep training the head for 5 more cycles. Fro the loss graph, there is a hint that the validation loss starts increasing, so we stop the training here and save the best model so far.
accuracy
confusion matrix