Skip to content

Utilize Triplet Loss and mean shift based calibration for Classification - This project shows a working concept for pose classification

Notifications You must be signed in to change notification settings

TilakD/Triplet_Loss_Classification

Repository files navigation

Utilize the learning from one model to classify different data. For example, utilize a model that is trained to classify fruits to classify animals, without much change.

This is done using triplet loss. For example, train a model to cluster fruits images, pass animal images through the fruits clustering model and extract the embeddings. Utilize mean shift method to move the animal embeddings on top of fruits embeddings and finalize the model. This classification can be extrapolated to transfer the learning from "n" classes to "n" new classes.

This would be useful when you are running low on training data.

Idea: Alt text

To check the concept, a pose estimation is developed and below are the results.

Final Results:

Model was trained on 707 images of Person 1 and total of 68 images of Person 2 was passed through the network. Embeddings translation was done and testing was done on 995 Person 2 images out of which 744 were correct predictions (around 75%). NOTE: Each person data has 6 poses.

Initial Steps:

  1. Create your model folder inside experiment folder
  2. Copy param.json file from experiments/base_model_v2/ into the folder that you added in Step 1
  3. Be sure to check train_size, eval_size and image_type

File structure:

├── data 
   ├── train
     ├── Pose 0
        ├── image01.png
        ├── image02.png
        ├── ...
     ├── Pose 1
     ├── Pose 2            
     ├── ...            
     ├── ...            
     └── Pose 6
   ├── test
     ├── Pose 0
        ├── image01.png
        ├── image02.png
        ├── ...
     ├── Pose 1
     ├── Pose 2            
     ├── ...            
     ├── ...            
     └── Pose 6
   ├── validation
     ├── Pose 0
        ├── image01.png
        ├── image02.png
        ├── ...
     ├── Pose 1
     ├── Pose 2            
     ├── ...            
     ├── ...            
     └── Pose 6

Train:

python train.py --model_dir experiments/base_model_v2 --data_dir data/cropped_img

You will first need to create a configuration file similar to: params.json. This json file specifies all the hyperparameters for the model. All the weights and summaries will be saved in the model_dir.

Once trained, you can visualize the embeddings by running:

python visualize_embeddings.py --model_dir experiments/base_model_v2 --data_dir data/cropped_img

And run tensorboard in the experiment directory:

tensorboard --logdir experiments/base_model_v2

Save Landmark Embeddings (For train set):

python landmark_embeddings.py --model_dir experiments/base_model_v2 --data_dir data/cropped_img/train

This will save the embeddings and labels of all landmarks.

Evaluate (On test set):

python evaluate.py --model_dir experiments/base_model_v2 --data_dir data/cropped_img

Clustering verification:

This can be done by opening check_p2_cluster.ipynb Model was trained on Person 1 images and below was the resulting cluster. Alt text

Small set of Person 2 images was passed through the trained network. All trained Person 1 embeddings were translated on Person 2 clusters (Check this notebook) and testing was done on whole of Person 2 data. Alt text

Below is an image of overlaying clusters of training embeddings of Person 1 and Person 2. Alt text

Mean Shift (translation):

Shift our train clusters on top of the cluster generated by new person. Check this notebook. Alt text

Testing:

Testing was done and you can check Testing_P1_Train_P2_Translation_Test notebook. Alt text

Predict (Need to use this after we translate P1 train embeddings on top of P2):

python predict.py --model_dir experiments/base_model_v2 --data_dir data/cropped_img/test/image_193.jpg

Resources

About

Utilize Triplet Loss and mean shift based calibration for Classification - This project shows a working concept for pose classification

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published