Skip to content

PrometheusAtlas/DeepLearning_TransferLearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 

Repository files navigation

Project: Transfer Learning and Ensemble Fine-Tuning of VGG16 Models

Overview

This project applies transfer learning and fine-tuning techniques using VGG16 for multi-class image classification.
The convolutional base is frozen and repurposed as a feature extractor, followed by custom dense classification heads with dropout, ReLU activations, and softmax output layers.
Both two-stage (frozen→unfrozen) and single-stage fine-tuning strategies are evaluated using Adam and SGD optimizers.
Finally, an ensemble aggregates model predictions by averaging softmax probabilities across trained variants.

Methodology

  • Pretrained VGG16 (ImageNet) as a convolutional backbone.
  • Global Average Pooling → Dense(512, ReLU) → Dropout(0.5) → Dense(256, ReLU) → Dropout(0.3) → Softmax output.
  • Optimizers: Adam (1e-3 → 1e-5) and SGD (1e-3 → 1e-5).
  • Early stopping (patience = 10).
  • Data augmentation via Keras ImageDataGenerator (rotation, shear, zoom, shift, flip).

Results

Model Optimizer Phases Accuracy
VGG16 + Logistic Regression feature extractor 0.878
VGG16 + Random Forest feature extractor 0.890
VGG16 + SVM feature extractor 0.920
Two-Stage Adam A (frozen) + B (10 unfrozen) 0.961
Single-Stage Adam 5 unfrozen layers 0.958
Two-Stage SGD A (frozen) + B (10 unfrozen) 0.923
Ensemble (3 variants) softmax avg 0.963

Key Insights

  • Fine-tuning deeper convolutional layers improves adaptation to target data.
  • Two-stage training stabilizes gradients and prevents catastrophic forgetting.
  • Ensemble averaging marginally outperforms the best individual model (+0.23%).

Defensive Perspective

Explored robustness against adversarial attacks (FGSM, PGD, CW, DeepFool, NES, Square Attack) and corresponding defenses such as adversarial training and diffusion-based purification for robustness in sensitive domains (medical imaging, remote sensing).

Files

  • train_feature_extractors.py — generates frozen VGG16 feature maps.
  • train_finetune_variants.py — fine-tuning of Adam/SGD variants.
  • ensemble_vgg16.py — ensemble aggregation of model outputs.
  • report.pdf — full technical write-up with adversarial analysis.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages