A deep learning project for classifying cloud types from images using Convolutional Neural Networks (CNN) and Transfer Learning.
- Overview
- Features
- Installation
- Quick Start
- Usage
- Project Structure
- Model Architectures
- Results
- Contributing
- License
This project implements a cloud classification system that can identify 7 different types of clouds:
- Cirriform clouds (Cirrus)
- Clear sky
- Cumulonimbus clouds
- Cumulus clouds
- High cumuliform clouds
- Stratiform clouds
- Stratocumulus clouds
The system uses state-of-the-art deep learning models including ResNet50, EfficientNet, and MobileNetV2 with transfer learning for accurate classification.
- π Multiple Model Architectures: Support for Simple CNN, ResNet50, EfficientNet, and MobileNetV2
- π Comprehensive Evaluation: Confusion matrix, classification reports, and visualization tools
- π¨ Data Augmentation: Built-in augmentation to improve model generalization
- π Training Monitoring: Real-time metrics tracking with early stopping and learning rate scheduling
- π Easy Inference: Simple API for single image and batch predictions
- π Visualization Tools: Automatic generation of training curves and prediction visualizations
- Python 3.7 or higher
- TensorFlow 2.10 or higher
- CUDA (optional, for GPU acceleration)
- Clone the repository:
git clone https://github.com/yourusername/cloud-classification.git
cd cloud-classification- Install dependencies:
pip install -r requirements.txt- Verify installation:
python test_setup.pypython data_loader.pyThis will display dataset information and visualize sample images.
# Train with ResNet50 (recommended)
python train.py --model resnet50 --epochs 50 --batch_size 32python evaluate.py --model models/resnet50_YYYYMMDD_HHMMSS/best_model.h5 --visualizepython predict.py --model models/resnet50_YYYYMMDD_HHMMSS/best_model.h5 --image path/to/image.jpgpython train.py --model resnet50 --epochs 50python train.py \
--model resnet50 \
--epochs 50 \
--batch_size 32 \
--lr 0.001 \
--save_dir modelssimple: Simple CNN from scratchresnet50: ResNet50 with transfer learning (recommended)efficientnet: EfficientNetB0 (high accuracy)mobilenet: MobileNetV2 (lightweight, fast)
--no_aug: Disable data augmentation--unfreeze: Unfreeze base model for fine-tuning--batch_size: Batch size (default: 32)--lr: Learning rate (default: 0.001)
# Basic evaluation
python evaluate.py --model models/resnet50_YYYYMMDD_HHMMSS/best_model.h5
# With visualizations
python evaluate.py --model models/resnet50_YYYYMMDD_HHMMSS/best_model.h5 --visualize# Single image
python predict.py --model models/resnet50_YYYYMMDD_HHMMSS/best_model.h5 --image image.jpg
# Batch prediction
python predict.py --model models/resnet50_YYYYMMDD_HHMMSS/best_model.h5 --dir images/
# Top-K predictions
python predict.py --model models/resnet50_YYYYMMDD_HHMMSS/best_model.h5 --image image.jpg --top_k 5# Analyze dataset
python visualize_data.py --all
# Specific analyses
python visualize_data.py --stats # Dataset statistics
python visualize_data.py --samples # Visualize samples
python visualize_data.py --sizes # Image size analysiscloud-classification/
βββ clouds_train/ # Training dataset
β βββ cirriform clouds/
β βββ clear sky/
β βββ cumulonimbus clouds/
β βββ cumulus clouds/
β βββ high cumuliform clouds/
β βββ stratiform clouds/
β βββ stratocumulus clouds/
βββ clouds_test/ # Test dataset
β βββ [same structure as train]
βββ models/ # Trained models (generated)
β βββ resnet50_YYYYMMDD_HHMMSS/
β βββ best_model.h5
β βββ final_model.h5
β βββ class_mapping.json
β βββ training_log.csv
β βββ training_curves.png
βββ evaluation_results/ # Evaluation outputs (generated)
βββ config.py # Configuration file
βββ data_loader.py # Data loading and preprocessing
βββ model.py # Model architectures
βββ train.py # Training script
βββ evaluate.py # Evaluation script
βββ predict.py # Prediction script
βββ visualize_data.py # Data visualization tools
βββ example_usage.py # Usage examples
βββ quick_start.py # Quick start guide
βββ test_setup.py # Setup verification
βββ requirements.txt # Dependencies
βββ README.md # This file
A custom CNN architecture built from scratch with:
- 4 convolutional blocks
- Batch normalization and dropout
- Global average pooling
- Dense layers for classification
Transfer learning with ImageNet pretrained ResNet50:
- Freeze base model option
- Fine-tuning support
- High accuracy
EfficientNet architecture optimized for accuracy and efficiency:
- Compound scaling method
- State-of-the-art performance
Lightweight model suitable for mobile and edge devices:
- Depthwise separable convolutions
- Fast inference time
After training, you'll get:
best_model.h5: Best model based on validation accuracyfinal_model.h5: Final model after all epochsclass_mapping.json: Class to index mappingtraining_log.csv: Training historytraining_curves.png: Visualization of training metrics
confusion_matrix.png: Confusion matrix visualizationclass_metrics.png: Per-class precision, recall, F1-scoreprediction_samples.png: Sample predictions with imagesevaluation_report.txt: Detailed classification report
Edit config.py to customize:
# Image settings
IMG_SIZE = (224, 224)
BATCH_SIZE = 32
EPOCHS = 50
LEARNING_RATE = 0.001
# Model settings
MODEL_NAME = 'resnet50'
PRETRAINED = True
FREEZE_BASE = True
# Data augmentation
USE_AUGMENTATION = True
AUGMENTATION_CONFIG = {
'rotation_range': 20,
'width_shift_range': 0.1,
'height_shift_range': 0.1,
'shear_range': 0.1,
'zoom_range': 0.1,
'horizontal_flip': True,
'brightness_range': [0.8, 1.2]
}- Use Data Augmentation: Always enable augmentation to increase data diversity
- Transfer Learning: Use pretrained models (ResNet50, EfficientNet) instead of training from scratch
- Fine-tuning: After training with frozen base, unfreeze and fine-tune with lower learning rate
- Hyperparameter Tuning: Experiment with different learning rates and batch sizes
- Ensemble Methods: Combine multiple models for better accuracy
import tensorflow as tf
print(tf.config.list_physical_devices('GPU'))- Reduce
batch_size(e.g.,--batch_size 16) - Use a lighter model (MobileNet)
- Reduce image size in
config.py
- Use GPU acceleration
- Increase
batch_sizeif you have enough RAM - Use data generators (already implemented)
pip install -r requirements.txtSee example_usage.py for comprehensive usage examples:
python example_usage.pyThis interactive script guides you through:
- Data exploration
- Data loading
- Model creation
- Training
- Evaluation
- Prediction
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- TensorFlow/Keras: Deep learning framework
- ImageNet: Pretrained model weights
- Dataset: Cloud images classification dataset
Thipv0302
- GitHub: @Thipv0302
For questions or suggestions, please open an issue on GitHub.
Made with β€οΈ for cloud classification
β If you find this project useful, please consider giving it a star!