Skip to content

LEOMMM1/Typhoon-satellite-Image-prediction-based-on-SA-ConvLstm-and-GAN

Repository files navigation

Overview

Predict the typhoon satellite image based on Self-Attention Convlstm and GAN neural networks. I refer to these papers mainly: Self-Attention ConvLSTM for Spatiotemporal Prediction
Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting
Satellite Image Prediction Relying on GAN and LSTM Neural Networks

Dataset introduction

The satellite image dataset is from Himawari-8,including three channels:Band 8,Band 9, Band 10. I use the Band 9 as the input data. Training dataset and validation dataset include four types:A,B,C,E and the test dataset are five types:U,V,W,X,Y.The image was token every hour in every type. You can download the dataset from the links:(https://drive.google.com/drive/folders/1woIEOxxCexOSoT2vBIjAcXf5YnTDcVeT?usp=drive_link)

Loss function

The loss function is composed of three part:ssim(structural similarity), reconstruction loss(L1 loss+L2 loss), adversarial loss and I take wgan-gp algrithm as GAN networks.

Metrics

Metrics include mse,ssim,psnr(peak signal-to-noise ratio),and sharpness.The method of calculating sharpness is from No-Reference Image Sharpness Assessment Based on Maximum Gradient and Variability of Gradients

Run

python train_run.py

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages