Skip to content

edchengg/VAE_GAN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VAE/GAN

This is an Pytorch implementation of Autoencoding beyond pixels using a learned similarity metric(VAE/GAN) in Python.

Variational Autoencoder + Generative Adverisal Network

Encoder + Decoder/Generator + Discriminator

VAE/GAN in training

Encoder + Decoder/Generator + Discriminator

Training Algorithm

similarity

class Discriminator
	self.f4 = nn.Linear(256, 1)

	def similarity(self, x):
	....
	return self.f4(self.dropout(h)) # no sigmoid at the end

Loss Function in python code

Decoder/Generator Loss Function

X_sim = discriminator.similarity(x_tilde)
X_data = discriminator.similarity(data)
rec_loss = ((X_sim - X_data) ** 2).mean()
dec_loss = GAMMA * rec_loss - dis_loss

Encoder Loss Function

KLD = -0.5 * torch.sum(1 + log_var - mu.pow(2) - log_var.exp())
KLD /= BATCH_SIZE * 784
enc_loss = KLD + BETA * rec_loss

Discriminator Loss Function

recon_loss = F.binary_cross_entropy(recon_data, FAKE_LABEL)
sample_loss = F.binary_cross_entropy(sample_data, FAKE_LABEL)
real_loss = F.binary_cross_entropy(real_data, REAL_LABEL)
dis_loss = recon_loss + sample_loss + real_loss

Data

MNIST

Neural Network

Encoder Decoder Discriminator
784 * 1024, LeakyReLU 20 * 1024, LeakyReLU 784 * 1024, LeakyReLU
1024 * 1024, LeakyReLU 1024 * 1024, LeakyReLU 1024 * 512, LeakyReLU
1024 * 1024, LeakyReLU 1024 * 1024, LeakyReLU 512 * 256, LeakyReLU
1024 * 20 1024 * 784, Tanh 256 * 1, Sigmoid

Loss:

Generation sample:

1-50 epochs:

Alt Text

Releases

No releases published

Packages

No packages published

Languages