Example code for the paper "Generative adversarial networks for reconstructing natural images from brain activity".
Method for reconstructing images from brain activity with GANs. You need a GAN that is trained for reproducing the target distribution (images that look like your stimuli) and a differentiable method for doing perceptual feature matching (here: layer activations of a convolutional neural network).
The method uses linear regression implemented as a neural network to predict the
latent space z
. Losses are calculated in image space and backpropagated
through the loss terms and the GAN over z
to the weights of the linear
regression layer.
... for the handwritten characters example:
-
Run
train_linear_model.py
, preferably on a GPU. This will produce./recon/finalZ.mat
which contains z predictions on your validation set. -
Run
reconstruct_from_z.py
to generate a PNG with reconstructions of the validation data in./recon/recons.png
.
... for your own data:
-
Train a GAN for your stimulus domain (e.g. natural grayscale images of size [64 64]). During training z should be drawn from a uniform distribution in [-1 1] and normalized (see
sample_z()
inmodel_dcgan_G.py
). -
Train a differentiable network for feature matching. The training code for the AlexNet used for handwritten digits can be found in
./featurematching/train_featurematching_handwritten.py
. -
Adapt some parameters in
args.py
andtrain_linear_model.py
(and hopefully little of the rest). Fine-tune the weights for the loss terms on an isolated data set. -
You should be able to just run
train_linear_model.py
then.
-
Anaconda Python 2.7 version
-
chainer
version 1.24 (install via:pip install chainer==1.24 --no-cache-dir -vvvv
) -
A GPU for training the feature matching network
If you publish using this code or use it in any other way, please cite:
Seeliger, K., Güçlü, U., Ambrogioni, L., Güçlütürk, Y., & van Gerven, M. A. J. (2018). Generative adversarial networks for reconstructing natural images from brain activity. NeuroImage.
Please notify the corresponding author in addition.