Skip to content

Official implementation of "Combining GAN with reverse correlation to construct personalized facial expressions".

Notifications You must be signed in to change notification settings

yansen0508/Mental-Deep-Reverse-Engineering

Repository files navigation

Combining GAN with reverse correlation to construct personalized facial expressions

This is the official implementation of our journal [paper] "Combining GAN with reverse correlation to construct personalized facial expressions".

For more information, please refer to our project website: Mental-Deep-Reverse-Engineering.

[Update: 25/08/23] Paper is available here.

[Update: 22/08/2023] Paper will be published on 25th Aug on PloS One.

[Update: 14/08/2023] Paper was accepted by Plos One.

Approach

General pipeline

Requirements

Please install Pytorch, Torchvision, Psychopy, and dependencies.

pip install -r requirements.txt

Description

GAN is based on GANimation. The interface of the perceptual experiment is based on Psychopy.

The input image and the output images are 148 * 148. There are 16 editable Action Units, for more detail please see Openface.

datasets/results/: All the generated images can be found in test_ganimation_30.

datasets/test/imgs/: image folder.

datasets/test/aus_openface_new560.pkl: dictionary containing the action units of each image.

datasets/test/train_ids.csv: the file containing the names of the images to be used to train.

datasets/test/test_ids.csv: the file containing the names of the images to be used to test.

csv/: containing all the trial information.

subject/: 560 different faces are generated by activating 3 out of 16 action units. Example subject comes from MMI datasets.

*.txt: texts displayed during the perceptual experiments.

Please download the pretrained model from here and here. Then put them at ckpts/190327_160828/

Generating 560 different facial expressions by a specific actor.

According to Openface, a list of editable action units is provided: AU1, AU2, AU4, AU5, AU6, AU7, AU9, AU10, AU12, AU14, AU15, AU17, AU20, AU23, AU25, AU26, AU43.

According to GANimation, you need to prepare the photo of a specific actor (e.g., your photograph), train_ids.csv, test_ids.csv, and aus_openface.pkl.

As mentioned in our paper, the objective is to generate the facial expression based on the actor's face and a list of AUs.

The actor's face

You need Openface to preprocess the actor's photo. Please use Openface to extract the intensities of Action Units.

aus_openface.pkl

You can create a dictionary named data to record which Action Units are activated in each facial expression. For example, data['10']=[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0] means that all action units in the image "10.bmp" will not be activated, i.e. a neutral face. data['1']=[0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0] means that Action Unit 12 in the image "1.bmp" will be activated. By assigning values (between 0 and 5), you can activate the corresponding AU.

And save them into aus_openface.pkl:


import pickle
output = open("datasets/test/aus_openface.pkl", 'wb')
pickle.dump(data, output)
output.close()

Note that "0.bmp" should be the actor's face. That is to say, the intensity of each AU of the actor's photo (preprocessed by Openface) should be saved at data['0']. Thus GAN will generate facial expressions based on the sample (i.e., data['0']).

train_ids.csv, test_ids.csv

Make sure you already have the csv file in datasets/results/test. And each image is included in the csv file. Next run the following code and the GAN will generate facial expressions with the corresponding AUs activated (you can find them in the path of output_folder).


from options import Options
from solvers import create_solver
output_folder = "datasets/results/test_ganimation_30"
opt = Options().parse()
solver = create_solver(opt)
solver.run_solver()

In this work, we activated 3 out of the first 16 AUs, thus resulting in 560 different facial expressions.

To perform the perceptual experiment:

MDR.py

Citation

If you use this code or ideas from the paper for your research, please cite our paper:

@article{yan2023combining,
  title={Combining GAN with reverse correlation to construct personalized facial expressions},
  author={Yan, Sen and Soladie, Catherine and Aucouturier, Jean-Julien and Seguier, Renaud},
  journal={PloS one},
  volume={},
  number={},
  pages={},
  year={2023},
  publisher={Public Library of Science San Francisco, CA USA}
}

Others

To optimize this process, please refer to our work IMGA.

Acknowledgement

This repository is built based on GANimation, and Openface. Sincere thanks to their wonderful works.

This project is supported by Randstad and ANR REFLETS.

CentraleSupelec Randstad

About

Official implementation of "Combining GAN with reverse correlation to construct personalized facial expressions".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages