Skip to content

fuzhenxin/text_style_transfer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

text_style_transfer

Model

Model is in model

model
    |style_transfer 
    |    |session_multi_decoder
    |    |    |train.sh
    |    |    |test.sh
    |    |    |com.sh
    |    |    |.......
    |    |
    |    |session_auto_encoder
    |    |    |similar to session_multi_decoder
    |    | 
    |    |session_style
    |         |similar to session_multi_decoder
    |data

Preprocess

cd model/style_transfer/data
python get_dict.py # generate vocabulary

Train and Test

$ cd model/style_transfer/session_multi_decoder
$ ./train.sh   # train model
$ ./test.sh    # test model
$ ./com.sh     # show results in compare.txt


Evaluation

Evaluation tool is in eval

Preprocess

  • put glove embedding in eval/word_emb
  • run bash run1.sh to copy results from model dir to current dir
  • test1 test2 test3 for different mode (autoencoder, style embedding. multi decoder)

Transfer Strength (Classifier)

$ python classifier data        # process data of classifier
$ python classifier train       # train classifier
$ python classifier test test1  # test classifier
                                # test1 is the test result dir
                                # results in test1/embedding/style0_classification.txt ...

Content reservation

$cd eval
$python emb_test.py test1   # test1 is the test result dir
                           # results in test1/embedding/style0_semantics.txt ...
                           

Finally, run python eval.py to show results collection.

Example:

dir_name model_type      transfer_strength content_reservation mixture
================================================================================
test1     embedding8 		0.267 		0.943880306299 	0.208126303212
test1 	embedding4 		0.485 		0.915346000157 	0.317023657029
test1 	embedding 		0.593 		0.896598659955 	0.356930373024
.................

Acknowledgment

Thanks for Fangfang Zhang and Yixin Zhang for helping compose data.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published