Skip to content

Xuekai-Zhu/storytrans_public

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 

Repository files navigation

StoryTrans

This repo inculdes the code in the paper StoryTrans: Non-Parallel Story Author-Style Transfer with Discourse Representations and Content Enhancing (ACL 2023 Long Paper).

StoryTrans leverages discourse representations to capture source content information and transfer them to target styles with learnable style embeddings.

Main_figure

Prerequisites

The prerequisites for running the code are listed in the requirement.txt. Make sure you have the necessary environment and dependencies set up before proceeding with the installation and execution of the code.

To install the required dependencies, you can use pip or conda to install the package in requirement.txt.

Data and pre-trained classifier

All our constructed data are in text_style_transfer/data

├── Data
   └── en   # the English story including everyday stories and Shakespeare’s plays.
   └── zh   # the Chinese story including fairy tales, LuXun (LX), and JinYong (JY).

Data example:

"text": ["Cask . Marry , before he fell downe , when he perceiu ' d the common Heard was glad he refus ' d the Crowne , he pluckt me ope his Doublet , and offer ' d them his Throat to cut : and I had beene a man of any Occupation , if I would not haue taken him at a word , I would I might goe to Hell among the Rogues , and so hee fell ."], "style": "<Sp>", "mask_word": ["taken", "Throat", "refus", "Rogues", "Heard", "Doublet", "Occupation", "fell"], "text_mask": ["Cask . Marry , before he <mask> downe , when he perceiu ' d the common <mask> was glad he <mask> ' d the Crowne , he pluckt me ope his <mask> , and offer ' d them his <mask> to cut : and I had beene a man of any <mask> , if I would not haue <mask> him at a word , I would I might goe to Hell among the <mask> , and so hee <mask> ."]

The classifier is very easy to train, you can pre-train the style classifier by yourself or download at pre-trained classifier.

Quick Start

1. Training of Discourse Representation Transfer

Execute the following command to train for first stage:

sh text_style_transfer/StyTrans_tran.sh

2. Training of Content Preservation Enhancing

Execute the following command to train for second stage:

sh text_style_transfer/MaskFill_train.sh

3. Generation

Execute the following command to generate your style transfer texts:

sh text_style_transfer/StyTrans_stage_1_test_zh.sh
sh text_style_transfer/MaskFill_gen_zh.sh

Citation

Please kindly cite our paper if this paper and the code are helpful.

@inproceedings{zhu-etal-2023-storytrans,
    title = "{S}tory{T}rans: Non-Parallel Story Author-Style Transfer with Discourse Representations and Content Enhancing",
    author = "Zhu, Xuekai  and
      Guan, Jian  and
      Huang, Minlie  and
      Liu, Juan",
    booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
    month = jul,
    year = "2023",
    address = "Toronto, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.acl-long.827",
    pages = "14803--14819",
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages