Skip to content

tensorflow implementation of the paper:Neural Text Generation with Unlikelihood Training

Notifications You must be signed in to change notification settings

chenywang/unlikelihood

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Neural Text deGeneration with Unlikelihood Training

tensorflow implementation of the paper: Neural Text Generation with Unlikelihood Training
Sean Welleck*, Ilia Kulikov*, Stephen Roller, Emily Dinan, Kyunghyun Cho, Jason Weston
*Equal contribution. The order was decided by a coin flip.

python

python 3.6

notice

add it to sequence_loss:

loss = seq2seq.sequence

loss = seq2seq.sequence_loss(
            logits=self.decoder_logits_train,
            targets=self.decoder_targets_train,
            weights=masks,
            average_across_timesteps=True,
            average_across_batch=True
        ) +
        alpha * sequence_unlikelihood_loss(logits, targets, weights)

I set alpha as 1.0

requirement

tensorflow==1.12.0

About

tensorflow implementation of the paper:Neural Text Generation with Unlikelihood Training

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages