You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to reproduce the Poem BLEU-2 result in the SeqGan paper, but I couldn't find out the vocabulary size used in the paper. In the RankGan paper, it uses a different dataset with size of 13,123 poems and filters out the words that occurs less than 5 times. Do you know the vocabulary size used in the SeqGan paper? Thanks a lot!
The text was updated successfully, but these errors were encountered:
Hi, have you reproduced the results? I tried use all words in training data, but got BLEU-2 ~0.394 for MLE, which is lower than reported. Besides, what is the configuration of your SeqGAN model? lstm_hidden_size 32, emb_dim 32?
I'm trying to reproduce the Poem BLEU-2 result in the SeqGan paper, but I couldn't find out the vocabulary size used in the paper. In the RankGan paper, it uses a different dataset with size of 13,123 poems and filters out the words that occurs less than 5 times. Do you know the vocabulary size used in the SeqGan paper? Thanks a lot!
The text was updated successfully, but these errors were encountered: