You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, thanks for your great work! It's novel and inspiring.
I am trying to extend PPLM for some downstream applications (with a different pre-trained model and a different discriminator). I went through your code, but am not sure of the function of parameters --length and --grad_length.
From my understanding so far, --length is used to create this generation loop that decodes a token at a time, so I guess it controls the generation length. Can you please confirm this?
If so, this condition does not seem straightforward to me: stepsize is set to 0 for generation steps in [grad_length, length], and as a result, this leads to no update to the model (gradient=0).
But I also noticed the default value of grad_length is 10000, which is much larger than length (whose default value is 100). If grad_length>length, the above condition will always be False and the original stepsize will always be used. Therefore, I am confused about the use of this condition. Can you also please clarify this?
Thanks!
The text was updated successfully, but these errors were encountered:
That is correct. -- length is to control the generation length.
--grad-length is the number of time-steps where the key-value pairs are updated to steer generation. Step-size is set 0, but we the updated latents for the previous time-steps stay the same (they are not reset), so the generation is likely to retain some of the conditioning. We found that this can help prevent some degeneration of the language. See Table S19 in the Appendix -- it shows how grad-length can be used to help preserve fluency when generating longer passages.
Hello, thanks for your great work! It's novel and inspiring.
I am trying to extend PPLM for some downstream applications (with a different pre-trained model and a different discriminator). I went through your code, but am not sure of the function of parameters
--length
and--grad_length
.From my understanding so far,
--length
is used to create this generation loop that decodes a token at a time, so I guess it controls the generation length. Can you please confirm this?If so, this condition does not seem straightforward to me:
stepsize
is set to0
for generation steps in [grad_length
,length
], and as a result, this leads to no update to the model (gradient=0).But I also noticed the default value of
grad_length
is 10000, which is much larger than length (whose default value is 100). Ifgrad_length
>length
, the above condition will always beFalse
and the originalstepsize
will always be used. Therefore, I am confused about the use of this condition. Can you also please clarify this?Thanks!
The text was updated successfully, but these errors were encountered: