-
-
Notifications
You must be signed in to change notification settings - Fork 392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sequence length independent generation #45
Comments
@tmphex i believe there's already support for detecting end of string tokens https://github.com/lucidrains/x-transformers/blob/main/x_transformers/autoregressive_wrapper.py#L45 |
as for beam search, let me think about it - i saw some paper out there with some fast optimized beam search, and maybe its worth creating a separate repo for that |
Thanks @lucidrains for pointing out the I look forward to try out the optimized beam search once available. |
@tmphex ohh yes, you indeed found a bug, thank you! one other thing is that the |
@lucidrains just pinging if you gotten around fixing the |
Newly released Fastseq (https://github.com/microsoft/fastseq) might be interesting to integrate with when you plan to work on optimizing generation and beam search support |
@lucidrains seems like |
Currently generation require passing sequence length to generate sequences of given length but say in tasks such as summary or translation, one doesn't know about the final sequence length. Currently I am trying to generate candidates with passing various lengths as work around. Also is it possible to add support for beam search method for generation in addition to current top_p/top_k methods.
The text was updated successfully, but these errors were encountered: