Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove batched generation support #162

Merged
merged 2 commits into from
Apr 19, 2023

Conversation

carmocca
Copy link
Contributor

Unblocks #153
Closes #49

Copy link
Collaborator

@lantiga lantiga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks @carmocca!
We can always add a separate script supporting batch generation in the future.

@lantiga lantiga merged commit f9885ff into main Apr 19, 2023
@lantiga lantiga deleted the carmocca/remove-batch-generation-support branch April 19, 2023 06:34
# if the sequence context is growing too long we must crop it at max_seq_length
idx_cond = idx_cond if T <= max_seq_length else idx_cond[:, -max_seq_length:]
idx_cond = idx_cond if T <= max_seq_length else idx_cond[-max_seq_length:]

# forward
logits = model(idx_cond)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The model still expects a batch, so the input tensor needs to be expanded here.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. Merged too hastily, sorry about that

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Doesn't work with batched input
3 participants