Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove max_input_length from model.encode #227

Merged
merged 1 commit into from
Oct 30, 2023
Merged

Conversation

gsarti
Copy link
Member

@gsarti gsarti commented Oct 30, 2023

Description

Removing the logic to impose a max_length in absence of pre-specified max input size in model configuration. Closes #221.

@gsarti gsarti merged commit 1b52b04 into main Oct 30, 2023
4 checks passed
@gsarti gsarti deleted the drop-force-max-len branch October 30, 2023 09:48
gsarti added a commit that referenced this pull request Oct 30, 2023
* origin/main:
  Remove `max_input_length` from `model.encode` (#227)
gsarti added a commit that referenced this pull request Oct 30, 2023
* origin/main:
  Attributed behavior for contrastive step functions (#228)
  Fix command for installing pre-commit hooks. (#229)
  Remove `max_input_length` from `model.encode` (#227)
  Migrate to `ruff format` (#225)
  Remove contrast_target_prefixes from contrastive step functions (#224)
  Step functions fixes, add `in_context_pvi` (#223)
  Format fixes, add Attanasio et al. (2023) to readme
  Add Sequential IG method (#222)
  Fix LIME and Occlusion outputs (#220)
  Update citation information
  Bump dependencies
  Add end_pos for contrast_targets_alignments
  Fix dummy output viz in console
  Minor fixes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Long prompt for decoder only model
1 participant