Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove forced max seq len for llama models #250

Merged
merged 1 commit into from
Apr 29, 2024
Merged

Conversation

EricLBuehler
Copy link
Owner

No description provided.

Copy link

Code Metrics Report
  ───────────────────────────────────────────────────────────────────────────────
Language                 Files     Lines   Blanks  Comments     Code Complexity
───────────────────────────────────────────────────────────────────────────────
Rust                        70     23589     1555       510    21524       1309
───────────────────────────────────────────────────────────────────────────────
Total                       70     23589     1555       510    21524       1309
───────────────────────────────────────────────────────────────────────────────
Estimated Cost to Develop 77,898
Estimated Schedule Effort 11.864694 months
Estimated People Required 5.076026
───────────────────────────────────────────────────────────────────────────────
Processed 779680 bytes, 0.780 megabytes (SI)
───────────────────────────────────────────────────────────────────────────────
  

@EricLBuehler EricLBuehler merged commit 2e6324f into master Apr 29, 2024
11 checks passed
@EricLBuehler EricLBuehler deleted the no_const_maxseqlen branch April 29, 2024 20:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant