Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Llama #199

Merged
merged 23 commits into from
May 2, 2023
Merged

Add Llama #199

merged 23 commits into from
May 2, 2023

Conversation

seanmor5
Copy link
Contributor

Adds the popular Llama model. The model builds and the parameter map is correct, though the test is not passing. I'll look deeper into why the values differ tomorrow.

@seanmor5
Copy link
Contributor Author

@jonatanklosko Seems there's something with the tokenizer we don't support?

Tokenizer(Error("data did not match any variant of untagged enum NormalizerWrapper", line: 49, column: 3))

@jonatanklosko
Copy link
Member

@seanmor5 perhaps we need to bump rust tokenizers in elixir-nx/tokenizers?

@jonatanklosko
Copy link
Member

@philss I tried bumping rust tokenizers to 0.13.3, but I got:

error[E0432]: unresolved import `onig`
 --> /Users/jonatanklosko/.asdf/installs/rust/nightly/registry/src/github.com-1ecc6299db9ec823/tokenizers-0.13.3/src/utils/onig.rs:3:5
  |
3 | use onig::Regex;
  |     ^^^^ help: a similar path exists: `super::onig`

error[E0433]: failed to resolve: use of undeclared crate or module `onig`
  --> /Users/jonatanklosko/.asdf/installs/rust/nightly/registry/src/github.com-1ecc6299db9ec823/tokenizers-0.13.3/src/utils/onig.rs:12:60
   |
12 |     pub fn find_iter<'r, 't>(&'r self, inside: &'t str) -> onig::FindMatches<'r, 't> {
   |                                                            ^^^^ use of undeclared crate or module `onig`

Some errors have detailed explanations: E0432, E0433.
For more information about an error, try `rustc --explain E0432`.
error: could not compile `tokenizers` due to 2 previous errors

Are they are missing feature flags in tokenizers? Adding explicit dependency on onig didn't help either, but I don't know what I'm doing :)

@seanmor5
Copy link
Contributor Author

@jonatanklosko Not sure, I got the same thing

@philss
Copy link

philss commented Apr 19, 2023

@jonatanklosko I think the problem is that they are declaring onig as an optional dependency here: https://github.com/huggingface/tokenizers/blob/d19bc63c6770a8ef5e59c816b653b68cca329cde/tokenizers/Cargo.toml#L44

But they are "requiring" in the code as it would be always available.
A quick-fix for your bumping is to declare in our crate (ex_tokenizers) the dependency with the default feature, that is adding onig:

tokenizers = { version = "0.13.3", default-features = false, features = ["default"]}

After that, run cargo update.

@philss
Copy link

philss commented Apr 19, 2023

@jonatanklosko another fix is to use the "unstable_wasm" feature instead of "default". It is what this issue suggests: huggingface/tokenizers#1104
I think it may be better for us, since it includes less dependencies 😬

tokenizers = { version = "0.13.3", default-features = false, features = ["unstable_wasm"]}

@seanmor5
Copy link
Contributor Author

This one just needs a new tokenizers release now!

lib/bumblebee.ex Outdated Show resolved Hide resolved
@philss
Copy link

philss commented Apr 19, 2023

This one just needs a new tokenizers release now!

@seanmor5 I'm working on that :)

@philss
Copy link

philss commented Apr 19, 2023

@seanmor5 @jonatanklosko version 0.3.2 of tokenizers was just released!

@jonatanklosko
Copy link
Member

@philss all passing, thanks :shipit:

Co-authored-by: Jonatan Kłosko <jonatanklosko@gmail.com>
@feynmanliang
Copy link

feynmanliang commented Apr 21, 2023

A thought on usability: how do commercial users easily discover that Llama (and its finetunes) are GPL v3 which is usually not license compatible, compared the the Apache 2.0 one usually expects in Elixir OSS?

@jonatanklosko
Copy link
Member

A thought on usability: how do commercial users easily discover that Llama (and its finetunes) are GPL v3 which is usually not license compatible, compared the the Apache 2.0 one usually expects in Elixir OSS?

Fair point, though I'm not sure there's anything specific we should do. A commercial user should always check the huggingface repository for license and other details. The elixir codebase is one thing, but fetching a model from Hugging Face is clearly downloading external files and should be treated as such.

Technically if someone trained llama from scratch using HF transformers I believe it could be distributed under a more permissive license.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants