New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Apache - 2.0 - Commercial License #50
Comments
The goal of the project is to end up with weights trained from scratch yes. The license on this repo is for the source code, we don't distribute the checkpoints from Meta. So yes users need to get them through the form and can use it only for research purposes as stated there in the agreement. This is temporary until there are checkpoints trained from scratch using the apache code. |
@Rock-Anderson on top of what @awaelchli said:
note that the LLaMA weights from Meta are not GPL licensed. They are released under a custom Meta license:
from https://ai.facebook.com/blog/large-language-model-llama-meta-ai/ |
Got it, thanks for educating, and for this repo contribution. The Readme mentioned truly open-sourced at the beginning and jumped to the section where we load official Llama weights, so I was confused a bit. Maybe adding that loading-weights and conversion to Lit-Llama for inference is only for research purposes, would help. |
@lantiga Do we want to change the wording in the "use the model" section to mention "for research purposes only" as suggested? |
Yes for sure, we want there to be no misunderstandings. I’ll change it shortly, thank you @Rock-Anderson for the input |
Hey team,
Thanks for releasing the code and repo under Apache-2.0
I'm still wondering though, as to how this would be truly open-sourced and commercialisable, if we're still loading official Llama weights (under GPL License) and converting them into Lit-Llama weights?
Or does it only mean that if we train from scratch using this code instead, without using the official Llama weights, then the end model could be used for commercial purposes?
Please help clarify.
TIA.
The text was updated successfully, but these errors were encountered: