Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New HFOIL 1.0 license - FAQ #744

Closed
julien-c opened this issue Jul 31, 2023 · 7 comments
Closed

New HFOIL 1.0 license - FAQ #744

julien-c opened this issue Jul 31, 2023 · 7 comments

Comments

@julien-c
Copy link
Member

julien-c commented Jul 31, 2023

As part of moving this repo to v1.0, we recently changed the repo's license going forward, to a non-fully-open source license that has some commercial restrictions.

I apologize for the confusion this change has created for users and members of the ML community. We wish we would have made that change earlier in the life of the repo. Frankly we didn’t carefully put thoughts about licensing for this library from day 1.

In this GH issue we would like to collate questions we've received and our replies to them. This post has been co-authored by @jeffboudier and @Narsil. 🙏


The FAQ is intended to provide guidance whether or not the usage restriction in HFOIL 1.0 applies to your TGI use case.

What does the license say?

All uses are allowed under HFOIL, except distributing TGI as a hosted or managed, and paid service, where the service grants users access to any substantial set of the features or functionality of TGI (you can read the license here). We try to break down what this means for you below.

What is the intention of the HFOIL 1.0 license?

TGI started as a project to power our internal products, and we see it as a critical component of our commercial solutions.
The goal of HFOIL is to double down our investment into TGI, while continuing to build TGI in the open and under a broadly commercially permissible license. The usage restriction is meant to prevent usage from compute providers and API wrappers commercializing TGI for enterprise production use-cases, without contributing back to TGI.

Do I need an agreement to use TGI for my use case?

You can use TGI without restrictions within any project or service that’s free to users.
You can use TGI without restrictions within any commercial project that’s used within your company (not sold to customers).
You can use TGI without restrictions within any commercial project sold to customers where you are not essentially selling a hosted or managed distribution of TGI.

To make it more specific with a few examples, you can use TGI under HFOIL 1.0 without restrictions in these example use cases:

  • My company built an “on-prem ChatGPT” internal service that’s serving an open source LLM with TGI via an API
    • The restriction does not apply because this is not a paid service.
  • My company offers a data analytics SaaS, where customers have access to a dashboard ; some of the features use predictions from LLMs which are made using TGI in a backend service
    • The restriction does not apply because the hosted paid service does not give its users access to TGI functionality
  • My company offers a paid application where end users can chat with assistants powered by LLMs using TGI in the backend
    • The restriction does not apply because the paid service users access is a chat application, not a LLM inference service.

In order to use TGI under HFOIL 1.0 for the following use cases, you need to reach out to set up an agreement with us:

  • My company offers a paid serverless inference service where customers can send requests to LLMs which are served by TGI - like Hugging Face Inference API
  • My company offers a paid LLM deployment service where customers can create inference endpoints deployed with TGI - like Hugging Face Inference Endpoints

What about using TGI via HF Inference Endpoints?

The HFOIL 1.0 restrictions do not apply to your usage of Hugging Face Inference Endpoints (where we are deploying TGI on your behalf).

I’m still not sure - what’s next?

If after reading the above it’s still not clear to you whether your TGI use case requires an agreement, you can email us at api-enterprise@huggingface.co with a short description of a) your paid services use case and b) where TGI fits within your stack.
If your TGI use case requires an agreement, we’ll set up a short discussion to set up an agreement with a commercial framework that makes sense for your use case.

Motivation and Inspiration for HFOIL

TGI is not the first open source project introducing granular licensing for commercial reasons. We are following a similar path to ElasticSearch offering new licenses after third-party companies started monetizing the open-source software without contributing back to the project. If you are experienced with Elastic License v2, HFOIL v1.0 may sound familiar, and that would be for good reason as we took inspiration from their work.

Why not AGPL

We considered AGPL but it is much more restrictive commercially.

Will transformers or diffusers have license restrictions too?

No.
We created HFOIL (a.k.a. Hugging Face Optimized Inference License) so we can keep building TGI in the open while powering our commercial products.
New libraries/projects developed as core components of our commercial products will be directly released under HFOIL from now on to provide clarity.

@casper-hansen
Copy link

casper-hansen commented Jul 31, 2023

  • My company offers a paid application where end users can chat with assistants powered by LLMs using TGI in the backend
    • The restriction does not apply because the paid service users access is a chat application, not a LLM inference service.

Consider this flow:

  1. User presses "Generate" in client/app with some input (could be any client, e.g. a chat client, Microsoft Word plugin, document drafting interface).
  2. Backend receives the request to generate and forwards it to TGI service.
  3. Backend streams back TGI response to client/app.

Why is this not a violation?

According to your license, "You may not distribute the Software as a hosted or managed, and paid service, where the service grants users access to any substantial set of the features or functionality of the Software.", access to an LLM served by TGI is granting access to a substantial functionality of the software since you are getting a response directly back from TGI and sending it to the client/app.

Edit: Following up, it seems that you should define what "[giving] users access to any substantial set of the features or functionality of the Software" means inside the actual license instead of a FAQ, as the license is what "counts".

@abb128
Copy link

abb128 commented Aug 1, 2023

We considered AGPL but it is much more restrictive commercially.

Why not dual-license with AGPL?

@jeffboudier
Copy link
Member

@abb128

Why not dual-license with AGPL?

Because AGPL would be much more restrictive commercially, for users of TGI. Under AGPL TGI users would be required to publish their codebase integrating TGI as open source under AGPL. TGI has a lot of enterprise users, who can continue to use and contribute to TGI under HFOIL, but would not be able to use TGI under AGPL. With dual licensing these enterprise users would need to reach out to us and set up a licensing agreement with us which is a lot of friction vs. the commercially permissible license HFOIL offers them.

@abb128
Copy link

abb128 commented Aug 1, 2023

@jeffboudier By dual licensing I mean granting the users permission to use TGI under the terms of either license, AGPL or HFOIL, at their own decision. If AGPL is too restrictive for the user's use case, they can use HFOIL. If HFOIL is too restrictive for the user's use case, they can use AGPL. There would be no need for enterprise users to reach out to set up a licensing agreement if the license already grants them use under the terms of HFOIL

@abdullahsych
Copy link

Does this license also restrict commercial usage of AWS DLCs with TGI v >= 1.0?

https://github.com/aws/deep-learning-containers/blob/master/available_images.md#huggingface-text-generation-inference-containers

@jeffboudier
Copy link
Member

@abdullahsych no, your usage of TGI v1.0+ within the HuggingFace Text Generation Inference Containers DLCs on SageMaker is not restricted by the HFOIL 1.0 license, thanks to the agreement between Hugging Face and Amazon.

@OlivierDehaene
Copy link
Member

OlivierDehaene commented Apr 8, 2024

I am very happy to announce that the license was reverted to Apache 2.0.

This concerns both TGI and the Text Embeddings Inference repository.

We reflected a lot on the change to HFoil since July. We understand how alienating this change must have felt for our users and we are sorry.

At the time we felt that this change was necessary to safeguard our inference solutions from larger companies. A lot of things have changed in the past year and the reasons we made the change in the first place are not applicable today.

This decision is irreversible; the repository will remain under the Apache 2.0 license for all forthcoming releases.

Myself and the team are super exicted by this change and for the future of HuggingFace inference solutions :)

@OlivierDehaene OlivierDehaene unpinned this issue Apr 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants