Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changes in Seldon open source license to BSL #3380

Closed
terrytangyuan opened this issue Jan 22, 2024 · 22 comments
Closed

Changes in Seldon open source license to BSL #3380

terrytangyuan opened this issue Jan 22, 2024 · 22 comments
Assignees

Comments

@terrytangyuan
Copy link
Member

We need to take some actions on the recent change of license in Seldon projects since we have Alibi integration for the explainer.

Here's the full email from Seldon regarding the license change:

Since Seldon was founded in 2014, its mission has been to accelerate the adoption of machine learning. Thanks to unwavering support and guidance from our customers, community members and investors, this vision has turned into a reality, with over 10 million machine learning models deployed across the world's most innovative companies.

Seldon has always committed itself to an open core business model, i.e. using open source as the foundation of the product suite, with advanced features, support and governance under commercial licensing models. As the market and Seldon have matured, it has become clear the value the Core and Alibi projects deliver is extensive, and no longer commensurate with the licensing model. To enable us to deliver even more value across our products, have better control over how they are commercialized, and reduce the threat of others benefitting from our efforts, we have decided it is appropriate to change the license of Core and Alibi. Our entry level serving product, MLServer, will remain open source under Apache 2.0.

What's changed?
From today, January 22, 2024, the license for future releases of Seldon's projects Core 1, Core 2, Alibi Detect and Alibi Explain will change from the Apache License Version 2.0 to the source available Business Source License v1.1 ("BSL").

What does this mean?
In practice, this means that use of the Core and Alibi projects under the BSL will be for non-production uses only, i.e. pre-production, staging and testing, with production use requiring payment under a commercial license.

We believe this change is a re-commitment to our open core principles. We now have an open source foundation with the MLServer project, a source available step-up with the Core projects, and commercially licensed enterprise grade products, with pricing across the suite that reflects a fair exchange of value.

The world of MLOps and AI is ever-changing, and we're excited to continue pushing the boundaries of innovation alongside our community and customers. For further information about the open source licensing change, or if you have any questions, please check out my full blog post, our FAQs or message your Community Account Manager in the Slack community.

Kind regards,

James Perry
CEO, Seldon

@thesuperzapper
Copy link

Here are two links for more context:

First, we need to confirm which components that we currently use are now licensed differently.
For example, do we use both Seldon Alibi and Seldon Core (note, it seems like Seldon MLServer is still Apache 2.0).

After that, I guess there are a few options for next steps:

  1. Convince Seldon to reverse course
  2. Remove all components which are now BSL
  3. Fork and keep some/all of the components as new Apache 2.0 projects (if we think any specific component is worth doing that for).

@yuzisun
Copy link
Member

yuzisun commented Jan 23, 2024

Two action items I think:

@thesuperzapper
Copy link

@yuzisun are we losing anything significant if we get rid of alibi?

Should we start an effort to make something similar in a truly open source way? (I'm sure we can get organizations to provide resources if necessary)

We could even start with the code base from the previously Apache 2.0 licensed alibi (if it's any good).

@yuzisun
Copy link
Member

yuzisun commented Jan 23, 2024

I am not sure how many KServe users are using Alibi, currently KServe is depending on Alibi 0.9.4 which is still under Apache license according to FAQ.

I think what we can do is to remove alibi type from the inference service yaml specification. User can still use previous alibi versions(<0.9.5) under apache license via custom explainer spec. We did this when we deprecated AIX explainer and left an example using custom explainer spec for AIX.

@terrytangyuan
Copy link
Member Author

@yuzisun Sounds good. We'll work on a PR to address this.

Should we start an effort to make something similar in a truly open source way? (I'm sure we can get organizations to provide resources if necessary)

@thesuperzapper I have a separate proposal to integrate with TrustyAI. See #3381 for details.

@TimKleinloog
Copy link
Contributor

TimKleinloog commented Jan 25, 2024

@yuzisun me and my team would like to keep maintaining a fork of Alibi under Apache license (or alternative library, to keep explainability present in KServe in a user friendly way). What would be needed in order to align our efforts?

I opened a proposal in #3391.

@terrytangyuan
Copy link
Member Author

terrytangyuan commented Jan 25, 2024

Let's use this issue to discuss Seldon-related dependencies. We are working on the removal of Alibi as outlined by @yuzisun #3380 (comment) and provide an example as custom explainer.

Regarding a fork, I doubt that we'd want to rely on vendor-maintained fork and don't want to have the risk of the fork going unmaintained again at some point.

@thesuperzapper
Copy link

@terrytangyuan @TimKleinloog if there is sufficient user-demand, and a group willing to maintain a fork of Alibi, I don't see a harm in at least starting the discussion.

@TimKleinloog
Copy link
Contributor

@terrytangyuan I also would like to use this issue to propose an alternative, since I don’t see a custom explainer as a good alternative. Especially in the EU where we are active explainability will increase in importance (due to transparency obligations in AI regulation). I would like to strengthen the position of KServe in this field in stead of weakening it.

@yuzisun
Copy link
Member

yuzisun commented Jan 26, 2024

@terrytangyuan I also would like to use this issue to propose an alternative, since I don’t see a custom explainer as a good alternative. Especially in the EU where we are active explainability will increase in importance (due to transparency obligations in AI regulation). I would like to strengthen the position of KServe in this field in stead of weakening it.

We can still bring in explainer runtime installed out of the box from kserve when we have a good alternative just like we provide triton, torchserve predictor runtimes. I am suggesting to remove the hard coded explainer type(alibi) from inference service spec and make it more extensible so later we can bring in the official explainer runtime. If people still want alibi they can use custom explainer, does this sound good to you ?

@nilayaishwarya
Copy link

nilayaishwarya commented Jan 26, 2024

I see that there is lot of traction about explainability. While we are going ahead to remove the alibi, we should already set some targets for what alternative can be provided. So I agree with @TimKleinloog @thesuperzapper maybe we should already start working upon other options. Because I think explainability in production level setting is still one of USPs of kserve, I would not like it become completely custom because it really reduces reusability of whatever standards are out there for explainability.

@TimKleinloog
Copy link
Contributor

@yuzisun that sounds like a solid plan. I would like to align our efforts to be able to create and maintain an explainer runtime, so we immediately can have an alternative.

@yuzisun
Copy link
Member

yuzisun commented Jan 26, 2024

I see that there is lot of traction about explainability. While we are going ahead to remove the alibi, we should already set some targets for what alternative can be provided. So I agree with @TimKleinloog @thesuperzapper maybe we should already start working upon other options. Because I think explainability in production level setting is still one of USPs of kserve, I would not like it become completely custom because it really reduces reusability of whatever standards are out there for explainability.

Just to be clear it is not custom, we want to make explainer pluggable as predictors so user can easily add explainer runtimes. kserve still ships a few official explainer runtimes out of the box. I am only suggesting using custom for alibi if people want to.

@nilayaishwarya
Copy link

@yuzisun Just to be clear. Do you mean creating different runtimes and plugging it in. Like these runtimes and explainer spec similar to model spec where we can choose runtimes build for different explainability libraries

@yuzisun
Copy link
Member

yuzisun commented Jan 26, 2024

@yuzisun Just to be clear. Do you mean creating different runtimes and plugging it in. Like these runtimes and explainer spec similar to model spec where we can choose runtimes build for different explainability libraries

Yes exactly.

@nilayaishwarya
Copy link

Sounds good, so in this issue we just scope removal of alibi. And in the other issues like one suggested by @TimKleinloog #3991. @terrytangyuan #3381 we should take up integration of new runtime approaches for explainability.

@terrytangyuan
Copy link
Member Author

I am moving the pluggable explainer to a separate issue to track #3398

@terrytangyuan
Copy link
Member Author

/assign

@bcvanmeurs
Copy link

  • Remove Seldon MLServer as we have equivalent features in KServe as well now, we already started removing MLServer reference in KServe docs.

Hi @yuzisun / @terrytangyuan is this the plan going forward, even though MLServer is still Apache 2.0? We are actually considering migrating from SeldonCore to KServe cause we could "just" switch the backend over to KServe but keep the models running on MLServer (for now, while we migrate them 1 by 1 later).

@thesuperzapper
Copy link

@yuzisun @terrytangyuan can we confirm if we are planning to keep using MLServer in KServe?

It seems to still be Apache 2.0:

@terrytangyuan
Copy link
Member Author

I don’t think we can remove it yet. See #3443 (comment)

@ksgnextuple
Copy link

ksgnextuple commented Apr 3, 2024

I know it;s a closed issue but does normal kserve serving runtime have the option of injecting version like mlserver had? Currently am using MlServe. So just wanted to know this before migration.

tjandy98 pushed a commit to tjandy98/kserve that referenced this issue Apr 10, 2024
* wip

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* comment out

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* fix wf

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* helm test

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* remove mlserver relate tests

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* fix lint

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* sklearnserver runtime

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* Fix test

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* fix

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* disable check

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* reunused imports

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* Add back mlserver

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* pre-commit fix

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* update storage url

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* fix build

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* fix codegen

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* revert uri

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* int_contents

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* Remove unused script

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* remove dockerfile

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* Empty-Commit

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* Empty-Commit

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

* Empty-Commit

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>

---------

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
Signed-off-by: tjandy98 <3953059+tjandy98@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants