You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a Python user, I would like to simplify the code required to load an inference model from Hugging Face (or private source) into Elasticsearch.
Use Case
Today eland is the way we load inference models into Elasticsearch. Many users will only use eland to load models, making it as simple to use will help the end-user experience.
Currently, in python, it requires several lines of code
eg:
Feature Request
As a Python user, I would like to simplify the code required to load an inference model from Hugging Face (or private source) into Elasticsearch.
Use Case
Today eland is the way we load inference models into Elasticsearch. Many users will only use eland to load models, making it as simple to use will help the end-user experience.
Currently, in python, it requires several lines of code
eg:
This could be abstracted into a simple function call such as
eland.ml.pytorch.import_model
and take in the minimum required parameters. Something likewhere it would return the model_id as it is known in Elasticsearch so it can be used later in the code to call inference
cc: @joshdevins
The text was updated successfully, but these errors were encountered: