Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question : API #70

Closed
theapache64 opened this issue Mar 24, 2020 · 10 comments
Closed

Question : API #70

theapache64 opened this issue Mar 24, 2020 · 10 comments

Comments

@theapache64
Copy link
Collaborator

@theapache64 theapache64 commented Mar 24, 2020

I've been developing a telegram bot using the API. Currently am using https://covid-middleware.deepset.ai/api/bert/question to get the answers.

curl -X POST \
  https://covid-middleware.deepset.ai/api/bert/question \
  -H 'content-type: application/json' \
  -d '{
	"question":"community spread?"
}'

but the swagger doesn't list this API and shows a different one with different structures.

So, my question is, Which API should I choose to get the answers? @tanaysoni

@tanaysoni

This comment has been minimized.

Copy link
Contributor

@tanaysoni tanaysoni commented Mar 24, 2020

Hi @theapache64, during the hackathon, the middleware was deprecated. The new updated APIs are available at https://covid-backend.deepset.ai. The documentation is at https://covid-backend.deepset.ai/docs.

I hope that clears the confusion.

@theapache64

This comment has been minimized.

Copy link
Collaborator Author

@theapache64 theapache64 commented Mar 24, 2020

Alright.

Can you explain the parameters ?

{
  "questions": [
    "string"
  ],
  "filters": {
    "additionalProp1": "string",
    "additionalProp2": "string",
    "additionalProp3": "string"
  },
  "top_k_reader": 0,
  "top_k_retriever": 0
}
@sfakir

This comment has been minimized.

Copy link
Collaborator

@sfakir sfakir commented Mar 24, 2020

@tanaysoni
I propose we hide the "model" from the the clients. because it's not responsibility of the clients to know what kind of models were used in the backend.

I create an MR to have a unified endpoint and internally we route to the right model, what do you think?

@tanaysoni

This comment has been minimized.

Copy link
Contributor

@tanaysoni tanaysoni commented Mar 24, 2020

@theapache64, a sample request looks like this:

{
    "questions": ["what are the symptoms?"],
    "top_k_retriever": 5 // return top 5 answers
}

The top_k_reader parameter can be ignored for FAQ style Question Answering.

With the current version, filters are tricky to implement as there's no endpoint to get a list of available filters. Feel free to create a PR if that's something useful for your use case!

@sfakir

This comment has been minimized.

Copy link
Collaborator

@sfakir sfakir commented Mar 24, 2020

@tanaysoni In MR74 I added POST /question/ask
as a proxy endpoint for our models.

It's my recommendation to have this very abstract endpoint we can keep and there won't be a change later, also we are able then to change the model internally without notifying all clients. We can have a chat about this later on.

@theapache64

This comment has been minimized.

Copy link
Collaborator Author

@theapache64 theapache64 commented Mar 24, 2020

@tanaysoni Which model should I use ? 1 or 2 ?

@tanaysoni

This comment has been minimized.

Copy link
Contributor

@tanaysoni tanaysoni commented Mar 24, 2020

@sfakir a unified endpoint sounds good to me!

I would propose to keep both, the unified endpoint and model-specific endpoints.

In the unified endpoint, we detect language of the request and route it to the appropriate model. For advanced use cases, clients can choose between the model-specific endpoints.

@tanaysoni

This comment has been minimized.

Copy link
Contributor

@tanaysoni tanaysoni commented Mar 24, 2020

@theapache64 model 1 is multi-lingual, while model 2 is exclusive English. If you're working only with English then model 2 might have better results.

@theapache64

This comment has been minimized.

Copy link
Collaborator Author

@theapache64 theapache64 commented Mar 25, 2020

@tanaysoni Which languages are supported in model 1?

@tholor

This comment has been minimized.

Copy link
Contributor

@tholor tholor commented Mar 26, 2020

In theory, it's multilingual. In practice, we only have reasonable amount of FAQ data for German and English right now.

We are currently switching to a generic API endpoint (see #72). So please use /question/ask once it's deployed (without any model id). We will then take care of routing it to the best model available for the detected language.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
4 participants
You can’t perform that action at this time.