You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This issue is to help keep track of API latency times and to put in the effort to document and improve on it.
Currently, response times are between 5.185s - 7s
Test response times in Terminal
time curl \
-X POST \
-H "Content-Type: application/json" \
https://kindly-api.azurewebsites.net/detect \
-d '{"text":"this movie is great"}'
The text was updated successfully, but these errors were encountered:
As a strategy to increase latency, we could convert the python model to a Tensorflow.js model and use it in a NodeJS endpoint to see if that would be faster
This can be a good resource to achieve that.
From what I have seen so far, you can only convert a model to Tensorflow.js if it was originally built with Tensorflow. I have no idea if the cardiffnlp model is based on Tensorflow. However, in the grand scheme of things, moving the backend completely to NodeJS would not be a problem if we are training our own model because we can train and serve the model with tensorflowjs
This issue is to help keep track of API latency times and to put in the effort to document and improve on it.
Currently, response times are between
5.185s - 7s
Test response times in Terminal
The text was updated successfully, but these errors were encountered: