New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(#1132): Introduce async version of our API #1391
Conversation
@frascuchon I did the refactoring discussed during the call. What's missing is updating the tests, documentation/docstrings, and maybe an improved behavior of the monitoring module (several retries, some automatic chunking, error messages), but maybe this can also be tackled in a separate PR. |
Codecov Report
@@ Coverage Diff @@
## master #1391 +/- ##
==========================================
+ Coverage 94.53% 94.67% +0.14%
==========================================
Files 130 129 -1
Lines 5889 5898 +9
==========================================
+ Hits 5567 5584 +17
+ Misses 322 314 -8
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
Since I didn't find a use case just using async future= rb.log(
name="monitor-dataset",
records=rb_records,
background=True,
)
# you can always wait for the results:
future.result() Anyway, the async log version remains visible in API in order to discover some use case using directly async function I've also included a minimal example in monitoring guides using bentoML + spaCy. Please, feel free to add your comments |
Co-authored-by: Daniel Vila Suero <daniel@recogn.ai>
I removed the From my side, this is good to go. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
YEAH!
Closes #1132
TODO:
Log in batches. Server performance can be compromised otherwise(NOT IN THIS PR)