Skip to content

Commit

Permalink
feat(#1132): introduce async/background version of rb.log (#1391)
Browse files Browse the repository at this point in the history
(cherry picked from commit aa67ed5)
  • Loading branch information
frascuchon committed May 10, 2022
1 parent 0d5d884 commit 900307e
Show file tree
Hide file tree
Showing 25 changed files with 590 additions and 441 deletions.
120 changes: 117 additions & 3 deletions docs/guides/monitoring.ipynb
Expand Up @@ -216,6 +216,120 @@
"dataset.map(make_prediction)"
]
},
{
"cell_type": "markdown",
"id": "6987c362-61b3-4682-aa7b-693bed30d3ae",
"metadata": {},
"source": [
"## Using `rb.log` in background mode\n",
"\n",
"You can monitor your own models without adding a response delay by using the `background` param in rb.log\n",
"\n",
"Let's see an example using [BentoML](https://www.bentoml.com/) with a spaCy NER pipeline:\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5a6e24b1-b665-4e44-be52-62e024927643",
"metadata": {},
"outputs": [],
"source": [
"import spacy\n",
"\n",
"nlp = spacy.load(\"en_core_web_sm\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "45740aee-e022-452d-8f77-fb8872eb7d11",
"metadata": {},
"outputs": [],
"source": [
"%%writefile spacy_model.py\n",
"\n",
"\n",
"from bentoml import BentoService, api, artifacts, env\n",
"from bentoml.adapters import JsonInput\n",
"from bentoml.frameworks.spacy import SpacyModelArtifact\n",
"\n",
"import rubrix as rb\n",
"\n",
"\n",
"@env(infer_pip_packages=True)\n",
"@artifacts([SpacyModelArtifact(\"nlp\")])\n",
"class SpacyNERService(BentoService):\n",
"\n",
" @api(input=JsonInput(), batch=True)\n",
" def predict(self, parsed_json_list):\n",
" result, rb_records = ([], [])\n",
" for index, parsed_json in enumerate(parsed_json_list):\n",
" doc = self.artifacts.nlp(parsed_json[\"text\"])\n",
" prediction = [{\"entity\": ent.text, \"label\": ent.label_} for ent in doc.ents]\n",
" rb_records.append(\n",
" rb.TokenClassificationRecord(\n",
" text=doc.text,\n",
" tokens=[t.text for t in doc],\n",
" prediction=[\n",
" (ent.label_, ent.start_char, ent.end_char) for ent in doc.ents\n",
" ],\n",
" )\n",
" )\n",
" result.append(prediction)\n",
" \n",
" rb.log(\n",
" name=\"monitor-for-spacy-ner\",\n",
" records=rb_records,\n",
" tags={\"framework\": \"bentoml\"},\n",
" background=True, \n",
" verbose=False\n",
" ) # By using the background=True, the model latency won't be affected\n",
" \n",
" return result\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "126b826f-c708-4b1d-898d-960058a7635d",
"metadata": {},
"outputs": [],
"source": [
"from spacy_model import SpacyNERService\n",
"\n",
"svc = SpacyNERService()\n",
"svc.pack('nlp', nlp)\n",
"\n",
"saved_path = svc.save()"
]
},
{
"cell_type": "markdown",
"id": "b5fadd39-87ed-4b7c-9f62-7cfdba0e2813",
"metadata": {},
"source": [
"You can predict some data without serving the model. Just launch following command:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "69736fa1-e263-4086-9a53-a0be399b22a1",
"metadata": {},
"outputs": [],
"source": [
"!bentoml run SpacyNERService:latest predict --input \"{\\\"text\\\":\\\"I am driving BMW\\\"}\""
]
},
{
"cell_type": "markdown",
"id": "8a1fe7dd-3ef0-42c6-bb2b-25a52a95c582",
"metadata": {},
"source": [
"If you're running Rubrix in local, go to http://localhost:6900/datasets/rubrix/monitor-for-spacy-ner and see that the new dataset `monitor-for-spacy-ner` contains your data"
]
},
{
"cell_type": "markdown",
"id": "c71b49ea-7384-423d-b2d8-ac6280b7a200",
Expand All @@ -229,7 +343,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
Expand All @@ -243,9 +357,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.5"
"version": "3.8.12"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
}
1 change: 0 additions & 1 deletion rublanding/rubrix-landing
Submodule rubrix-landing deleted from 342bc0
2 changes: 2 additions & 0 deletions src/rubrix/__init__.py
Expand Up @@ -36,6 +36,7 @@
init,
load,
log,
log_async,
set_workspace,
)
from rubrix.client.datasets import (
Expand All @@ -62,6 +63,7 @@
"init",
"load",
"log",
"log_async",
"set_workspace",
],
"client.models": [
Expand Down

0 comments on commit 900307e

Please sign in to comment.