forked from run-llama/llama-hub
-
Notifications
You must be signed in to change notification settings - Fork 7
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add downloadable modules (run-llama#646)
* wip * wip * json * update readmes * update readmes * add zephyr query engine pack * added llama packs for llava completion * Update library.json * Update base.py * basic tests * fix gmail agent * llama pack implementation, documentation, and notebook * incorporate pr feedback * update readme * Create streamlit_app.py * Create requirements.txt * Create README.md * Add files via upload * Update README.md * wip * wip * wip * Update README.md * Update README.md * incorporate pr feedback * update format * fix streamlit * adding deeplake's packs * adding trailing lines in inits * Fix copy * fix copy * minor tweaks * add library json * add redis ingest * add to library * tests * linting * trulens packs * readme center title * readme header * title * cta in readme * update readme to 3 pack * md change * md change to readme * uncomment pip install * linting * linting * reqs * shorten readme * add library json * linting * linting * feat: OpenAI Image Generation Tool (run-llama#628) * feat: dall-e-3 * chore: remove checkpoints * lint * cr * chore: use multi-modal as an example * chore: delete checkpoint * chore: fix tests and lint * cr * lint * wip * first gradio chatbot up and running * streaming works * wip * get everything working with two tools * wip * wip * Add timescale vector auto retriever pack * update readme * update readmes * delay import * linting * update readme --------- Co-authored-by: wglantz <wglantz@arisglobal.com> Co-authored-by: Wenqi Glantz <wenqiglantz@gmail.com> Co-authored-by: Alexander Song <axiomofjoy@gmail.com> Co-authored-by: Caroline Frasca (Lu) <42614552+carolinedlu@users.noreply.github.com> Co-authored-by: Simon Suo <simonsdsuo@gmail.com> Co-authored-by: AdkSarsen <adilkhan@activeloop.ai> Co-authored-by: Josh Reini <joshreini1@gmail.com> Co-authored-by: Emanuel Ferreira <contatoferreirads@gmail.com> Co-authored-by: Andrei Fajardo <andrei@nerdai.io> Co-authored-by: Matvey Arye <mat@timescale.com>
- Loading branch information
1 parent
6c3b049
commit e43726e
Showing
73 changed files
with
3,328 additions
and
13 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Empty file.
109 changes: 109 additions & 0 deletions
109
llama_hub/llama_packs/arize_phoenix_query_engine/README.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,109 @@ | ||
<center> | ||
<p style="text-align:center"> | ||
<img alt="phoenix logo" src="https://storage.googleapis.com/arize-assets/phoenix/assets/phoenix-logo-light.svg" width="200"/> | ||
<br> | ||
<a href="https://docs.arize.com/phoenix/">Docs</a> | ||
| | ||
<a href="https://github.com/Arize-ai/phoenix">GitHub</a> | ||
| | ||
<a href="https://join.slack.com/t/arize-ai/shared_invite/zt-1px8dcmlf-fmThhDFD_V_48oU7ALan4Q">Community</a> | ||
</p> | ||
</center> | ||
<h1 align="center">Arize-Phoenix LlamaPack</h1> | ||
|
||
This LlamaPack instruments your LlamaIndex app for LLM tracing with [Phoenix](https://github.com/Arize-ai/phoenix), an open-source LLM observability library from [Arize AI](https://phoenix.arize.com/). | ||
|
||
## CLI Usage | ||
|
||
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package: | ||
|
||
```bash | ||
llamaindex-cli download-llamapack ArizePhoenixQueryEnginePack --download-dir ./arize_pack | ||
``` | ||
|
||
You can then inspect the files at `./arize_pack` and use them as a template for your own project! | ||
|
||
## Code Usage | ||
|
||
You can download the pack to a the `./arize_pack` directory: | ||
|
||
```python | ||
from llama_index.llama_packs import download_llama_pack | ||
|
||
# download and install dependencies | ||
ArizePhoenixQueryEnginePack = download_llama_pack( | ||
"ArizePhoenixQueryEnginePack", "./arize_pack" | ||
) | ||
``` | ||
|
||
You can then inspect the files at `./arize_pack` or continue on to use the module. | ||
|
||
|
||
```python | ||
import os | ||
|
||
from llama_index.node_parser import SentenceSplitter | ||
from llama_index.readers import SimpleWebPageReader | ||
from tqdm.auto import tqdm | ||
``` | ||
|
||
Configure your OpenAI API key. | ||
|
||
|
||
```python | ||
os.environ["OPENAI_API_KEY"] = "copy-your-openai-api-key-here" | ||
``` | ||
|
||
Parse your documents into a list of nodes and pass to your LlamaPack. In this example, use nodes from a Paul Graham essay as input. | ||
|
||
|
||
```python | ||
documents = SimpleWebPageReader().load_data( | ||
[ | ||
"https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt" | ||
] | ||
) | ||
parser = SentenceSplitter() | ||
nodes = parser.get_nodes_from_documents(documents) | ||
phoenix_pack = ArizePhoenixQueryEnginePack(nodes=nodes) | ||
``` | ||
|
||
Run a set of queries via the pack's `run` method, which delegates to the underlying query engine. | ||
|
||
|
||
```python | ||
queries = [ | ||
"What did Paul Graham do growing up?", | ||
"When and how did Paul Graham's mother die?", | ||
"What, in Paul Graham's opinion, is the most distinctive thing about YC?", | ||
"When and how did Paul Graham meet Jessica Livingston?", | ||
"What is Bel, and when and where was it written?", | ||
] | ||
for query in tqdm(queries): | ||
print("Query") | ||
print("=====") | ||
print(query) | ||
print() | ||
response = phoenix_pack.run(query) | ||
print("Response") | ||
print("========") | ||
print(response) | ||
print() | ||
``` | ||
|
||
View your trace data in the Phoenix UI. | ||
|
||
|
||
```python | ||
phoenix_session_url = phoenix_pack.get_modules()["session_url"] | ||
print(f"Open the Phoenix UI to view your trace data: {phoenix_session_url}") | ||
``` | ||
|
||
You can access the internals of the LlamaPack, including your Phoenix session and your query engine, via the `get_modules` method. | ||
|
||
|
||
```python | ||
phoenix_pack.get_modules() | ||
``` | ||
|
||
Check out the [Phoenix documentation](https://docs.arize.com/phoenix/) for more information! |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
from llama_hub.llama_packs.arize_phoenix_query_engine.base import ( | ||
ArizePhoenixQueryEnginePack, | ||
) | ||
|
||
__all__ = ["ArizePhoenixQueryEnginePack"] |
186 changes: 186 additions & 0 deletions
186
llama_hub/llama_packs/arize_phoenix_query_engine/arize_phoenix_llama_pack.ipynb
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,186 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"<center>\n", | ||
" <p style=\"text-align:center\">\n", | ||
" <img alt=\"phoenix logo\" src=\"https://storage.googleapis.com/arize-assets/phoenix/assets/phoenix-logo-light.svg\" width=\"200\"/>\n", | ||
" <br>\n", | ||
" <a href=\"https://docs.arize.com/phoenix/\">Docs</a>\n", | ||
" |\n", | ||
" <a href=\"https://github.com/Arize-ai/phoenix\">GitHub</a>\n", | ||
" |\n", | ||
" <a href=\"https://join.slack.com/t/arize-ai/shared_invite/zt-1px8dcmlf-fmThhDFD_V_48oU7ALan4Q\">Community</a>\n", | ||
" </p>\n", | ||
"</center>\n", | ||
"<h1 align=\"center\">Arize-Phoenix LlamaPack</h1>\n", | ||
"\n", | ||
"This LlamaPack instruments your LlamaIndex app for LLM tracing with [Phoenix](https://github.com/Arize-ai/phoenix), an open-source LLM observability library from [Arize AI](https://phoenix.arize.com/)." | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Install and Import Dependencies" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"!pip install \"arize-phoenix[llama-index]\" llama-hub html2text" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"import os\n", | ||
"\n", | ||
"from llama_hub.llama_packs.arize_phoenix_query_engine import ArizePhoenixQueryEnginePack\n", | ||
"from llama_index.node_parser import SentenceSplitter\n", | ||
"from llama_index.readers import SimpleWebPageReader\n", | ||
"from tqdm.auto import tqdm" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Configure your OpenAI API key." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"os.environ[\"OPENAI_API_KEY\"] = \"copy-your-openai-api-key-here\"" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Parse your documents into a list of nodes and pass to your LlamaPack. In this example, use nodes from a Paul Graham essay as input." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"documents = SimpleWebPageReader().load_data(\n", | ||
" [\n", | ||
" \"https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt\"\n", | ||
" ]\n", | ||
")\n", | ||
"parser = SentenceSplitter()\n", | ||
"nodes = parser.get_nodes_from_documents(documents)\n", | ||
"phoenix_pack = ArizePhoenixQueryEnginePack(nodes=nodes)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Run a set of queries via the pack's `run` method, which delegates to the underlying query engine." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"queries = [\n", | ||
" \"What did Paul Graham do growing up?\",\n", | ||
" \"When and how did Paul Graham's mother die?\",\n", | ||
" \"What, in Paul Graham's opinion, is the most distinctive thing about YC?\",\n", | ||
" \"When and how did Paul Graham meet Jessica Livingston?\",\n", | ||
" \"What is Bel, and when and where was it written?\",\n", | ||
"]\n", | ||
"for query in tqdm(queries):\n", | ||
" print(\"Query\")\n", | ||
" print(\"=====\")\n", | ||
" print(query)\n", | ||
" print()\n", | ||
" response = phoenix_pack.run(query)\n", | ||
" print(\"Response\")\n", | ||
" print(\"========\")\n", | ||
" print(response)\n", | ||
" print()" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"View your trace data in the Phoenix UI." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"phoenix_session_url = phoenix_pack.get_modules()[\"session_url\"]\n", | ||
"print(f\"Open the Phoenix UI to view your trace data: {phoenix_session_url}\")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"You can access the internals of the LlamaPack, including your Phoenix session and your query engine, via the `get_modules` method." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"phoenix_pack.get_modules()" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Check out the [Phoenix documentation](https://docs.arize.com/phoenix/) for more information!" | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "llmapps", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.10.12" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 2 | ||
} |
Oops, something went wrong.