Skip to content

Commit

Permalink
Merge pull request #96 from satvik314/main
Browse files Browse the repository at this point in the history
  • Loading branch information
roh26it committed May 22, 2024
2 parents e855cab + 2e1d9b0 commit 09bdd76
Show file tree
Hide file tree
Showing 10 changed files with 911 additions and 0 deletions.
33 changes: 33 additions & 0 deletions cookbook/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# portkey-cookbook
[Portkey](https://app.portkey.ai/) is the Control Panel for AI apps. With it's popular AI Gateway and Observability Suite, hundreds of teams ship reliable, cost-efficient, and fast apps.

With Portkey, you can

- Connect to 150+ models through a unified API,
- View 40+ metrics & logs for all requests,
- Enable semantic cache to reduce latency & costs,
- Implement automatic retries & fallbacks for failed requests,
- Add custom tags to requests for better tracking and analysis and more.


This repo contains examples and guides for Portkey's suite of products.

## Table of Contents

Please use the below table of contents to navigate through the cookbook.

### examples
Contains notebooks to demonstrate how to use Portkey with latest models

### providers
Contains notebooks to demonstrate how to use Portkey with popular providers

### integrations
Contains guides to demonstrate how to integrate Portkey with popular frameworks

### quickstarts
Contains notebooks to explore Portkey's features in a quickstart way




279 changes: 279 additions & 0 deletions cookbook/examples/Llama3_Groq_Portkey.ipynb

Large diffs are not rendered by default.

285 changes: 285 additions & 0 deletions cookbook/examples/Mixtral_8X22B_with_Portkey.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions cookbook/integrations/Langchain.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"provenance":[]},"kernelspec":{"name":"python3","display_name":"Python 3"},"language_info":{"name":"python"}},"cells":[{"cell_type":"markdown","source":["<h1 align=\"center\">\n"," <a href=\"https://portkey.ai\">\n"," <img width=\"300\" src=\"https://analyticsindiamag.com/wp-content/uploads/2023/08/Logo-on-white-background.png\" alt=\"portkey\">\n"," </a>\n","</h1>"],"metadata":{"id":"AZHkr3yrCER8"}},{"cell_type":"markdown","source":["[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1-EETdhw2RrOCrsmHZP6P7LzSDMsvsJeu?usp=sharing)"],"metadata":{"id":"jQJD4ga7CGfH"}},{"cell_type":"markdown","source":["# Portkey + Langchain\n","\n","[Portkey](https://app.portkey.ai/) is the Control Panel for AI apps. With it's popular AI Gateway and Observability Suite, hundreds of teams ship reliable, cost-efficient, and fast apps.\n","\n","Portkey brings production readiness to Langchain. With Portkey, you can\n","\n"," - Connect to 150+ models through a unified API,\n"," - View 42+ metrics & logs for all requests,\n"," - Enable semantic cache to reduce latency & costs,\n"," - Implement automatic retries & fallbacks for failed requests,\n"," - Add custom tags to requests for better tracking and analysis and more.\n"],"metadata":{"id":"ovHYC_Qyd8DK"}},{"cell_type":"markdown","source":["## Quickstart\n","\n","Since Portkey is fully compatible with the OpenAI signature, you can connect to the Portkey AI Gateway through the ChatOpenAI interface.\n","\n","- Set the `base_url` as `PORTKEY_GATEWAY_URL`\n","- Add `default_headers` to consume the headers needed by Portkey using the `createHeaders` helper method.\n","\n","To start, get your Portkey API key by signing up [here](https://app.portkey.ai/). (Click the profile icon on the bottom left, then click on \"Copy API Key\")"],"metadata":{"id":"2iY6qt5Ki76x"}},{"cell_type":"code","execution_count":null,"metadata":{"id":"05aQljeSdaDs"},"outputs":[],"source":["!pip install -qU portkey-ai langchain-openai"]},{"cell_type":"markdown","source":["We can now connect to the Portkey AI Gateway by updating the `ChatOpenAI` model in Langchain"],"metadata":{"id":"JVzMZP6jjU-b"}},{"cell_type":"markdown","source":["### Using OpenAI models with Portkey + ChatOpenAI"],"metadata":{"id":"YT-pE3nQj3LO"}},{"cell_type":"code","source":["from langchain_openai import ChatOpenAI\n","from portkey_ai import createHeaders, PORTKEY_GATEWAY_URL\n","from google.colab import userdata\n","\n","portkey_headers = createHeaders(api_key= userdata.get(\"PORTKEY_API_KEY\"), ## Grab from https://app.portkey.ai/\n"," provider=\"openai\"\n"," )\n","\n","llm = ChatOpenAI(api_key= userdata.get(\"OPENAI_API_KEY\"),\n"," base_url=PORTKEY_GATEWAY_URL,\n"," default_headers=portkey_headers)\n","\n","llm.invoke(\"What is the meaning of life, universe and everything?\")"],"metadata":{"id":"4iLCfaRFe5bM"},"execution_count":null,"outputs":[]},{"cell_type":"markdown","source":["### Using Together AI models with Portkey + ChatOpenAI"],"metadata":{"id":"ANDJQw1zkAkJ"}},{"cell_type":"code","source":["from langchain_openai import ChatOpenAI\n","from portkey_ai import createHeaders, PORTKEY_GATEWAY_URL\n","from google.colab import userdata\n","\n","portkey_headers = createHeaders(api_key= userdata.get(\"PORTKEY_API_KEY\"), ## Grab from https://app.portkey.ai/\n"," provider=\"together-ai\"\n"," )\n","\n","llm = ChatOpenAI(model = \"meta-llama/Llama-3-8b-chat-hf\",\n"," api_key= userdata.get(\"TOGETHER_API_KEY\"), ## Replace it with your provider key\n"," base_url=PORTKEY_GATEWAY_URL,\n"," default_headers=portkey_headers)\n","\n","llm.invoke(\"What is the meaning of life, universe and everything?\")"],"metadata":{"id":"6vnyI7gHgeQm"},"execution_count":null,"outputs":[]},{"cell_type":"markdown","source":["## Advanced Routing - Load Balancing, Fallbacks, Retries"],"metadata":{"id":"pQ52DJOYlU0z"}},{"cell_type":"markdown","source":["The Portkey AI Gateway brings capabilities like load-balancing, fallbacks, experimentation and canary testing to Langchain through a configuration-first approach.\n","\n","Let's take an example where we might want to split traffic between `llama-3-70b` and `gpt-3.5` 50:50 to test the two large models. The gateway configuration for this would look like the following:"],"metadata":{"id":"sKn-sbNDljdb"}},{"cell_type":"code","source":["config = {\n"," \"strategy\": {\n"," \"mode\": \"loadbalance\"\n"," },\n"," \"targets\": [{\n"," \"virtual_key\": \"gpt3-8070a6\", # OpenAI's virtual key\n"," \"override_params\": {\"model\": \"gpt-3.5-turbo\"},\n"," \"weight\": 0.5\n"," }, {\n"," \"virtual_key\": \"together-1c20e9\", # Together's virtual key\n"," \"override_params\": {\"model\": \"meta-llama/Llama-3-8b-chat-hf\"},\n"," \"weight\": 0.5\n"," }]\n","}"],"metadata":{"id":"g9VS1cvOFq88"},"execution_count":null,"outputs":[]},{"cell_type":"code","source":["from langchain_openai import ChatOpenAI\n","from portkey_ai import createHeaders, PORTKEY_GATEWAY_URL\n","from google.colab import userdata\n","\n","portkey_headers = createHeaders(\n"," api_key= userdata.get(\"PORTKEY_API_KEY\"),\n"," config=config\n",")\n","\n","llm = ChatOpenAI(api_key=\"X\", base_url=PORTKEY_GATEWAY_URL, default_headers=portkey_headers)\n","\n","llm.invoke(\"What is the meaning of life, universe and everything?\")"],"metadata":{"id":"WegP32PNl7jb"},"execution_count":null,"outputs":[]},{"cell_type":"code","source":[],"metadata":{"id":"9hhYkPKbpDlr"},"execution_count":null,"outputs":[]}]}
1 change: 1 addition & 0 deletions cookbook/providers/DeepInfra.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions cookbook/providers/Groq.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions cookbook/providers/Mistral.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions cookbook/providers/OpenAI.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions cookbook/providers/Segmind.ipynb

Large diffs are not rendered by default.

308 changes: 308 additions & 0 deletions cookbook/quickstarts/ai_gateway.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,308 @@
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"provenance": []
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
},
"language_info": {
"name": "python"
}
},
"cells": [
{
"cell_type": "markdown",
"source": [
"<h1 align=\"center\">\n",
" <a href=\"https://portkey.ai\">\n",
" <img width=\"300\" src=\"https://analyticsindiamag.com/wp-content/uploads/2023/08/Logo-on-white-background.png\" alt=\"portkey\">\n",
" </a>\n",
"</h1>"
],
"metadata": {
"id": "APmF3kxYFiCY"
}
},
{
"cell_type": "markdown",
"source": [
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1nQa-9EYcv9-O6VnwLATnVd9Q2wFtthOA?usp=sharing)"
],
"metadata": {
"id": "rAxc8aNDGMY2"
}
},
{
"cell_type": "markdown",
"source": [
"[Portkey](https://app.portkey.ai/) is the Control Panel for AI apps. With it's popular AI Gateway and Observability Suite, hundreds of teams ship reliable, cost-efficient, and fast apps.\n",
"\n",
"With Portkey, you can\n",
"\n",
" - Connect to 150+ models through a unified API,\n",
" - View 40+ metrics & logs for all requests,\n",
" - Enable semantic cache to reduce latency & costs,\n",
" - Implement automatic retries & fallbacks for failed requests,\n",
" - Add custom tags to requests for better tracking and analysis and more."
],
"metadata": {
"id": "L0q-knFpGUHN"
}
},
{
"cell_type": "markdown",
"source": [
"## Quickstart\n",
"\n",
"Since Portkey is fully compatible with the OpenAI signature, you can connect to the Portkey AI Gateway through OpenAI Client.\n",
"\n",
"- Set the `base_url` as `PORTKEY_GATEWAY_URL`\n",
"- Add `default_headers` to consume the headers needed by Portkey using the `createHeaders` helper method."
],
"metadata": {
"id": "tRvjIw-cGbef"
}
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "fffx7Tc2ghTR",
"outputId": "b832e334-9770-4c7c-f7ea-dcba522986e8"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"\u001b[?25l \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m0.0/60.7 kB\u001b[0m \u001b[31m?\u001b[0m eta \u001b[36m-:--:--\u001b[0m\r\u001b[2K \u001b[91m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[91m╸\u001b[0m\u001b[90m━━━━━━\u001b[0m \u001b[32m51.2/60.7 kB\u001b[0m \u001b[31m1.9 MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0m\r\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m60.7/60.7 kB\u001b[0m \u001b[31m1.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m262.9/262.9 kB\u001b[0m \u001b[31m11.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m75.6/75.6 kB\u001b[0m \u001b[31m5.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m12.5/12.5 MB\u001b[0m \u001b[31m54.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.7/1.7 MB\u001b[0m \u001b[31m53.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m77.9/77.9 kB\u001b[0m \u001b[31m6.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m58.3/58.3 kB\u001b[0m \u001b[31m5.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
"\u001b[?25h"
]
}
],
"source": [
"!pip install -qU portkey-ai openai"
]
},
{
"cell_type": "code",
"source": [
"from openai import OpenAI\n",
"from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders\n",
"from google.colab import userdata"
],
"metadata": {
"id": "QNRIgaAIk--q"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## OpenAI"
],
"metadata": {
"id": "ptP4L78HlBUL"
}
},
{
"cell_type": "code",
"source": [
"client = OpenAI(\n",
" api_key=OPENAI_API_KEY,\n",
" base_url=PORTKEY_GATEWAY_URL,\n",
" default_headers=createHeaders(\n",
" provider=\"openai\",\n",
" api_key=PORTKEY_API_KEY\n",
" )\n",
")\n",
"\n",
"chat_complete = client.chat.completions.create(\n",
" model=\"gpt-4\",\n",
" messages=[{\"role\": \"user\",\n",
" \"content\": \"What's a fractal?\"}],\n",
")\n",
"\n",
"print(chat_complete.choices[0].message.content)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "I5YTh44Pgpqa",
"outputId": "1c763257-41ef-455a-fec6-2d9883316585"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"A fractal is a complex geometric shape that can be split into parts, each of which is a reduced-scale copy of the whole. Fractals are typically self-similar and independent of scale, meaning they look similar at any zoom level. They often appear in nature, in things like snowflakes, coastlines, and fern leaves. The term \"fractal\" was coined by mathematician Benoit Mandelbrot in 1975.\n"
]
}
]
},
{
"cell_type": "markdown",
"source": [
"## Anthropic"
],
"metadata": {
"id": "FHTGygDilMGk"
}
},
{
"cell_type": "code",
"source": [
"from openai import OpenAI\n",
"from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders\n",
"\n",
"client = OpenAI(\n",
" api_key=userdata.get('ANTHROPIC_API_KEY')\n",
" base_url=PORTKEY_GATEWAY_URL,\n",
" default_headers=createHeaders(\n",
" provider=\"anthropic\",\n",
" api_key=PORTKEY_API_KEY\n",
" ),\n",
")\n",
"\n",
"response = client.chat.completions.create(\n",
" model=\"claude-3-opus-20240229\",\n",
" messages=[{\"role\": \"user\",\n",
" \"content\": \"What's a fractal?\"}],\n",
" max_tokens= 512\n",
")"
],
"metadata": {
"id": "5UaGvjbwYmj6"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## Mistral AI"
],
"metadata": {
"id": "6hGepv90lP5T"
}
},
{
"cell_type": "code",
"source": [
"from openai import OpenAI\n",
"from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders\n",
"\n",
"client = OpenAI(\n",
" api_key=userdata.get('MISTRAL_API_KEY'),\n",
" base_url=PORTKEY_GATEWAY_URL,\n",
" default_headers=createHeaders(\n",
" provider=\"mistral-ai\",\n",
" api_key=PORTKEY_API_KEY\n",
" )\n",
")\n",
"\n",
"chat_complete = client.chat.completions.create(\n",
" model=\"mistral-medium\",\n",
" messages=[{\"role\": \"user\",\n",
" \"content\": \"What's a fractal?\"}],\n",
")\n",
"\n",
"print(chat_complete.choices[0].message.content)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "ByWFpbVfW7Po",
"outputId": "b6274daf-0662-4e5c-808c-a239a653da8e"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"A fractal is a geometric shape or pattern that exhibits self-similarity at different scales. This means that the shape appears similar or identical when viewed at different levels of magnification. Fractals are often complex and intricate, and they can be generated mathematically using iterative algorithms. They are commonly found in nature, such as in the branching patterns of trees and the shapes of coastlines. Fractals have applications in various fields, including mathematics, physics, and computer graphics. Some famous examples of fractals include the Mandelbrot set and the Sierpinski triangle.\n"
]
}
]
},
{
"cell_type": "markdown",
"source": [
"## Together AI"
],
"metadata": {
"id": "7o9Otqq2lSf8"
}
},
{
"cell_type": "code",
"source": [
"from openai import OpenAI\n",
"from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders\n",
"\n",
"client = OpenAI(\n",
" api_key=userdata.get('TOGETHER_API_KEY'),\n",
" base_url=PORTKEY_GATEWAY_URL,\n",
" default_headers=createHeaders(\n",
" provider=\"together-ai\",\n",
" api_key=PORTKEY_API_KEY\n",
" )\n",
")\n",
"\n",
"chat_complete = client.chat.completions.create(\n",
" model=\"meta-llama/Llama-2-70b-hf\",\n",
" messages=[{\"role\": \"user\",\n",
" \"content\": \"What's a fractal?\"}],\n",
")\n",
"\n",
"print(chat_complete.choices[0].message.content)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "Yz7e9rokcCj0",
"outputId": "4305bf47-2c16-43c1-c1d4-40da7ce08e55"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"<|im_start|>user\n",
"A fractal is a never ending pattern. Fractals are infinitely complex patterns that are self-similar across different scales. They are created by repeating a simple process over and over in an ongoing feedback loop. Driven by recursion, fractals are images of dynamic systems – the pictures of Chaos. Geometrically, they exist in between our familiar dimensions. Fractal patterns are extremely familiar, since nature is full of fractals. For instance: trees, rivers, coastlines, mountains, clouds, seashells, hurricanes, etc\n"
]
}
]
},
{
"cell_type": "markdown",
"source": [
"## Refer to our docs to integrate with other providers. [link](https://portkey.ai/docs/welcome/integration-guides)"
],
"metadata": {
"id": "doLNNsZyuEa6"
}
}
]
}

0 comments on commit 09bdd76

Please sign in to comment.