Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add function calling program #12980

Merged
merged 6 commits into from
Apr 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
60 changes: 60 additions & 0 deletions docs/docs/examples/llm/mistralai.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -631,6 +631,66 @@
" print(f\"Name: {s.tool_name}, Input: {s.raw_input}, Output: {str(s)}\")"
]
},
{
"cell_type": "markdown",
"id": "96430944-27b6-45e1-8013-41460252ecb2",
"metadata": {},
"source": [
"## Structured Prediction\n",
"\n",
"An important use case for function calling is extracting structured objects. LlamaIndex provides an intuitive interface for this through `structured_predict` - simply define the target Pydantic class (can be nested), and given a prompt, we extract out the desired object."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "13e43ccb-1cbf-49e3-aa34-fdcbe00865bf",
"metadata": {},
"outputs": [],
"source": [
"from llama_index.llms.mistralai import MistralAI\n",
"from llama_index.core.prompts import PromptTemplate\n",
"from pydantic import BaseModel\n",
"\n",
"\n",
"class Restaurant(BaseModel):\n",
" \"\"\"A restaurant with name, city, and cuisine.\"\"\"\n",
"\n",
" name: str\n",
" city: str\n",
" cuisine: str\n",
"\n",
"\n",
"llm = MistralAI(model=\"mistral-large-latest\")\n",
"prompt_tmpl = PromptTemplate(\n",
" \"Generate a restaurant in a given city {city_name}\"\n",
")\n",
"restaurant_obj = llm.structured_predict(\n",
" Restaurant, prompt_tmpl, city_name=\"Miami\"\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "850f02e7-1bb1-4e5f-870d-6528a36d4d22",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"Restaurant(name='Mandolin Aegean Bistro', city='Miami', cuisine='Greek')"
]
},
"execution_count": null,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"restaurant_obj"
]
},
{
"cell_type": "markdown",
"id": "5152a2b4-78e6-47a5-933d-f5186ec0f775",
Expand Down
64 changes: 63 additions & 1 deletion docs/docs/examples/llm/openai.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -322,7 +322,9 @@
"source": [
"## Function Calling\n",
"\n",
"OpenAI models have native support for function calling. This conveniently integrates with LlamaIndex tool abstractions, letting you plug in any arbitrary Python function to the LLM."
"OpenAI models have native support for function calling. This conveniently integrates with LlamaIndex tool abstractions, letting you plug in any arbitrary Python function to the LLM.\n",
"\n",
"In the example below, we define a function to generate a Song object."
]
},
{
Expand Down Expand Up @@ -411,6 +413,66 @@
" print(f\"Name: {s.tool_name}, Input: {s.raw_input}, Output: {str(s)}\")"
]
},
{
"cell_type": "markdown",
"id": "7ede8d94-524b-4a51-8150-552df952f1bf",
"metadata": {},
"source": [
"## Structured Prediction\n",
"\n",
"An important use case for function calling is extracting structured objects. LlamaIndex provides an intuitive interface for this through `structured_predict` - simply define the target Pydantic class (can be nested), and given a prompt, we extract out the desired object."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "04312089-bf9a-48d0-918f-ca1b3808439b",
"metadata": {},
"outputs": [],
"source": [
"from llama_index.llms.openai import OpenAI\n",
"from llama_index.core.prompts import PromptTemplate\n",
"from pydantic import BaseModel\n",
"\n",
"\n",
"class Restaurant(BaseModel):\n",
" \"\"\"A restaurant with name, city, and cuisine.\"\"\"\n",
"\n",
" name: str\n",
" city: str\n",
" cuisine: str\n",
"\n",
"\n",
"llm = OpenAI(model=\"gpt-3.5-turbo\")\n",
"prompt_tmpl = PromptTemplate(\n",
" \"Generate a restaurant in a given city {city_name}\"\n",
")\n",
"restaurant_obj = llm.structured_predict(\n",
" Restaurant, prompt_tmpl, city_name=\"San Francisco\"\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4b6fffcd-ff1e-4755-a851-1c6757a8075e",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"Restaurant(name='Tasty Bites', city='San Francisco', cuisine='Italian')"
]
},
"execution_count": null,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"restaurant_obj"
]
},
{
"cell_type": "markdown",
"id": "df5fa1ab-f598-46da-80f3-f6af5dd2fe83",
Expand Down
Loading
Loading