Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions docs/api/models/function.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
# `pydantic_ai.models.function`

A model controlled by a local function.

[`FunctionModel`][pydantic_ai.models.function.FunctionModel] is similar to [`TestModel`][pydantic_ai.models.test.TestModel],
but allows greater control over the model's behavior.

Its primary use case is for more advanced unit testing than is possible with `TestModel`.

::: pydantic_ai.models.function
17 changes: 17 additions & 0 deletions docs/api/models/gemini.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,20 @@
# `pydantic_ai.models.gemini`

Custom interface to the `generativelanguage.googleapis.com` API using
[HTTPX](https://www.python-httpx.org/) and [Pydantic](https://docs.pydantic.dev/latest/.

The Google SDK for interacting with the `generativelanguage.googleapis.com` API
[`google-generativeai`](https://ai.google.dev/gemini-api/docs/quickstart?lang=python) reads like it was written by a
Java developer who thought they knew everything about OOP, spent 30 minutes trying to learn Python,
gave up and decided to build the library to prove how horrible Python is. It also doesn't use httpx for HTTP requests,
and tries to implement tool calling itself, but doesn't use Pydantic or equivalent for validation.

We therefore implement support for the API directly.

Despite these shortcomings, the Gemini model is actually quite powerful and very fast.

## Setup

For details on how to set up authentication with this model, see [model configuration for Gemini](../../install.md#gemini).

::: pydantic_ai.models.gemini
4 changes: 4 additions & 0 deletions docs/api/models/groq.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
# `pydantic_ai.models.groq`

## Setup

For details on how to set up authentication with this model, see [model configuration for Groq](../../install.md#groq).

::: pydantic_ai.models.groq
4 changes: 4 additions & 0 deletions docs/api/models/openai.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
# `pydantic_ai.models.openai`

## Setup

For details on how to set up authentication with this model, see [model configuration for OpenAI](../../install.md#openai).

::: pydantic_ai.models.openai
49 changes: 49 additions & 0 deletions docs/api/models/vertexai.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,52 @@
# `pydantic_ai.models.vertexai`

Custom interface to the `*-aiplatform.googleapis.com` API for Gemini models.

This model uses [`GeminiAgentModel`][pydantic_ai.models.gemini.GeminiAgentModel] with just the URL and auth method
changed from [`GeminiModel`][pydantic_ai.models.gemini.GeminiModel], it relies on the VertexAI
[`generateContent`](https://cloud.google.com/vertex-ai/docs/reference/rest/v1/projects.locations.endpoints/generateContent)
and
[`streamGenerateContent`](https://cloud.google.com/vertex-ai/docs/reference/rest/v1/projects.locations.endpoints/streamGenerateContent)
function endpoints
having the same schemas as the equivalent [Gemini endpoints][pydantic_ai.models.gemini.GeminiModel].

There are four advantages of using this API over the `generativelanguage.googleapis.com` API which
[`GeminiModel`][pydantic_ai.models.gemini.GeminiModel] uses, and one big disadvantage.

## Setup

For details on how to set up authentication with this model as well as a comparison with the `generativelanguage.googleapis.com` API used by [`GeminiModel`][pydantic_ai.models.gemini.GeminiModel],
see [model configuration for Gemini via VertexAI](../../install.md#gemini-via-vertexai).

## Example Usage

With the default google project already configured in your environment using "application default credentials":

```py title="vertex_example_env.py"
from pydantic_ai import Agent
from pydantic_ai.models.vertexai import VertexAIModel

model = VertexAIModel('gemini-1.5-flash')
agent = Agent(model)
result = agent.run_sync('Tell me a joke.')
print(result.data)
#> Did you hear about the toothpaste scandal? They called it Colgate.
```

Or using a service account JSON file:

```py title="vertex_example_service_account.py"
from pydantic_ai import Agent
from pydantic_ai.models.vertexai import VertexAIModel

model = VertexAIModel(
'gemini-1.5-flash',
service_account_file='path/to/service-account.json',
)
agent = Agent(model)
result = agent.run_sync('Tell me a joke.')
print(result.data)
#> Did you hear about the toothpaste scandal? They called it Colgate.
```

::: pydantic_ai.models.vertexai
7 changes: 2 additions & 5 deletions docs/examples/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,9 @@ If you clone the repo, you should instead use `uv sync --extra examples` to inst

### Setting model environment variables

All these examples will need you to set either:
These examples will need you to set up authentication with one or more of the LLMs, see the [model configuration](../install.md#model-configuration) docs for details on how to do this.

* `OPENAI_API_KEY` to use OpenAI models, go to [platform.openai.com](https://platform.openai.com/) and follow your nose until you find how to generate an API key
* or, `GEMINI_API_KEY` to use Google Gemini models, go to [aistudio.google.com](https://aistudio.google.com/) and do the same to generate an API key

Then set the API key as an environment variable with:
TL;DR: in most cases you'll need to set one of the following environment variables:

=== "OpenAI"

Expand Down
Loading