Gemini-OpenAI-Proxy is a proxy software. It is designed to convert OpenAI API protocol calls into Google Gemini Pro protocol, so that software using OpenAI protocol can use Gemini Pro model without perception.
If you're interested in using Google Gemini but don't want to modify your software, Gemini-OpenAI-Proxy is a great option. It allows you to easily integrate the powerful features of Google Gemini without having to do any complex development work.
Get api key from https://makersuite.google.com/app/apikey
✅ Gemini Pro
curl -s http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer $YOUR_GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello, Who are you?"}],
"temperature": 0.7
}'
✅ Gemini Pro Vision
curl -s http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer $YOUR_GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4-vision-preview",
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "What do you see in this picture?"
},
{
"type": "image_url",
"image_url": {
"url": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAADAAAAAnAgMAAAA0vyM3AAAACVBMVEX/4WwCAgF3aTMcpbzGAAAAa0lEQVR4nGOgAWB1QOYEIHFEcXKmhCBxQqYgcSLEEGymAFEEhzFAFYmTwNoA53A6IDmB1YETidPAiLBVFGgEgrNqJYIzNTQU4Z5QZA6QNQ3hGpAZcNegceBOADFQOQlQDhfQyUwLkPxKVwAABbkRCcDA66QAAAAASUVORK5CYII="
}
}
]
}
],
"stream": false
}'
-
/v1/chat/completions
- stream
- complete
Request Model | Target Gemini Model |
---|---|
gpt-3.5-turbo | gemini-1.5-flash-8b-latest |
gpt-4 | gemini-1.5-pro-latest |
gpt-4o | gemini-1.5-flash-latest |
gpt-4o-mini | gemini-1.5-flash-8b-latest |
gpt-4-vision-preview | gemini-1.5-flash-latest |
gpt-4-turbo | gemini-1.5-pro-latest |
gpt-4-turbo-preview | gemini-2.0-flash-exp |
gemini* | gemini* |
...(others) | gemini-1.5-flash-latest |
Copy main_cloudflare-workers.mjs
to
cloudflare-workers
Copy main_deno.mjs
to deno deploy
- Alternatively can be deployed with cli:
vercel deploy
- Serve locally:
vercel dev
- Vercel Functions limitations (with Edge runtime)
deno task start:deno
node dist/main_node.mjs
bun dist/main_bun.mjs
docker run -d -p 8000:8000 ghcr.io/zuisong/gemini-openai-proxy:deno
## or
docker run -d -p 8000:8000 ghcr.io/zuisong/gemini-openai-proxy:bun
## or
docker run -d -p 8000:8000 ghcr.io/zuisong/gemini-openai-proxy:node