Skip to content

AlexJJAX/OpenAI_Responses_Conversation_API

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ€– OpenAI Responses & Conversations API (Python Examples)

Python OpenAI Flask Database Architecture License

A hands-on reference collection built with:

  • 🐍 Python 3.13+
  • πŸ€– OpenAI Responses API (openai>=2.21.0)
  • 🌢 Flask (for stateful web app examples)
  • πŸ—„ SQLite + SQLAlchemy (for session and conversation persistence)
  • πŸ“‘ python-dotenv, requests, uuid-utils

This repository covers the full breadth of the OpenAI Responses API β€” from a single-line text completion to multi-user stateful conversation management with TTL-based session expiry. Numbered standalone scripts demonstrate individual API features, while the Conversation_API/ and weather app subdirectories provide progressively more complete backend applications.


πŸ“Œ Overview

  • OpenAI_Responses_Conversation_API/Basic_examples/ β€” 11 standalone scripts illustrating individual Responses API features (text, vision, file analysis, tools, streaming, function calling, structured output, response retrieval)
  • Conversation_API/ β€” four examples of the Conversations API, from simple two-turn chat to a multi-user Flask backend with SQLite persistence and TTL-based session expiry
  • OpenAI_Responses_Conversation_API/stateless_weather_app/ β€” a minimal stateless Flask backend + client that answers weather questions without preserving session history
  • OpenAI_Responses_Conversation_API/weather_app_with_sessions/ β€” adds basic session support to the weather app via the Responses API
  • OpenAI_Responses_Conversation_API/weather_app_advanced_sessions/ β€” full-featured session management with SQLAlchemy, conversation history storage, and a DB inspection utility

🧠 What Does "Conversation State" Mean Here?

The OpenAI Responses API natively supports a conversation parameter that links responses to a server-side conversation object. This means:

  • The model automatically receives prior context from earlier turns in the same conversation without the client needing to resend history.
  • A conversation is identified by a unique ID created with client.conversations.create().
  • Conversations are stateless from the client's perspective β€” the client only needs to pass the conversation ID, not the message history.

The more advanced examples in this repo layer local SQLite storage on top of this to track which user owns each conversation and when it was last active.


✨ Features

  • Minimal text completions and multi-turn conversation via a conversation ID
  • Image analysis (remote URL and local upload)
  • Online file (PDF) analysis via a public URL
  • Local file upload and analysis (via Files API)
  • Built-in web search tool integration
  • Custom function calling with structured tool output
  • Remote MCP server integration via SSE
  • Server-sent event (SSE) streaming responses
  • Structured JSON output with a strict schema
  • Response creation and retrieval by ID
  • Multi-user Flask chat API with TTL-based conversation expiry
  • Stateful weather assistant with persistent SQLite session history

πŸ—‚ Project Structure

OpenAI_Responses_Conversation_API
β”œβ”€β”€ .env                                   # API keys (not committed)
β”œβ”€β”€ pyproject.toml                         # Project config and dependencies (uv)
β”œβ”€β”€ uv.lock                                # Fixed dependency versions
β”œβ”€β”€ README.md                              # This file
β”‚
β”œβ”€β”€ Conversation_API/                      # OpenAI Conversations API examples
β”‚   β”œβ”€β”€ conversation_basic_example/        # Two-turn conversation
β”‚   β”œβ”€β”€ conversation_multi_user_temp_memory_app/ # Flask API with TTL-based memory
β”‚   β”œβ”€β”€ conversation_persistent_memory_app/# Flask app with SQLite persistence
β”‚   └── update_conversation_metadata/      # Metadata update/retrieve scripts
β”‚
└── Responses_API/                         # Core Responses API examples
    β”œβ”€β”€ Basic_examples/                    # 11 standalone feature scripts
    β”‚   β”œβ”€β”€ 1-basic_example.py             # Text completion
    β”‚   β”œβ”€β”€ ...
    β”‚   └── resources/                     # Sample PDFs for analysis
    β”‚
    β”œβ”€β”€ stateless_weather_app/             # Simple stateless weather backend
    β”œβ”€β”€ weather_app_with_sessions/         # Session support via Responses API
    └── weather_app_advanced_sessions/     # Persistent weather session backend

βš™οΈ Setup and Requirements

Before running any script, make sure you have:

  • Python 3.13+
  • uv installed globally
  • An OpenAI API key with access to the Responses API
  • Internet access for API calls and any remote file/image examples

πŸ”‘ Environment Variables

Create a .env file in the project root and add:

OPENAI_API_KEY="your_openai_api_key"

Additional optional variable used by the multi-user app:

CONVERSATION_TTL_MINUTES=30   # Conversation expiry timeout (default: 30)

Note: Never commit .env to version control. It is already listed in .gitignore.


πŸš€ Running the Project

βœ… 1. Install dependencies

uv sync

βœ… 2. Create the environment file

cp .env.example .env
# Then fill in your OPENAI_API_KEY

βœ… 3. Run a standalone example script

Each numbered script is self-contained:

uv run python OpenAI_Responses_Conversation_API/Basic_examples/1-basic_example.py
uv run python OpenAI_Responses_Conversation_API/Basic_examples/2-analyse_images.py
uv run python OpenAI_Responses_Conversation_API/Basic_examples/5-model_with_web_search_tool.py
uv run python OpenAI_Responses_Conversation_API/Basic_examples/8-with_server_sent_streaming.py
uv run python OpenAI_Responses_Conversation_API/Basic_examples/10-structured_output.py

βœ… 4. Run the Conversation API basic example

uv run python Conversation_API/conversation_basic_example/conversation_api_basic.py

βœ… 5. Run the multi-user Flask chat app

uv run python Conversation_API/conversation_multi_user_temp_memory_app/app.py

Then send a request:

curl -X POST http://127.0.0.1:5000/chat \
  -H "Content-Type: application/json" \
  -d '{"user_id": "alice", "message": "Hello!"}'

βœ… 6. Run the advanced weather app

uv run python OpenAI_Responses_Conversation_API/weather_app_advanced_sessions/backend.py
uv run python OpenAI_Responses_Conversation_API/weather_app_advanced_sessions/frontend.py

βš™οΈ How It Works

1️⃣ Standalone scripts (1–11)

Each numbered script directly calls client.responses.create(...) with a specific configuration. They require no server and run top-to-bottom. The key parameter that changes across scripts is input (text, image, file), tools (web search, function, MCP), stream, and text.format (structured output).

2️⃣ Basic Conversation API example

conversation_api_basic.py first calls client.conversations.create() to obtain a persistent conversation ID, then sends multiple messages referencing that ID. The model remembers prior turns automatically via the server-side conversation object.

conversation = client.conversations.create()
response1 = client.responses.create(
    model="gpt-5-nano",
    conversation=conversation.id,
    input=[{"role": "user", "content": "Hello!"}]
)

3️⃣ Multi-user Flask chat API

conversation_multi_user_temp_memory_app/app.py wraps the Conversations API in a REST backend:

  • POST /chat β€” creates a new conversation on first call, then sends subsequent messages in the same conversation, storing metadata in SQLite
  • GET /conversations β€” lists conversations filterable by user_id and status
  • POST /conversations/<id>/close β€” marks a conversation as closed
  • GET /responses/<conversation_id> β€” returns all stored response records for a conversation

Conversations that have not been active within CONVERSATION_TTL_MINUTES are automatically marked as expired on the next request.

4️⃣ Stateless vs. stateful weather apps

App State Storage
stateless_weather_app None β€” each request is independent β€”
weather_app_with_sessions Conversation ID passed per session OpenAI server
weather_app_advanced_sessions Full history stored locally SQLite (SQLAlchemy)

🌐 API Endpoints (Multi-user Chat App)

POST /chat

Send a message for a user. Creates a new conversation on the first call.

Field Type Required Description
user_id string βœ… Unique user identifier
message string βœ… Message to send
conversation_id string ❌ Existing conversation ID to continue

Example response:

{
  "conversation_id": "conv_abc123",
  "response_id": "resp_xyz789",
  "reply": "Hello! I can answer questions, write code, and more."
}

GET /conversations

List conversations. Accepts optional query params user_id and status.

POST /conversations/<conversation_id>/close

Closes a conversation and marks it closed.

GET /responses/<conversation_id>

Returns all stored response records for a given conversation sorted by creation time.


πŸ”„ Conversation Lifecycle

POST /chat (no conversation_id)
   ↓
client.conversations.create()  β†’  new conversation_id stored in SQLite
   ↓
POST /chat (with conversation_id)
   ↓
Validate ownership + TTL check
   β”œβ”€β”€ Expired β†’ 409 error
   └── Active  β†’ client.responses.create(conversation=id)
                     ↓
               Response stored in SQLite
                     ↓
               last_active_at updated
                     ↓
               JSON reply returned to caller

πŸ— Architecture Diagram

Caller / Client script
        ↓
  HTTP Request (or direct Python call)
        ↓
  Flask Route (or standalone script)
        ↓
  client.responses.create(...)
        ↓
  OpenAI Responses API
  β”œβ”€β”€ Conversation state (server-side)
  β”œβ”€β”€ Tools: web_search / function / MCP / streaming
  └── Structured output / file analysis
        ↓
  Response text / JSON
        ↓
  (Advanced apps) SQLite via SQLAlchemy
        ↓
  JSON response returned to caller

πŸ” Notes

  • OPENAI_API_KEY must be in .env and is loaded via python-dotenv with override=True
  • Conversation IDs are created server-side by OpenAI; the local SQLite records are supplementary metadata
  • TTL expiry in the multi-user app is enforced lazily β€” a conversation is only marked expired when the next request arrives after the timeout window
  • Script 4 (upload_and_anslyse_local_file.py) requires a local PDF named Unit6Exercises.pdf to be present in the Responses_API/Basic_examples/resources/ directory
  • Script 7 (remote_sse_mcp.py) connects to a live remote MCP server and requires internet access
  • SQLite databases (app.db, conversations.db) are auto-generated on first run

πŸ›  Troubleshooting

Issue Likely Cause Fix
AuthenticationError Missing or invalid API key Check .env for a valid OPENAI_API_KEY
ModuleNotFoundError venv not active or deps not installed Run source .venv/bin/activate then uv sync
409 Conversation is expired TTL window passed between messages Start a new conversation or increase CONVERSATION_TTL_MINUTES
FileNotFoundError on script 4 Unit6Exercises.pdf not present Place PDF in OpenAI_Responses_Conversation_API/Basic_examples/resources/
Flask port conflict Port 5000 already in use Change port=5000 in app.run(...) to a free port

🎯 Project Goals

This project demonstrates:

  • The full surface area of the OpenAI Responses API in minimal, readable scripts
  • How the Conversations API removes client-side history management
  • Progressive architecture: stateless β†’ session-aware β†’ fully persisted
  • How to layer SQLite-backed metadata (ownership, TTL, history) on top of the OpenAI Conversations API
  • Function calling, tool use, streaming, vision, file analysis, and structured output patterns

πŸ“„ License

MIT License – feel free to use and modify.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages