A hands-on reference collection built with:
- π Python
3.13+ - π€ OpenAI Responses API (
openai>=2.21.0) - πΆ Flask (for stateful web app examples)
- π SQLite + SQLAlchemy (for session and conversation persistence)
- π‘
python-dotenv,requests,uuid-utils
This repository covers the full breadth of the OpenAI Responses API β from a single-line text completion to multi-user stateful conversation management with TTL-based session expiry. Numbered standalone scripts demonstrate individual API features, while the Conversation_API/ and weather app subdirectories provide progressively more complete backend applications.
OpenAI_Responses_Conversation_API/Basic_examples/β 11 standalone scripts illustrating individual Responses API features (text, vision, file analysis, tools, streaming, function calling, structured output, response retrieval)Conversation_API/β four examples of the Conversations API, from simple two-turn chat to a multi-user Flask backend with SQLite persistence and TTL-based session expiryOpenAI_Responses_Conversation_API/stateless_weather_app/β a minimal stateless Flask backend + client that answers weather questions without preserving session historyOpenAI_Responses_Conversation_API/weather_app_with_sessions/β adds basic session support to the weather app via the Responses APIOpenAI_Responses_Conversation_API/weather_app_advanced_sessions/β full-featured session management with SQLAlchemy, conversation history storage, and a DB inspection utility
The OpenAI Responses API natively supports a conversation parameter that links responses to a server-side conversation object. This means:
- The model automatically receives prior context from earlier turns in the same conversation without the client needing to resend history.
- A conversation is identified by a unique ID created with
client.conversations.create(). - Conversations are stateless from the client's perspective β the client only needs to pass the conversation ID, not the message history.
The more advanced examples in this repo layer local SQLite storage on top of this to track which user owns each conversation and when it was last active.
- Minimal text completions and multi-turn conversation via a conversation ID
- Image analysis (remote URL and local upload)
- Online file (PDF) analysis via a public URL
- Local file upload and analysis (via Files API)
- Built-in web search tool integration
- Custom function calling with structured tool output
- Remote MCP server integration via SSE
- Server-sent event (SSE) streaming responses
- Structured JSON output with a strict schema
- Response creation and retrieval by ID
- Multi-user Flask chat API with TTL-based conversation expiry
- Stateful weather assistant with persistent SQLite session history
OpenAI_Responses_Conversation_API
βββ .env # API keys (not committed)
βββ pyproject.toml # Project config and dependencies (uv)
βββ uv.lock # Fixed dependency versions
βββ README.md # This file
β
βββ Conversation_API/ # OpenAI Conversations API examples
β βββ conversation_basic_example/ # Two-turn conversation
β βββ conversation_multi_user_temp_memory_app/ # Flask API with TTL-based memory
β βββ conversation_persistent_memory_app/# Flask app with SQLite persistence
β βββ update_conversation_metadata/ # Metadata update/retrieve scripts
β
βββ Responses_API/ # Core Responses API examples
βββ Basic_examples/ # 11 standalone feature scripts
β βββ 1-basic_example.py # Text completion
β βββ ...
β βββ resources/ # Sample PDFs for analysis
β
βββ stateless_weather_app/ # Simple stateless weather backend
βββ weather_app_with_sessions/ # Session support via Responses API
βββ weather_app_advanced_sessions/ # Persistent weather session backend
Before running any script, make sure you have:
- Python
3.13+ uvinstalled globally- An OpenAI API key with access to the Responses API
- Internet access for API calls and any remote file/image examples
Create a .env file in the project root and add:
OPENAI_API_KEY="your_openai_api_key"Additional optional variable used by the multi-user app:
CONVERSATION_TTL_MINUTES=30 # Conversation expiry timeout (default: 30)Note: Never commit
.envto version control. It is already listed in.gitignore.
uv synccp .env.example .env
# Then fill in your OPENAI_API_KEYEach numbered script is self-contained:
uv run python OpenAI_Responses_Conversation_API/Basic_examples/1-basic_example.py
uv run python OpenAI_Responses_Conversation_API/Basic_examples/2-analyse_images.py
uv run python OpenAI_Responses_Conversation_API/Basic_examples/5-model_with_web_search_tool.py
uv run python OpenAI_Responses_Conversation_API/Basic_examples/8-with_server_sent_streaming.py
uv run python OpenAI_Responses_Conversation_API/Basic_examples/10-structured_output.pyuv run python Conversation_API/conversation_basic_example/conversation_api_basic.pyuv run python Conversation_API/conversation_multi_user_temp_memory_app/app.pyThen send a request:
curl -X POST http://127.0.0.1:5000/chat \
-H "Content-Type: application/json" \
-d '{"user_id": "alice", "message": "Hello!"}'uv run python OpenAI_Responses_Conversation_API/weather_app_advanced_sessions/backend.py
uv run python OpenAI_Responses_Conversation_API/weather_app_advanced_sessions/frontend.pyEach numbered script directly calls client.responses.create(...) with a specific configuration. They require no server and run top-to-bottom. The key parameter that changes across scripts is input (text, image, file), tools (web search, function, MCP), stream, and text.format (structured output).
conversation_api_basic.py first calls client.conversations.create() to obtain a persistent conversation ID, then sends multiple messages referencing that ID. The model remembers prior turns automatically via the server-side conversation object.
conversation = client.conversations.create()
response1 = client.responses.create(
model="gpt-5-nano",
conversation=conversation.id,
input=[{"role": "user", "content": "Hello!"}]
)conversation_multi_user_temp_memory_app/app.py wraps the Conversations API in a REST backend:
POST /chatβ creates a new conversation on first call, then sends subsequent messages in the same conversation, storing metadata in SQLiteGET /conversationsβ lists conversations filterable byuser_idandstatusPOST /conversations/<id>/closeβ marks a conversation asclosedGET /responses/<conversation_id>β returns all stored response records for a conversation
Conversations that have not been active within CONVERSATION_TTL_MINUTES are automatically marked as expired on the next request.
| App | State | Storage |
|---|---|---|
stateless_weather_app |
None β each request is independent | β |
weather_app_with_sessions |
Conversation ID passed per session | OpenAI server |
weather_app_advanced_sessions |
Full history stored locally | SQLite (SQLAlchemy) |
Send a message for a user. Creates a new conversation on the first call.
| Field | Type | Required | Description |
|---|---|---|---|
user_id |
string |
β | Unique user identifier |
message |
string |
β | Message to send |
conversation_id |
string |
β | Existing conversation ID to continue |
Example response:
{
"conversation_id": "conv_abc123",
"response_id": "resp_xyz789",
"reply": "Hello! I can answer questions, write code, and more."
}List conversations. Accepts optional query params user_id and status.
Closes a conversation and marks it closed.
Returns all stored response records for a given conversation sorted by creation time.
POST /chat (no conversation_id)
β
client.conversations.create() β new conversation_id stored in SQLite
β
POST /chat (with conversation_id)
β
Validate ownership + TTL check
βββ Expired β 409 error
βββ Active β client.responses.create(conversation=id)
β
Response stored in SQLite
β
last_active_at updated
β
JSON reply returned to caller
Caller / Client script
β
HTTP Request (or direct Python call)
β
Flask Route (or standalone script)
β
client.responses.create(...)
β
OpenAI Responses API
βββ Conversation state (server-side)
βββ Tools: web_search / function / MCP / streaming
βββ Structured output / file analysis
β
Response text / JSON
β
(Advanced apps) SQLite via SQLAlchemy
β
JSON response returned to caller
OPENAI_API_KEYmust be in.envand is loaded viapython-dotenvwithoverride=True- Conversation IDs are created server-side by OpenAI; the local SQLite records are supplementary metadata
- TTL expiry in the multi-user app is enforced lazily β a conversation is only marked expired when the next request arrives after the timeout window
- Script 4 (
upload_and_anslyse_local_file.py) requires a local PDF namedUnit6Exercises.pdfto be present in theResponses_API/Basic_examples/resources/directory - Script 7 (
remote_sse_mcp.py) connects to a live remote MCP server and requires internet access - SQLite databases (
app.db,conversations.db) are auto-generated on first run
| Issue | Likely Cause | Fix |
|---|---|---|
AuthenticationError |
Missing or invalid API key | Check .env for a valid OPENAI_API_KEY |
ModuleNotFoundError |
venv not active or deps not installed | Run source .venv/bin/activate then uv sync |
409 Conversation is expired |
TTL window passed between messages | Start a new conversation or increase CONVERSATION_TTL_MINUTES |
FileNotFoundError on script 4 |
Unit6Exercises.pdf not present |
Place PDF in OpenAI_Responses_Conversation_API/Basic_examples/resources/ |
| Flask port conflict | Port 5000 already in use | Change port=5000 in app.run(...) to a free port |
This project demonstrates:
- The full surface area of the OpenAI Responses API in minimal, readable scripts
- How the Conversations API removes client-side history management
- Progressive architecture: stateless β session-aware β fully persisted
- How to layer SQLite-backed metadata (ownership, TTL, history) on top of the OpenAI Conversations API
- Function calling, tool use, streaming, vision, file analysis, and structured output patterns
MIT License β feel free to use and modify.