Simple and easy-to-understand Python examples for working with Ollama.
-
Create and activate virtual environment:
# Windows venv\Scripts\activate # Linux/Mac source venv/bin/activate
-
Install dependencies:
pip install -r requirements.txt
-
Make sure Ollama is running and you have the
llama3.2
model installed:ollama pull llama3.2
Basic chat functionality with Ollama.
python chat_example.py
Simple text generation using a prompt.
python generate_example.py
Streaming responses chunk by chunk in real-time.
python stream_example.py
Demonstrates maintaining context across multiple messages.
python multi_turn_conversation.py
Lists all available Ollama models on your system.
python list_models.py
Shows how to use custom parameters like temperature and top_p.
python custom_parameters.py
- Python 3.7+
- Ollama installed and running
ollama
Python package (installed via requirements.txt)
- All examples use the
llama3.2
model by default - You can change the model name in each file to use different models
- Make sure Ollama service is running before executing the examples