Skip to content

AI-Code-Geek/ollama-python-examples

Repository files navigation

Ollama Python Examples

Simple and easy-to-understand Python examples for working with Ollama.

Setup

  1. Create and activate virtual environment:

    # Windows
    venv\Scripts\activate
    
    # Linux/Mac
    source venv/bin/activate
  2. Install dependencies:

    pip install -r requirements.txt
  3. Make sure Ollama is running and you have the llama3.2 model installed:

    ollama pull llama3.2

Examples

1. Chat Example (chat_example.py)

Basic chat functionality with Ollama.

python chat_example.py

2. Generate Example (generate_example.py)

Simple text generation using a prompt.

python generate_example.py

3. Stream Example (stream_example.py)

Streaming responses chunk by chunk in real-time.

python stream_example.py

4. Multi-turn Conversation (multi_turn_conversation.py)

Demonstrates maintaining context across multiple messages.

python multi_turn_conversation.py

5. List Models (list_models.py)

Lists all available Ollama models on your system.

python list_models.py

6. Custom Parameters (custom_parameters.py)

Shows how to use custom parameters like temperature and top_p.

python custom_parameters.py

Requirements

  • Python 3.7+
  • Ollama installed and running
  • ollama Python package (installed via requirements.txt)

Notes

  • All examples use the llama3.2 model by default
  • You can change the model name in each file to use different models
  • Make sure Ollama service is running before executing the examples

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages