Skip to content

mizoz/ollama-chat-python

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Ollama Chat Python

PyPI Version PyPI Downloads License Python Version GitHub Stars

A Python command-line interface for interacting with Ollama LLMs locally from your terminal.

Features

  • Chat with Ollama models from terminal
  • Multiple model support
  • Streamed responses for real-time output
  • Conversation history
  • System prompts
  • Easy model switching
  • Python API for integration

Installation

From PyPI

pip install ollama-chat-python

From Source

git clone https://github.com/mizoz/ollama-chat-python.git
cd ollama-chat-python
pip install -e .

Usage

Prerequisites

  • Python 3.8+
  • Ollama installed and running locally

Command Line

# Start chat session
ollama-chat

# Chat with specific model
ollama-chat --model llama2

# With system prompt
ollama-chat --system "You are a helpful coding assistant"

Python API

from ollama_chat import OllamaChat

chat = OllamaChat(model="llama2")
response = chat.chat("Hello, how are you?")
print(response)

# With system prompt
chat = OllamaChat(
    model="llama2",
    system="You are a Python expert"
)
response = chat.chat("How do I use list comprehensions?")

CLI Options

Option Description
-m, --model Specify the model to use
-s, --system Set system prompt
-h, --history Enable conversation history

Requirements

  • Python 3.8+
  • Ollama installed and running locally

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License.

Author

mizoz


⭐ If you find this project useful, please consider giving it a star on GitHub!

About

Chat with Ollama LLMs from terminal - Python CLI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages