Skip to content

accendium/simple-agent-py

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

simple-agent-py

A minimal Python sandbox for experimenting with terminal-based LLM applications.

The project focuses on small, readable scripts that demonstrate core patterns for chat and tool-calling agents without introducing a larger framework. It currently includes a chat client, a coding agent, and a terminal word game generated with the agent itself.

Features

  • Lightweight Python CLIs with straightforward, readable implementations
  • OpenRouter-backed chat and agent examples
  • Local tool execution for basic coding workflows
  • Session logging for agent runs in logs/
  • A small example artifact, wordy.py, created through the agent

Included Scripts

chat.py

A minimal multi-turn chat client for OpenRouter.

  • Sends conversation history with each request
  • Maintains in-memory session state for the current run
  • Uses only the Python standard library

agent.py

A compact coding agent that extends the same chat loop with local tools.

Available tools:

  • bash
  • read_file
  • write_file
  • list_files
  • search_files
  • check_lint

Each session is written to logs/chat_YYYYMMDD_HHMMSS.log for later inspection.

wordy.py

A terminal word-guessing game inspired by Wordle, with ANSI color output and an on-screen keyboard.

Requirements

  • Python 3.14+
  • An OPENROUTER_API_KEY environment variable for chat.py and agent.py

Getting Started

Create a virtual environment and activate it:

python -m venv .venv
.venv\Scripts\Activate.ps1

Set your OpenRouter API key:

$env:OPENROUTER_API_KEY="your_key_here"

You can also place the key in a local .env file in the repository root:

OPENROUTER_API_KEY=your_key_here

Usage

Run the chat client:

python chat.py

Run the coding agent:

python agent.py

Run the word game:

python wordy.py

Project Structure

.
|-- agent.py
|-- chat.py
|-- logs/
|-- pyproject.toml
|-- README.md
`-- wordy.py

Implementation Notes

  • chat.py and agent.py load a local .env file automatically when present.
  • The project is intentionally standard library-first and keeps its examples compact.
  • Model names, prompts, and tool definitions are currently configured directly in the scripts.

About

A minimal Python sandbox for experimenting with LLM-powered terminal apps.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages