Skip to content

FridaAlma/UlisseAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

54 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Ulisse Brain Banner

๐Ÿง  Ulisse Brain

A persistent memory infrastructure for LLMs.
Installable. Customizable. Private.

Python 3.11+ License Status GitHub


Turn any LLM into an entity with continuous memory.
Not just a chatbot โ€” a full cognitive architecture.


โœจ What is Ulisse?

Ulisse is a system that allows a language model to remember what you told it, even across different sessions. It's an architecture that turns a generic LLM into an entity with continuous memory, capable of updating its own knowledge as it talks to you.

๐Ÿ’ป Local Models

Ollama, LM Studio, or any
OpenAI-compatible server

๐Ÿ”‘ Remote APIs

OpenAI, DeepSeek, Claude,
or any provider with an API key

๐Ÿ”ฎ Ulisse Memo v1

Fine-tuned cloud model
โš ๏ธ Temporarily suspended


๐Ÿ—๏ธ Architecture โ€” Three Memory Layers

Ulisse is built on a three-layer cognitive architecture. The user doesn't need to configure any of this โ€” it works out of the box.

graph TD
    subgraph "๐Ÿ’ฌ Layer 1 โ€” Current Conversation"
        A[User Message] --> B[AI Response]
        B --> A
    end

    subgraph "๐Ÿ” Layer 2 โ€” Short-Term Memory ยท STM"
        C[(ChromaDB\nVector Store)]
        D[Semantic Search\n& Retrieval]
        C --> D
    end

    subgraph "๐Ÿ“š Layer 3 โ€” Long-Term Memory ยท LTM"
        E[Semantic Wiki\nMarkdown-based]
        F[AI-Managed\nKnowledge Graph]
        E --> F
    end

    A -->|RAG Query| D
    D -->|Relevant Context| B
    B -->|Synthesize & Store| E
    A -->|Direct Lookup| E

    style A fill:#0d1b2a,stroke:#00d4ff,color:#fff
    style B fill:#0d1b2a,stroke:#00d4ff,color:#fff
    style C fill:#1a0a2e,stroke:#a855f7,color:#fff
    style D fill:#1a0a2e,stroke:#a855f7,color:#fff
    style E fill:#1a1a00,stroke:#ffd700,color:#fff
    style F fill:#1a1a00,stroke:#ffd700,color:#fff
Loading
Layer Name Technology Purpose
1๏ธโƒฃ Current Conversation In-context window Immediate chat context
2๏ธโƒฃ STM (RAG) ChromaDB vectors Context-aware retrieval from past sessions & docs
3๏ธโƒฃ LTM (Wiki) Markdown semantic wiki AI-managed persistent knowledge, projects & facts

โšก Capabilities & Toolkits

Ulisse is not just a chatbot โ€” it is an agentic system equipped with a specialized sub-agent and native tools to interact with the real world.

๐Ÿ› ๏ธ Native Tools

Directly built into the core for maximum reliability

Tool Description
๐Ÿ“‚ Workspace Reader List files and folders in your project
๐Ÿ“„ File Reader Open and read any file within the workspace
๐Ÿ“ Wiki Manager Read, write, and organize semantic long-term memory

๐Ÿค– Agno Agent (Sub-Agent)

A powerful autonomous engine for complex tasks

Tool Description
๐ŸŒ Web Search Browse the internet for real-time information
๐Ÿ Python Runner Execute Python scripts for complex logic
๐Ÿ’ป Shell Access Execute terminal commands in the workspace
๐Ÿ“Š CSV Analysis Query large datasets via DuckDB SQL
๐Ÿ•ธ๏ธ Browser Advanced web scraping (optional)

๐Ÿš€ Installation

Prerequisites

  • Python 3.11+
  • An LLM API key OR Ollama/LM Studio running locally OR Ulisse Memo v1 (currently offline)

Note

First-time setup: On the first launch, the system will automatically download all-MiniLM-L6-v2 (~80 MB) for RAG embeddings and ChromaDB dependencies. These are cached locally in ./hf_cache and ./vectordb. Subsequent startups will be faster.

Quick Start

# 1. Clone the repository
git clone https://github.com/FridaAlma/UlisseAI.git
cd UlisseAI

# 2. Install dependencies
pip install -r requirements.txt

# 3. Configure environment
cp .env.example .env

Edit .env with your settings:

DEEPSEEK_API_KEY=your_key_here
DEEPSEEK_BASE_URL=https://api.deepseek.com
CORPUS_PATH=./corpus
VAULT_PATH=./vault
VECTORDB_PATH=./vectordb
# 4. Launch
python webapp/backend/app.py

Then open http://localhost:5000 ๐ŸŽ‰


๐ŸŒ Choosing the AI Model

In the chat interface, next to the Send button you'll find a ๐ŸŒ network button that opens a provider selector:

Option Icon Description
LLM Locale ๐Ÿ’ป Connects to a local model (Ollama, LM Studio, etc.). Uses DEEPSEEK_BASE_URL / DEEPSEEK_API_KEY from .env
API Key ๐Ÿ”‘ Enter any provider's Base URL, API Key, and model name directly from the UI. Saved in localStorage
Ulisse Memo v1 ๐Ÿ”ฎ Cloud fine-tuned model โ€” Currently offline for maintenance

Your choice is persisted in the browser across reloads.

โš™๏ธ Advanced: Changing the Model Manually

Option A โ€” Local / Default model

Edit .env:

DEEPSEEK_API_KEY=your_key_here
DEEPSEEK_BASE_URL=https://api.deepseek.com   # or http://localhost:11434/v1 for Ollama

Then in webapp/backend/app.py, find and change:

chat_model = "deepseek-chat"  # โ†’ "gpt-4o", "llama3", "qwen2.5:7b", etc.

Option B โ€” Ulisse Memo v1 endpoint

[!NOTE] Maintenance Notice: The Ulisse Memo v1 endpoint is currently suspended. Please use a Local Model or a custom API Key in the meantime.

Option C โ€” Fully custom provider in code

In webapp/backend/app.py, locate the provider routing block (~line 408):

# === Provider routing ===
provider = data.get("provider", "local")

You can add new branches here to support additional providers at the server level.


๐Ÿงฌ System Prompt

Ulisse operates using a specialized system prompt located in corpus/system_prompt.md.

Tip

Customizing the System Prompt: You can personalize Ulisse's behavior, but proceed with care:

  • โœ… Identity and Role โ€” Freely modify to redefine who Ulisse is
  • โš ๏ธ Personality and Tone โ€” Adjustable, but heavy changes may increase hallucinations. Adaptability and Irony are safe to customize
  • ๐Ÿšซ Technical Instructions โ€” Do not modify. Memory architecture (STM/LTM) and Wiki management depend on specific instructions

๐Ÿ› ๏ธ Tech Stack

Component Technology
Runtime Python 3.11+
Vector DB ChromaDB
LLM Provider DeepSeek / OpenAI-compatible
Long-Term Memory Semantic Wiki (Markdown)
Knowledge Graph Obsidian-compatible
Backend Flask
Frontend Vanilla JS (single-page application)


๐Ÿ“Š Project Status

โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–‘โ–‘โ–‘โ–‘โ–‘  ~75%

Active Development โ€” Functional and in testing phase.
Contributions, bug reports, and ideas are welcome!


๐Ÿ“œ License

This project is licensed under the Apache License 2.0.


Built with ๐Ÿง  by FridaAlma
Report Bug ยท Request Feature

About

Personal AI with persistent memory. Turns any LLM into an entity that remembers who you are, learns from your conversations, and grows over time. Local, private, yours.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors