Skip to content
This repository was archived by the owner on Sep 18, 2025. It is now read-only.
/ aeoncore-mvp Public archive

Core engine for Pegasus Loop — a prototype AI companion that remembers, adapts, and grows with you.

License

Notifications You must be signed in to change notification settings

pegasusloop/aeoncore-mvp

Repository files navigation

AeonCore

Bring your exported ChatGPT conversations into a tiny, local memory and explore them with a minimal AeonCore CLI.

What's in scope for the hackathon

  • Parse a conversations.json export from ChatGPT
  • Generate text embeddings with a local model (default) or OpenAI
  • Build a FAISS index for fast similarity search
  • Chat with a lightweight AeonCore REPL (optional)
  • Hack on top of a simple Typer-based CLI

What's out of scope

  • Legacy folders and experimental modules — see _CLEANUP_TODO.md
  • Cloud deployments, full LoopDesk apps, or production hardening
  • Proprietary models or services beyond what you configure yourself

Requirements

  • Python 3.10+
  • sentence-transformers and faiss for local embeddings (CPU or GPU)
  • Optional: openai package and API key for OpenAI embeddings
  • Optional: Ollama running locally for the chat REPL

Install

git clone https://example.com/aeoncore-mvp.git
cd aeoncore-mvp
python3 -m venv .venv
source .venv/bin/activate  # Windows: .venv\\Scripts\\Activate.ps1
pip install -e .

⚠️ MVP Scope Notes

✅ Supported input: conversations.json export from ChatGPT (Go to Settings → Data Controls → Export Data, then unzip and point AeonCore MVP at the conversations.json file.)

❌ Not supported (for now): other JSON formats (Slack, Discord, Gmail, etc.). This is intentional — we focused the hackathon MVP on one clean path end-to-end.

💡 Why? Because hackathon time is short — we scoped narrowly so we could actually deliver something that works out of the box. Future formats can be added later.

🧪 Tested on:

Fresh macOS (MacBook Air, Python 3.11 via Homebrew)

Fresh Linux (Ubuntu, Python 3.11)

Both verified with a clean git clone + pip install -e .

Quickstart

  1. Export your ChatGPT data
    In ChatGPT go to Settings → Data Controls → Export Data and follow the email link to download conversations.json (see OpenAI Help Center: "How do I export my ChatGPT history and data?").
  2. Ingest the export
    python aeoncore/ingestion/chat_history_ingestor.py path/to/conversations.json \
        --output dynamic_memory/chat_history_dump.jsonl
  3. Vectorize chats
    python scripts/build_faiss_index.py
  4. Chat with Aeon (optional; requires Ollama)
    aeon chat start
    Sample query: What did I talk about last week?
    Exit with :quit.

Folder structure

aeoncore-mvp/
├── aeoncore/ingestion/         # ChatGPT export parser
├── scripts/                      # Utility scripts (FAISS builder)
├── src/aeon/cli/              # Typer CLI entrypoints
├── src/aeon/core/loop_barometer/  # Embedding providers
└── dynamic_memory/               # Created at runtime for vectors & memory

Configuration

The embedding CLI reads provider settings from environment variables:

export EMBED_PROVIDER=oss20b   # or openai
export EMBED_MODEL=oss-20b     # or text-embedding-3-small

Common commands

Command Description
python aeoncore/ingestion/chat_history_ingestor.py <in> --output <out> Convert ChatGPT export to JSONL
python scripts/build_faiss_index.py Build dynamic_memory/chat_history.faiss
aeon hello Sanity check for the CLI
aeon embed input.txt Embed each line of input.txt
aeon chat start Interactive REPL (needs Ollama)

Troubleshooting

  1. conversations.json not found – check the path and that the export was unzipped.
  2. ModuleNotFoundError: sentence_transformers – install sentence-transformers or set EMBED_PROVIDER=openai.
  3. faiss import errors – install faiss-cpu (or faiss-gpu) matching your platform.
  4. Long paths on Windows – run git config --system core.longpaths true.
  5. Proxy/SSL errors during pip install – ensure you have network access or configure proxy settings.

License & acknowledgments

License: see repository maintainers.
Built with Typer, sentence-transformers, and FAISS.

About

Core engine for Pegasus Loop — a prototype AI companion that remembers, adapts, and grows with you.

Resources

License

Stars

Watchers

Forks

Packages

No packages published