Skip to content

Memory-based long conversation handling (no compaction) #686

@kovtcharov

Description

@kovtcharov

Problem

Long conversations overflow the model's context window. The naive solution (context compaction / summarization) loses critical information — OpenClaw's biggest failure was losing safety instructions during compaction.

Approach

Use the memory system + RAG instead of compaction. Important context is offloaded to persistent storage and retrieved via RAG when needed. The memory system IS the solution to long conversations, not summarization/pruning.

Design:

  • As conversation grows, agent proactively saves important facts/decisions to ~/.gaia/memory/
  • When context approaches limit, oldest messages are dropped BUT their key content is already in memory
  • RAG retrieves relevant memory when the agent needs past context
  • No information is permanently lost — it moves from conversation to memory

Dependencies

Acceptance Criteria

  • Agent saves important context to memory as conversation grows
  • Past context retrievable via RAG after messages are dropped
  • No critical information lost during long conversations
  • Safety instructions and user preferences always preserved
  • Works transparently — user doesn't need to manually save context

Metadata

Metadata

Assignees

No one assigned

    Labels

    agentsAgent system changesenhancementNew feature or requestp0high priority

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions