AHIVE Memory Encoding Protocol - Plug-and-play memory management for AI agents.
AMEP is a plug-and-play memory management protocol library for AI agents, providing memory storage, semantic retrieval, and context management capabilities.
| Feature | Description |
|---|---|
| ๐ง Water Meter Mode | Full message flow management: record โ retrieve โ respond |
| ๐ Semantic Retrieval | BGE vector search with time range filtering |
| ๐พ Memory Persistence | Session recovery, automatic history loading |
| ๐๏ธ Memory Extraction | LLM-based key information extraction |
| ๐ Multi-Backend | Local GGUF, Ollama, OpenAI-compatible APIs |
| Feature | AMEP | LangChain Memory | Mem0 |
|---|---|---|---|
| Zero-config startup | โ | โ | โ |
| Local model support | โ Built-in | โ | โ |
| Water Meter Mode | โ Fully managed | โ Manual | |
| Chinese optimized | โ BGE-small-zh | ||
| Memory extraction | โ LLM automatic | โ | โ |
| Package size | ~1MB | ~50MB | ~10MB |
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Your Agent โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ
โผ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ AMEP Water Meter Mode โ
โ โโโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโ โ
โ โ Record โโโ Decide โโโ Retrieve โโโ Call LLM โ โ
โ โโโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ
โโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโ
โผ โผ โผ
โโโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโ
โBGE Embed โ โFaiss Indexโ โ MD Store โ
โโโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโ
User Message
โ
โผ
โโโโโโโโโโโโโโโโโโโ
โ 1. Record User โ
โโโโโโโโโโฌโโโโโโโโโ
โ
โผ
โโโโโโโโโโโโโโโโโโโ
โ 2. Retrieval โ โ LLM decides if retrieval needed
โ Decision โ
โโโโโโโโโโฌโโโโโโโโโ
โ Need?
โโโโโโดโโโโโ
โ Yes โ No
โผ โ
โโโโโโโโโโโโโ โ
โ 3. Search โ โ
โโโโโโโฌโโโโโโ โ
โ โ
โผ โ
โโโโโโโโโโโโโ โ
โ 4. Contextโ โ
โโโโโโโฌโโโโโโ โ
โ โ
โโโโโฌโโโโ
โผ
โโโโโโโโโโโโโโโโโโโ
โ 5. Call LLM โ
โโโโโโโโโโฌโโโโโโโโโ
โ
โผ
โโโโโโโโโโโโโโโโโโโ
โ 6. Record Reply โ
โโโโโโโโโโฌโโโโโโโโโ
โ
โผ
Return Response
| Metric | Value | Description |
|---|---|---|
| Embedding speed | ~10ms | BGE-small-zh single query |
| Retrieval latency | <50ms | Faiss HNSW index |
| Memory compression | 85-90% | After LLM extraction |
| Memory usage | ~100MB | Including embedding model |
| Startup time | <2s | Cold start |
# Basic installation (with mock embedding)
npm install amep-protocol
# Full installation (with BGE embedding model)
npm install amep-protocol @huggingface/transformersimport { createAMEP } from 'amep-protocol';
const amep = createAMEP({
storage: { basePath: './data/amep' },
});
await amep.initialize();
// Store memory
await amep.createMemory({
userId: 'user-001',
content: 'User prefers dark mode',
});
// Search memory
const memories = await amep.search({
query: 'user preferences',
userId: 'user-001',
});import type { LLMService } from 'amep-protocol';
// 1. Implement LLMService interface
const llmService: LLMService = {
chat: async (options) => {
const response = await yourLLM.chat(options.messages);
return { content: response.content };
},
};
// 2. Use water meter mode
const result = await amep.processMessage({
message: 'What did we discuss yesterday?',
userId: 'user-001',
systemPrompt: 'You are a helpful assistant.',
llmService,
});
console.log(result.content); // LLM response
console.log(result.retrievalTriggered); // Whether retrieval was triggered๐ Full Usage Guide: USAGE.md
const amep = createAMEP({
storage: {
basePath: './data/amep', // Storage path
retentionDays: 90, // Memory retention days
},
embedding: {
modelType: 'bge-small-zh', // Embedding model
},
retrieval: {
maxResults: 5, // Max retrieval results
},
session: {
maxContextMessages: 20, // Context message count
},
});| Method | Description |
|---|---|
initialize() |
Initialize the service |
createMemory(options) |
Store memory |
search(options) |
Search memories |
processMessage(options) |
Water meter mode - process message |
getStartupContext(options) |
Get startup context (restore history) |
endSession(sessionId) |
End session (trigger memory extraction) |
Water Meter Mode means all messages flow through AMEP, just like water flows through a meter to be measured and recorded. AMEP automatically handles recording, retrieval, and LLM calls.
- bge-small-zh: Chinese optimized, 512 dimensions, ~100MB (recommended)
- bge-m3: Multilingual, 1024 dimensions
- mock: For testing, zero dependencies
const llmService: LLMService = {
chat: async (options) => {
const response = await localModel.chat(options.messages);
return { content: response.content };
},
};By default, uses Markdown file storage with:
- User/agent isolation
- Automatic archiving
- 90-day retention
Yes. Use userId and agentId to distinguish different users and agents. Memories are completely isolated.
MIT License ยฉ 2026 StarFuture Software Studio (AHIVE.CN)
AMEP (AHIVE Memory Encoding Protocol) ๆฏไธไธชๅณๆๅณ็จ็ๆบ่ฝไฝ่ฎฐๅฟ็ฎก็ๅ่ฎฎๅบ๏ผๆไพ่ฎฐๅฟๅญๅจใ่ฏญไนๆฃ็ดขใไธไธๆ็ฎก็่ฝๅใ
| ็นๆง | ่ฏดๆ |
|---|---|
| ๐ง ๆฐด่กจๆจกๅผ | ๆถๆฏๆตๅ จๆ็ฎก๏ผ่ชๅจๅฎๆ่ฎฐๅฝโๆฃ็ดขโๅๅค |
| ๐ ่ฏญไนๆฃ็ดข | BGE ๅ้ๆฃ็ดข๏ผๆฏๆๆถ้ด่ๅด่ฟๆปค |
| ๐พ ่ฎฐๅฟๆฐธ็ปญ | ไผ่ฏๆขๅค๏ผๅๅฒๅฏน่ฏ่ชๅจๅ ่ฝฝ |
| ๐๏ธ ่ฎฐๅฟๆ็บฏ | LLM ๆๅๅ ณ้ฎไฟกๆฏ๏ผๅ็ผฉๅญๅจ |
| ๐ ๅคๅ็ซฏๆฏๆ | ๆฌๅฐ GGUFใOllamaใOpenAI ๅ ผๅฎน API |
npm install amep-protocolimport { createAMEP } from 'amep-protocol';
const amep = createAMEP();
await amep.initialize();
// ๆฐด่กจๆจกๅผ
const result = await amep.processMessage({
message: 'ๆไปฌๆจๅคฉ่ไบไปไน๏ผ',
userId: 'user-001',
llmService: myLLMService,
});Q: ไธบไปไนๅซ"ๆฐด่กจๆจกๅผ"๏ผ
ๆฐด่กจๆจกๅผๆฏๅปๆๆๆถๆฏๆต้ฝ็ป่ฟ AMEP๏ผๅฐฑๅ่ชๆฅๆฐด็ป่ฟๆฐด่กจไธๆ ท่ขซ่ฎก้ๅ่ฎฐๅฝใ
Q: ๆฏๆๅชไบๅตๅ ฅๆจกๅ๏ผ
- bge-small-zh: ไธญๆไผๅ๏ผๆจ่๏ผ
- bge-m3: ๅค่ฏญ่จ
- mock: ๆต่ฏ็จ
ๅฎไฝ๏ผๆบ่ฝไฝ่ฎฐๅฟ็ฎก็็ "Windows DLL"