-
-
Notifications
You must be signed in to change notification settings - Fork 112
[AI Bundle] Add memory provider configuration support #583
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
hmm, i know that it is injected in the prompt, but still i feel it should be standalone config, not below the prompt |
I thought the same, but it can only work with the SystemInputProcessor and this is only registered, once a prompt is defined 🤷♂️ Imagine, you don't add a system prompt but memory, then the memory will be your system prompt, which is weird to me. |
i think it works on every MessageBag with system prompt - no matter if pushed by the user or the SystemPromptProcessor: ai/src/agent/src/Memory/MemoryInputProcessor.php Lines 72 to 83 in d8dfb30
|
|
I updated my comment, it's about the result |
Yeah, feels a bit weird but i was inspired by other libraries doing the same. The memory was always injected to the prompt message type as the memory do not have an own message type. I had shortly also seen some internal OpenAI / Antrophic internal prompt collections where there memory was also injected into the prompt. Another approach i have seen was to have I, for example, have replaced a static configured fantasy calendar that was a "forced" tooling for each LLM call with this static memory. Also some other personal information are injected here from different sources. I understand the tooling approach more as a dynamic memory to let the LLM decide information from conversation to be stored into a memory that could then be injected again. An interesting overview about how complex prompts can be filled, for me, was https://github.com/asgeirtj/system_prompts_leaks/tree/main - There is a lot going on in the system prompts that is going beyond just the "System Prompt". |
|
In my case I have a demo where I have a system prompt and adding all activities and apartments of the client. Roughly 15 apartments an 19 activities. I would love to be able to split the "data" front he prompt, so this feels handy. |
|
No, don't get me wrong, i'm not questioning that this belongs in the system prompt when looking at the actual payload. I'm questioning if from the configuration point of view, it should go there. my_agent:
platform: 'service_id'
model: 'model_name'
instruction:
text: '...'
...
memory: '...'
tools: [...]I think the memory is a feature of the agent, not a feature of the prompt - even tho from an implementation point of view, it is implemented via the system prompt. |
|
Fine, but validating, that you can only use memory once a prompt is set, agree? |
The memory itself has a prompt - I think even if no system prompt is around, we should add this - and this is actually what the |
f3a1b35 to
f4ac5b6
Compare
This PR introduces comprehensive memory provider configuration capabilities for AI agents, enabling both simple static memory and advanced dynamic memory scenarios.
🚀 Key Features
Smart Detection Logic
The system automatically detects the intent:
StaticMemoryProviderwith the string as static contentMemory as System Prompt
memoryis provided (noprompt), memory serves as the system promptmemoryandpromptare provided, memory is prepended to the promptConfiguration Examples
Static Memory (Most Common):
Dynamic Memory (Advanced):
Memory as System Prompt:
This enhancement significantly improves the developer experience while maintaining full backward compatibility and adding powerful new memory capabilities for AI agents.
cc @DZunke