-
-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow memory access #263
Allow memory access #263
Conversation
Missing gpt model (#262)
|
||
# Intialize Conversation with a LLM | ||
# | ||
# @param llm [Object] The LLM to use for the conversation | ||
# @param options [Hash] Options to pass to the LLM, like temperature, top_k, etc. | ||
# @return [Langchain::Conversation] The Langchain::Conversation instance | ||
def initialize(llm:, **options, &block) | ||
def initialize(llm:, memory_class: ConversationMemory, **options, &block) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is supposed to be memory_class: BaseConversationMemory
# This method can be overridden in subclasses | ||
end | ||
|
||
def end_of_conversation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where would this method get called?
@lukasedw I like this interface! Left you a few comments. |
@@ -51,7 +51,7 @@ module Langchain | |||
autoload :Loader, "langchain/loader" | |||
autoload :Data, "langchain/data" | |||
autoload :Conversation, "langchain/conversation" | |||
autoload :ConversationMemory, "langchain/conversation_memory" | |||
autoload :BaseConversationMemory, "langchain/base_conversation_memory" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need to change the class name? I think MyConversationMemory < Langchain::ConversationMemory
actually makes sense.
|
||
# Intialize Conversation with a LLM | ||
# | ||
# @param llm [Object] The LLM to use for the conversation | ||
# @param options [Hash] Options to pass to the LLM, like temperature, top_k, etc. | ||
# @return [Langchain::Conversation] The Langchain::Conversation instance | ||
def initialize(llm:, **options, &block) | ||
def initialize(llm:, memory_class: ConversationMemory, **options, &block) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd pass a memory object instead of a class. So that the usage would be:
llm = Langchain::LLM::OpenAI.new(...)
memory = MyCoolConversationMemory.new(llm: llm)
conversation = Langchain::Conversation(memory: memory)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
memory = MyCoolConversationMemory.new(llm: llm)
What would this be needed for?
I think a better approach is:
llm = Langchain::LLM::OpenAI.new(...)
memory = DatabaseConversationMemory.new(url: "postgres://...")
conversation = Langchain::Conversation(llm: llm, memory: memory)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
llm is used inside of ConversationMemory to calculate token limit and also to summarize history once the limit is reached. We can change that somehow, but that's not related to my comment. My comment is about passing an instance of memory to conversation instead of just class_name.
after_add_message(message, "user") | ||
end | ||
|
||
def after_add_message(message, role) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's not create empty methods in langchain's class. It's better to do:
def append_ai_message(content)
append_message {role: "ai", content: content}
end
def append_message(message)
@messages << message
end
and then in subclass you can override append_message:
def append_message(message)
super
# do stuff here
end
@@ -1,10 +1,7 @@ | |||
# frozen_string_literal: true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please do not randomly remove this comment.
No description provided.