Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create an internal reasoning / search_memory tool #5

Open
syntex01 opened this issue Mar 23, 2023 · 12 comments
Open

Create an internal reasoning / search_memory tool #5

syntex01 opened this issue Mar 23, 2023 · 12 comments

Comments

@syntex01
Copy link
Contributor

I think that the agent should speak to itself and reason what topics would be helpful for him to know right now. We can then provide the closest match for each topic. In this internal chat the bot could later on also access various functions like searching the web.

@syntex01 syntex01 changed the title We have to create an internal reasoning Create an internal reasoning Mar 23, 2023
@kyb3r
Copy link
Owner

kyb3r commented Mar 23, 2023

I read the microsoft research paper on “sparks of AGI” https://arxiv.org/abs/2303.12712. It seems like GPT-4 is a lot lot better at using tools, and reasoning. Gpt-3.5 is hit or miss, sometimes it uses the tool provided in in context examples, sometimes it does not. One thing i found helped a lot is providing an example convo before the system message. Gpt-3.5 tends to ignore the system message a lot.


[
    {"role": "user", "content": "I want to book an appointment with Dr Bob"},
    {"role": "assistant", "content": "AVAILABLE_TIMES('Dr Bob') -> …"},
    {"role": "assistant", "content": "Here are the available times for Dr Bob ..."},

    {"role": "system", "content": "You are a booking assistant blah blah ..."} # Example convo before this
    
]

@syntex01
Copy link
Contributor Author

I also noticed this with other projects. Gpt4 is very responsive and rarely makes mistakes when provided clear instructions. You can even tell it to execute certain functions when it sees the need for it. (By identifying the function name in the output)

@kyb3r
Copy link
Owner

kyb3r commented Mar 23, 2023

Take a look at this: https://openai.com/blog/chatgpt-plugins

@syntex01
Copy link
Contributor Author

That's really nice

@kyb3r
Copy link
Owner

kyb3r commented Mar 23, 2023

This is actually insane. It looks like openai made models finetuned to learn how to use tools using their plugin specification.

For now, plugins are designed for calling backend APIs, but we are exploring plugins that can call client-side APIs as well.

Client-side apis would benefit us, that would provide a simple way for us to make a tool for accessing memories.

@syntex01
Copy link
Contributor Author

Yeah the advancement gpt4 brought do allow for many things that were harder with 3.5. I am really happy to see the announcement about the plugins.

@kyb3r
Copy link
Owner

kyb3r commented Mar 28, 2023

I have access to GPT-4 now and I have already tested out tools

IMG_0242

it works well, gpt-4 is smart :)

It should be easy to create a memory tool that takes in a query as an argument

@syntex01
Copy link
Contributor Author

Did you also get access to tools?

@kyb3r
Copy link
Owner

kyb3r commented Mar 28, 2023

no, i just made my own implementation lol.

you define your own tools, and the ChatAgent class will let gpt-4 know about these tools and how to use them.

works reliably with gpt-4. I finished implementing it in another (private) repo, i’ll try finish merging it into this one soon

@kyb3r
Copy link
Owner

kyb3r commented Mar 28, 2023

Alrighty I got it to work!

You: Tell me about the HMCS system we discussed in the past 

FUNCTION CALL:  search_memory {'query': 'HMCS system'}

Agent: In the past, we discussed the Hierarchical Memory Consolidation System (HMCS), which was developed in 2023. HMCS is designed to enhance the memory capacity of large language models, particularly those that rely on natural language processing. This new memory system can improve the performance of AI models, making them more efficient and effective in processing natural language data. Note that HMCS is not related to Her Majesty's Canadian Ship, which is a prefix used for ships in the Canadian Navy.

@syntex01
Copy link
Contributor Author

Ah really nice:) just that you know, I am currently skiing so I can't implement my ideas rn. But I will be back in 6 days:) I will check the updates you make to the code however.

@kyb3r kyb3r changed the title Create an internal reasoning Create an internal reasoning / search_memory tool Mar 28, 2023
@kyb3r
Copy link
Owner

kyb3r commented Mar 29, 2023

Currently, this only works with GPT-4 (since its smart enough to reason when to use a tool and when not to)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants