Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama context fix #5

Closed

Conversation

s-kostyaev
Copy link
Contributor

No description provided.

ahyatt added a commit that referenced this pull request Oct 24, 2023
There were two issues: when we set the llm-chat-prompt-interaction, we set it to
the value, a vector, not a list of a vector. Additionally, we did not check for
the existence of the context correctly.

This is an alternate fix to #5.
@ahyatt
Copy link
Owner

ahyatt commented Oct 24, 2023

Thank you, this fix showed me what was going wrong. I fixed this, in a different way, because I want just one object in the car of llm-chat-prompt-interactions, which should be the context, whatever that is. In that case, it's a vector. Your fix works, but adds the whole list of numbers, not as a vector, which ends up being the same when transformed eventually back into json.

I've checked in the fix in the conversation-fix branch, so please try and verify that it works.

@s-kostyaev
Copy link
Contributor Author

Works for me, thank you

@s-kostyaev s-kostyaev closed this Oct 24, 2023
@s-kostyaev s-kostyaev deleted the conversation-fix branch October 24, 2023 06:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants