Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use correct ollama client in ChatEdit screen. Fix #76 #77

Merged
merged 3 commits into from
Apr 22, 2024

Conversation

habaneraa
Copy link
Contributor

A simple fix for issue #76 .

In addition, I think maybe we can extract these local variables to a global client variable, which naturally functions as a singleton object.

oterm/oterm/ollama.py

Lines 22 to 24 in 147cd2f

client = AsyncClient(
host=envConfig.OLLAMA_URL, verify=envConfig.OTERM_VERIFY_SSL
)

oterm/oterm/ollama.py

Lines 39 to 41 in 147cd2f

client = AsyncClient(
host=envConfig.OLLAMA_URL, verify=envConfig.OTERM_VERIFY_SSL
)

Copy link
Owner

@ggozad ggozad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for this!
Some (minor) comments:
I am happy with a singleton here but then having the OllamaLLM in parallel seems like a missed opportunity. So I would either:

  1. move the functionality for list, show to OllamaLLM and create a singleton out of it.
    or 2) remove OllamaLLM altogether and use your singleton directly.
    Happy to discuss if you think differently, I don't have strong opinions on this.

Could you also add an entry to CHANGES.txt with your username and a description?
As soon as we merge I will release :)

@ggozad
Copy link
Owner

ggozad commented Apr 21, 2024

And of course I should have coffee before opening my mouth...
We don't want the singleton here for OllamaLLM. Perhaps best if list, show etc become static methods on OllamaLLM?

@habaneraa
Copy link
Contributor Author

habaneraa commented Apr 21, 2024

You are right. Static methods are absolutely better here.

Note that I kept those "list" and "show" requests to be synchronous. Just like before. I observed that the app will result in a strange NoScreen exception if an async call in on_mount() method of ChatEdit finishes late... (To reproduce, add await asyncio.sleep(1.0) to it). Currently I have no idea why a slow request would lead to NoScreen exception. Will investigate this later. Anyway this PR works well.

@ggozad
Copy link
Owner

ggozad commented Apr 21, 2024

Thank you so much, this looks great! I am way from a computer atm, but will merge and release tomorrow!

@ggozad ggozad merged commit 3c136dc into ggozad:main Apr 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants