Feat/ollama integration#55
Conversation
There was a problem hiding this comment.
❌ Changes requested. Reviewed everything up to 7e44fa5 in 2 minutes and 6 seconds
More details
- Looked at
448lines of code in10files - Skipped
0files when reviewing. - Skipped posting
5drafted comments based on config settings.
1. backend/app/nodes/llm/llm_utils.py:19
- Draft comment:
Ensurelitellm.set_verbose=Trueis a valid usage. Ifset_verboseis not a valid attribute or method, this line should be removed or corrected. - Reason this comment was not posted:
Comment did not seem useful.
2. backend/app/nodes/llm/llm_utils.py:20
- Draft comment:
load_dotenv()is called multiple times across different files. Consider centralizing this call to avoid redundancy. - Reason this comment was not posted:
Confidence changes required:50%
Theload_dotenv()function is called multiple times across different files. This is unnecessary and can be centralized to avoid redundancy.
3. backend/app/nodes/llm/llm_utils.py:193
- Draft comment:
Ensure consistent handling ofapi_base. Check for non-empty values before using it to avoid unexpected behavior. - Reason this comment was not posted:
Comment did not seem useful.
4. backend/app/nodes/llm/single_llm_call.py:17
- Draft comment:
load_dotenv()is called multiple times across different files. Consider centralizing this call to avoid redundancy. - Reason this comment was not posted:
Confidence changes required:50%
Theload_dotenv()function is called multiple times across different files. This is unnecessary and can be centralized to avoid redundancy.
5. frontend/src/utils/api.ts:258
- Draft comment:
UpdatelistApiKeysto returnresponse.datainstead ofresponse.data.keysto match the backend response format. - Reason this comment was not posted:
Comment looked like it was already resolved.
Workflow ID: wflow_bC91faIyjfRF7mCK
Want Ellipsis to fix these issues? Tag @ellipsis-dev in a comment. You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.
There was a problem hiding this comment.
👍 Looks good to me! Incremental review on ac6d9c5 in 31 seconds
More details
- Looked at
13lines of code in1files - Skipped
0files when reviewing. - Skipped posting
1drafted comments based on config settings.
1. backend/app/nodes/llm/llm_utils.py:285
- Draft comment:
The docstring mentions parameters (response_model,max_retries,initial_wait,max_wait) that are not in the function signature. Please update the docstring to reflect the actual parameters. - Reason this comment was not posted:
Comment was on unchanged code.
Workflow ID: wflow_0zfuDA4MNdNL9smj
You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.
There was a problem hiding this comment.
👍 Looks good to me! Incremental review on 162192c in 23 seconds
More details
- Looked at
13lines of code in1files - Skipped
0files when reviewing. - Skipped posting
1drafted comments based on config settings.
1. backend/app/nodes/llm/llm_utils.py:284
- Draft comment:
The retry attempts have been increased from 1 to 3, which aligns with the retry logic used in other functions likecompletion_with_backoff. This change should improve reliability. - Reason this comment was not posted:
Confidence changes required:0%
The change in retry attempts from 1 to 3 in theollama_with_backofffunction is consistent with the retry logic used in other functions likecompletion_with_backoff. This change is likely intended to improve reliability.
Workflow ID: wflow_UdbByDj9hSNBQ7bb
You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.
There was a problem hiding this comment.
👍 Looks good to me! Incremental review on 8a5f886 in 12 seconds
More details
- Looked at
55lines of code in1files - Skipped
0files when reviewing. - Skipped posting
0drafted comments based on config settings.
Workflow ID: wflow_IRKFF4AsKwrmZ3sC
You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.
Add support for models hosted by ollama
This PR adds support for connecting self-hosted models using ollama with PySpur. The ollama endpoint has to be specified in the .env file before launching the PySpur service.
Summary of changes:
Important
Integrates Ollama models, centralizes environment management, updates key management APIs, and adjusts Docker and frontend for these changes.
ollamaPython SDK inllm_utils.py.OllamaOptionsclass for API call options.ollama_with_backoff()function for API calls with retry logic.key_management.py..env.examplewith Ollama configuration.key_management.pyto useMODEL_PROVIDER_KEYS.docker-compose.ymlto include.envfile and extra hosts.test_ollama.shscript to verify Ollama connection inentrypoint.sh.SettingsModal.tsxandapi.tsto handle API keys more effectively.This description was created by
for 8a5f886. It will automatically update as commits are pushed.