Skip to content

Support for Phi4 from Ollama#83

Merged
JeanKaddour merged 1 commit intoPySpur-Dev:mainfrom
pafend:feat/OllamaPhi4
Jan 9, 2025
Merged

Support for Phi4 from Ollama#83
JeanKaddour merged 1 commit intoPySpur-Dev:mainfrom
pafend:feat/OllamaPhi4

Conversation

@pafend
Copy link
Copy Markdown
Contributor

@pafend pafend commented Jan 9, 2025

Important

Add support for OLLAMA_PHI4 model in LLMModels enum and get_model_info() in _utils.py.

  • Models:
    • Add OLLAMA_PHI4 to LLMModels enum in _utils.py.
    • Update get_model_info() in _utils.py to include OLLAMA_PHI4 with constraints: max_tokens=4096, max_temperature=2.0.

This description was created by Ellipsis for ebd6eb8. It will automatically update as commits are pushed.

Copy link
Copy Markdown
Contributor

@ellipsis-dev ellipsis-dev Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Looks good to me! Reviewed everything up to ebd6eb8 in 12 seconds

More details
  • Looked at 25 lines of code in 1 files
  • Skipped 0 files when reviewing.
  • Skipped posting 1 drafted comments based on config settings.
1. backend/app/nodes/llm/_utils.py:97
  • Draft comment:
    The addition of the OLLAMA_PHI4 model is consistent with the existing pattern for adding models. Ensure that the max_tokens value of 4096 is correct for this model.
  • Reason this comment was not posted:
    Confidence changes required: 33%
    The addition of the new model 'OLLAMA_PHI4' seems consistent with the existing pattern for adding models. However, the max_tokens value for this model is set to 4096, which is consistent with other Ollama models, so it seems correct.

Workflow ID: wflow_LJVgWgPcHcIrSrJL


You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.

@JeanKaddour JeanKaddour merged commit c510c02 into PySpur-Dev:main Jan 9, 2025
@JeanKaddour
Copy link
Copy Markdown
Contributor

Thank you @pafend !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants