Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recreated settings changes - Adds serveral options for llamacpp and ollama #1703

Merged
merged 5 commits into from
Mar 11, 2024

Commits on Mar 11, 2024

  1. Recreated settings changes

    icsy7867 committed Mar 11, 2024
    Configuration menu
    Copy the full SHA
    10ffebe View commit details
    Browse the repository at this point in the history
  2. post check

    icsy7867 committed Mar 11, 2024
    Configuration menu
    Copy the full SHA
    1afa6e1 View commit details
    Browse the repository at this point in the history
  3. Fixed variable value

    icsy7867 committed Mar 11, 2024
    Configuration menu
    Copy the full SHA
    942f2b1 View commit details
    Browse the repository at this point in the history
  4. Set default of num_predict of ollama to None, so that it automaticall…

    …y uses the context window size
    icsy7867 committed Mar 11, 2024
    Configuration menu
    Copy the full SHA
    2154fd2 View commit details
    Browse the repository at this point in the history
  5. Fixed suspect black issue

    icsy7867 committed Mar 11, 2024
    Configuration menu
    Copy the full SHA
    41865dc View commit details
    Browse the repository at this point in the history