Skip to content

Conversation

@manstis
Copy link
Contributor

@manstis manstis commented May 16, 2025

Description

Support providing the path to a llama-stack configuration file when using it "as a library".

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change

Related Tickets & Documents

  • Related Issue #
  • Closes #

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Try a configuration file:

name: foo bar baz
llama_stack:
  use_as_library_client: true
  library_client_config_path: /home/manstis/.llama/distributions/ollama/ollama-run.yaml
  url: http://localhost:8321
  api_key: xyzzy
  chat_completion_mode: true

@manstis manstis requested a review from tisnik May 16, 2025 13:57
Copy link
Contributor

@tisnik tisnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice one, thank you!

@tisnik tisnik merged commit 0b72ed5 into lightspeed-core:main May 16, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants