Skip to content

Conversation

@tisnik
Copy link
Contributor

@tisnik tisnik commented May 12, 2025

Description

Llama-stack API key in configuration

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change

@tisnik tisnik merged commit f3e3d73 into lightspeed-core:main May 12, 2025
2 checks passed
@@ -1 +1,4 @@
name: foo bar baz
llama_stack:
url: http://localhost:8321
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you planning on running llama-stack (server) separately to your own uvicorn instance?

The default server launched with llama-stack is here.

I suspect you could use Meta's code as a starting point to integrate llama-stack itself into your code?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @manstis, it needs to be decided, but current approach is to run llama-stack as a separate image, the service will call it via LlamaStackClient object. Do you think it's doable this way?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @tisnik running llama-stack as a separate image may be difficult.

llama-stack will take some (IDK how much) configuration from lightspeed-stack; providers, models etc. Your distribution would then need to effectively manage two containers; one for lightspeed-stack and one for llama-stack.

You may want to look at this too.

i.e. lightspeed-stack is the server but uses llama-stack as a library internally.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That one looks reasonable, thank you @manstis !

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested it right now, definitely possible solution, yes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants