Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 38 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,7 @@ Llama Stack Client. It is a library available for Python, Swift, Node.js or
Kotlin, which "wraps" the REST API stack in a suitable way, which is easier for
many applications.


![Integration with Llama Stack](docs/core2llama-stack_interface.png)


Expand Down Expand Up @@ -114,8 +115,45 @@ user_data_collection:
transcripts_storage: "/tmp/data/transcripts"
```

### MCP Server and Tool Configuration

**Note**: The `run.yaml` configuration is currently an implementation detail. In the future, all configuration will be available directly from the lightspeed-core config.

#### Configuring MCP Servers

MCP (Model Context Protocol) servers provide tools and capabilities to the AI agents. These are configured in the `mcp_servers` section of your `lightspeed-stack.yaml`:

```yaml
mcp_servers:
- name: "filesystem-tools"
provider_id: "model-context-protocol"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the provider_id: "model-context-protocol" necessary? As it is llama-stack "thing", can it be added automatically?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But I wonder if you can have two MCPs with the same name and different providers ? If this is supported by llama-stack I fear we will need to expose it as well ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit complicated because the provider_id needs to match an id defined in tool_runtime. By default we just have the following defined in the run.yml:

tool_runtime:
    - provider_id: model-context-protocol
      provider_type: remote::model-context-protocol
      config: {}

The providers supported today are here

But name is toolgroup_id today which must be unique no matter the provider id (see: https://github.com/lightspeed-core/lightspeed-stack/blob/main/src/utils/common.py#L56 ).

If we want to allow for same name but different provider we can generate a toolgroup_id or something but that sounds messy.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is another MCP provider in lightspeed-providers. This is why we asked for the provider to be a parameter too.

url: "http://localhost:3000"
- name: "git-tools"
provider_id: "model-context-protocol"
url: "http://localhost:3001"
- name: "database-tools"
provider_id: "model-context-protocol"
url: "http://localhost:3002"
```

**Important**: Only MCP servers defined in the `lightspeed-stack.yaml` configuration are available to the agents. Tools configured in the llama-stack `run.yaml` are not accessible to lightspeed-core agents.

#### Configuring MCP Headers

MCP headers allow you to pass authentication tokens, API keys, or other metadata to MCP servers. These are configured **per request** via the `MCP-HEADERS` HTTP header:

```bash
curl -X POST "http://localhost:8080/v1/query" \
-H "Content-Type: application/json" \
-H "MCP-HEADERS: {\"filesystem-tools\": {\"Authorization\": \"Bearer token123\"}}" \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just as in HTTP/1.x, header field names are strings of ASCII characters that are compared in a case-insensitive fashion. However, header field names MUST be converted to lowercase prior to their encoding in HTTP/2. A request or response containing uppercase header field names MUST be treated as malformed -- RFC 7540, Hypertext Transfer Protocol Version 2 (HTTP/2), Section 8.1.2

Should MCP-HEADERS be uppercase?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ldjebran I believe we use upper case from Ansible Lightspeed (as, IIRC, the header was contributed by us). Changing case could break us? Unless other code in lightspeed-core looks for the header in a case insensitive way?

Copy link
Contributor

@ldjebran ldjebran Aug 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@manstis we are already using uppercase when sending the header just as the curl command above, do not think this should affect anyone as soon as they can be converted to lowercase or looking in a case insensitive way.

forcing clients to have headers in lowercase will break us for sure.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

curl actually send Mcp-headers in request. And as HTTP/1.1 standard says, header names are case insensitive, so don't worry much ;)

-d '{"query": "List files in /tmp"}'
```


### Llama Stack project and configuration

**Note**: The `run.yaml` configuration is currently an implementation detail. In the future, all configuration will be available directly from the lightspeed-core config.

To run Llama Stack in separate process, you need to have all dependencies installed. The easiest way how to do it is to create a separate repository with Llama Stack project file `pyproject.toml` and Llama Stack configuration file `run.yaml`. The project file might look like:

```toml
Expand Down