Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Cloud deployments of LlamaAgents is still in alpha. You can try it out locally,

Agent Data is a JSON store tied to a `deploymentName` and `collection`. Use the official JavaScript SDK with strong typing for CRUD, search, and aggregation.

See the [Agent Data Overview](/python/cloud/llamaagents/agent-data-overview) for concepts, constraints, and environment details.
See the [Agent Data Overview](/python/llamaagents/llamactl/agent-data-overview) for concepts, constraints, and environment details.

Install:
```bash
Expand Down Expand Up @@ -102,7 +102,7 @@ for (const r of results.items) {
}
```

See the [Agent Data Overview](/python/cloud/llamaagents/agent-data-overview#filter-dsl) for more details on filters.
See the [Agent Data Overview](/python/llamaagents/llamactl/agent-data-overview#filter-dsl) for more details on filters.

- Filter keys target `data` fields, except `created_at`/`updated_at` which are top-level.
- Sort with comma-separated specs; prefix data fields in `orderBy` (e.g., `"data.name desc, created_at"`).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -63,5 +63,5 @@ SDKs and environments:
- The **Python SDK** runs server‑side and uses your API key and an optional base URL.

Next steps:
- Python usage: see [Agent Data (Python)](/python/cloud/llamaagents/agent-data-python)
- JavaScript usage: see [Agent Data (JavaScript)](/python/cloud/llamaagents/agent-data-javascript)
- Python usage: see [Agent Data (Python)](/python/llamaagents/llamactl/agent-data-python)
- JavaScript usage: see [Agent Data (JavaScript)](/python/llamaagents/llamactl/agent-data-javascript)
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ sidebar:
Cloud deployments of LlamaAgents is still in alpha. You can try it out locally, or [request access by contacting us](https://landing.llamaindex.ai/llamaagents?utm_source=docs)
:::

See the [Agent Data Overview](/python/cloud/llamaagents/agent-data-overview) for concepts, constraints, and environment details.
See the [Agent Data Overview](/python/llamaagents/llamactl/agent-data-overview) for concepts, constraints, and environment details.

### Install

Expand Down
10 changes: 5 additions & 5 deletions docs/src/content/docs/llamaagents/llamactl/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ This will prompt for some details, and create a Python module that contains Llam
When you run `llamactl init`, the scaffold also includes AI assistant-facing docs: `AGENTS.md`, `CLAUDE.md`, and `GEMINI.md`. These contain quick references and instructions for using LlamaIndex libraries to assist coding. These files are optional and safe to customize or remove — they do not affect your builds, runtime, or deployments.
:::

Application configuration is managed within your project's `pyproject.toml`, where you can define Python workflow instances that should be served, environment details, and configuration for how the UI should be built. See the [Deployment Config Reference](/python/cloud/llamaagents/configuration-reference) for details on all configurable fields.
Application configuration is managed within your project's `pyproject.toml`, where you can define Python workflow instances that should be served, environment details, and configuration for how the UI should be built. See the [Deployment Config Reference](/python/llamaagents/llamactl/configuration-reference) for details on all configurable fields.

## Develop and Run Locally

Expand Down Expand Up @@ -90,9 +90,9 @@ workflow = MyWorkflow()

At this point, you can get to coding. The development server will detect changes as you save files. It will even resume in-progress workflows!

For more information about CLI flags available, see [`llamactl serve`](/python/cloud/llamaagents/llamactl-reference/commands-serve).
For more information about CLI flags available, see [`llamactl serve`](/python/llamaagents/llamactl-reference/commands-serve).

For a more detailed reference on how to define and expose workflows, see [Workflows & App Server API](/python/cloud/llamaagents/workflow-api).
For a more detailed reference on how to define and expose workflows, see [Workflows & App Server API](/python/llamaagents/llamactl/workflow-api).

## Create a Cloud Deployment

Expand All @@ -116,7 +116,7 @@ llamactl deployments create
:::info
The first time you run this, you'll be prompted to log into LlamaCloud.

Username/password sign-in is not yet supported. If you do not have a supported social sign-in provider, you can use token-based authentication via `llamactl auth token`. See [`llamactl auth`](/python/cloud/llamaagents/llamactl-reference/commands-auth) for details.
Username/password sign-in is not yet supported. If you do not have a supported social sign-in provider, you can use token-based authentication via `llamactl auth token`. See [`llamactl auth`](/python/llamaagents/llamactl-reference/commands-auth) for details.
:::

This will open an interactive Terminal UI (TUI). You can tab through fields, or even point and click with your mouse if your terminal supports it. All required fields should be automatically detected from your environment, but can be customized:
Expand All @@ -135,4 +135,4 @@ After creation, the TUI will show deployment status and logs.

---

Next: Read about defining and exposing workflows in [Workflows & App Server API](/python/cloud/llamaagents/workflow-api).
Next: Read about defining and exposing workflows in [Workflows & App Server API](/python/llamaagents/llamactl/workflow-api).
6 changes: 3 additions & 3 deletions docs/src/content/docs/llamaagents/llamactl/ui-build.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ The LlamaAgents toolchain is unopinionated about your UI stack — bring your ow

## How the integration works

`llamactl` starts and proxies your frontend during development by calling your `npm run dev` command. When you deploy, it builds your UI statically with `npm run build`. These commands are configurable; see [UIConfig](/python/cloud/llamaagents/configuration-reference#uiconfig-fields) in the configuration reference. You can also use other package managers if you have [corepack](https://nodejs.org/download/release/v19.9.0/docs/api/corepack.html) enabled.
`llamactl` starts and proxies your frontend during development by calling your `npm run dev` command. When you deploy, it builds your UI statically with `npm run build`. These commands are configurable; see [UIConfig](/python/llamaagents/llamactl/configuration-reference#uiconfig-fields) in the configuration reference. You can also use other package managers if you have [corepack](https://nodejs.org/download/release/v19.9.0/docs/api/corepack.html) enabled.

During development, `llamactl` starts its workflow server (port `4501` by default) and starts the UI, passing a `PORT` environment variable (set to `4502` by default) and a `LLAMA_DEPLOY_DEPLOYMENT_BASE_PATH` (for example, `/deployments/<name>/ui`) where the UI will be served. It then proxies requests from the server to the client app from that base path.

Expand All @@ -27,7 +27,7 @@ Once deployed, the Kubernetes operator builds your application with the configur
1. Serve the dev UI on the configured `PORT`. This environment variable tells your dev server which port to use during development. Many frameworks, such as Next.js, read this automatically.
2. Set your app's base path to the value of `LLAMA_DEPLOY_DEPLOYMENT_BASE_PATH`. LlamaAgents applications rely on this path to route to multiple workflow deployments. The proxy leaves this path intact so your application can link internally using absolute paths. Your development server and router need to be aware of this base path. Most frameworks provide a way to configure it. For example, Vite uses [`base`](https://vite.dev/config/shared-options.html#base).
3. Re-export the `LLAMA_DEPLOY_DEPLOYMENT_BASE_PATH` env var to your application. Read this value (for example, in React Router) to configure a base path. This is also often necessary to link static assets correctly.
4. If you're integrating with LlamaCloud, re-export the `LLAMA_DEPLOY_PROJECT_ID` env var to your application and use it to scope your LlamaCloud requests to the same project. Read more in the [Configuration Reference](/python/cloud/llamaagents/configuration-reference#authorization).
4. If you're integrating with LlamaCloud, re-export the `LLAMA_DEPLOY_PROJECT_ID` env var to your application and use it to scope your LlamaCloud requests to the same project. Read more in the [Configuration Reference](/python/llamaagents/llamactl/configuration-reference#authentication).
5. We also recommend re-exporting `LLAMA_DEPLOY_DEPLOYMENT_NAME`, which can be helpful for routing requests to your workflow server correctly.

## Examples
Expand Down Expand Up @@ -138,4 +138,4 @@ export default function Logo() {

## Configure the UI output directory

Your UI must output static assets that the platform can locate. Configure `ui.directory` and `ui.build_output_dir` as described in the [Deployment Config Reference](/python/cloud/llamaagents/configuration-reference#uiconfig-fields). Default: `${ui.directory}/dist`.
Your UI must output static assets that the platform can locate. Configure `ui.directory` and `ui.build_output_dir` as described in the [Deployment Config Reference](/python/llamaagents/llamactl/configuration-reference#uiconfig-fields). Default: `${ui.directory}/dist`.
2 changes: 1 addition & 1 deletion docs/src/content/docs/llamaagents/llamactl/workflow-api.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ LlamaAgents runs your LlamaIndex workflows locally and in the cloud. Author your

## Learn the basics (LlamaIndex Workflows)

LlamaAgents is built on top of LlamaIndex workflows. If you're new to workflows, start here: [LlamaIndex Workflows](/python/workflows).
LlamaAgents is built on top of LlamaIndex workflows. If you're new to workflows, start here: [LlamaIndex Workflows](/python/llamaagents/workflows).

## Author a workflow (quick example)

Expand Down