From 77cb2158326096ba02e7557f46d2a8f62212e106 Mon Sep 17 00:00:00 2001 From: TuanaCelik Date: Wed, 29 Oct 2025 15:38:40 +0100 Subject: [PATCH] fixing internal urls --- .../docs/llamaagents/llamactl/agent-data-javascript.md | 4 ++-- .../docs/llamaagents/llamactl/agent-data-overview.md | 4 ++-- .../docs/llamaagents/llamactl/agent-data-python.md | 2 +- .../docs/llamaagents/llamactl/getting-started.md | 10 +++++----- docs/src/content/docs/llamaagents/llamactl/ui-build.md | 6 +++--- .../content/docs/llamaagents/llamactl/workflow-api.md | 2 +- 6 files changed, 14 insertions(+), 14 deletions(-) diff --git a/docs/src/content/docs/llamaagents/llamactl/agent-data-javascript.md b/docs/src/content/docs/llamaagents/llamactl/agent-data-javascript.md index b47af90d..10609607 100644 --- a/docs/src/content/docs/llamaagents/llamactl/agent-data-javascript.md +++ b/docs/src/content/docs/llamaagents/llamactl/agent-data-javascript.md @@ -10,7 +10,7 @@ Cloud deployments of LlamaAgents is still in alpha. You can try it out locally, Agent Data is a JSON store tied to a `deploymentName` and `collection`. Use the official JavaScript SDK with strong typing for CRUD, search, and aggregation. -See the [Agent Data Overview](/python/cloud/llamaagents/agent-data-overview) for concepts, constraints, and environment details. +See the [Agent Data Overview](/python/llamaagents/llamactl/agent-data-overview) for concepts, constraints, and environment details. Install: ```bash @@ -102,7 +102,7 @@ for (const r of results.items) { } ``` -See the [Agent Data Overview](/python/cloud/llamaagents/agent-data-overview#filter-dsl) for more details on filters. +See the [Agent Data Overview](/python/llamaagents/llamactl/agent-data-overview#filter-dsl) for more details on filters. - Filter keys target `data` fields, except `created_at`/`updated_at` which are top-level. - Sort with comma-separated specs; prefix data fields in `orderBy` (e.g., `"data.name desc, created_at"`). diff --git a/docs/src/content/docs/llamaagents/llamactl/agent-data-overview.md b/docs/src/content/docs/llamaagents/llamactl/agent-data-overview.md index fe1b556b..77c276bd 100644 --- a/docs/src/content/docs/llamaagents/llamactl/agent-data-overview.md +++ b/docs/src/content/docs/llamaagents/llamactl/agent-data-overview.md @@ -63,5 +63,5 @@ SDKs and environments: - The **Python SDK** runs server‑side and uses your API key and an optional base URL. Next steps: -- Python usage: see [Agent Data (Python)](/python/cloud/llamaagents/agent-data-python) -- JavaScript usage: see [Agent Data (JavaScript)](/python/cloud/llamaagents/agent-data-javascript) +- Python usage: see [Agent Data (Python)](/python/llamaagents/llamactl/agent-data-python) +- JavaScript usage: see [Agent Data (JavaScript)](/python/llamaagents/llamactl/agent-data-javascript) diff --git a/docs/src/content/docs/llamaagents/llamactl/agent-data-python.md b/docs/src/content/docs/llamaagents/llamactl/agent-data-python.md index 9973b9cd..1fd7be5a 100644 --- a/docs/src/content/docs/llamaagents/llamactl/agent-data-python.md +++ b/docs/src/content/docs/llamaagents/llamactl/agent-data-python.md @@ -7,7 +7,7 @@ sidebar: Cloud deployments of LlamaAgents is still in alpha. You can try it out locally, or [request access by contacting us](https://landing.llamaindex.ai/llamaagents?utm_source=docs) ::: -See the [Agent Data Overview](/python/cloud/llamaagents/agent-data-overview) for concepts, constraints, and environment details. +See the [Agent Data Overview](/python/llamaagents/llamactl/agent-data-overview) for concepts, constraints, and environment details. ### Install diff --git a/docs/src/content/docs/llamaagents/llamactl/getting-started.md b/docs/src/content/docs/llamaagents/llamactl/getting-started.md index d07dcef0..8b7c8435 100644 --- a/docs/src/content/docs/llamaagents/llamactl/getting-started.md +++ b/docs/src/content/docs/llamaagents/llamactl/getting-started.md @@ -52,7 +52,7 @@ This will prompt for some details, and create a Python module that contains Llam When you run `llamactl init`, the scaffold also includes AI assistant-facing docs: `AGENTS.md`, `CLAUDE.md`, and `GEMINI.md`. These contain quick references and instructions for using LlamaIndex libraries to assist coding. These files are optional and safe to customize or remove — they do not affect your builds, runtime, or deployments. ::: -Application configuration is managed within your project's `pyproject.toml`, where you can define Python workflow instances that should be served, environment details, and configuration for how the UI should be built. See the [Deployment Config Reference](/python/cloud/llamaagents/configuration-reference) for details on all configurable fields. +Application configuration is managed within your project's `pyproject.toml`, where you can define Python workflow instances that should be served, environment details, and configuration for how the UI should be built. See the [Deployment Config Reference](/python/llamaagents/llamactl/configuration-reference) for details on all configurable fields. ## Develop and Run Locally @@ -90,9 +90,9 @@ workflow = MyWorkflow() At this point, you can get to coding. The development server will detect changes as you save files. It will even resume in-progress workflows! -For more information about CLI flags available, see [`llamactl serve`](/python/cloud/llamaagents/llamactl-reference/commands-serve). +For more information about CLI flags available, see [`llamactl serve`](/python/llamaagents/llamactl-reference/commands-serve). -For a more detailed reference on how to define and expose workflows, see [Workflows & App Server API](/python/cloud/llamaagents/workflow-api). +For a more detailed reference on how to define and expose workflows, see [Workflows & App Server API](/python/llamaagents/llamactl/workflow-api). ## Create a Cloud Deployment @@ -116,7 +116,7 @@ llamactl deployments create :::info The first time you run this, you'll be prompted to log into LlamaCloud. -Username/password sign-in is not yet supported. If you do not have a supported social sign-in provider, you can use token-based authentication via `llamactl auth token`. See [`llamactl auth`](/python/cloud/llamaagents/llamactl-reference/commands-auth) for details. +Username/password sign-in is not yet supported. If you do not have a supported social sign-in provider, you can use token-based authentication via `llamactl auth token`. See [`llamactl auth`](/python/llamaagents/llamactl-reference/commands-auth) for details. ::: This will open an interactive Terminal UI (TUI). You can tab through fields, or even point and click with your mouse if your terminal supports it. All required fields should be automatically detected from your environment, but can be customized: @@ -135,4 +135,4 @@ After creation, the TUI will show deployment status and logs. --- -Next: Read about defining and exposing workflows in [Workflows & App Server API](/python/cloud/llamaagents/workflow-api). +Next: Read about defining and exposing workflows in [Workflows & App Server API](/python/llamaagents/llamactl/workflow-api). diff --git a/docs/src/content/docs/llamaagents/llamactl/ui-build.md b/docs/src/content/docs/llamaagents/llamactl/ui-build.md index d25a1125..b6b1e293 100644 --- a/docs/src/content/docs/llamaagents/llamactl/ui-build.md +++ b/docs/src/content/docs/llamaagents/llamactl/ui-build.md @@ -16,7 +16,7 @@ The LlamaAgents toolchain is unopinionated about your UI stack — bring your ow ## How the integration works -`llamactl` starts and proxies your frontend during development by calling your `npm run dev` command. When you deploy, it builds your UI statically with `npm run build`. These commands are configurable; see [UIConfig](/python/cloud/llamaagents/configuration-reference#uiconfig-fields) in the configuration reference. You can also use other package managers if you have [corepack](https://nodejs.org/download/release/v19.9.0/docs/api/corepack.html) enabled. +`llamactl` starts and proxies your frontend during development by calling your `npm run dev` command. When you deploy, it builds your UI statically with `npm run build`. These commands are configurable; see [UIConfig](/python/llamaagents/llamactl/configuration-reference#uiconfig-fields) in the configuration reference. You can also use other package managers if you have [corepack](https://nodejs.org/download/release/v19.9.0/docs/api/corepack.html) enabled. During development, `llamactl` starts its workflow server (port `4501` by default) and starts the UI, passing a `PORT` environment variable (set to `4502` by default) and a `LLAMA_DEPLOY_DEPLOYMENT_BASE_PATH` (for example, `/deployments//ui`) where the UI will be served. It then proxies requests from the server to the client app from that base path. @@ -27,7 +27,7 @@ Once deployed, the Kubernetes operator builds your application with the configur 1. Serve the dev UI on the configured `PORT`. This environment variable tells your dev server which port to use during development. Many frameworks, such as Next.js, read this automatically. 2. Set your app's base path to the value of `LLAMA_DEPLOY_DEPLOYMENT_BASE_PATH`. LlamaAgents applications rely on this path to route to multiple workflow deployments. The proxy leaves this path intact so your application can link internally using absolute paths. Your development server and router need to be aware of this base path. Most frameworks provide a way to configure it. For example, Vite uses [`base`](https://vite.dev/config/shared-options.html#base). 3. Re-export the `LLAMA_DEPLOY_DEPLOYMENT_BASE_PATH` env var to your application. Read this value (for example, in React Router) to configure a base path. This is also often necessary to link static assets correctly. -4. If you're integrating with LlamaCloud, re-export the `LLAMA_DEPLOY_PROJECT_ID` env var to your application and use it to scope your LlamaCloud requests to the same project. Read more in the [Configuration Reference](/python/cloud/llamaagents/configuration-reference#authorization). +4. If you're integrating with LlamaCloud, re-export the `LLAMA_DEPLOY_PROJECT_ID` env var to your application and use it to scope your LlamaCloud requests to the same project. Read more in the [Configuration Reference](/python/llamaagents/llamactl/configuration-reference#authentication). 5. We also recommend re-exporting `LLAMA_DEPLOY_DEPLOYMENT_NAME`, which can be helpful for routing requests to your workflow server correctly. ## Examples @@ -138,4 +138,4 @@ export default function Logo() { ## Configure the UI output directory -Your UI must output static assets that the platform can locate. Configure `ui.directory` and `ui.build_output_dir` as described in the [Deployment Config Reference](/python/cloud/llamaagents/configuration-reference#uiconfig-fields). Default: `${ui.directory}/dist`. +Your UI must output static assets that the platform can locate. Configure `ui.directory` and `ui.build_output_dir` as described in the [Deployment Config Reference](/python/llamaagents/llamactl/configuration-reference#uiconfig-fields). Default: `${ui.directory}/dist`. diff --git a/docs/src/content/docs/llamaagents/llamactl/workflow-api.md b/docs/src/content/docs/llamaagents/llamactl/workflow-api.md index ac39889b..a8e2e356 100644 --- a/docs/src/content/docs/llamaagents/llamactl/workflow-api.md +++ b/docs/src/content/docs/llamaagents/llamactl/workflow-api.md @@ -10,7 +10,7 @@ LlamaAgents runs your LlamaIndex workflows locally and in the cloud. Author your ## Learn the basics (LlamaIndex Workflows) -LlamaAgents is built on top of LlamaIndex workflows. If you're new to workflows, start here: [LlamaIndex Workflows](/python/workflows). +LlamaAgents is built on top of LlamaIndex workflows. If you're new to workflows, start here: [LlamaIndex Workflows](/python/llamaagents/workflows). ## Author a workflow (quick example)