Skip to content

Commit

Permalink
Fixed broken links
Browse files Browse the repository at this point in the history
Signed-off-by: hbelmiro <helber.belmiro@gmail.com>
  • Loading branch information
hbelmiro committed Jun 12, 2024
1 parent 9665cfe commit bb40231
Show file tree
Hide file tree
Showing 12 changed files with 29 additions and 29 deletions.
2 changes: 1 addition & 1 deletion content/en/_redirects
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
/docs/pipelines/ /docs/components/pipelines
/docs/pipelines/output-viewer/ /docs/components/pipelines/legacy-v1/sdk/output-viewer/
/docs/pipelines/pipelines-metrics/ /docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/
/docs/pipelines/build-component/ /docs/components/pipelines/legacy-v1/sdk/build-component/
/docs/pipelines/build-component/ /docs/components/pipelines/legacy-v1/sdk/component-development//
/docs/pipelines/install-sdk/ /docs/components/pipelines/legacy-v1/sdk/install-sdk/
/docs/pipelines/lightweight-python-components/ /docs/components/pipelines/legacy-v1/sdk/python-function-components/
/docs/pipelines/sdk/lightweight-python-components/ /docs/components/pipelines/legacy-v1/sdk/python-function-components/
Expand Down
2 changes: 1 addition & 1 deletion content/en/docs/components/pipelines/concepts/component.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,6 @@ deserialize the data for use in the downstream component.
to deploy Kubeflow and run a sample pipeline directly from the Kubeflow
Pipelines UI.
* Build your own
[component and pipeline](/docs/components/pipelines/legacy-v1/sdk/build-component/).
[component and pipeline](/docs/components/pipelines/legacy-v1/sdk/component-development//).
* Build a [reusable component](/docs/components/pipelines/legacy-v1/sdk/component-development/) for
sharing in multiple pipelines.
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ kubectl edit configMap kfp-launcher -n ${namespace}
This pipeline root will be the default pipeline root for all pipelines running in the Kubernetes namespace unless you override it using one of the following options:

#### Via Building Pipelines
You can configure a pipeline root through the `kfp.dsl.pipeline` annotation when [building pipelines](https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/build-pipeline/#build-your-pipeline)
You can configure a pipeline root through the `kfp.dsl.pipeline` annotation when [building pipelines](/docs/components/pipelines/legacy-v1/sdk/build-pipeline/#build-your-pipeline)

#### Via Submitting a Pipeline through SDK
You can configure pipeline root via `pipeline_root` argument when you submit a Pipeline using one of the following:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,9 @@ improvements can make it the default executor that most people should use going
* Security: more secure
* No `privileged` access.
* Cannot escape the privileges of the pod's service account.
* Migration: `command` must be specified in [Kubeflow Pipelines component specification](https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/).
* Migration: `command` must be specified in [Kubeflow Pipelines component specification](/docs/components/pipelines/reference/component-spec/).

Note, the same migration requirement is required by [Kubeflow Pipelines v2 compatible mode](https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/v2-compatibility/), refer to
Note, the same migration requirement is required by [Kubeflow Pipelines v2 compatible mode](/docs/components/pipelines/legacy-v1/sdk/v2-compatibility/), refer to
[known caveats & breaking changes](https://github.com/kubeflow/pipelines/issues/6133).

#### Migrate to Emissary Executor
Expand Down Expand Up @@ -105,7 +105,7 @@ existing clusters.
##### Migrate pipeline components to run on emissary executor

Some pipeline components require manual updates to run on emissary executor.
For [Kubeflow Pipelines component specification](https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/) YAML,
For [Kubeflow Pipelines component specification](/docs/components/pipelines/reference/component-spec/) YAML,
the `command` field must be specified.

Step by step component migration tutorial:
Expand Down Expand Up @@ -152,7 +152,7 @@ Step by step component migration tutorial:
1. The updated component can run on emissary executor now.

Note: Kubeflow Pipelines SDK compiler always specifies a command for
[python function based components](https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/python-function-components/).
[python function based components](/docs/components/pipelines/legacy-v1/sdk/python-function-components/).
Therefore, these components will continue to work on emissary executor without
modifications.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -221,11 +221,11 @@ when designing a pipeline.
into a single file.

[container-op]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/dsl.html#kfp.dsl.ContainerOp
[component-spec]: https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/
[python-function-component]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/python-function-components/
[component-dev]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/component-development/
[python-function-component-data-passing]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/python-function-components/#understanding-how-data-is-passed-between-components
[prebuilt-components]: https://www.kubeflow.org/docs/examples/shared-resources/
[component-spec]: /docs/components/pipelines/reference/component-spec/
[python-function-component]: /docs/components/pipelines/legacy-v1/sdk/python-function-components/
[component-dev]: /docs/components/pipelines/legacy-v1/sdk/component-development/
[python-function-component-data-passing]: /docs/components/pipelines/legacy-v1/sdk/python-function-components/#understanding-how-data-is-passed-between-components
[prebuilt-components]: /docs/examples/shared-resources/


```python
Expand Down Expand Up @@ -319,7 +319,7 @@ $ head merged_data.csv
The following example shows the updated `merge_csv` function.

[web-download-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/components/web/Download/component.yaml
[python-function-components]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/python-function-components/
[python-function-components]: /docs/components/pipelines/legacy-v1/sdk/python-function-components/
[input-path]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html?highlight=inputpath#kfp.components.InputPath
[output-path]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html?highlight=outputpath#kfp.components.OutputPath

Expand Down Expand Up @@ -415,14 +415,14 @@ kfp.compiler.Compiler().compile(
2. Upload and run your `pipeline.yaml` using the Kubeflow Pipelines user interface.
See the guide to [getting started with the UI][quickstart].

[quickstart]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/overview/quickstart
[quickstart]: /docs/components/pipelines/legacy-v1/overview/quickstart

#### Option 2: run the pipeline using Kubeflow Pipelines SDK client

1. Create an instance of the [`kfp.Client` class][kfp-client] following steps in [connecting to Kubeflow Pipelines using the SDK client][connect-api].

[kfp-client]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/client.html#kfp.Client
[connect-api]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/connect-api
[connect-api]: /docs/components/pipelines/legacy-v1/sdk/connect-api


```python
Expand Down Expand Up @@ -450,8 +450,8 @@ client.create_run_from_pipeline_func(
pipeline][k8s-resources] (Experimental).

[conditional]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/samples/tutorials/DSL%20-%20Control%20structures/DSL%20-%20Control%20structures.py
[recursion]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/dsl-recursion/
[k8s-resources]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/manipulate-resources/
[recursion]: /docs/components/pipelines/legacy-v1/sdk/dsl-recursion/
[k8s-resources]: /docs/components/pipelines/legacy-v1/sdk/manipulate-resources/


<div class="notebook-links">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -395,7 +395,7 @@ The following examples demonstrate how to specify your component's interface.
```

[dsl-types]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/types.py
[dsl-type-checking]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/static-type-checking/
[dsl-type-checking]: /docs/components/pipelines/legacy-v1/sdk/static-type-checking/

### Specify your component's metadata

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -277,4 +277,4 @@ For better understanding, please refer to the following samples:
[Kubeflow Pipelines domain-specific language (DSL)](/docs/components/pipelines/legacy-v1/sdk/dsl-overview/),
a set of Python libraries that you can use to specify ML pipelines.
* For quick iteration,
[build components and pipelines](/docs/components/pipelines/legacy-v1/sdk/build-component/).
[build components and pipelines](/docs/components/pipelines/legacy-v1/sdk/component-development//).
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@
"source": [
"3. Create and run your pipeline. [Learn more about creating and running pipelines][build-pipelines].\n",
"\n",
"[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/build-component/"
"[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/component-development//"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ Python function-based components make it easier to iterate quickly by letting yo
component code as a Python function and generating the [component specification][component-spec] for you.
This document describes how to build Python function-based components and use them in your pipeline.

[component-spec]: https://www.kubeflow.org/docs/components/pipelines/reference/component-spec/
[component-spec]: /docs/components/pipelines/reference/component-spec/

## Before you begin

Expand All @@ -70,7 +70,7 @@ from kfp.components import create_component_from_func
3. Create an instance of the [`kfp.Client` class][kfp-client] following steps in [connecting to Kubeflow Pipelines using the SDK client][connect-api].

[kfp-client]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/client.html#kfp.Client
[connect-api]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/connect-api
[connect-api]: /docs/components/pipelines/legacy-v1/sdk/connect-api


```python
Expand Down Expand Up @@ -110,7 +110,7 @@ add_op = create_component_from_func(

3. Create and run your pipeline. [Learn more about creating and running pipelines][build-pipelines].

[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/build-component/
[build-pipelines]: /docs/components/pipelines/legacy-v1/sdk/component-development//


```python
Expand Down Expand Up @@ -323,8 +323,8 @@ including component metadata and metrics.
[dockerfile]: https://docs.docker.com/engine/reference/builder/
[named-tuple-hint]: https://docs.python.org/3/library/typing.html#typing.NamedTuple
[named-tuple]: https://docs.python.org/3/library/collections.html#collections.namedtuple
[kfp-visualize]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/output-viewer/
[kfp-metrics]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/
[kfp-visualize]: /docs/components/pipelines/legacy-v1/sdk/output-viewer/
[kfp-metrics]: /docs/components/pipelines/legacy-v1/sdk/pipelines-metrics/
[input-path]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html#kfp.components.InputPath
[output-path]: https://kubeflow-pipelines.readthedocs.io/en/stable/source/components.html#kfp.components.OutputPath

Expand Down Expand Up @@ -563,7 +563,7 @@ def calc_pipeline(

5. Compile and run your pipeline. [Learn more about compiling and running pipelines][build-pipelines].

[build-pipelines]: https://www.kubeflow.org/docs/components/pipelines/legacy-v1/sdk/build-pipeline/#compile-and-run-your-pipeline
[build-pipelines]: /docs/components/pipelines/legacy-v1/sdk/build-pipeline/#compile-and-run-your-pipeline


```python
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ static type checking for fast development iterations.

## Motivation

A pipeline is a workflow consisting of [components](/docs/components/pipelines/legacy-v1/sdk/build-component#overview-of-pipelines-and-components) and each
A pipeline is a workflow consisting of [components](/docs/components/pipelines/legacy-v1/sdk/component-development/#overview-of-pipelines-and-components) and each
component contains inputs and outputs. The DSL compiler supports static type checking to ensure the type consistency among the component
I/Os within the same pipeline. Static type checking helps you to identify component I/O inconsistencies without running the pipeline.
It also shortens the development cycles by catching the errors early.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ wget -O ${PIPELINE_FILE} ${PIPELINE_URL}
dsl-compile --py ${PIPELINE_FILE} --output ${PIPELINE_NAME}.tar.gz
```

After running the commands above, you should get two files in your current directory: `sequential.py` and `sequential.tar.gz`. Run the following command to deploy the generated `.tar.gz` file as you would do using the [Kubeflow Pipelines UI](/docs/components/pipelines/legacy-v1/sdk/build-component/#deploy-the-pipeline), but this time using the REST API.
After running the commands above, you should get two files in your current directory: `sequential.py` and `sequential.tar.gz`. Run the following command to deploy the generated `.tar.gz` file as you would do using the [Kubeflow Pipelines UI](/docs/components/pipelines/legacy-v1/sdk/component-development//#deploy-the-pipeline), but this time using the REST API.

```
SVC=localhost:8888
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,6 @@ The following notebooks are available:
* Learn the various ways to use the [Kubeflow Pipelines
SDK](/docs/components/pipelines/legacy-v1/sdk/sdk-overview/).
* See how to
[build your own pipeline components](/docs/components/pipelines/legacy-v1/sdk/build-component/).
[build your own pipeline components](/docs/components/pipelines/legacy-v1/sdk/component-development//).
* Read more about
[building lightweight components](/docs/components/pipelines/legacy-v1/sdk/lightweight-python-components/).

0 comments on commit bb40231

Please sign in to comment.