Skip to content

Commit

Permalink
fix: use canonical v1 link throughout pipelines docs (#3240)
Browse files Browse the repository at this point in the history
* fix: use canonical v1 links throughout docs

* use master for roadmap

* fix other broken links

* use sdk/release-1.8 instead of commit hash
  • Loading branch information
connor-mccarthy committed May 5, 2022
1 parent 1116a6e commit 0649c6a
Show file tree
Hide file tree
Showing 36 changed files with 107 additions and 107 deletions.
Expand Up @@ -162,17 +162,17 @@ and use it as your working directory.
### Deploy on GCP with Cloud SQL and Google Cloud Storage

**Note**: This is recommended for production environments. For more details about customizing your environment
for GCP, see the [Kubeflow Pipelines GCP manifests](https://github.com/kubeflow/pipelines/tree/master/manifests/kustomize/env/gcp).
for GCP, see the [Kubeflow Pipelines GCP manifests](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/manifests/kustomize/env/gcp).

### Change deployment namespace

To deploy Kubeflow Pipelines standalone in namespace `<my-namespace>`:

1. Set the `namespace` field to `<my-namespace>` in
[dev/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/master/manifests/kustomize/env/dev/kustomization.yaml) or
[gcp/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/master/manifests/kustomize/env/gcp/kustomization.yaml).
[dev/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/manifests/kustomize/env/dev/kustomization.yaml) or
[gcp/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/manifests/kustomize/env/gcp/kustomization.yaml).

1. Set the `namespace` field to `<my-namespace>` in [cluster-scoped-resources/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/master/manifests/kustomize/cluster-scoped-resources/kustomization.yaml)
1. Set the `namespace` field to `<my-namespace>` in [cluster-scoped-resources/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/manifests/kustomize/cluster-scoped-resources/kustomization.yaml)

1. Apply the changes to update the Kubeflow Pipelines deployment:

Expand All @@ -181,7 +181,7 @@ To deploy Kubeflow Pipelines standalone in namespace `<my-namespace>`:
kubectl apply -k manifests/kustomize/env/dev
```

**Note**: If using GCP Cloud SQL and Google Cloud Storage, set the proper values in [manifests/kustomize/env/gcp/params.env](https://github.com/kubeflow/pipelines/blob/master/manifests/kustomize/env/gcp/params.env), then apply with this command:
**Note**: If using GCP Cloud SQL and Google Cloud Storage, set the proper values in [manifests/kustomize/env/gcp/params.env](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/manifests/kustomize/env/gcp/params.env), then apply with this command:

```
kubectl apply -k manifests/kustomize/cluster-scoped-resources
Expand All @@ -192,15 +192,15 @@ To deploy Kubeflow Pipelines standalone in namespace `<my-namespace>`:

By default, the KFP standalone deployment installs an [inverting proxy agent](https://github.com/google/inverting-proxy) that exposes a public URL. If you want to skip the installation of the inverting proxy agent, complete the following:

1. Comment out the proxy components in the base `kustomization.yaml`. For example in [manifests/kustomize/env/dev/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/master/manifests/kustomize/env/dev/kustomization.yaml) comment out `inverse-proxy`.
1. Comment out the proxy components in the base `kustomization.yaml`. For example in [manifests/kustomize/env/dev/kustomization.yaml](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/manifests/kustomize/env/dev/kustomization.yaml) comment out `inverse-proxy`.

1. Apply the changes to update the Kubeflow Pipelines deployment:

```
kubectl apply -k manifests/kustomize/env/dev
```

**Note**: If using GCP Cloud SQL and Google Cloud Storage, set the proper values in [manifests/kustomize/env/gcp/params.env](https://github.com/kubeflow/pipelines/blob/master/manifests/kustomize/env/gcp/params.env), then apply with this command:
**Note**: If using GCP Cloud SQL and Google Cloud Storage, set the proper values in [manifests/kustomize/env/gcp/params.env](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/manifests/kustomize/env/gcp/params.env), then apply with this command:

```
kubectl apply -k manifests/kustomize/env/gcp
Expand Down
8 changes: 4 additions & 4 deletions content/en/docs/components/pipelines/introduction.md
Expand Up @@ -64,7 +64,7 @@ and [components](/docs/components/pipelines/concepts/component/).
The screenshots and code below show the `xgboost-training-cm.py` pipeline, which
creates an XGBoost model using structured data in CSV format. You can see the
source code and other information about the pipeline on
[GitHub](https://github.com/kubeflow/pipelines/tree/master/samples/core/xgboost_training_cm).
[GitHub](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/samples/core/xgboost_training_cm).

### The runtime execution graph of the pipeline

Expand All @@ -79,7 +79,7 @@ Kubeflow Pipelines UI:

Below is an extract from the Python code that defines the
`xgboost-training-cm.py` pipeline. You can see the full code on
[GitHub](https://github.com/kubeflow/pipelines/tree/master/samples/core/xgboost_training_cm).
[GitHub](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/samples/core/xgboost_training_cm).

```python
@dsl.pipeline(
Expand Down Expand Up @@ -229,9 +229,9 @@ At a high level, the execution of a pipeline proceeds as follows:

* **Python SDK**: You create components or specify a pipeline using the Kubeflow
Pipelines domain-specific language
([DSL](https://github.com/kubeflow/pipelines/tree/master/sdk/python/kfp/dsl)).
([DSL](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/sdk/python/kfp/dsl)).
* **DSL compiler**: The
[DSL compiler](https://github.com/kubeflow/pipelines/tree/master/sdk/python/kfp/compiler)
[DSL compiler](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/sdk/python/kfp/compiler)
transforms your pipeline's Python code into a static configuration (YAML).
* **Pipeline Service**: You call the Pipeline Service to create a
pipeline run from the static configuration.
Expand Down
4 changes: 2 additions & 2 deletions content/en/docs/components/pipelines/overview/quickstart.md
Expand Up @@ -56,7 +56,7 @@ workload:
alt="Run results on the pipelines UI"
class="mt-3 mb-3 border border-info rounded">

You can find the [source code for the **Data passing in python components** tutorial](https://github.com/kubeflow/pipelines/tree/master/samples/tutorials/Data%20passing%20in%20python%20components) in the Kubeflow Pipelines repo.
You can find the [source code for the **Data passing in python components** tutorial](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/samples/tutorials/Data%20passing%20in%20python%20components) in the Kubeflow Pipelines repo.

## Run an ML pipeline

Expand Down Expand Up @@ -89,7 +89,7 @@ Follow these steps to run the sample:
alt="XGBoost results on the pipelines UI"
class="mt-3 mb-3 border border-info rounded">

You can find the [source code for the **XGBoost - Iterative model training** demo](https://github.com/kubeflow/pipelines/tree/master/samples/core/xgboost_training_cm) in the Kubeflow Pipelines repo.
You can find the [source code for the **XGBoost - Iterative model training** demo](https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/samples/core/xgboost_training_cm) in the Kubeflow Pipelines repo.

## Next steps

Expand Down
Expand Up @@ -284,7 +284,7 @@ <h3 id="tag-PipelineService" class="swagger-summary-tag">Tag: PipelineService</h
changes to the pipeline&#39;s most recent pipeline version. If there are no
remaining pipeline versions, the pipeline will have no default version.
Examines the run_service_api.ipynb notebook to learn more about creating a
run using a pipeline version (<a href="https://github.com/kubeflow/pipelines/blob/master/tools/benchmarks/run_service_api.ipynb">https://github.com/kubeflow/pipelines/blob/master/tools/benchmarks/run_service_api.ipynb</a>).</p>
run using a pipeline version (<a href="https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/tools/benchmarks/run_service_api.ipynb">https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/tools/benchmarks/run_service_api.ipynb</a>).</p>
</td>
</tr>
<tr>
Expand Down Expand Up @@ -2067,7 +2067,7 @@ <h3 class="panel-title"><span class="operation-name">POST</span> <strong>/apis/v
changes to the pipeline&apos;s most recent pipeline version. If there are no
remaining pipeline versions, the pipeline will have no default version.
Examines the run_service_api.ipynb notebook to learn more about creating a
run using a pipeline version (<a href="https://github.com/kubeflow/pipelines/blob/master/tools/benchmarks/run_service_api.ipynb">https://github.com/kubeflow/pipelines/blob/master/tools/benchmarks/run_service_api.ipynb</a>).</div>
run using a pipeline version (<a href="https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/tools/benchmarks/run_service_api.ipynb">https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/tools/benchmarks/run_service_api.ipynb</a>).</div>
<h3 class="panel-title"><span class="operation-name">DELETE</span> <strong>/apis/v1beta1/pipeline_versions/{version_id}</strong></h3>
Tags:
<a href="#tag-PipelineService">PipelineService</a>
Expand Down
Expand Up @@ -51,7 +51,7 @@ See some examples of real-world
## Detailed specification (ComponentSpec)

This section describes the
[ComponentSpec](https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/components/_structures.py).
[ComponentSpec](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/components/_structures.py).

### Metadata

Expand Down Expand Up @@ -80,7 +80,7 @@ This section describes the
as hints for pipeline authors and can be used by the pipeline system/UI
to validate arguments and connections between components. Basic types
are **String**, **Integer**, **Float**, and **Bool**. See the full list
of [types](https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/types.py)
of [types](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/types.py)
defined by the Kubeflow Pipelines SDK.
* `optional`: Specifies if input is optional or not. This is of type
**Bool**, and defaults to **False**. **Only valid for inputs.**
Expand Down
12 changes: 6 additions & 6 deletions content/en/docs/components/pipelines/sdk-v2/build-pipeline.ipynb
Expand Up @@ -242,7 +242,7 @@
"All outputs are returned as files, using the the paths that Kubeflow Pipelines\n",
"provides.\n",
"\n",
"[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py\n",
"[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py\n",
"\n",
"Python function-based components make it easier to build pipeline components\n",
"by building the component specification for you. Python function-based\n",
Expand Down Expand Up @@ -411,11 +411,11 @@
"\n",
"The following example shows the updated `merge_csv` function.\n",
"\n",
"[web-download-component]: https://github.com/kubeflow/pipelines/blob/master/components/web/Download/component.yaml\n",
"[web-download-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/components/web/Download/component.yaml\n",
"[python-function-components]: https://www.kubeflow.org/docs/components/pipelines/sdk-v2/python-function-components/\n",
"[input]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py\n",
"[output]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py\n",
"[dsl-component]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/_component.py"
"[input]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py\n",
"[output]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py\n",
"[dsl-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/_component.py"
]
},
{
Expand Down Expand Up @@ -621,7 +621,7 @@
" pipeline][conditional].\n",
"\n",
"\n",
"[conditional]: https://github.com/kubeflow/pipelines/blob/master/samples/tutorials/DSL%20-%20Control%20structures/DSL%20-%20Control%20structures.py\n",
"[conditional]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/samples/tutorials/DSL%20-%20Control%20structures/DSL%20-%20Control%20structures.py\n",
"[k8s-resources]: https://www.kubeflow.org/docs/components/pipelines/sdk/manipulate-resources/"
]
}
Expand Down
12 changes: 6 additions & 6 deletions content/en/docs/components/pipelines/sdk-v2/build-pipeline.md
Expand Up @@ -238,7 +238,7 @@ depending on their data type.
All outputs are returned as files, using the the paths that Kubeflow Pipelines
provides.

[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py
[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py

Python function-based components make it easier to build pipeline components
by building the component specification for you. Python function-based
Expand Down Expand Up @@ -373,11 +373,11 @@ Learn more about [building Python function-based components][python-function-com

The following example shows the updated `merge_csv` function.

[web-download-component]: https://github.com/kubeflow/pipelines/blob/master/components/web/Download/component.yaml
[web-download-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/components/web/Download/component.yaml
[python-function-components]: https://www.kubeflow.org/docs/components/pipelines/sdk-v2/python-function-components/
[input]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py
[output]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py
[dsl-component]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/_component.py
[input]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py
[output]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py
[dsl-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/_component.py


```python
Expand Down Expand Up @@ -505,7 +505,7 @@ client.create_run_from_pipeline_func(
pipeline][conditional].


[conditional]: https://github.com/kubeflow/pipelines/blob/master/samples/tutorials/DSL%20-%20Control%20structures/DSL%20-%20Control%20structures.py
[conditional]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/samples/tutorials/DSL%20-%20Control%20structures/DSL%20-%20Control%20structures.py
[k8s-resources]: https://www.kubeflow.org/docs/components/pipelines/sdk/manipulate-resources/


Expand Down
Expand Up @@ -423,7 +423,7 @@ The following examples demonstrate how to specify your component's interface.
]
```

[dsl-types]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/types.py
[dsl-types]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/types.py
[dsl-type-checking]: https://www.kubeflow.org/docs/components/pipelines/sdk/static-type-checking/

### Specify your component's metadata
Expand Down Expand Up @@ -584,7 +584,7 @@ components/<component group>/<component name>/

See this [sample component][org-sample] for a real-life component example.

[org-sample]: https://github.com/kubeflow/pipelines/tree/master/components/sample/keras/train_classifier
[org-sample]: https://github.com/kubeflow/pipelines/tree/sdk/release-1.8/components/sample/keras/train_classifier

## Next steps

Expand All @@ -606,4 +606,4 @@ See this [sample component][org-sample] for a real-life component example.
resources](/docs/examples/shared-resources/).


[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py
[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py
Expand Up @@ -309,7 +309,7 @@
"All outputs are returned as files, using the the paths that Kubeflow Pipelines\n",
"provides.\n",
"\n",
"[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py\n",
"[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py\n",
"\n",
"The following sections describe how to pass parameters and artifacts to your function. \n",
"\n",
Expand Down Expand Up @@ -340,7 +340,7 @@
"[kfp-metrics]: https://www.kubeflow.org/docs/components/pipelines/sdk/pipelines-metrics/\n",
"[input-path]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/kfp.components.html#kfp.components.InputPath\n",
"[output-path]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/kfp.components.html#kfp.components.OutputPath\n",
"[vs-dsl-component]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/v2/dsl/component_decorator.py"
"[vs-dsl-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/v2/components/component_decorator.py"
]
},
{
Expand Down Expand Up @@ -467,7 +467,7 @@
"\n",
"[input]: https://github.com/kubeflow/pipelines/blob/c5daa7532d18687b180badfca8d750c801805712/sdk/python/kfp/dsl/io_types.py\n",
"[output]: https://github.com/kubeflow/pipelines/blob/c5daa7532d18687b180badfca8d750c801805712/sdk/python/kfp/dsl/io_types.py\n",
"[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py\n",
"[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py\n",
"\n"
]
},
Expand Down
Expand Up @@ -279,7 +279,7 @@ depending on their data type.
All outputs are returned as files, using the the paths that Kubeflow Pipelines
provides.

[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py
[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py

The following sections describe how to pass parameters and artifacts to your function.

Expand Down Expand Up @@ -310,7 +310,7 @@ The following example demonstrates how to return multiple outputs by value.
[kfp-metrics]: https://www.kubeflow.org/docs/components/pipelines/sdk/pipelines-metrics/
[input-path]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/kfp.components.html#kfp.components.InputPath
[output-path]: https://kubeflow-pipelines.readthedocs.io/en/latest/source/kfp.components.html#kfp.components.OutputPath
[vs-dsl-component]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/v2/dsl/component_decorator.py
[vs-dsl-component]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/v2/components/component_decorator.py


```python
Expand Down Expand Up @@ -419,7 +419,7 @@ To return a file as an output, use one of the following type annotations:

[input]: https://github.com/kubeflow/pipelines/blob/c5daa7532d18687b180badfca8d750c801805712/sdk/python/kfp/dsl/io_types.py
[output]: https://github.com/kubeflow/pipelines/blob/c5daa7532d18687b180badfca8d750c801805712/sdk/python/kfp/dsl/io_types.py
[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py
[kfp-artifact]: https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py



Expand Down
Expand Up @@ -64,7 +64,7 @@ introduces the following changes:
are always passed by value, which means that they are inserted into the
command used to execute the component. Parameters are stored in ML Metadata.

* [Artifacts](https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/io_types.py)
* [Artifacts](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/io_types.py)
are larger inputs or outputs, such as datasets or models. Input
artifacts are always passed as a reference to a path.

Expand All @@ -75,7 +75,7 @@ introduces the following changes:
* The following changes affect how you define a pipeline:

* Pipeline functions must be decorated with
[`@kfp.dsl.pipeline`](https://github.com/kubeflow/pipelines/blob/master/sdk/python/kfp/dsl/_pipeline.py). Specify the following arguments for the
[`@kfp.dsl.pipeline`](https://github.com/kubeflow/pipelines/blob/sdk/release-1.8/sdk/python/kfp/dsl/_pipeline.py). Specify the following arguments for the
`@pipeline` annotation.

* `name`: The pipeline name is used when querying MLMD to store or lookup
Expand Down

0 comments on commit 0649c6a

Please sign in to comment.