From e6e897a3affa1ae459732a6a3425ee62b957717c Mon Sep 17 00:00:00 2001 From: Linghua Jin Date: Fri, 10 Oct 2025 00:24:33 -0700 Subject: [PATCH] fix links in examples --- docs/docs/core/flow_methods.mdx | 2 +- docs/docs/examples/examples/image_search.md | 2 +- docs/docs/examples/examples/multi_format_index.md | 2 +- docs/docs/examples/examples/photo_search.md | 4 ++-- docs/docs/examples/examples/postgres_source.md | 2 +- examples/amazon_s3_embedding/README.md | 2 +- examples/azure_blob_embedding/README.md | 2 +- examples/gdrive_text_embedding/README.md | 2 +- 8 files changed, 9 insertions(+), 9 deletions(-) diff --git a/docs/docs/core/flow_methods.mdx b/docs/docs/core/flow_methods.mdx index c9098689..39374fc6 100644 --- a/docs/docs/core/flow_methods.mdx +++ b/docs/docs/core/flow_methods.mdx @@ -210,7 +210,7 @@ A data source may enable one or multiple *change capture mechanisms*: * Configured with a [refresh interval](flow_def#refresh-interval), which is generally applicable to all data sources. * Specific data sources also provide their specific change capture mechanisms. - For example, [`Postgres` source](../sources/#postgres) listens to PostgreSQL's change notifications, [`AmazonS3` source](../sources/#amazons3) watches S3 bucket's change events, and [`GoogleDrive` source](../sources#googledrive) allows polling recent modified files. + For example, [`Postgres` source](../sources/postgres) listens to PostgreSQL's change notifications, [`AmazonS3` source](../sources/amazons3) watches S3 bucket's change events, and [`GoogleDrive` source](../sources/googledrive) allows polling recent modified files. See documentations for specific data sources. Change capture mechanisms enable CocoIndex to continuously capture changes from the source data and update the target data accordingly, under live update mode. diff --git a/docs/docs/examples/examples/image_search.md b/docs/docs/examples/examples/image_search.md index 783108c0..3e468784 100644 --- a/docs/docs/examples/examples/image_search.md +++ b/docs/docs/examples/examples/image_search.md @@ -66,7 +66,7 @@ def image_object_embedding_flow(flow_builder, data_scope): The `add_source` function sets up a table with fields like `filename` and `content`. Images are automatically re-scanned every minute. - + ## Process Each Image and Collect the Embedding diff --git a/docs/docs/examples/examples/multi_format_index.md b/docs/docs/examples/examples/multi_format_index.md index b062fa29..2b0e9e31 100644 --- a/docs/docs/examples/examples/multi_format_index.md +++ b/docs/docs/examples/examples/multi_format_index.md @@ -52,7 +52,7 @@ data_scope["documents"] = flow_builder.add_source( cocoindex.sources.LocalFile(path="source_files", binary=True) ) ``` - + ## Convert Files to Pages diff --git a/docs/docs/examples/examples/photo_search.md b/docs/docs/examples/examples/photo_search.md index 1474b6a2..d17c998c 100644 --- a/docs/docs/examples/examples/photo_search.md +++ b/docs/docs/examples/examples/photo_search.md @@ -65,8 +65,8 @@ def face_recognition_flow(flow_builder, data_scope): This creates a table with `filename` and `content` fields. 📂 -You can connect it to your [S3 Buckets](https://cocoindex.io/docs/ops/sources#amazons3) (with SQS integration, [example](https://cocoindex.io/blogs/s3-incremental-etl)) -or [Azure Blob store](https://cocoindex.io/docs/ops/sources#azureblob). +You can connect it to your [S3 Buckets](https://cocoindex.io/docs/ops/sources/amazons3) (with SQS integration, [example](https://cocoindex.io/blogs/s3-incremental-etl)) +or [Azure Blob store](https://cocoindex.io/docs/ops/sources/azureblob). ## Detect and Extract Faces diff --git a/docs/docs/examples/examples/postgres_source.md b/docs/docs/examples/examples/postgres_source.md index 00cf99e5..5f3d4914 100644 --- a/docs/docs/examples/examples/postgres_source.md +++ b/docs/docs/examples/examples/postgres_source.md @@ -59,7 +59,7 @@ CocoIndex incrementally sync data from Postgres. When new or updated rows are fo - `notification` enables change capture based on Postgres LISTEN/NOTIFY. Each change triggers an incremental processing on the specific row immediately. - Regardless if `notification` is provided or not, CocoIndex still needs to scan the full table to detect changes in some scenarios (e.g. between two `update` invocation), and the `ordinal_column` provides a field that CocoIndex can use to quickly detect which row has changed without reading value columns. -Check [Postgres source](https://cocoindex.io/docs/ops/sources#postgres) for more details. +Check [Postgres source](https://cocoindex.io/docs/ops/sources/postgres) for more details. If you use the Postgres database hosted by Supabase, please click Connect on your project dashboard and find the URL there. Check [DatabaseConnectionSpec](https://cocoindex.io/docs/core/settings#databaseconnectionspec) for more details. diff --git a/examples/amazon_s3_embedding/README.md b/examples/amazon_s3_embedding/README.md index bae588f4..4224498d 100644 --- a/examples/amazon_s3_embedding/README.md +++ b/examples/amazon_s3_embedding/README.md @@ -9,7 +9,7 @@ Before running the example, you need to: 1. [Install Postgres](https://cocoindex.io/docs/getting_started/installation#-install-postgres) if you don't have one. 2. Prepare for Amazon S3. - See [Setup for AWS S3](https://cocoindex.io/docs/ops/sources#setup-for-amazon-s3) for more details. + See [Setup for AWS S3](https://cocoindex.io/docs/sources/amazons3#setup-for-amazon-s3) for more details. 3. Create a `.env` file with your Amazon S3 bucket name and (optionally) prefix. Start from copying the `.env.example`, and then edit it to fill in your bucket name and prefix. diff --git a/examples/azure_blob_embedding/README.md b/examples/azure_blob_embedding/README.md index c5d250e2..582b1b08 100644 --- a/examples/azure_blob_embedding/README.md +++ b/examples/azure_blob_embedding/README.md @@ -9,7 +9,7 @@ Before running the example, you need to: 1. [Install Postgres](https://cocoindex.io/docs/getting_started/installation#-install-postgres) if you don't have one. 2. Prepare for Azure Blob Storage. - See [Setup for Azure Blob Storage](https://cocoindex.io/docs/ops/sources#setup-for-azure-blob-storage) for more details. + See [Setup for Azure Blob Storage](https://cocoindex.io/docs/sources/azureblob#setup-for-azure-blob-storage) for more details. 3. Create a `.env` file with your Azure Blob Storage container name and (optionally) prefix. Start from copying the `.env.example`, and then edit it to fill in your bucket name and prefix. diff --git a/examples/gdrive_text_embedding/README.md b/examples/gdrive_text_embedding/README.md index 6cb4cfa7..55bac06d 100644 --- a/examples/gdrive_text_embedding/README.md +++ b/examples/gdrive_text_embedding/README.md @@ -30,7 +30,7 @@ Before running the example, you need to: - Setup a service account in Google Cloud, and download the credential file. - Share folders containing files you want to import with the service account's email address. - See [Setup for Google Drive](https://cocoindex.io/docs/ops/sources#setup-for-google-drive) for more details. + See [Setup for Google Drive](https://cocoindex.io/docs/sources/googledrive#setup-for-google-drive) for more details. 3. Create `.env` file with your credential file and folder IDs. Starting from copying the `.env.example`, and then edit it to fill in your credential file path and folder IDs.