diff --git a/dev-ai-data-platform/data-products/data-products.md b/dev-ai-data-platform/data-products/data-products.md
index 6aca2aa0..7b0d7f52 100644
--- a/dev-ai-data-platform/data-products/data-products.md
+++ b/dev-ai-data-platform/data-products/data-products.md
@@ -95,7 +95,7 @@ In this Lab, you will:
10. On the **Select Tables** page, choose the table to share with the **Risk Department**:
- * Select the **Share\_Loan\_Data\_Risk\_VW** table in the **Available Tables** column.
+ * Select the **Shared\_Loan\_Data\_Risk\_VW** table in the **Available Tables** column.
* Click the **move (>)** button to add it to the **Shared Tables** column

@@ -104,7 +104,7 @@ In this Lab, you will:
11. In the **Recipients** section, lets define who we want to create this data share for by clicking **New Recipients**.
- 
+ 
12. In the **Create Share Recipient** window, enter the following:
@@ -118,7 +118,7 @@ In this Lab, you will:
13. Back on the **Create Share** page, select the newly created recipient from the list of recipients.
- 
+ 
14. Click the **Copy** icon to copy the recipient's activation link to your clipboard.
@@ -128,11 +128,11 @@ In this Lab, you will:
15. Now, publish your share by clicking the **Publish** button from the actions menu.
- 
+ 
16. This will turn the Share Icon green with a state of Published Share.
- 
+ 
>***Congratulations!!!*** You’ve just **created and published a data product share**.
By defining the share, selecting the right data, and authorizing a recipient, you’ve set up a **governed, reusable pipeline for cross-team collaboration**.
@@ -143,11 +143,11 @@ In this Lab, you will:
1. **Download the Activation Link Profile for the Data Share** that we will use in the upcoming lab
Paste the activation link you copied earlier into a separate browser tab and click **Get Profile Information** to download the recipient profile file (The default name is `delta_share_profile.json`).
->If you experience an error with your activation link don't worry, the steps below will show you how to get a new copy and try again.
+>If you experience an error with your activation link don't worry, you can get a new link in step 3 below and try again.

-2. A list of **share recipients** and their **Profile Activation link** can also be retrieved from the **Provide Share** page, by clicking the **Actions** icon next to your data product share.
Then selecting **Recipients and Profiles**.
+2. You can get a list of the **share recipients** and their **Profile Activation link** from the **Provide Share** page, by clicking the **Actions** icon next to your data product share.
Then selecting **Recipients and Profiles**.

@@ -172,5 +172,5 @@ At SeersEquities, this means **smoother handoffs, faster risk evaluation, and be
## Acknowledgements
* **Authors** - Eddie Ambler
-* **Last Updated By/Date** - September 2025, Eddie Ambler
+* **Last Updated By/Date** - Eddie Ambler - September 2025
diff --git a/dev-ai-data-platform/load-transform/images/confirm-move-data-file1.png b/dev-ai-data-platform/load-transform/images/confirm-move-data-file1.png
index 891eaa57..98760635 100644
Binary files a/dev-ai-data-platform/load-transform/images/confirm-move-data-file1.png and b/dev-ai-data-platform/load-transform/images/confirm-move-data-file1.png differ
diff --git a/dev-ai-data-platform/load-transform/load-transform.md b/dev-ai-data-platform/load-transform/load-transform.md
index dfdf6425..2076ea88 100644
--- a/dev-ai-data-platform/load-transform/load-transform.md
+++ b/dev-ai-data-platform/load-transform/load-transform.md
@@ -60,8 +60,8 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
```
- 5. Right-click on your browser tab and select **Duplicate** from the context menu to open another tab.
- * Click **Database Actions** in the top banner **of the new tab**.
+ 5. Right-click on your browser tab and select **Duplicate** from the context menu to open another tab.
+ * Click **Database Actions** in the top banner **of the new tab**.

@@ -89,7 +89,7 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
11. Return to **SQL | Oracle Database Actions** browser tab.
- * In the PL/SQL block modify the ***source\_uri*** definition in the **PL/SQL Block in the SQL Worksheet**, as shown below:
+ * In the PL/SQL block modify the ***source\_uri*** definition in the **PL/SQL Block in the SQL Worksheet**, as shown below:
```text
source_uri VARCHAR2(100) := ‘Paste the LOANAPP_FUNDING uri you copied here';
@@ -103,11 +103,11 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline

- Click **Close** to exit.
+ * Click **Close** to exit.
14. Return to **SQL | Oracle Database Actions** browser tab.
- * In the PL/SQL block modify the ***target\_uri*** definition in the **PL/SQL block we placed in the SQL Worksheet**, as shown below:
+ * In the PL/SQL block modify the ***target\_uri*** definition in the **PL/SQL block we placed in the SQL Worksheet**, as shown below:
```text
target_uri VARCHAR2(100) := ‘Paste the MYDEMOBUCKET uri you copied here';
@@ -121,34 +121,33 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
16. Return to **Data Load | Oracle Database** tab.
-* Click the **Actions** icon in the **MyDemoBucket** panel, then select **Objects** from the context menu.
+ * Click the **Actions** icon in the **MyDemoBucket** panel, then select **Objects** from the context menu.

-17. Click the folder icon to confirm that the **funding\_commitments1.json** file in the **LOANAPP\_FUNDING** bucket has been successfully copied here.
+17. Expand the **FUNDING** folder icon to confirm that the **funding\_commitments1.json** file from the **LOANAPP\_FUNDING** bucket has been successfully copied here.

- Click **Close** to exit.
+ * Click **Close** to exit.
***Congratulations!*** You have now successfuly interacted with data in object storage using PL/SQL from the Data Studio tools and your Autonomous Database.
## Task 2: Build Initial Live Feed Table
- 1. From the **Data Load | Oracle Database** tab - Navigate to Live Feed.
+1. From the **Data Load | Oracle Database** tab - Navigate to Live Feed.
- * On Left rail expand **Data Load**, then click on **Live Feed**.
+ * On Left rail expand **Data Load**, then click on **Live Feed**.

>You should now see the Live Feed Page
+2. Click the **Create Live Table Feed** button to enter the Create Live Feed wizard
- 2. Click the **Create Live Table Feed** button to enter the Create Live Feed wizard
+ 
- 
-
- 3. Enter details for the Live Table Feed Preview.
+3. Enter details for the Live Table Feed Preview.
* Select Cloud Store Location: **MyDemoBucket**
* Select Radial Box: **Basic**
@@ -161,28 +160,28 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
* Click the **Next** button to proceed.
-
- 4. Configure Live Feed Table Settings as follows:
+4. Configure Live Feed Table Settings as follows:
* **For Option**: Choose **Merge Into Table** from drop-down list
* **For target Table Name**: Enter the name of the target table of the Live Feed -- **FUNDING\_PROVIDER\_OFFER\_STG**. ***In ALL CAPS***
- * Then modify **Mapping** details exactly as shown below:
- >**Modify mapping to update Data Type** to NUMBER for: FUNDING_PROVIDER_ID and FUNDING_OFFER_REFERENCE_ID
- >**For Merge Key**: Select FUNDING_PROVIDER_ID and FUNDING_OFFER_REFERENCE_ID
+ * Then modify **Mapping** details exactly as shown below:
+ * **Update Data Type** to NUMBER for: FUNDING\_PROVIDER\_ID and FUNDING\_OFFER\_REFERENCE\_ID
+ * **For Merge Key**: Select FUNDING\_PROVIDER\_ID and FUNDING\_OFFER\_REFERENCE\_ID
+ * **Unselect last row:** Inclusion of SYSTIMESTAMP Source

* Click the **Next** button to proceed.
- 5. Review the information shown on the Preview page.
+5. Review the information shown on the Preview page.

* Click **Next** to proceed.
- 6. Enter remaining details for the **Live Table Feed**
+6. Enter remaining details for the **Live Table Feed**
a. Enter live feed name **LOANAPP\_FUNDING\_FEED**
b. Check box to **Enable for Scheduling**.
@@ -192,11 +191,11 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
* Click **Create**
- 7. When the popup box appears, select **Yes** to run the Live Feed.
+7. When the popup box appears, select **Yes** to run the Live Feed.

- 8. Review Live Feed Table and set page Refresh Interval
+8. Review Live Feed Table and set page Refresh Interval
* **You should see 3 rows loaded**
* **Set page refresh interval to 10 seconds** to see when new data is loaded
@@ -205,11 +204,12 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
***Congratulations!*** You have successfully created your Live Feed table.
+
## Task 3: Test Live Feed Table Data Population
1. Return to the **SQL | Oracle Database Actions** tab where your PL/SQL Block exists.
-* Load a file into our Live Feed Bucket to trigger the Live Feed process by modifying the **object\_name** definition in the PL/SQL block, as shown below:
+ * Load a file into our Live Feed Bucket to trigger the Live Feed process by modifying the **object\_name** definition in the PL/SQL block, as shown below:
```text
object_name VARCHAR2(200) := 'funding_commitments2.json';
@@ -232,7 +232,7 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
* Clear the SQL Worksheet and Run the following code to populate new loan products.
- ```text
+ ```text
DECLARE
New_Funding_Offers NUMBER;
@@ -260,9 +260,9 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
```
- 
+ 
- ***Congratulations!*** On creating a Live Feed that can automatically load data from object storage into your database and be integrated into an automated business process.
+ ***Congratulations!*** On creating a Live Feed that can automatically load data from object storage into your database and be integrated into an automated business process.
## Conclusion
In this lab, you built a data pipeline using the Oracle Live Table Feed tool and successfully used the data from the pipeline in a PL/SQL block that was run from Oracle Autonomous Database to automate business processes.
diff --git a/dev-ai-data-platform/mktplace-subscribe/images/query-shared-data-product.png b/dev-ai-data-platform/mktplace-subscribe/images/query-shared-data-product.png
new file mode 100644
index 00000000..ab19623c
Binary files /dev/null and b/dev-ai-data-platform/mktplace-subscribe/images/query-shared-data-product.png differ
diff --git a/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-2a.png b/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-2a.png
index 4de95e05..d5b18b0e 100644
Binary files a/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-2a.png and b/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-2a.png differ
diff --git a/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-6.png b/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-6.png
index 8356233b..ed77be72 100644
Binary files a/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-6.png and b/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-6.png differ
diff --git a/dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md b/dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md
index 8b662096..a58012e6 100644
--- a/dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md
+++ b/dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md
@@ -58,7 +58,7 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec
* Select **Create Share Provider** as share source
* Select **From File** as **Share Provider JSON**
- * Select the file you downloaded in the lab **Create & Share Trusted Data Products**
+ * Select the file named **delta\_share\_profile.json** that you downloaded in lab 3
* In **Provider Name** text box enter: **Demo-Data-Share**
* In **Description** text box enter: **Demo-Data-Share**
@@ -112,9 +112,10 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec

-7. Once the job is complete, check that you see the link icon next to the data link card to confirm success.
+7. Once the job completes, confirm Data Link success
- >If after a few seconds the status does not update, click the refresh icon on the right.
+ * Set the refresh icon on the right to 60 seconds.
+ * Check that you see the link icon next to the data link card to confirm success

@@ -142,6 +143,13 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec

+ >**This will cause the SQL Worksheet to appear** and auto run a select all query from our linked table as shown below.
+ >The **query result section** proves that the query was able to access the data in the object storage file and return it the dashboard
+
+
+
+ 
+
***Congratulations!*** You’ve now subscribed to a shared data product and validated that you can query it directly from Autonomous Database.
This ensures the Risk team at SeersEquities can work with live, trusted data—ready for real-time analysis without unnecessary data movement.
## Conclusion
@@ -157,5 +165,4 @@ This workflow ensures faster risk analysis, smarter decisions, and tighter colla
## Acknowledgements
* **Authors** - Eddie Ambler
-* **Last Updated By/Date** - September 2025, Eddie Ambler
-
+* **Last Updated By/Date** - Eddie Ambler - September 2025
\ No newline at end of file
diff --git a/dev-ai-data-platform/sync-link/images/task1-scrn-14.png b/dev-ai-data-platform/sync-link/images/task1-scrn-14.png
index 20834d92..086b5b21 100644
Binary files a/dev-ai-data-platform/sync-link/images/task1-scrn-14.png and b/dev-ai-data-platform/sync-link/images/task1-scrn-14.png differ
diff --git a/dev-ai-data-platform/sync-link/images/task1-scrn-16.png b/dev-ai-data-platform/sync-link/images/task1-scrn-16.png
index e8ae20cb..699fbb13 100644
Binary files a/dev-ai-data-platform/sync-link/images/task1-scrn-16.png and b/dev-ai-data-platform/sync-link/images/task1-scrn-16.png differ
diff --git a/dev-ai-data-platform/sync-link/images/task2-scrn-14.png b/dev-ai-data-platform/sync-link/images/task2-scrn-14.png
index 697a5f45..1344b094 100644
Binary files a/dev-ai-data-platform/sync-link/images/task2-scrn-14.png and b/dev-ai-data-platform/sync-link/images/task2-scrn-14.png differ
diff --git a/dev-ai-data-platform/sync-link/sync-link.md b/dev-ai-data-platform/sync-link/sync-link.md
index 6aa4f350..979cc83c 100644
--- a/dev-ai-data-platform/sync-link/sync-link.md
+++ b/dev-ai-data-platform/sync-link/sync-link.md
@@ -27,7 +27,9 @@ By the end of this lab, you will:
By the end, you’ll have the skills to turn raw, external data into a seamless part of SeersEquities’ analytics workflow—ready to power better loan decisions and smarter risk management.
-## Task 1: Load Object Storage Data into Autonomous Database using the Catalog Tool.
+## Task 1: Load Object Storage Data into Autonomous Database using the Catalog Tool.
+
+>***Skip to Step 5,*** **if continuing from Lab 1**
1. If you are not yet logged in to **Database Actions**, click **View Login Info**. Copy your **DB ADMIN Password**, and click the **SQL Worksheet** link.
@@ -41,56 +43,60 @@ By the end, you’ll have the skills to turn raw, external data into a seamless

-4. Click the **Data Objects** tab at the top of the catalog page to view the contents from your object storage buckets.
+4. Click the **Data Objects** tab at the top of the catalog page to view the contents from your object storage buckets.
+
+ 
- 
+5. From the list of Data Objects, select **LoanAppcustomer_extension.csv** to open the **Cloud Object Entity** page.
-5. From the list, select **LoanAppcustomer_extension.csv** to open the **Cloud Object Entity** page.
+ 
- 
+6. Click **Load to Table**.
-6. Click **Load to Table**.
+ 
- 
+7. Select **Create Table** in the table section, then change the table name to **CUSTOMER\_EXTENSION**.
-7. Select **Create Table** in the table section, then change the table name to something more meanful -- like, **CUSTOMER EXTENSION**.
+ 
- 
+ Click **Load Data**.
- Click **Load Data**.
+8. In the popup window, click **Go to Data Load** to continue.
-8. In the popup window, click **Go to Data Load** to continue.
+ 
- 
+9. Set Refresh Rate to 60 Seconds and check on table load job completion
-9. Once the job completes, the table appears under **Table and View Loads** on the page. Click **Report** to review job details.
+ * Click on drop-down for **Refresh** and set frequency to 60 seconds
+ * The table will appear under **Table and View Loads**, once the table load completes the job will move from queued and show number of rows loaded
+ * Click **Report** to review job details.
- 
+ 
-10. Review the job details. Click **SQL** to review the code used.
+10. Review the job details. Click **SQL** to review the code used.
- 
+ 
-13. The SQL code is displayed.
+11. The SQL code is displayed.
- 
+ 
- Click **Close**.
+ Click **Close**.
-14. To analyze the data load, click **Query**.
+12. To validate the data load, click **Query**.
- 
+ 
-15. The SQL Worksheet opens with the query pre-loaded, the results displayed, and an analysis of the dataset.
+13. The SQL Worksheet opens with the query pre-loaded, the results displayed, and an analysis of the dataset.
- 
+ 
-16. Click **Catalog** in the left rail and you’ll see the **CUSTOMER_EXTENSION** table now listed in the catalog.
+14. Click **Catalog** in the left rail, then select **Tables and Views** Filter to see that the **CUSTOMER_EXTENSION** table is now listed in the catalog.
- 
+ 
-You’ve just loaded external object storage data directly into your Autonomous Database—turning a static file into a query-ready table. This move helps optimize performance and makes your data ready for analytics, joins, and future products.
+***Congratulations!*** You’ve just loaded external object storage data directly into your Autonomous Database, turning a static file into a query-ready table. This move helps optimize performance and makes your data ready for analytics, joins, and future products.
## Task 2: Link Object Storage Data from Data Catalog to ADB.
@@ -102,39 +108,39 @@ You’ve just loaded external object storage data directly into your Autonomous

- 3. Select **Create External Table** in the table section, then change the table name to something more meaningful --- **CUSTOMER_SEGMENT**.
+ 3. Select **Create External Table** in the table section, then change the table name to **CUSTOMER\_SEGMENT**.

Click **Link Data**.
- 7. In the popup, click **Go To Data Link** to continue.
+ 4. In the popup, click **Go To Data Load** to continue.

- 8. Once the job completes, the external table appears under **Table and View Loads** on the page. Click **Report** to review job details.
+ 5. Once the job completes, the external table appears under **Table and View Loads** on the page. Click **Report** to review job details.
- 
+ 
- 9. Review the job details. Click **SQL** to review the code used.
+ 6. Review the job details. Click **SQL** to review the code used.

- 10. The SQL code is displayed.
+ 7. The SQL code is displayed.

Click **Close**.
- 11. To analyze the data link, click Query.
+ 8. To test the data link, click Query.

- 12. The SQL Worksheet opens with the query pre-loaded, the results displayed, and an analysis of the dataset.
+ 9. The SQL Worksheet opens with the query pre-loaded, the results displayed, and an analysis of the dataset.
- 
+ 
-You’ve just linked **external object storage data** to your database—**no loading required**. With this external table in place, you can **run queries instantly while avoiding data duplication** and keeping your analytics agile and efficient.
+***Congratulations!*** You’ve just linked **external object storage data** to your database, **with no data loading required**. With this external table in place, you can **run queries instantly while avoiding data duplication** and keeping your analytics agile and efficient.
## Task 3: Query Data in Object Storage and ADB Database.
@@ -166,7 +172,7 @@ You’ve just linked **external object storage data** to your database—**no lo

- 4. Combine data from both tables using a **join**. Paste the query below, then click **Run**:
+ 4. Combine data in the database and in object storage using a **join**. Paste the query below, then click **Run**:
```text
@@ -181,7 +187,7 @@ You’ve just linked **external object storage data** to your database—**no lo

-You’ve now combined external object storage data with internal database data—all from a single query. This unlocks richer analytics, enabling SeersEquities to connect customer attributes with segmentation strategies in real time.
+***Congratulations!*** You’ve now combined external object storage data with internal database data, all from a single query. This unlocks richer analytics, enabling SeersEquities to connect customer attributes with segmentation strategies in real time.
## Conclusion
@@ -193,4 +199,4 @@ For SeersEquities, this means **faster decisions, smarter loan products, and mor
## Acknowledgements
* **Authors** - Eddie Ambler, Otis Barr
-* **Last Updated By/Date** - Kamryn Vinson, June 2025
+* **Last Updated By/Date** - Eddie Ambler, September 2025