Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 8 additions & 8 deletions dev-ai-data-platform/data-products/data-products.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ In this Lab, you will:

10. On the **Select Tables** page, choose the table to share with the **Risk Department**:

* Select the **Share\_Loan\_Data\_Risk\_VW** table in the **Available Tables** column.
* Select the **Shared\_Loan\_Data\_Risk\_VW** table in the **Available Tables** column.
* Click the **move (>)** button to add it to the **Shared Tables** column

![Define a Data Product Recipient](./images/select-items-for-share.png "Define a Data Product Recipient")
Expand All @@ -104,7 +104,7 @@ In this Lab, you will:

11. In the **Recipients** section, lets define who we want to create this data share for by clicking **New Recipients**.

![Define a Data Product Recipient](./images/define-data-product-share-recipient-10.png "Define a Data Product Recipient")
![Define a Data Product Recipient](./images/define-data-product-share-recipient-10.png "Define a Data Product Recipient")

12. In the **Create Share Recipient** window, enter the following:

Expand All @@ -118,7 +118,7 @@ In this Lab, you will:

13. Back on the **Create Share** page, select the newly created recipient from the list of recipients.

![Define a Data Product Recipient](./images/selectrecipientdrop.png "Define")
![Define a Data Product Recipient](./images/selectrecipientdrop.png "Define")

14. Click the **Copy** icon to copy the recipient's activation link to your clipboard.

Expand All @@ -128,11 +128,11 @@ In this Lab, you will:

15. Now, publish your share by clicking the **Publish** button from the actions menu.

![publish Data Product ](./images/publishshare.png "")
![publish Data Product ](./images/publishshare.png "")

16. This will turn the Share Icon green with a state of Published Share.

![created Data Product ](./images/sharecreated.png "")
![created Data Product ](./images/sharecreated.png "")

>***Congratulations!!!*** You’ve just **created and published a data product share**. <br>
By defining the share, selecting the right data, and authorizing a recipient, you’ve set up a **governed, reusable pipeline for cross-team collaboration**.
Expand All @@ -143,11 +143,11 @@ In this Lab, you will:
1. **Download the Activation Link Profile for the Data Share** that we will use in the upcoming lab <br>
Paste the activation link you copied earlier into a separate browser tab and click **Get Profile Information** to download the recipient profile file (The default name is `delta_share_profile.json`).

>If you experience an error with your activation link don't worry, the steps below will show you how to get a new copy and try again.
>If you experience an error with your activation link don't worry, you can get a new link in step 3 below and try again.

![Data Product activation link](./images/Paste-activation-link-in-window.png "")

2. A list of **share recipients** and their **Profile Activation link** can also be retrieved from the **Provide Share** page, by clicking the **Actions** icon next to your data product share. <br> Then selecting **Recipients and Profiles**.
2. You can get a list of the **share recipients** and their **Profile Activation link** from the **Provide Share** page, by clicking the **Actions** icon next to your data product share. <br> Then selecting **Recipients and Profiles**.

![created Data Product ](./images/manageshare.png "")

Expand All @@ -172,5 +172,5 @@ At SeersEquities, this means **smoother handoffs, faster risk evaluation, and be

## Acknowledgements
* **Authors** - Eddie Ambler
* **Last Updated By/Date** - September 2025, Eddie Ambler
* **Last Updated By/Date** - Eddie Ambler - September 2025

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
54 changes: 27 additions & 27 deletions dev-ai-data-platform/load-transform/load-transform.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,8 +60,8 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
</copy>
```

5. Right-click on your browser tab and select **Duplicate** from the context menu to open another tab. <br>
* Click **Database Actions** in the top banner **of the new tab**.
5. Right-click on your browser tab and select **Duplicate** from the context menu to open another tab.
* Click **Database Actions** in the top banner **of the new tab**.

![Open DB Actions in Duplicate Tab](./images/open-another-browser-tab.png "")

Expand Down Expand Up @@ -89,7 +89,7 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline

11. Return to **SQL | Oracle Database Actions** browser tab. <br>

* In the PL/SQL block modify the ***source\_uri*** definition in the **PL/SQL Block in the SQL Worksheet**, as shown below:
* In the PL/SQL block modify the ***source\_uri*** definition in the **PL/SQL Block in the SQL Worksheet**, as shown below:

```text
source_uri VARCHAR2(100) := ‘Paste the LOANAPP_FUNDING uri you copied here';
Expand All @@ -103,11 +103,11 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline

![Copy MYDEMOBUCKET URI](./images/mydemobucket-uri.png "")

Click **Close** to exit.
* Click **Close** to exit.

14. Return to **SQL | Oracle Database Actions** browser tab. <br>

* In the PL/SQL block modify the ***target\_uri*** definition in the **PL/SQL block we placed in the SQL Worksheet**, as shown below:
* In the PL/SQL block modify the ***target\_uri*** definition in the **PL/SQL block we placed in the SQL Worksheet**, as shown below:

```text
target_uri VARCHAR2(100) := ‘Paste the MYDEMOBUCKET uri you copied here';
Expand All @@ -121,34 +121,33 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline

16. Return to **Data Load | Oracle Database** tab.

* Click the **Actions** icon in the **MyDemoBucket** panel, then select **Objects** from the context menu.
* Click the **Actions** icon in the **MyDemoBucket** panel, then select **Objects** from the context menu.

![Switch Tab & Select LOANAPP_FUNDING Connection](./images/move-data-file1.png "")

17. Click the folder icon to confirm that the **funding\_commitments1.json** file in the **LOANAPP\_FUNDING** bucket has been successfully copied here.
17. Expand the **FUNDING** folder icon to confirm that the **funding\_commitments1.json** file from the **LOANAPP\_FUNDING** bucket has been successfully copied here.

![Confirm File 1 Copy](./images/confirm-move-data-file1.png "")

Click **Close** to exit.
* Click **Close** to exit.

***Congratulations!*** You have now successfuly interacted with data in object storage using PL/SQL from the Data Studio tools and your Autonomous Database.

## Task 2: Build Initial Live Feed Table

1. From the **Data Load | Oracle Database** tab - Navigate to Live Feed.
1. From the **Data Load | Oracle Database** tab - Navigate to Live Feed.

* On Left rail expand **Data Load**, then click on **Live Feed**.
* On Left rail expand **Data Load**, then click on **Live Feed**.

![Navigate from Data Load Connections to Live Feed](./images/navigate-connections-to-live-feed.png "")

>You should now see the Live Feed Page

2. Click the **Create Live Table Feed** button to enter the Create Live Feed wizard

2. Click the **Create Live Table Feed** button to enter the Create Live Feed wizard
![Create Live Feed Wizard](./images/live-feed-wizard-step1.png "")

![Create Live Feed Wizard](./images/live-feed-wizard-step1.png "")

3. Enter details for the Live Table Feed Preview.
3. Enter details for the Live Table Feed Preview.

* Select Cloud Store Location: **MyDemoBucket**
* Select Radial Box: **Basic**
Expand All @@ -161,28 +160,28 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline

* Click the **Next** button to proceed.


4. Configure Live Feed Table Settings as follows:
4. Configure Live Feed Table Settings as follows:

* **For Option**: Choose **Merge Into Table** from drop-down list

* **For target Table Name**: Enter the name of the target table of the Live Feed -- **FUNDING\_PROVIDER\_OFFER\_STG**. ***In ALL CAPS*** <br>

* Then modify **Mapping** details exactly as shown below:
>**Modify mapping to update Data Type** to NUMBER for: FUNDING_PROVIDER_ID and FUNDING_OFFER_REFERENCE_ID <br>
>**For Merge Key**: Select FUNDING_PROVIDER_ID and FUNDING_OFFER_REFERENCE_ID
* <u>Then modify **Mapping** details exactly as shown below:</u>
* **Update Data Type** to NUMBER for: FUNDING\_PROVIDER\_ID and FUNDING\_OFFER\_REFERENCE\_ID <br>
* **For Merge Key**: Select FUNDING\_PROVIDER\_ID and FUNDING\_OFFER\_REFERENCE\_ID<br>
* **Unselect last row:** Inclusion of SYSTIMESTAMP Source

![Create Live Feed Wizard - step 2](./images/live-feed-wizard-step2-table-settings.png "")

* Click the **Next** button to proceed.

5. Review the information shown on the Preview page.
5. Review the information shown on the Preview page.

![Create Live Feed Wizard - step 3](./images/live-feed-preview.png "")

* Click **Next** to proceed.

6. Enter remaining details for the **Live Table Feed**
6. Enter remaining details for the **Live Table Feed**

a. Enter live feed name **LOANAPP\_FUNDING\_FEED** <br>
b. Check box to **Enable for Scheduling**. <br>
Expand All @@ -192,11 +191,11 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline

* Click **Create**

7. When the popup box appears, select **Yes** to run the Live Feed.
7. When the popup box appears, select **Yes** to run the Live Feed.

![Run Initial Live Table Feed](./images/do-run-live-feed.png)

8. Review Live Feed Table and set page Refresh Interval
8. Review Live Feed Table and set page Refresh Interval

* **You should see 3 rows loaded**
* **Set page refresh interval to 10 seconds** to see when new data is loaded
Expand All @@ -205,11 +204,12 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline

***Congratulations!*** You have successfully created your Live Feed table.


## Task 3: Test Live Feed Table Data Population

1. Return to the **SQL | Oracle Database Actions** tab where your PL/SQL Block exists.

* Load a file into our Live Feed Bucket to trigger the Live Feed process by modifying the **object\_name** definition in the PL/SQL block, as shown below:
* Load a file into our Live Feed Bucket to trigger the Live Feed process by modifying the **object\_name** definition in the PL/SQL block, as shown below:

```text
object_name VARCHAR2(200) := 'funding_commitments2.json';
Expand All @@ -232,7 +232,7 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline

* Clear the SQL Worksheet and Run the following code to populate new loan products.

```text
```text
<copy>
DECLARE
New_Funding_Offers NUMBER;
Expand Down Expand Up @@ -260,9 +260,9 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline
</copy>
```

![Review Convert Funding to Loan Procudure Output](./images/review-funding-to-loan-conversion.png)
![Review Convert Funding to Loan Procudure Output](./images/review-funding-to-loan-conversion.png)

***Congratulations!*** On creating a Live Feed that can automatically load data from object storage into your database and be integrated into an automated business process.
***Congratulations!*** On creating a Live Feed that can automatically load data from object storage into your database and be integrated into an automated business process.

## Conclusion
In this lab, you built a data pipeline using the Oracle Live Table Feed tool and successfully used the data from the pipeline in a PL/SQL block that was run from Oracle Autonomous Database to automate business processes.
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
17 changes: 12 additions & 5 deletions dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec

* Select **Create Share Provider** as share source
* Select **From File** as **Share Provider JSON**
* Select the file you downloaded in the lab **Create & Share Trusted Data Products**
* Select the file named **delta\_share\_profile.json** that you downloaded in lab 3
* In **Provider Name** text box enter: **Demo-Data-Share**
* In **Description** text box enter: **Demo-Data-Share**

Expand Down Expand Up @@ -112,9 +112,10 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec

![Create Data Product Share](./images/select-shared-data-5.png )

7. Once the job is complete, check that you see the link icon next to the data link card to confirm success.
7. Once the job completes, confirm Data Link success

>If after a few seconds the status does not update, click the refresh icon on the right.
* Set the refresh icon on the right to 60 seconds.
* Check that you see the link icon next to the data link card to confirm success

![Create Data Product Share](./images/select-shared-data-6.png )

Expand Down Expand Up @@ -142,6 +143,13 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec

![Create Data Product Share](./images/select-shared-data-4a.png )

>**This will cause the SQL Worksheet to appear** and auto run a select all query from our linked table as shown below. <br>
>The **query result section** proves that the query was able to access the data in the object storage file and return it the dashboard
<br>


![Query Shared Data Product](./images/query-shared-data-product.png )

***Congratulations!*** You’ve now subscribed to a shared data product and validated that you can query it directly from Autonomous Database. <br> This ensures the Risk team at SeersEquities can work with live, trusted data—ready for real-time analysis without unnecessary data movement.

## Conclusion
Expand All @@ -157,5 +165,4 @@ This workflow ensures faster risk analysis, smarter decisions, and tighter colla

## Acknowledgements
* **Authors** - Eddie Ambler
* **Last Updated By/Date** - September 2025, Eddie Ambler

* **Last Updated By/Date** - Eddie Ambler - September 2025
Binary file modified dev-ai-data-platform/sync-link/images/task1-scrn-14.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dev-ai-data-platform/sync-link/images/task1-scrn-16.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dev-ai-data-platform/sync-link/images/task2-scrn-14.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading