diff --git a/dev-ai-data-platform/data-products/data-products.md b/dev-ai-data-platform/data-products/data-products.md index 6aca2aa0..7b0d7f52 100644 --- a/dev-ai-data-platform/data-products/data-products.md +++ b/dev-ai-data-platform/data-products/data-products.md @@ -95,7 +95,7 @@ In this Lab, you will: 10. On the **Select Tables** page, choose the table to share with the **Risk Department**: - * Select the **Share\_Loan\_Data\_Risk\_VW** table in the **Available Tables** column. + * Select the **Shared\_Loan\_Data\_Risk\_VW** table in the **Available Tables** column. * Click the **move (>)** button to add it to the **Shared Tables** column ![Define a Data Product Recipient](./images/select-items-for-share.png "Define a Data Product Recipient") @@ -104,7 +104,7 @@ In this Lab, you will: 11. In the **Recipients** section, lets define who we want to create this data share for by clicking **New Recipients**. - ![Define a Data Product Recipient](./images/define-data-product-share-recipient-10.png "Define a Data Product Recipient") + ![Define a Data Product Recipient](./images/define-data-product-share-recipient-10.png "Define a Data Product Recipient") 12. In the **Create Share Recipient** window, enter the following: @@ -118,7 +118,7 @@ In this Lab, you will: 13. Back on the **Create Share** page, select the newly created recipient from the list of recipients. - ![Define a Data Product Recipient](./images/selectrecipientdrop.png "Define") + ![Define a Data Product Recipient](./images/selectrecipientdrop.png "Define") 14. Click the **Copy** icon to copy the recipient's activation link to your clipboard. @@ -128,11 +128,11 @@ In this Lab, you will: 15. Now, publish your share by clicking the **Publish** button from the actions menu. - ![publish Data Product ](./images/publishshare.png "") + ![publish Data Product ](./images/publishshare.png "") 16. This will turn the Share Icon green with a state of Published Share. - ![created Data Product ](./images/sharecreated.png "") + ![created Data Product ](./images/sharecreated.png "") >***Congratulations!!!*** You’ve just **created and published a data product share**.
By defining the share, selecting the right data, and authorizing a recipient, you’ve set up a **governed, reusable pipeline for cross-team collaboration**. @@ -143,11 +143,11 @@ In this Lab, you will: 1. **Download the Activation Link Profile for the Data Share** that we will use in the upcoming lab
Paste the activation link you copied earlier into a separate browser tab and click **Get Profile Information** to download the recipient profile file (The default name is `delta_share_profile.json`). ->If you experience an error with your activation link don't worry, the steps below will show you how to get a new copy and try again. +>If you experience an error with your activation link don't worry, you can get a new link in step 3 below and try again. ![Data Product activation link](./images/Paste-activation-link-in-window.png "") -2. A list of **share recipients** and their **Profile Activation link** can also be retrieved from the **Provide Share** page, by clicking the **Actions** icon next to your data product share.
Then selecting **Recipients and Profiles**. +2. You can get a list of the **share recipients** and their **Profile Activation link** from the **Provide Share** page, by clicking the **Actions** icon next to your data product share.
Then selecting **Recipients and Profiles**. ![created Data Product ](./images/manageshare.png "") @@ -172,5 +172,5 @@ At SeersEquities, this means **smoother handoffs, faster risk evaluation, and be ## Acknowledgements * **Authors** - Eddie Ambler -* **Last Updated By/Date** - September 2025, Eddie Ambler +* **Last Updated By/Date** - Eddie Ambler - September 2025 diff --git a/dev-ai-data-platform/load-transform/images/confirm-move-data-file1.png b/dev-ai-data-platform/load-transform/images/confirm-move-data-file1.png index 891eaa57..98760635 100644 Binary files a/dev-ai-data-platform/load-transform/images/confirm-move-data-file1.png and b/dev-ai-data-platform/load-transform/images/confirm-move-data-file1.png differ diff --git a/dev-ai-data-platform/load-transform/load-transform.md b/dev-ai-data-platform/load-transform/load-transform.md index dfdf6425..2076ea88 100644 --- a/dev-ai-data-platform/load-transform/load-transform.md +++ b/dev-ai-data-platform/load-transform/load-transform.md @@ -60,8 +60,8 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline ``` - 5. Right-click on your browser tab and select **Duplicate** from the context menu to open another tab.
- * Click **Database Actions** in the top banner **of the new tab**. + 5. Right-click on your browser tab and select **Duplicate** from the context menu to open another tab. + * Click **Database Actions** in the top banner **of the new tab**. ![Open DB Actions in Duplicate Tab](./images/open-another-browser-tab.png "") @@ -89,7 +89,7 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline 11. Return to **SQL | Oracle Database Actions** browser tab.
- * In the PL/SQL block modify the ***source\_uri*** definition in the **PL/SQL Block in the SQL Worksheet**, as shown below: + * In the PL/SQL block modify the ***source\_uri*** definition in the **PL/SQL Block in the SQL Worksheet**, as shown below: ```text source_uri VARCHAR2(100) := ‘Paste the LOANAPP_FUNDING uri you copied here'; @@ -103,11 +103,11 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline ![Copy MYDEMOBUCKET URI](./images/mydemobucket-uri.png "") - Click **Close** to exit. + * Click **Close** to exit. 14. Return to **SQL | Oracle Database Actions** browser tab.
- * In the PL/SQL block modify the ***target\_uri*** definition in the **PL/SQL block we placed in the SQL Worksheet**, as shown below: + * In the PL/SQL block modify the ***target\_uri*** definition in the **PL/SQL block we placed in the SQL Worksheet**, as shown below: ```text target_uri VARCHAR2(100) := ‘Paste the MYDEMOBUCKET uri you copied here'; @@ -121,34 +121,33 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline 16. Return to **Data Load | Oracle Database** tab. -* Click the **Actions** icon in the **MyDemoBucket** panel, then select **Objects** from the context menu. + * Click the **Actions** icon in the **MyDemoBucket** panel, then select **Objects** from the context menu. ![Switch Tab & Select LOANAPP_FUNDING Connection](./images/move-data-file1.png "") -17. Click the folder icon to confirm that the **funding\_commitments1.json** file in the **LOANAPP\_FUNDING** bucket has been successfully copied here. +17. Expand the **FUNDING** folder icon to confirm that the **funding\_commitments1.json** file from the **LOANAPP\_FUNDING** bucket has been successfully copied here. ![Confirm File 1 Copy](./images/confirm-move-data-file1.png "") - Click **Close** to exit. + * Click **Close** to exit. ***Congratulations!*** You have now successfuly interacted with data in object storage using PL/SQL from the Data Studio tools and your Autonomous Database. ## Task 2: Build Initial Live Feed Table - 1. From the **Data Load | Oracle Database** tab - Navigate to Live Feed. +1. From the **Data Load | Oracle Database** tab - Navigate to Live Feed. - * On Left rail expand **Data Load**, then click on **Live Feed**. + * On Left rail expand **Data Load**, then click on **Live Feed**. ![Navigate from Data Load Connections to Live Feed](./images/navigate-connections-to-live-feed.png "") >You should now see the Live Feed Page +2. Click the **Create Live Table Feed** button to enter the Create Live Feed wizard - 2. Click the **Create Live Table Feed** button to enter the Create Live Feed wizard + ![Create Live Feed Wizard](./images/live-feed-wizard-step1.png "") - ![Create Live Feed Wizard](./images/live-feed-wizard-step1.png "") - - 3. Enter details for the Live Table Feed Preview. +3. Enter details for the Live Table Feed Preview. * Select Cloud Store Location: **MyDemoBucket** * Select Radial Box: **Basic** @@ -161,28 +160,28 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline * Click the **Next** button to proceed. - - 4. Configure Live Feed Table Settings as follows: +4. Configure Live Feed Table Settings as follows: * **For Option**: Choose **Merge Into Table** from drop-down list * **For target Table Name**: Enter the name of the target table of the Live Feed -- **FUNDING\_PROVIDER\_OFFER\_STG**. ***In ALL CAPS***
- * Then modify **Mapping** details exactly as shown below: - >**Modify mapping to update Data Type** to NUMBER for: FUNDING_PROVIDER_ID and FUNDING_OFFER_REFERENCE_ID
- >**For Merge Key**: Select FUNDING_PROVIDER_ID and FUNDING_OFFER_REFERENCE_ID + * Then modify **Mapping** details exactly as shown below: + * **Update Data Type** to NUMBER for: FUNDING\_PROVIDER\_ID and FUNDING\_OFFER\_REFERENCE\_ID
+ * **For Merge Key**: Select FUNDING\_PROVIDER\_ID and FUNDING\_OFFER\_REFERENCE\_ID
+ * **Unselect last row:** Inclusion of SYSTIMESTAMP Source ![Create Live Feed Wizard - step 2](./images/live-feed-wizard-step2-table-settings.png "") * Click the **Next** button to proceed. - 5. Review the information shown on the Preview page. +5. Review the information shown on the Preview page. ![Create Live Feed Wizard - step 3](./images/live-feed-preview.png "") * Click **Next** to proceed. - 6. Enter remaining details for the **Live Table Feed** +6. Enter remaining details for the **Live Table Feed** a. Enter live feed name **LOANAPP\_FUNDING\_FEED**
b. Check box to **Enable for Scheduling**.
@@ -192,11 +191,11 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline * Click **Create** - 7. When the popup box appears, select **Yes** to run the Live Feed. +7. When the popup box appears, select **Yes** to run the Live Feed. ![Run Initial Live Table Feed](./images/do-run-live-feed.png) - 8. Review Live Feed Table and set page Refresh Interval +8. Review Live Feed Table and set page Refresh Interval * **You should see 3 rows loaded** * **Set page refresh interval to 10 seconds** to see when new data is loaded @@ -205,11 +204,12 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline ***Congratulations!*** You have successfully created your Live Feed table. + ## Task 3: Test Live Feed Table Data Population 1. Return to the **SQL | Oracle Database Actions** tab where your PL/SQL Block exists. -* Load a file into our Live Feed Bucket to trigger the Live Feed process by modifying the **object\_name** definition in the PL/SQL block, as shown below: + * Load a file into our Live Feed Bucket to trigger the Live Feed process by modifying the **object\_name** definition in the PL/SQL block, as shown below: ```text object_name VARCHAR2(200) := 'funding_commitments2.json'; @@ -232,7 +232,7 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline * Clear the SQL Worksheet and Run the following code to populate new loan products. - ```text + ```text DECLARE New_Funding_Offers NUMBER; @@ -260,9 +260,9 @@ Leverage Data Studio Tools to Build a Live Feed Data Pipeline ``` - ![Review Convert Funding to Loan Procudure Output](./images/review-funding-to-loan-conversion.png) + ![Review Convert Funding to Loan Procudure Output](./images/review-funding-to-loan-conversion.png) - ***Congratulations!*** On creating a Live Feed that can automatically load data from object storage into your database and be integrated into an automated business process. + ***Congratulations!*** On creating a Live Feed that can automatically load data from object storage into your database and be integrated into an automated business process. ## Conclusion In this lab, you built a data pipeline using the Oracle Live Table Feed tool and successfully used the data from the pipeline in a PL/SQL block that was run from Oracle Autonomous Database to automate business processes. diff --git a/dev-ai-data-platform/mktplace-subscribe/images/query-shared-data-product.png b/dev-ai-data-platform/mktplace-subscribe/images/query-shared-data-product.png new file mode 100644 index 00000000..ab19623c Binary files /dev/null and b/dev-ai-data-platform/mktplace-subscribe/images/query-shared-data-product.png differ diff --git a/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-2a.png b/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-2a.png index 4de95e05..d5b18b0e 100644 Binary files a/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-2a.png and b/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-2a.png differ diff --git a/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-6.png b/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-6.png index 8356233b..ed77be72 100644 Binary files a/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-6.png and b/dev-ai-data-platform/mktplace-subscribe/images/select-shared-data-6.png differ diff --git a/dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md b/dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md index 8b662096..a58012e6 100644 --- a/dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md +++ b/dev-ai-data-platform/mktplace-subscribe/mktplace-subscribe.md @@ -58,7 +58,7 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec * Select **Create Share Provider** as share source * Select **From File** as **Share Provider JSON** - * Select the file you downloaded in the lab **Create & Share Trusted Data Products** + * Select the file named **delta\_share\_profile.json** that you downloaded in lab 3 * In **Provider Name** text box enter: **Demo-Data-Share** * In **Description** text box enter: **Demo-Data-Share** @@ -112,9 +112,10 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec ![Create Data Product Share](./images/select-shared-data-5.png ) -7. Once the job is complete, check that you see the link icon next to the data link card to confirm success. +7. Once the job completes, confirm Data Link success - >If after a few seconds the status does not update, click the refresh icon on the right. + * Set the refresh icon on the right to 60 seconds. + * Check that you see the link icon next to the data link card to confirm success ![Create Data Product Share](./images/select-shared-data-6.png ) @@ -142,6 +143,13 @@ In this lab, you will use the Data Studio tools to Consume a Data Share as a Rec ![Create Data Product Share](./images/select-shared-data-4a.png ) + >**This will cause the SQL Worksheet to appear** and auto run a select all query from our linked table as shown below.
+ >The **query result section** proves that the query was able to access the data in the object storage file and return it the dashboard +
+ + + ![Query Shared Data Product](./images/query-shared-data-product.png ) + ***Congratulations!*** You’ve now subscribed to a shared data product and validated that you can query it directly from Autonomous Database.
This ensures the Risk team at SeersEquities can work with live, trusted data—ready for real-time analysis without unnecessary data movement. ## Conclusion @@ -157,5 +165,4 @@ This workflow ensures faster risk analysis, smarter decisions, and tighter colla ## Acknowledgements * **Authors** - Eddie Ambler -* **Last Updated By/Date** - September 2025, Eddie Ambler - +* **Last Updated By/Date** - Eddie Ambler - September 2025 \ No newline at end of file diff --git a/dev-ai-data-platform/sync-link/images/task1-scrn-14.png b/dev-ai-data-platform/sync-link/images/task1-scrn-14.png index 20834d92..086b5b21 100644 Binary files a/dev-ai-data-platform/sync-link/images/task1-scrn-14.png and b/dev-ai-data-platform/sync-link/images/task1-scrn-14.png differ diff --git a/dev-ai-data-platform/sync-link/images/task1-scrn-16.png b/dev-ai-data-platform/sync-link/images/task1-scrn-16.png index e8ae20cb..699fbb13 100644 Binary files a/dev-ai-data-platform/sync-link/images/task1-scrn-16.png and b/dev-ai-data-platform/sync-link/images/task1-scrn-16.png differ diff --git a/dev-ai-data-platform/sync-link/images/task2-scrn-14.png b/dev-ai-data-platform/sync-link/images/task2-scrn-14.png index 697a5f45..1344b094 100644 Binary files a/dev-ai-data-platform/sync-link/images/task2-scrn-14.png and b/dev-ai-data-platform/sync-link/images/task2-scrn-14.png differ diff --git a/dev-ai-data-platform/sync-link/sync-link.md b/dev-ai-data-platform/sync-link/sync-link.md index 6aa4f350..979cc83c 100644 --- a/dev-ai-data-platform/sync-link/sync-link.md +++ b/dev-ai-data-platform/sync-link/sync-link.md @@ -27,7 +27,9 @@ By the end of this lab, you will: By the end, you’ll have the skills to turn raw, external data into a seamless part of SeersEquities’ analytics workflow—ready to power better loan decisions and smarter risk management. -## Task 1: Load Object Storage Data into Autonomous Database using the Catalog Tool. +## Task 1: Load Object Storage Data into Autonomous Database using the Catalog Tool. + +>***Skip to Step 5,*** **if continuing from Lab 1** 1. If you are not yet logged in to **Database Actions**, click **View Login Info**. Copy your **DB ADMIN Password**, and click the **SQL Worksheet** link. @@ -41,56 +43,60 @@ By the end, you’ll have the skills to turn raw, external data into a seamless ![Select CATALOG from Navigation Menu](./images/task1-scrn-7.png "") -4. Click the **Data Objects** tab at the top of the catalog page to view the contents from your object storage buckets. +4. Click the **Data Objects** tab at the top of the catalog page to view the contents from your object storage buckets. + + ![Click Data Objects](./images/task1-scrn-8.png "") - ![Click Data Objects](./images/task1-scrn-8.png "") +5. From the list of Data Objects, select **LoanAppcustomer_extension.csv** to open the **Cloud Object Entity** page. -5. From the list, select **LoanAppcustomer_extension.csv** to open the **Cloud Object Entity** page. + ![Select Customer_Extention.csv file](./images/click-csv.png "") - ![Select Customer_Extention.csv file](./images/click-csv.png "") +6. Click **Load to Table**. -6. Click **Load to Table**. + ![Select Load to Table](./images/task1-scrn-9.png "") - ![Select Load to Table](./images/task1-scrn-9.png "") +7. Select **Create Table** in the table section, then change the table name to **CUSTOMER\_EXTENSION**. -7. Select **Create Table** in the table section, then change the table name to something more meanful -- like, **CUSTOMER EXTENSION**. + ![Select Create Table & provide name](./images/define-load-data-table.png "") - ![Select Create Table & provide name](./images/define-load-data-table.png "") + Click **Load Data**. - Click **Load Data**. +8. In the popup window, click **Go to Data Load** to continue. -8. In the popup window, click **Go to Data Load** to continue. + ![Click Go To Data Load](./images/go-to-data-load.png "") - ![Click Go To Data Load](./images/go-to-data-load.png "") +9. Set Refresh Rate to 60 Seconds and check on table load job completion -9. Once the job completes, the table appears under **Table and View Loads** on the page. Click **Report** to review job details. + * Click on drop-down for **Refresh** and set frequency to 60 seconds + * The table will appear under **Table and View Loads**, once the table load completes the job will move from queued and show number of rows loaded + * Click **Report** to review job details. - ![Click Report to review load job](./images/task1-scrn-14.png "") + ![Click Report to review load job](./images/task1-scrn-14.png "") -10. Review the job details. Click **SQL** to review the code used. +10. Review the job details. Click **SQL** to review the code used. - ![Click SQL to review load code](./images/review-data-load-job.png "") + ![Click SQL to review load code](./images/review-data-load-job.png "") -13. The SQL code is displayed. +11. The SQL code is displayed. - ![Examine load SQL code](./images/task1-scrn-15.png "") + ![Examine load SQL code](./images/task1-scrn-15.png "") - Click **Close**. + Click **Close**. -14. To analyze the data load, click **Query**. +12. To validate the data load, click **Query**. - ![Analyze data load](./images/task1-scrn-16.png "") + ![Analyze data load](./images/task1-scrn-16.png "") -15. The SQL Worksheet opens with the query pre-loaded, the results displayed, and an analysis of the dataset. +13. The SQL Worksheet opens with the query pre-loaded, the results displayed, and an analysis of the dataset. - ![Load Data set analysis](./images/task1-scrn-17.png "") + ![Load Data set analysis](./images/task1-scrn-17.png "") -16. Click **Catalog** in the left rail and you’ll see the **CUSTOMER_EXTENSION** table now listed in the catalog. +14. Click **Catalog** in the left rail, then select **Tables and Views** Filter to see that the **CUSTOMER_EXTENSION** table is now listed in the catalog. - ![Review CUSTOMER_EXTENSION table in Catalog](./images/task1-scrn-18.png "") + ![Review CUSTOMER_EXTENSION table in Catalog](./images/task1-scrn-18.png "") -You’ve just loaded external object storage data directly into your Autonomous Database—turning a static file into a query-ready table. This move helps optimize performance and makes your data ready for analytics, joins, and future products. +***Congratulations!*** You’ve just loaded external object storage data directly into your Autonomous Database, turning a static file into a query-ready table. This move helps optimize performance and makes your data ready for analytics, joins, and future products. ## Task 2: Link Object Storage Data from Data Catalog to ADB. @@ -102,39 +108,39 @@ You’ve just loaded external object storage data directly into your Autonomous ![Select Link to Table](./images/task2-scrn-10a.png "") - 3. Select **Create External Table** in the table section, then change the table name to something more meaningful --- **CUSTOMER_SEGMENT**. + 3. Select **Create External Table** in the table section, then change the table name to **CUSTOMER\_SEGMENT**. ![Select Create External Table & provide name](./images/task2-scrn-12.png "") Click **Link Data**. - 7. In the popup, click **Go To Data Link** to continue. + 4. In the popup, click **Go To Data Load** to continue. ![Click Go To Data Link](./images/go-to-data-load.png "") - 8. Once the job completes, the external table appears under **Table and View Loads** on the page. Click **Report** to review job details. + 5. Once the job completes, the external table appears under **Table and View Loads** on the page. Click **Report** to review job details. - ![Click Report to review link job](./images/task2-scrn-14.png "") + ![Click Report to review link job](./images/task2-scrn-14.png "") - 9. Review the job details. Click **SQL** to review the code used. + 6. Review the job details. Click **SQL** to review the code used. ![Click SQL to review link code](./images/review-data-link-job.png "") - 10. The SQL code is displayed. + 7. The SQL code is displayed. ![Examine link SQL code](./images/review-data-link-sql.png "") Click **Close**. - 11. To analyze the data link, click Query. + 8. To test the data link, click Query. ![Analyze data link](./images/analyze-data-link.png "") - 12. The SQL Worksheet opens with the query pre-loaded, the results displayed, and an analysis of the dataset. + 9. The SQL Worksheet opens with the query pre-loaded, the results displayed, and an analysis of the dataset. - ![Link Data set analysis](./images/task2-scrn-17.png "") + ![Link Data set analysis](./images/task2-scrn-17.png "") -You’ve just linked **external object storage data** to your database—**no loading required**. With this external table in place, you can **run queries instantly while avoiding data duplication** and keeping your analytics agile and efficient. +***Congratulations!*** You’ve just linked **external object storage data** to your database, **with no data loading required**. With this external table in place, you can **run queries instantly while avoiding data duplication** and keeping your analytics agile and efficient. ## Task 3: Query Data in Object Storage and ADB Database. @@ -166,7 +172,7 @@ You’ve just linked **external object storage data** to your database—**no lo ![Query Data in Object Storage](./images/task3-scrn-3.png "Query Data in Object Storage") - 4. Combine data from both tables using a **join**. Paste the query below, then click **Run**: + 4. Combine data in the database and in object storage using a **join**. Paste the query below, then click **Run**: ```text @@ -181,7 +187,7 @@ You’ve just linked **external object storage data** to your database—**no lo ![Query Data in Object Storage](./images/task3-scrn-4a.png "") -You’ve now combined external object storage data with internal database data—all from a single query. This unlocks richer analytics, enabling SeersEquities to connect customer attributes with segmentation strategies in real time. +***Congratulations!*** You’ve now combined external object storage data with internal database data, all from a single query. This unlocks richer analytics, enabling SeersEquities to connect customer attributes with segmentation strategies in real time. ## Conclusion @@ -193,4 +199,4 @@ For SeersEquities, this means **faster decisions, smarter loan products, and mor ## Acknowledgements * **Authors** - Eddie Ambler, Otis Barr -* **Last Updated By/Date** - Kamryn Vinson, June 2025 +* **Last Updated By/Date** - Eddie Ambler, September 2025