From 5c700232d985a38a024ec8b382499767904f31a3 Mon Sep 17 00:00:00 2001 From: Jos de Bruijn Date: Tue, 7 Jun 2016 16:48:19 -0700 Subject: [PATCH] documentation is moved to MSDN --- .../documentation/README.md | 7 - .../documentation/root.md | 68 ------ .../documentation/wwi-data-generation.md | 39 ---- .../documentation/wwi-etl.md | 58 ----- .../documentation/wwi-olap-catalog.md | 82 ------- .../documentation/wwi-olap-installation.md | 54 ----- .../documentation/wwi-olap-sample-queries.md | 5 - .../documentation/wwi-olap-sql-features.md | 95 -------- .../documentation/wwi-oltp-htap-catalog.md | 202 ------------------ .../wwi-oltp-htap-installation.md | 63 ------ .../wwi-oltp-htap-sample-queries.md | 5 - .../wwi-oltp-htap-sql-features.md | 27 --- .../documentation/wwi-overview.md | 49 ----- 13 files changed, 754 deletions(-) delete mode 100644 samples/databases/wide-world-importers/documentation/README.md delete mode 100644 samples/databases/wide-world-importers/documentation/root.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-data-generation.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-etl.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-olap-catalog.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-olap-installation.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-olap-sample-queries.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-olap-sql-features.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-oltp-htap-catalog.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-oltp-htap-installation.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-oltp-htap-sample-queries.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-oltp-htap-sql-features.md delete mode 100644 samples/databases/wide-world-importers/documentation/wwi-overview.md diff --git a/samples/databases/wide-world-importers/documentation/README.md b/samples/databases/wide-world-importers/documentation/README.md deleted file mode 100644 index 72f6c331b7..0000000000 --- a/samples/databases/wide-world-importers/documentation/README.md +++ /dev/null @@ -1,7 +0,0 @@ -# Documentation for the WideWorldImporters Sample Database - -This folder contains documentation for the sample. - -Start with [root.md](root.md) - -Note that these contents will most likely be migrated to MSDN. diff --git a/samples/databases/wide-world-importers/documentation/root.md b/samples/databases/wide-world-importers/documentation/root.md deleted file mode 100644 index dfdad25c6e..0000000000 --- a/samples/databases/wide-world-importers/documentation/root.md +++ /dev/null @@ -1,68 +0,0 @@ -# Wide World Importers Sample for SQL Server and Azure SQL Database - -Wide World Importers is a comprehensive database sample that both illustrates database design, and illustrates how SQL Server features can be leveraged in an application. - -Note that the sample is meant to be representative of a typical database. It does not include every feature of SQL Server. The design of the database follows one common set of standards, but there are many ways one might build a database. - -**Latest release**: -[wide-world-importers-release](http://go.microsoft.com/fwlink/?LinkID=800630) - -**Source code for the sample**: -[wide-world-importers](https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/wide-world-importers). - -**Feedback**: please send to -[sqlserversamples@microsoft.com](mailto:sqlserversamples@microsoft.com). - -The documentation for the sample is organized as follows: - -## Overview - -__[Wide World Importers Overview](wwi-overview.md)__ - -Overview of the sample company Wide World Importers, and the workflows addressed by the sample. - -## Main OLTP Database WideWorldImporters - -__[WideWorldImporters Installation and Configuration](wwi-oltp-htap-installation.md)__ - -Instructions for the installation and configuration of the core database WideWorldImporters that is used for transaction processing (OLTP - OnLine Transaction Processing) and operational analytics (HTAP - Hybrid Transactional/Analytical Processing). - -__[WideWorldImporters Database Catalog](wwi-oltp-htap-catalog.md)__ - -Description of the schemas and tables used in the WideWorldImporters database. - -__[WideWorldImporters Use of SQL Server Features and Capabilities](wwi-oltp-htap-sql-features.md)__ - -Describes how WideWorldImporters leverages core SQL Server features. - -__[WideWorldImporters Sample Queries](wwi-oltp-htap-sample-queries.md)__ - -Sample queries for the WideWorldImporters database. - -## Data Warehousing and Analytics Database WideWorldImportersDW - -__[WideWorldImportersDW Installation and Configuration](wwi-olap-installation.md)__ - -Instructions for the installation and configuration of the OLAP database WideWorldImportersDW. - -__[WideWorldImportersDW OLAP Database Catalog](wwi-olap-catalog.md)__ - -Description of the schemas and tables used in the WideWorldImportersDW database, which is the sample database for data warehousing and analytics processing (OLAP). - -__[WideWorldImporters ETL Workflow](wwi-etl.md)__ - -Workflow for the ETL (Extract, Transform, Load) process that migrates data from the transactional database WideWorldImporters to the data warehouse WideWorldImportersDW. - -__[WideWorldImportersDW Use of SQL Server Features and Capabilities](wwi-olap-sql-features.md)__ - -Describes how the WideWorldImportersDW leverages SQL Server features for analytics processing. - -__[WideWorldImportersDW OLAP Sample Queries](wwi-olap-sample-queries.md)__ - -Sample analytics queries leveraging the WideWorldImportersDW database. - -## Data generation - -__[WideWorldImporters Data Generation](wwi-data-generation.md)__ - -Describes how additional data can be generated in the sample database, for example inserting sales and purchase data up to the current date. diff --git a/samples/databases/wide-world-importers/documentation/wwi-data-generation.md b/samples/databases/wide-world-importers/documentation/wwi-data-generation.md deleted file mode 100644 index 1ed9df1b47..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-data-generation.md +++ /dev/null @@ -1,39 +0,0 @@ -# WideWorldImporters Data Generation - -The released versions of the WideWorldImporters and WideWorldImportersDW databases contains data starting January 1st 2013, up to the day these databases were generated. - -If the sample databases are used at a later date, for demonstration or illustration purposes, it may be beneficial to include more recent sample data in the database. - -## Data Generation in WideWorldImporters - -To generate sample data up to the current date, follow these steps: - -1. If you have not yet done so, install a clean version of the WideWorldImporters database. For installation instructions, [WideWorldImporters Installation and Configuration](wwi-oltp-htap-installation.md). -2. Execute the following statement in the database: - -``` - EXEC DataLoadSimulation.PopulateDataToCurrentDate - @AverageNumberOfCustomerOrdersPerDay = 60, - @SaturdayPercentageOfNormalWorkDay = 50, - @SundayPercentageOfNormalWorkDay = 0, - @IsSilentMode = 1, - @AreDatesPrinted = 1; -``` - -This statement adds sample sales and purchase data in the database, up to the current date. It outputs the progress of the data generation day-by-day. It will take rougly 10 minutes for every year that needs data. Note that there are some differences in the data generated between runs, since there is a random factor in the data generation. - -To increase or decrease the amount of data generated, in terms of orders per day, change the value for the parameter `@AverageNumberOfCustomerOrdersPerDay`. The parameters `@SaturdayPercentageOfNormalWorkDay` and `@SundayPercentageOfNormalWorkDay` are used to determine the order volume for weekend days. - -## Importing Data in WideWorldImportersDW - -To import sample data up to the current date in the OLAP database WideWorldImportersDW, follow these steps: - -1. Execute the data generation logic in the WideWorldImporters OLTP database, using the steps above. -2. If you have not yet done so, install a clean version of the WideWorldImportersDW database. For installation instructions, [WideWorldImporters Installation and Configuration](wwi-olap-installation.md). -3. Reseed the OLAP database by executing the following statement in the database: - -``` - EXECUTE [Application].Configuration_ReseedETL -``` - -4. Run the SSIS package **Daily ETL.ispac** to import the data into the OLAP database. For instructions on how to run the ETL job, see [WideWorldImporters ETL Workflow](wwi-etl.md). diff --git a/samples/databases/wide-world-importers/documentation/wwi-etl.md b/samples/databases/wide-world-importers/documentation/wwi-etl.md deleted file mode 100644 index a88e18a19d..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-etl.md +++ /dev/null @@ -1,58 +0,0 @@ -# WideWorldImporters ETL Workflow - -The ETL package WWI_Integration is used to migrate data from the WideWorldImporters database to the WideWorldImportersDW database as the data changes. The package is run periodically (most commonly daily). - -## Overview - -The design of the package uses SQL Server Integration Services (SSIS) to orchestrate bulk T-SQL operations (rather than as separate transformations within SSIS) to ensure high performance. - -Dimensions are loaded first, followed by Fact tables. The package can be re-run at any time after a failure. - -The workflow is as follows: - -![Alt text](/media/wide-world-importers-etl-workflow.png "WideWorldImporters ETL Workflow") - -It starts with an expression task that works out the appropriate cutoff time. This time is the current time less a few seconds. (This is more robust than requesting data right to the current time). It then truncates any milliseconds from the time. - -The main processing starts by populating the Date dimension table. It ensures that all dates for the current year have been populated in the table. - -After this, a series of data flow tasks loads each dimension, then each fact. - -## Prerequisites - -- SQL Server 2016 (or higher) with the databases WideWorldImporters and WideWorldImportersDW. These can be on the same or different instances of SQL Server. -- SQL Server Management Studio (SSMS) -- SQL Server 2016 Integration Services (SSIS). - - Make sure you have created an SSIS Catalog. If not, right click **Integration Services** in SSMS Object Explorer, and choose **Add Catalog**. Follow the defaults. It will ask you to enable sqlclr and provide a password. - - -## Download - -The latest release of the sample: - -[wide-world-importers-v0.1](https://github.com/Microsoft/sql-server-samples/releases/tag/wide-world-importers-v0.1) - -Download the SSIS package file **Daily ETL.ispac**. - -Source code to recreate the sample database is available from the following location. - -[wide-world-importers](https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/wide-world-importers/wwi-integration-etl) - -## Install - -1. Deploy the SSIS package. - - Open the "Daily ETL.ispac" package from Windows Explorer. This will launch the Integration Services Deployment Wizard. - - Under "Select Source" follow the default Project Deployment, with the path pointing to the "Daily ETL.ispac" package. - - Under "Select Destination" enter the name of the server that hosts the SSIS catalog. - - Select a path under the SSIS catalog, for example under a new folder "WideWorldImporters". - - Finalize the wizard by clicking Deploy. - -2. Create a SQL Server Agent job for the ETL process. - - In SSMS, right-click "SQL Server Agent" and select New->Job. - - Pick a name, for example "WideWorldImporters ETL". - - Add a Job Step of type "SQL Server Integration Services Package". - - Select the server with the SSIS catalog, and select the "Daily ETL" package. - - Under Configuration->Connection Managers ensure the connections to the source and target are configured correctly. The default is to connect to the local instance. - - Click OK to create the Job. - -3. Execute or schedule the Job. diff --git a/samples/databases/wide-world-importers/documentation/wwi-olap-catalog.md b/samples/databases/wide-world-importers/documentation/wwi-olap-catalog.md deleted file mode 100644 index 88e5c093a1..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-olap-catalog.md +++ /dev/null @@ -1,82 +0,0 @@ -# WideWorldImportersDW OLAP Database Catalog - -The WideWorldImportersDW database is used for data warehousing and analytical processing. The transactional data about sales and purchases is generated in the WideWorldImporters database, and loaded into the WideWorldImportersDW database using a [daily ETL process](wwi-etl.md). - -The data in WideWorldImportersDW thus mirrors the data in WideWorldImporters, but the tables are organized differently. While WideWorldImporters has a traditional normalized schema, WideWorldImportersDW uses the [star schema](https://wikipedia.org/wiki/Star_schema) approach for its table design. Besides the fact and dimension tables, the database includes a number of staging tables that are used in the ETL process. - -## Schemas - -The different types of tables are organized in three schemas. - -|Schema|Description| -|-----------------------------|---------------------| -|Dimension|Dimension tables.| -|Fact|Fact tables.| -|Integration|Staging tables and other objects needed for ETL.| - -## Tables - -The dimension and fact tables are listed below. The tables in the Integration schema are used only for the ETL process, and are not listed. - -### Dimension tables - -WideWorldImportersDW has the following dimension tables. The description includes the relationship with the source tables in the WideWorldImporters database. - -|Table|Source tables| -|-----------------------------|---------------------| -|City|`Application.Cities`, `Application.StateProvinces`, `Application.Countries`.| -|Customer|`Sales.Customers`, `Sales.BuyingGroups`, `Sales.CustomerCategories`.| -|Date|New table with information about dates, including financial year (based on November 1st start for financial year).| -|Employee|`Application.People`.| -|StockItem|`Warehouse.StockItems`, `Warehouse.Colors`, `Warehouse.PackageType`.| -|Supplier|`Purchasing.Suppliers`, `Purchasing.SupplierCategories`.| -|PaymentMethod|`Application.PaymentMethods`.| -|TransactionType|`Application.TransactionTypes`.| - -### Fact tables - -WideWorldImportersDW has the following fact tables. The description includes the relationship with the source tables in the WideWorldImporters database, as well as the classes of analytics/reporting queries each fact table is typically used with. - -|Table|Source tables|Sample Analytics| -|-----------------------------|---------------------|---------------------| -|Order|`Sales.Orders` and `Sales.OrderLines`|Sales people, picker/packer productivity, and on time to pick orders. In addition, low stock situations leading to back orders.| -|Sale|`Sales.Invoices` and `Sales.InvoiceLines`|Sales dates, delivery dates, profitability over time, profitability by sales person.| -|Purchase|`Purchasing.PurchaseOrderLines`|Expected vs actual lead times| -|Transaction|`Sales.CustomerTransactions` and `Purchasing.SupplierTransactions`|Measuring issue dates vs finalization dates, and amounts.| -|Movement|`Warehouse.StockTransactions`|Movements over time.| -|Stock Holding|`Warehouse.StockItemHoldings`|On-hand stock levels and value.| - -## Stored procedures - -The stored procedures are used primarily for the ETL process and for configuration purposes. - -Any extensions of the sample are encouraged to use the `Reports` schema for Reporting Services reports, and the `PowerBI` schema for Power-BI access. - -### Application Schema - -These procedures are used to configure the sample. They are used to apply enterprise edition features to the standard edition version of the sample, add PolyBase, and reseed ETL. - -|Procedure|Purpose| -|-----------------------------|---------------------| -|Configuration_ApplyPartitionedColumnstoreIndexing|Applies both partitioning and columnstore indexes for fact tables.| -|Configuration_ConfigureForEnterpriseEdition|Applies partitioning, columnstore indexing and in-memory.| -|Configuration_EnableInMemory|Replaces the integration staging tables with SCHEMA_ONLY memory-optimized tables to improve ETL performance.| -|Configuration_ApplyPolybase|Configures an external data source, file format, and table.| -|Configuration_PopulateLargeSaleTable|Applied enterprise edition changes, then populates a larger amount of data for the 2012 calendar year as additional history.| -|Configuration_ReseedETL|Removes existing data and restarts the ETL seeds. This allows for repopulating the OLAP database to match updated rows in the OLTP database.| - -### Integration Schema - -Procedures used in the ETL process fall in these categories: -- Helper procedures for the ETL package - All Get* procedures. -- Procedures used by the ETL package for migrating staged data into the DW tables - All Migrate* procedures. -- `PopulateDateDimensionForYear` - Takes a year and ensures that all dates for that year are populated in the `Dimension.Date` table. - -### Sequences Schema - -Procedures to configure the sequences in the database. - -|Procedure|Purpose| -|-----------------------------|---------------------| -|ReseedAllSequences|Calls the procedure `ReseedSequenceBeyondTableValue` for all sequences.| -|ReseedSequenceBeyondTableValue|Used to reposition the next sequence value beyond the value in any table that uses the same sequence. (Like a `DBCC CHECKIDENT` for identity columns equivalent for sequences but across potentially multiple tables).| diff --git a/samples/databases/wide-world-importers/documentation/wwi-olap-installation.md b/samples/databases/wide-world-importers/documentation/wwi-olap-installation.md deleted file mode 100644 index fcfa2c9043..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-olap-installation.md +++ /dev/null @@ -1,54 +0,0 @@ -# WideWorldImportersDW Installation and Configuration - - -- [SQL Server 2016](https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2016) (or higher) or [Azure SQL Database](https://azure.microsoft.com/services/sql-database/). To use the Full version of the sample, use SQL Server Evaluation/Developer/Enterprise Edition. -- [SQL Server Management Studio](https://msdn.microsoft.com/library/mt238290.aspx). For the best results use the April 2016 preview or later. - -## Download - -The latest release of the sample: - -[wide-world-importers-release](http://go.microsoft.com/fwlink/?LinkID=800630) - -Download the sample WideWorldImportersDW database backup/bacpac that corresponds to your edition of SQL Server or Azure SQL Database. - -Source code to recreate the sample database is available from the following location. Note that data population is based on ETL from the OLTP database (WideWorldImporters): - -[wide-world-importers-source](https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/wide-world-importers/wwi-dw-database-scripts) - -## Install - - -### SQL Server - -To restore a backup to a SQL Server instance, you can use Management Studio. - -1. Open SQL Server Management Studio and connect to the target SQL Server instance. -2. Right-click on the **Databases** node, and select **Restore Database**. -3. Select **Device** and click on the button **...** -4. In the dialog **Select backup devices**, click **Add**, navigate to the database backup in the filesystem of the server, and select the backup. Click **OK**. -5. If needed, change the target location for the data and log files, in the **Files** pane. Note that it is best practice to place data and log files on different drives. -6. Click **OK**. This will initiate the database restore. After it completes, you will have the database WideWorldImporters installed on your SQL Server instance. - -### Azure SQL Database - -To import a bacpac into a new SQL Database, you can use Management Studio. - -1. (optional) If you do not yet have a SQL Server in Azure, navigate to the [Azure portal](https://portal.azure.com/) and create a new SQL Database. In the process of create a database, you will create a server. Make note of the server. - - See [this tutorial](https://azure.microsoft.com/documentation/articles/sql-database-get-started/) to create a database in minutes -2. Open SQL Server Management Studio and connect to your server in Azure. -3. Right-click on the **Databases** node, and select **Import Data-Tier Application**. -4. In the **Import Settings** select **Import from local disk** and select the bacpac of the sample database from your file system. -5. Under **Database Settings** change the database name to *WideWorldImportersDW* and select the target edition and service objective to use. -6. Click **Next** and **Finish** to kick off deployment. It will take a few minutes to complete. When specifying a service objective lower than S2 it may take longer. - -## Configuration - -[Applies to SQL Server 2016 (and later) Developer/Evaluation/Enterprise Edition] - -The sample database can make use of PolyBase to query files in Hadoop or Azure blob storage. However, that feature is not installed by default with SQL Server - you need to select it during SQL Server setup. Therefore, a post-installation step is required. - -1. In SQL Server Management Studio, connect to the WideWorldImportersDW database and open a new query window. -2. Run the following T-SQL command to enable the use of PolyBase in the database: - - EXECUTE [Application].[Configuration_ApplyPolyBase] diff --git a/samples/databases/wide-world-importers/documentation/wwi-olap-sample-queries.md b/samples/databases/wide-world-importers/documentation/wwi-olap-sample-queries.md deleted file mode 100644 index 171bbe7287..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-olap-sample-queries.md +++ /dev/null @@ -1,5 +0,0 @@ -# WideWorldImportersDW OLAP Sample Queries - -Refer to the sample-scripts.zip file that is included with the release of the sample, or refer to the source code: - -[wide-world-importers/sample-scripts](https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/wide-world-importers/sample-scripts) diff --git a/samples/databases/wide-world-importers/documentation/wwi-olap-sql-features.md b/samples/databases/wide-world-importers/documentation/wwi-olap-sql-features.md deleted file mode 100644 index 99921b6d2f..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-olap-sql-features.md +++ /dev/null @@ -1,95 +0,0 @@ -# WideWorldImportersDW Use of SQL Server Features and Capabilities - -WideWorldImportersDW is designed to showcase many of the key features of SQL Server that are suitable for data warehousing and analytics. The following is a list of SQL Server features and capabilities, and a description of how they are used in WideWorldImportersDW. - -## PolyBase - -[Applies to SQL Server (2016 and later)] - -PolyBase is used to combine sales information from WideWorldImportersDW with a public data set about demographics to understand which cities might be of interest for further expansion of sales. - -To enable the use of PolyBase in the sample database, make sure it is installed, and run the following stored procedure in the database: - - EXEC [Application].[Configuration_ApplyPolybase] - -This will create an external table `dbo.CityPopulationStatistics` that references a public data set that contains population data for cities in the United States, hosted in Azure blob storage. You are encouraged to review the code in the stored procedure to understand the configuration process. If you want to host your own data in Azure blob storage and keep it secure from general public access, you will need to undertake additional configuration steps. The following query returns the data from that external data set: - - SELECT CityID, StateProvinceCode, CityName, YearNumber, LatestRecordedPopulation FROM dbo.CityPopulationStatistics; - -To understand which cities might be of interest for further expansion, the following query looks at the growth rate of cities, and returns the top 100 largest cities with significant growth, and where Wide World Importers does not have a sales presence. The query involves a join between the remote table `dbo.CityPopulationStatistics` and the local table `Dimension.City`, and a filter involving the local table `Fact.Sales`. - - WITH PotentialCities - AS - ( - SELECT cps.CityName, - cps.StateProvinceCode, - MAX(cps.LatestRecordedPopulation) AS PopulationIn2016, - (MAX(cps.LatestRecordedPopulation) - MIN(cps.LatestRecordedPopulation)) * 100.0 - / MIN(cps.LatestRecordedPopulation) AS GrowthRate - FROM dbo.CityPopulationStatistics AS cps - WHERE cps.LatestRecordedPopulation IS NOT NULL - AND cps.LatestRecordedPopulation <> 0 - GROUP BY cps.CityName, cps.StateProvinceCode - ), - InterestingCities - AS - ( - SELECT DISTINCT pc.CityName, - pc.StateProvinceCode, - pc.PopulationIn2016, - FLOOR(pc.GrowthRate) AS GrowthRate - FROM PotentialCities AS pc - INNER JOIN Dimension.City AS c - ON pc.CityName = c.City - WHERE GrowthRate > 2.0 - AND NOT EXISTS (SELECT 1 FROM Fact.Sale AS s WHERE s.[City Key] = c.[City Key]) - ) - SELECT TOP(100) CityName, StateProvinceCode, PopulationIn2016, GrowthRate - FROM InterestingCities - ORDER BY PopulationIn2016 DESC; - -## Clustered Columnstore Indexes - -(Full version of the sample) - -Clustered Columnstore Indexes (CCI) are used with all the fact tables, to reduce storage footprint and improve query performance. With the use of CCI, the base storage for the fact tables uses column compression. - -Nonclustered indexes are used on top of the clustered columnstore index, to facilitate primary key and foreign key constraints. These constraints were added out of an abundance of caution - the ETL process sources the data from the WideWorldImporters database, which has constraints to enforce integrity. Removing primary and foreign key constraints, and their supporting indexes, would reduce the storage footprint of the fact tables. - -**Data size** - -The sample database has limited data size, to make it easy to download and install the sample. However, to see the real performance benefits of columnstore indexes, you would want to use a larger data set. - -You can run the following statement to increase the size of the `Fact.Sales` table by inserting another 12 million rows of sample data. These rows are all inserted for the year 2012, such that there is no interference with the ETL process. - - EXECUTE [Application].[Configuration_PopulateLargeSaleTable] - -This statement will take around 5 minutes to run. To insert more than 12 million rows, pass the desired number of rows to insert as a parameter to this stored procedure. - -To compare query performance with and without columnstore, you can drop and/or recreate the clustered columnstore index. - -To drop the index: - - DROP INDEX [CCX_Fact_Order] ON [Fact].[Order] - -To recreate: - - CREATE CLUSTERED COLUMNSTORE INDEX [CCX_Fact_Order] ON [Fact].[Order] - -## Partitioning - -(Full version of the sample) - -Data size in a Data Warehouse can grow very large. Therefore it is best practice to use partitioning to manage the storage of the large tables in the database. - -All of the larger fact tables are partitioned by year. The only exception is `Fact.Stock Holdings`, which is not date-based and has limited data size compared with the other fact tables. - -The partition function used for all partitioned tables is `PF_Date`, and the partition scheme being used is `PS_Date`. - -## In-Memory OLTP - -(Full version of the sample) - -WideWorldImportersDW uses SCHEMA_ONLY memory-optimized tables for the staging tables. All `Integration.`\*`_Staging` tables are SCHEMA_ONLY memory-optimized tables. - -The advantage of SCHEMA_ONLY tables is that they are not logged, and do not require any disk access. This improves the performance of the ETL process. Since these tables are not logged, their contents are lost if there is a failure. However, the data source is still available, so the ETL process can simply be restarted if a failure occurs. diff --git a/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-catalog.md b/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-catalog.md deleted file mode 100644 index 9daf521b76..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-catalog.md +++ /dev/null @@ -1,202 +0,0 @@ -# WideWorldImporters Database Catalog - -The WideWorldImporters database contains all the transaction information and daily data for sales and purchases, as well as sensor data for vehicles and cold rooms. - -## Schemas - -WideWorldImporters uses schemas for different purposes, such as storing data, defining how users can access the data, and providing objects for data warehouse development and integration. - -### Data schemas - -These schemas contain the data. A number of tables are needed by all other schemas and are located in the Application schema. - -|Schema|Description| -|-----------------------------|---------------------| -|Application|Application-wide users, contacts, and parameters. This also contains reference tables with data that is used by multiple schemas| -|Purchasing|Stock item purchases from suppliers and details about suppliers.| -|Sales|Stock item sales to retail customers, and details about customers and sales people. | -|Warehouse|Stock item inventory and transactions.| - -### Secure-access schemas - -These schemas are used for external applications that are not allowed to access the data tables directly. They contain views and stored procedures used by external applications. - -|Schema|Description| -|-----------------------------|---------------------| -|Website|All access to the database from the company website is through this schema.| -|Reports|All access to the database from Reporting Services reports is through this schema.| -|PowerBI|All access to the database from the Power BI dashboards via the Enterprise Gateway is through this schema.| - -Note that the Reports and PowerBI schemas are not used in the initial release of the sample database. However, all Reporting Services and Power BI samples built on top of this database are encouraged to use these schemas. - -### Development schemas - -Special-purpose schemas - -|Schema|Description| -|-----------------------------|---------------------| -|Integration|Objects and procedures required for data warehouse integration (i.e. migrating the data to the WideWorldImportersDW database).| -|Sequences|Holds sequences used by all tables in the application.| - -## Tables - -All tables in the database are in the data schemas. - -### Application Schema - -Details of parameters and people (users and contacts), along with common reference tables (common to multiple other schemas). - -|Table|Description| -|-----------------------------|---------------------| -|SystemParameters|Contains system-wide configurable parameters.| -|People|Contains user names, contact information, for all who use the application, and for the people that the Wide World Importers deals with at customer organizations. This includes staff, customers, suppliers, and any other contacts. For people who have been granted permission to use the system or website, the information includes login details.| -|Cities|There are many addresses stored in the system, for people, customer organization delivery addresses, pickup addresses at suppliers, etc. Whenever an address is stored, there is a reference to a city in this table. There is also a spatial location for each city.| -|StateProvinces|Cities are part of states or provinces. This table has details of those, including spatial data describing the boundaries each state or province.| -|Countries|States or Provinces are part of countries. This table has details of those, including spatial data describing the boundaries of each country.| -|DeliveryMethods|Choices for delivering stock items (e.g., truck/van, post, pickup, courier, etc.)| -|PaymentMethods|Choices for making payments (e.g., cash, check, EFT, etc.)| -|TransactionTypes|Types of customer, supplier, or stock transactions (e.g., invoice, credit note, etc.)| - -### Purchasing Schema - -Details of suppliers and of stock item purchases. - -|Table|Description| -|-----------------------------|---------------------| -|Suppliers|Main entity table for suppliers (organizations)| -|SupplierCategories|Categories for suppliers (e.g., novelties, toys, clothing, packaging, etc.)| -|SupplierTransactions|All financial transactions that are supplier-related (invoices, payments)| -|PurchaseOrders|Details of supplier purchase orders| -|PurchaseOrderLines|Detail lines from supplier purchase orders| - -  -### Sales Schema - -Details of customers, salespeople, and of stock item sales. - -|Table|Description| -|-----------------------------|---------------------| -|Customers|Main entity tables for customers (organizations or individuals)| -|CustomerCategories|Categories for customers (ie novelty stores, supermarkets, etc.)| -|BuyingGroups|Customer organizations can be part of groups that exert greater buying power| -|CustomerTransactions|All financial transactions that are customer-related (invoices, payments)| -|SpecialDeals|Special pricing. This can include fixed prices, discount in dollars or discount percent.| -|Orders|Detail of customer orders| -|OrderLines|Detail lines from customer orders| -|Invoices|Details of customer invoices| -|InvoiceLines|Detail lines from customer invoices| - -### Warehouse Schema - -Details of stock items, their holdings and transactions. - -|Table|Description| -|-----------------------------|---------------------| -|StockItems|Main entity table for stock items| -|StockItemHoldings|Non-temporal columns for stock items. These arefrequently updated columns.| -|StockGroups|Groups for categorizing stock items (e.g., novelties, toys, edible novelties, etc.)| -|StockItemStockGroups|Which stock items are in which stock groups (many to many)| -|Colors|Stock items can (optionally) have colors| -|PackageTypes|Ways that stock items can be packaged (e.g., box, carton, pallet, kg, etc.| -|StockItemTransactions|Transactions covering all movements of all stock items (receipt, sale, write-off)| -|VehicleTemperatures|Regularly recorded temperatures of vehicle chillers| -|ColdRoomTemperatures|Regularly recorded temperatures of cold room chillers| - - -## Design considerations - -Database design is subjective and there is no right or wrong way to design a database. The schemas and tables in this database show ideas for how you can design your own database. - -### Schema design - -WideWorldImporters uses a small number of schemas so that it is easy to understand the database system and demonstrate database principles. - -Wherever possible, the database collocates tables that are commonly queried together into the same schema to minimize join complexity. - -The database schema has been code-generated based on a series of metadata tables in another database WWI_Preparation. This gives WideWorldImporters a very high degree of design consistency, naming consistency, and completeness. For details on how the schema has been generated see the source code: [wide-world-importers/wwi-database-scripts](https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/wide-world-importers/wwi-database-scripts) - -### Table design - -- All tables have single column primary keys for join simplicity. -- All schemas, tables, columns, indexes, and check constraints have a Description extended property that can be used to identify the purpose of the object or column. Memory-optimized tables are an exception to this since they don’t currently support extended properties. -- All foreign keys are automatically indexed unless there is another non-clustered index that has the same left-hand component. -- Auto-numbering in tables is based on sequences. These sequences are easier to work with across linked servers and similar environments than IDENTITY columns. Memory-optimized tables use IDENTITY columns since they don’t support in SQL Server 2016. -- A single sequence (TransactionID) is used for these tables: CustomerTransactions, SupplierTransactions, and StockItemTransactions. This demonstrates how a set of tables can have a single sequence. -- Some columns have appropriate default values. - -### Security schemas - -For security, WideWorldImporters does not allow external applications to access data schemas directly. To isolate access, WideWorldImporters uses security-access schemas that do not hold data, but contain views and stored procedures. External applications use the security schemas to retrieve the data that they are allowed to view. This way, users can only run the views and stored procedures in the secure-access schemas - -For example, this sample includes Power BI dashboards. An external application accesses these Power BI dashboards from the Power BI gateway as a user that has read-only permission on the PowerBI schema. For read-only permission, the user only needs SELECT and EXECUTE permission on the PowerBI schema. A database administrator at WWI assigns these permissions as needed. - -## Stored Procedures - -Stored procedures are organized in schemas. Most of the schemas are used for configuration or sample purposes. - -The `Website` schema contains the stored procedures that can be used by a Web front-end. - -The `Reports` and `PowerBI` schemas are meant for reporting services and PowerBI purposes. Any extensions of the sample are encouraged to use these schemas for reporting purposes. - -### Website schema - -These are the procedures used by a client application, such as a Web front-end. - -|Procedure|Purpose| -|-----------------------------|---------------------| -|ActivateWebsiteLogon|Allows a person (from `Application.People`) to have access to the website.| -|ChangePassword|Changes a user’s password (for users that are not using external authentication mechanisms).| -|InsertCustomerOrders|Allows inserting one or more customer orders (including the order lines).| -|InvoiceCustomerOrders|Takes a list of orders to be invoiced and processes the invoices.| -|RecordColdRoomTemperatures|Takes a sensor data list, as a table-valued parameter (TVP), and applies the data to the `Warehouse.ColdRoomTemperatures` temporal table.| -|RecordVehicleTemperature|Takes a JSON array and uses it to update `Warehouse.VehicleTemperatures`.| -|SearchForCustomers|Searches for customers by name or part of name (either the company name or the person name).| -|SearchForPeople|Searches for people by name or part of name.| -|SearchForStockItems|Searches for stock items by name or part of name or marketing comments.| -|SearchForStockItemsByTags|Searches for stock items by tags.| -|SearchForSuppliers|Searches for suppliers by name or part of name (either the company name or the person name).| - -### Integration Schema - -The stored procedures in this schema are used by the ETL process. They obtain the data needed from various tables for the timeframe required by the [ETL package](wwi-etl.md). - -### DataLoadSimulation Schema - -Simulates a workload that inserts sales and purchases. The main stored procedure is `PopulateDataToCurrentDate`, which is used to insert sample data up to the current date. - -|Procedure|Purpose| -|-----------------------------|---------------------| -|Configuration_ApplyDataLoadSimulationProcedures|Recreates the procedures needed for data load simulation. This is needed for bringing data up to the current date.| -|Configuration_RemoveDataLoadSimulationProcedures|This removes the procedures again after data simulation is complete.| -|DeactiveTemporalTablesBeforeDataLoad|Removes the temporal nature of all temporal tables and where applicable, applies a trigger so that changes can be made as though they were being applied at an earlier date than the sys-temporal tables allow.| -|PopulateDataToCurrentDate|Used to bring the data up to the current date. Should be run before any other configuration options after restoring the database from an initial backup.| -|ReactivateTemporalTablesAfterDataLoad|Re-establishes the temporal tables, including checking for data consistency. (Removes the associated triggers).| - - -### Application Schema - -These procedures are used to configure the sample. They are used to apply enterprise edition features to the standard edition version of the sample, and also to add auditing and full-text indexing. - -|Procedure|Purpose| -|-----------------------------|---------------------| -|AddRoleMemberIfNonexistant|Adds a member to a role if the member isn’t already in the role| -|Configuration_ApplyAuditing|Adds auditing. Server auditing is applied for standard edition databases; additional database auditing is added for enterprise edition.| -|Configuration_ApplyColumnstoreIndexing|Applies columnstore indexing to `Sales.OrderLines` and `Sales.InvoiceLines` and reindexes appropriately.| -|Configuration_ApplyFullTextIndexing|Applies fulltext indexes to `Application.People`, `Sales.Customers`, `Purchasing.Suppliers`, and `Warehouse.StockItems`. Replaces `Website.SearchForPeople`, `Website.SearchForSuppliers`, `Website.SearchForCustomers`, `Website.SearchForStockItems`, `Website.SearchForStockItemsByTags` with replacement procedures that use fulltext indexing.| -|Configuration_ApplyPartitioning|Applies table partitioning to `Sales.CustomerTransactions and `Purchasing.SupplierTransactions`, and rearranges the indexes to suit.| -|Configuration_ApplyRowLevelSecurity|Applies row level security to filter customers by sales territory related roles.| -|Configuration_ConfigureForEnterpriseEdition|Applies columnstore indexing, full text, in-memory, polybase, and partitioning.| -|Configuration_EnableInMemory|Adds a memory-optimized filegroup (when not working in Azure), replaces `Warehouse.ColdRoomTemperatures`, `Warehouse.VehicleTemperatures` with in-memory equivalents, and migrates the data, recreates the `Website.OrderIDList`, `Website.OrderList`, `Website.OrderLineList`, `Website.SensorDataList` table types with memory-optimized equivalents, drops and recreates the procedures `Website.InvoiceCustomerOrders`, `Website.InsertCustomerOrders`, and `Website.RecordColdRoomTemperatures` that uses these table types.| -|Configuration_RemoveAuditing|Removes the auditing configuration.| -|Configuration_RemoveRowLevelSecurity|Removes the row level security configuration (this is needed for changes to the associated tables).| -|CreateRoleIfNonExistant|Creates a database role if it doesn’t already exist.| - - -### Sequences Schema - -Procedures to configure the sequences in the database. - -|Procedure|Purpose| -|-----------------------------|---------------------| -|ReseedAllSequences|Calls the procedure ReseedSequenceBeyondTableValue for all sequences.| -|ReseedSequenceBeyondTableValue|Used to reposition the next sequence value beyond the value in any table that uses the same sequence. (Like a DBCC CHECKIDENT for identity columns equivalent for sequences but across potentially multiple tables).| diff --git a/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-installation.md b/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-installation.md deleted file mode 100644 index 68671f525c..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-installation.md +++ /dev/null @@ -1,63 +0,0 @@ -# WideWorldImporters Installation and Configuration - -## Prerequisites - -- [SQL Server 2016](https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2016) (or higher) or [Azure SQL Database](https://azure.microsoft.com/services/sql-database/). To use the Full version of the sample, use SQL Server Evaluation/Developer/Enterprise Edition. -- [SQL Server Management Studio](https://msdn.microsoft.com/library/mt238290.aspx). For the best results use the April 2016 preview or later. - -## Download - -The latest release of the sample: - -[wide-world-importers-release](http://go.microsoft.com/fwlink/?LinkID=800630) - -Download the sample WideWorldImporters database backup/bacpac that corresponds to your edition of SQL Server or Azure SQL Database. - -Source code to recreate the sample database is available from the following location. Note that recreating the sample will result in slight differences in the data, since there is a random factor in the data generation: - -[wide-world-importers](https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/wide-world-importers/wwi-database-scripts) - -## Install - - -### SQL Server - -To restore a backup to a SQL Server instance, you can use Management Studio. - -1. Open SQL Server Management Studio and connect to the target SQL Server instance. -2. Right-click on the **Databases** node, and select **Restore Database**. -3. Select **Device** and click on the button **...** -4. In the dialog **Select backup devices**, click **Add**, navigate to the database backup in the filesystem of the server, and select the backup. Click **OK**. -5. If needed, change the target location for the data and log files, in the **Files** pane. Note that it is best practice to place data and log files on different drives. -6. Click **OK**. This will initiate the database restore. After it completes, you will have the database WideWorldImporters installed on your SQL Server instance. - -### Azure SQL Database - -To import a bacpac into a new SQL Database, you can use Management Studio. - -1. (optional) If you do not yet have a SQL Server in Azure, navigate to the [Azure portal](https://portal.azure.com/) and create a new SQL Database. In the process of create a database, you will create a server. Make note of the server. - - See [this tutorial](https://azure.microsoft.com/documentation/articles/sql-database-get-started/) to create a database in minutes -2. Open SQL Server Management Studio and connect to your server in Azure. -3. Right-click on the **Databases** node, and select **Import Data-Tier Application**. -4. In the **Import Settings** select **Import from local disk** and select the bacpac of the sample database from your file system. -5. Under **Database Settings** change the database name to *WideWorldImporters* and select the target edition and service objective to use. -6. Click **Next** and **Finish** to kick off deployment. It will take a few minutes to complete. When specifying a service objective lower than S2 it may take longer. - -## Configuration - -### Full-Text Indexing - -The sample database can make use of Full-Text Indexing. However, that feature is not installed by default with SQL Server - you need to select it during SQL Server setup (it is enabled by default in Azure SQL DB). Therefore, a post-installation step is required. - -1. In SQL Server Management Studio, connect to the WideWorldImporters database and open a new query window. -2. Run the following T-SQL command to enable the use of Full-Text Indexing in the database: - - EXECUTE Application.Configuration_ApplyFullTextIndexing - -### SQL Server Audit - -Enabling auditing in SQL Server requires server configuration. To enable SQL Server audit for the WideWorldImporters sample, run the following statement in the database: - - EXECUTE [Application].[Configuration_ApplyAuditing] - -In Azure SQL Database, Audit is configured through the [Azure portal](https://portal.azure.com/). diff --git a/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-sample-queries.md b/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-sample-queries.md deleted file mode 100644 index 9a08faf06d..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-sample-queries.md +++ /dev/null @@ -1,5 +0,0 @@ -# WideWorldImporters Sample Queries - -Refer to the sample-scripts.zip file that is included with the release of the sample, or refer to the source code: - -[wide-world-importers/sample-scripts](https://github.com/Microsoft/sql-server-samples/tree/master/samples/databases/wide-world-importers/sample-scripts) diff --git a/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-sql-features.md b/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-sql-features.md deleted file mode 100644 index 9e4ad1e865..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-oltp-htap-sql-features.md +++ /dev/null @@ -1,27 +0,0 @@ -# WideWorldImporters Use of SQL Server Features and Capabilities - -WideWorldImporters is designed to showcase many of the key features of SQL Server, including the latest features introduced in SQL Server 2016. The following is a list of SQL Server features and capabilities, and a description of how they are used in WideWorldImporters. - -|SQL Server feature or capability|Use in WideWorldImporters| -|-----------------------------|---------------------| -|Temporal tables|There are many temporal tables, including all look-up style reference tables and main entities such as StockItems, Customers, and Suppliers. Using temporal tables allows to conveniently keep track of the history of these entities.| -|AJAX calls for JSON|The application frequently uses AJAX calls to query these tables: Persons, Customers, Suppliers, and StockItems. The calls return JSON payloads (i.e. the data that is returned is formatted as JSON data). See, for example, the stored procedure `Website.SearchForCustomers`.| -|JSON property/value bags|A number of tables have columns that hold JSON data to extend the relational data in the table. For example, `Application.SystemParameters` has a column for application settings and `Application.People` has a column to record user preferences. These tables use an `nvarchar(max)` column to record the JSON data, along with a CHECK constraint using the built-in function `ISJSON` to ensure the column values are valid JSON.| -|Row-level security (RLS)|Row Level Security (RLS) is used to limit access to the Customers table, based on role membership. Each sales territory has a role and a user. To see this in action, use the corresponding script in sample-script.zip, which is part of the [release of the sample](http://go.microsoft.com/fwlink/?LinkID=800630).| -|Real-time Operational Analytics|(Full version of the database) The core transactional tables `Sales.InvoiceLines` and `Sales.OrderLines` both have a non-clustered columnstore index to support efficient execution of analytical queries in the transactional database with minimal impact on the operational workload. Running transactions and analytics in the same database is also referred to as [Hybrid Transactional/Analytical Processing (HTAP)](https://wikipedia.org/wiki/Hybrid_Transactional/Analytical_Processing_(HTAP)). To see this in action, use the corresponding script in sample-script.zip, which is part of the [release of the sample](http://go.microsoft.com/fwlink/?LinkID=800630).| -|PolyBase|To see this PolyBase in action, using an external table with a public data set hosted in Azure blog storage, use the corresponding script in sample-script.zip, which is part of the [release of the sample](http://go.microsoft.com/fwlink/?LinkID=800630).| -|In-Memory OLTP|(Full version of the database) The table types are all memory-optimized, such that table-valued parameters (TVPs) all benefit from memory-optimization.

The two monitoring tables, `Warehouse.VehicleTemperatures` and `Warehouse.ColdRoomTemperatures`, are memory-optimized. This allows the ColdRoomTemperatures table to be populated at higher speed than a traditional disk-based table. The VehicleTemperatures table holds the JSON payload and lends itself to extension towards IoT scenarios. The VehicleTemperatures table further lends itself to scenarios involving EventHubs, Stream Analytics, and Power BI.

The stored procedure `Website.RecordColdRoomTemperatures` is natively compiled to further improve the performance of recording cold room temperatures.

To see an example of In-Memory OLTP in action, see the vehicle-locations workload driver in workload-drivers.zip, which is part of the [release of the sample](http://go.microsoft.com/fwlink/?LinkID=800630).| -|Clustered columnstore index|(Full version of the database) The table `Warehouse.StockItemTransactions` uses a clustered columnstore index. The number of rows in this table is expected to grow large, and the clustered columnstore index significantly reduces the on-disk size of the table, and improves query performance. The modification on this table are insert-only - there is no update/delete on this table in the online workload - and clustered columnstore index performs well for insert workloads.| -|Dynamic Data Masking|In the database schema, Data Masking has been applied to the bank details held for Suppliers, in the table `Purchasing.Suppliers`. Non-admin staff will not have access to this information.| -|Always Encrypted|A demo for Always Encrypted is included in the downloadable samples.zip, which is part of the [release of the sample](http://go.microsoft.com/fwlink/?LinkID=800630).. The demo creates an encryption key, a table using encryption for sensitive data, and a small sample application that inserts data into the table.| -|Stretch database|The `Warehouse.ColdRoomTemperatures` table has been implemented as a temporal table, and is memory-optimized in the Full version of the sample database. The archive table is disk-based and can be stretched to Azure.| -|Full-text indexes|Full-text indexes improve searches for People, Customers, and StockItems. The indexes are applied to queries only if you have full-text indexing installed on your SQL Server instance. A non-persistent computed column is used to create the data that is full-text indexed in the StockItems table.

`CONCAT` is used for concatenating the fields to create SearchData that is full-text indexed.
To enable the use of full-text indexes in the sample execute the following statement in the database:

`EXECUTE [Application].[Configuration_ConfigureFullTextIndexing]`

The procedure creates a default fulltext catalog if one doesn’t already exist, then replaces the search views with full-text versions of those views).

Note that using full-text indexes in SQL Server requires selecting the Full-Text option during installation. Azure SQL Database does not require and specific configuration to enable full-text indexes.| -|Indexed persisted computed columns|Indexed persisted computed columns used in SupplierTransactions and CustomerTransactions.| -|Check constraints|A relatively complex check constraint is in `Sales.SpecialDeals`. This ensures that one and only one of DiscountAmount, DiscountPercentage, and UnitPrice is configured.| -|Unique constraints|A many to many construction (and unique constraints) are set up for Warehouse.StockItemStockGroups`.| -|Table partitioning|(Full version of the database) The tables `Sales.CustomerTransactions` and `Purchasing.SupplierTransactions` are both partitioned by year using the partition function `PF_TransactionDate` and the partition scheme `PS_TransactionDate`. Partitioning is used to improve the manageability of large tables.| -|List processing|An example table type `Website.OrderIDList` is provided. It is used by an example procedure `Website.InvoiceCustomerOrders`. The procedure uses Common Table Expressions (CTEs), TRY/CATCH, JSON_MODIFY, XACT_ABORT, NOCOUNT, THROW, and XACT_STATE to demonstrates the ability to process a list of orders rather than just a single order, to minimize round trips from the application to the database engine.| -|GZip compression|The `Warehouse.VehicleTemperature`s table holds full sensor data but when this data is more than a few months old, it is compressed to conserve space using the COMPRESS function, which uses GZip compression.

The view `Website.VehicleTemperatures` uses the DECOMPRESS function when retrieving data that was previously compressed.| -|Query Store|Query Store is enabled on the database. After running a few queries, open the database in Management Studio, open the node Query Store, which is under the database, and open the report Top Resource Consuming Queries to see the query executions and the plans for the queries you just ran.| -|STRING_SPLIT|The column `DeliveryInstructions` in the table `Sales.Invoices`has a comma-delimited value that can be used to demonstrate STRING_SPLIT.| -|Audit|SQL Server Audit can be enabled for this sample database by running the following statement in the database:

`EXECUTE [Application].[Configuration_ApplyAuditing]`

In Azure SQL Database, auditing is enabled through the [Azure portal](https://portal.azure.com/).

Security operations involving logins, roles and permissions are logged on all systems where audit is enabled (including standard edition systems). Audit is directed to the application log because this is available on all systems and does not require additional permissions. A warning is given that for higher security, it should be redirected to the security log or to a file in a secure folder. A link is provided to describe the required additional configuration.

For evaluation/developer/enterprise edition systems, access to all financial transactional data is audited.| diff --git a/samples/databases/wide-world-importers/documentation/wwi-overview.md b/samples/databases/wide-world-importers/documentation/wwi-overview.md deleted file mode 100644 index 352b4c8a68..0000000000 --- a/samples/databases/wide-world-importers/documentation/wwi-overview.md +++ /dev/null @@ -1,49 +0,0 @@ -# Wide World Importers Overview - -This is an overview of the fictitious company Wide World Importers and the workflows that are addressed in the WideWorldImporters sample databases for SQL Server and Azure SQL Database. - -Wide World Importers (WWI) is a wholesale novelty goods importer and distributor operating from the San Francisco bay area. - -As a wholesaler, WWI’s customers are mostly companies who resell to individuals. WWI sells to retail customers across the United States including specialty stores, supermarkets, computing stores, tourist attraction shops, and some individuals. WWI also sells to other wholesalers via a network of agents who promote the products on WWI’s behalf. While all of WWI’s customers are currently based in the United States, the company is intending to push for expansion into other countries. - -WWI buys goods from suppliers including novelty and toy manufacturers, and other novelty wholesalers. They stock the goods in their WWI warehouse and reorder from suppliers as needed to fulfil customer orders. They also purchase large volumes of packaging materials, and sell these in smaller quantities as a convenience for the customers. - -Recently WWI started to sell a variety of edible novelties such as chilly chocolates. The company previously did not have to handle chilled items. Now, to meet food handling requirements, they must monitor the temperature in their chiller room and any of their trucks that have chiller sections. - -## Workflow for warehouse stock items - -The typical flow for how items are stocked and distributed is as follows: -- WWI creates purchase orders and submits the orders to the suppliers. -- Suppliers send the items, WWI receives them and stocks them in their warehouse. -- Customers order items from WWI -- WWI fills the customer order with stock items in the warehouse, and when they do not have sufficient stock, they order the additional stock from the suppliers. -- Some customers do not want to wait for items that are not in stock. If they order say five different stock items, and four are available, they want to receive the four items and backorder the remaining item. The item would them be sent later in a separate shipment. -- WWI invoices customers for the stock items, typically by converting the order to an invoice. -- Customers might order items that are not in stock. These items are backordered. -- WWI delivers stock items to customers either via their own delivery vans, or via other couriers or freight methods. -- Customers pay invoices to WWI. -- Periodically, WWI pays suppliers for items that were on purchase orders. This is often sometime after they have received the goods. - -## Data Warehouse and analysis workflow - -While the team at WWI use SQL Server Reporting Services to generate operational reports from the WideWorldImporters database, they also need to perform analytics on their data and need to generate strategic reports. The team have created a dimensional data model in a database WideWorldImportersDW. This database is populated by an Integration Services package. - -SQL Server Analysis Services is used to create analytic data models from the data in the dimensional data model. SQL Server Reporting Services is used to generate strategic reports directly from the dimensional data model, and also from the analytic model. Power BI is used to create dashboards from the same data. The dashboards are used on websites, and on phones and tablets. *Note: these data models and reports are not yet available* - -## Additional workflows - -These are additional workflows. -- WWI issues credit notes when a customer does not receive the good for some reason, or when the goods are faulty. These are treated as negative invoices. -- WWI periodically counts the on-hand quantities of stock items to ensure that the stock quantities shown as available on their system are accurate. (The process of doing this is called a stocktake). -- Cold room temperatures. Perishable goods are stored in refrigerated rooms. Sensor data from these rooms is ingested into the database for monitoring and analytics purposes. -- Vehicle location tracking. Vehicles that transport goods for WWI include sensors that track the location. This location is again ingested into the database for monitoring and further analytics. - -## Fiscal year - -The company operates with a financial year that starts on November 1st. - -## Terms of use - -The license for the sample database and the sample code is described here: [license.txt](https://github.com/Microsoft/sql-server-samples/blob/master/license.txt) - -The sample database includes public data that has been loaded from data.gov and Natural EarthData. The terms of use are here: [http://www.naturalearthdata.com/about/terms-of-use/](http://www.naturalearthdata.com/about/terms-of-use/)