Shows how to use an External Hive (SQL Server) along with ADLS Gen 1 as part of a Databricks initialization script that runs when the cluster is created.
-
Updated
Nov 12, 2018 - PowerShell
Shows how to use an External Hive (SQL Server) along with ADLS Gen 1 as part of a Databricks initialization script that runs when the cluster is created.
Upload a folder to Azure Data Lake Store
Queues up files to copy from one ADLS account to be copied to another ADLS account. You can also use this for on-prem and/or blob.
Submitting a U-SQL Job to Azure Data Lake Analytics
Places a resource lock on your ADLS resources so you cannot accidently delete.
Powershell to copy data lake file from local computer
Add a description, image, and links to the azure-data-lake topic page so that developers can more easily learn about it.
To associate your repository with the azure-data-lake topic, visit your repo's landing page and select "manage topics."