Skip to content

Commit e76e58c

Browse files
authored
Update readme and quickstart guide (#173)
- cleanup readme - add settings and commands to the quickstart - cleanup commands - fix "full sync" command - removed some unused commands
1 parent 78fc04e commit e76e58c

File tree

12 files changed

+129
-176
lines changed

12 files changed

+129
-176
lines changed
Lines changed: 24 additions & 69 deletions
Original file line numberDiff line numberDiff line change
@@ -1,91 +1,46 @@
11
# Databricks extension for VSCode
22

3-
| System | Status |
4-
| ----------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
5-
| Build ([main branch](https://github.com/databricks/databricks-vscode/commits/main)) | [![GitHub CI Status](https://github.com/databricks/databricks-vscode/actions/workflows/push.yml/badge.svg?branch=main)](https://github.com/databricks/databricks-vscode/actions/workflows/push.yml) [![Coverage](https://img.shields.io/codecov/c/github/databricks/databricks-vscode/main.svg)](https://codecov.io/gh/databricks/databricks-vscode/branch/main) [![LGTM Grade](https://img.shields.io/lgtm/grade/javascript/github/databricks/databricks-vscode)](https://lgtm.com/projects/g/databricks/databricks-vscode/) |
6-
| [Marketplace](https://marketplace.visualstudio.com/items?itemName=databricks.databricks-vscode) | [![Marketplace Version](https://img.shields.io/vscode-marketplace/v/databricks.databricks-vscode.svg) ![Marketplace Downloads](https://img.shields.io/vscode-marketplace/d/databricks.databricks-vscode.svg)](https://marketplace.visualstudio.com/items?itemName=databricks.databricks-vscode) |
3+
| System | Status |
4+
| ----------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
5+
| Build ([main branch](https://github.com/databricks/databricks-vscode/commits/main)) | [![GitHub CI Status](https://github.com/databricks/databricks-vscode/actions/workflows/push.yml/badge.svg?branch=main)](https://github.com/databricks/databricks-vscode/actions/workflows/push.yml) |
6+
| [Marketplace](https://marketplace.visualstudio.com/items?itemName=databricks.databricks-vscode) | [![Marketplace Version](https://img.shields.io/vscode-marketplace/v/databricks.databricks-vscode.svg) ![Marketplace Downloads](https://img.shields.io/vscode-marketplace/d/databricks.databricks-vscode.svg)](https://marketplace.visualstudio.com/items?itemName=databricks.databricks-vscode) |
77

88
The Databricks extension for VSCode allows you to develop for the Databricks Lakehouse platform from VSCode.
99

1010
The extension is available from the [Visual Studio Marketplace](https://marketplace.visualstudio.com/itemdetails?itemName=databricks.databricks-vscode).
1111

12-
This is an open source project because we want you to be involved. We love issues, feature requests, code reviews, pull
13-
requests or any positive contribution. See [CONTRIBUTING.md](CONTRIBUTING.md).
12+
This is an open source project because we want you to be involved. We love issues, feature requests, code reviews, pull requests or any positive contribution. See [CONTRIBUTING.md](CONTRIBUTING.md).
1413

1514
## Features
1615

17-
<mark>TODO</mark>
16+
- Synchronize code to a Databricks workspace
17+
- Run Python files on a Databricks cluster
18+
- Run notebooks and Python files as Workflows
1819

19-
Describe specific features of your extension including screenshots of your extension in action. Image paths are relative to this README file.
20-
21-
For example if there is an image subfolder under your extension project workspace:
22-
23-
\!\[feature X\]\(images/feature-x.png\)
24-
25-
> Tip: Many popular extensions utilize animations. This is an excellent way to show off your extension! We recommend short, focused animations that are easy to follow.
20+
![run](./images/run.gif)
2621

2722
## Requirements
2823

29-
<mark>TODO</mark>
30-
31-
If you have any requirements or dependencies, add a section describing those and how to install and configure them.
32-
33-
## Extension Settings
34-
35-
<mark>TODO</mark>
36-
37-
Include if your extension adds any VS Code settings through the `contributes.configuration` extension point.
24+
In order to use this extension you need acceess to a Databricks workspace:
3825

39-
For example:
26+
1. Databricks workspace with:
27+
1. `Repos` enabled
28+
2. `Files in Repos` enabled
29+
2. Permission to access the workspace using a personal access token (PAT)
30+
3. Access to an interactive cluster or permissions to create a cluster
31+
4. Permissions to create Databricks repos
4032

41-
This extension contributes the following settings:
33+
## Documentation
4234

43-
- `myExtension.enable`: enable/disable this extension
44-
- `myExtension.thing`: set to `blah` to do something
45-
46-
## Known Issues
47-
48-
<mark>TODO</mark>
49-
50-
Calling out known issues can help limit users opening duplicate issues against your extension.
35+
- The [Quick Start Guide](README.quickstart.md) provides an overview
36+
of common features.
37+
- <mark>The [User Guide](https://docs.databricks.com/)
38+
contains comprehesive documentation about the Databricks extension. (TODO: Link not available yet)</mark>
5139

5240
## Release Notes
5341

54-
<mark>TODO</mark>
55-
56-
Users appreciate release notes as you update your extension.
57-
58-
### 1.0.0
59-
60-
Initial release of ...
61-
62-
### 1.0.1
63-
64-
Fixed issue #.
65-
66-
### 1.1.0
67-
68-
Added features X, Y, and Z.
69-
70-
---
71-
72-
## Following extension guidelines
73-
74-
Ensure that you've read through the extensions guidelines and follow the best practices for creating your extension.
75-
76-
- [Extension Guidelines](https://code.visualstudio.com/api/references/extension-guidelines)
77-
78-
## Working with Markdown
79-
80-
**Note:** You can author your README using Visual Studio Code. Here are some useful editor keyboard shortcuts:
81-
82-
- Split the editor (`Cmd+\` on macOS or `Ctrl+\` on Windows and Linux)
83-
- Toggle preview (`Shift+CMD+V` on macOS or `Shift+Ctrl+V` on Windows and Linux)
84-
- Press `Ctrl+Space` (Windows, Linux) or `Cmd+Space` (macOS) to see a list of Markdown snippets
85-
86-
### For more information
42+
### 0.0.1
8743

88-
- [Visual Studio Code's Markdown Support](http://code.visualstudio.com/docs/languages/markdown)
89-
- [Markdown Syntax Reference](https://help.github.com/articles/markdown-basics/)
44+
Preview version of the VSCode extension for Databricks
9045

91-
**Enjoy!**
46+
**Happy Coding!**

packages/databricks-vscode/README.quickstart.md

Lines changed: 48 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,22 @@ The Databricks extension for VSCode allows you to develop for the Databricks Lak
88
- Run Python files on a Databricks cluster
99
- Run notebooks and Python files as Workflows
1010

11+
## <a id="toc"></a>Table of Contents
12+
13+
- [Getting Started](#setup-steps)
14+
- [Configure Extension](#configure-extension)
15+
- [Running Code](#running-code)
16+
- [Running PySpark Code](#running-pyspark-code)
17+
- [Running PySpark Code and Notebooks as Workflows](#running-code-as-workflows)
18+
- [Advanced: Running using custom run configurations](#run-configurations)
19+
- [Extension Settings](#settings)
20+
- [`Databricks:` Commands](#commands)
21+
1122
---
1223

1324
# <a id="setup-steps"></a>Getting Started
1425

15-
## Configure Extension
26+
## <a id="configure-extension"></a>Configure Extension
1627

1728
1. Open the Databricks panel by clicking on the Databricks icon on the left
1829
2. Click the "Configure Databricks" button
@@ -26,30 +37,61 @@ The Databricks extension for VSCode allows you to develop for the Databricks Lak
2637

2738
![configure](./images/configure.gif)
2839

29-
## Running Code
40+
## <a id="running-code"></a>Running Code
3041

3142
Once you have your project configured you can sync your local code to the repo and run it on a cluster. You can use the https://github.com/databricks/ide-best-practices repository as an example.
3243

33-
### Running PySpark code
44+
### <a id="running-pyspark-code"></a>Running PySpark code
3445

3546
1. Create python file
3647
2. Add PySpark code to the python file.
3748
3. Click the "Run" icon in the tab bar and select "Run File on Databricks"
3849

3950
This will start the code synchronization and run the active python file on the configured cluster. The result is printed in the "debug" output panel.
4051

41-
![configure](./images/run.gif)
52+
![run](./images/run.gif)
4253

43-
### Running PySpark and notebooks as a Workflow
54+
### <a id="running-code-as-workflows"></a>Running PySpark and notebooks as a Workflow
4455

4556
1. Create a python file or a python based notebook
4657
1. You can create a python based notebook by exporting a notebook from the Databricks web application or use a notebook that is already tracked in git, such as https://github.com/databricks/notebook-best-practices
4758
2. Click the "Run" icon in the tab bar and select "Run File as Workflow on Databricks"
4859

4960
This will run the file using the Jobs API on the configured cluster and render the result in a WebView.
5061

51-
### Advanced: Running using custom run configurations
62+
### <a id="run-configurations"></a>Advanced: Running using custom run configurations
5263

5364
Both ways of running code on a cluster are also available in custom run configurations. In the "Run and Debug" panel you can click "Add configuration..." and select either "Databricks: Launch" or "Databricks: Launch as Workflow". Using run configuration you can also pass in command line arguments and run your code by simply pressing `F5`.
5465

5566
![configure](./images/custom-runner.gif)
67+
68+
## <a id="settings"></a>Extension Settings
69+
70+
This extension contributes the following settings:
71+
72+
- `databricks.logs.maxFieldLength`: The maximum length of each field displayed in logs outputs panel
73+
- `databricks.logs.truncationDepth`: The max depth of logs to show without truncation
74+
- `databricks.logs.maxArrayLength`: The maximum number of items to show for array fields
75+
- `databricks.logs.enabled`: Enable/disable logging. Reload window for changes to take effect
76+
77+
## <a id="commands"></a>`Databricks:` Commands
78+
79+
The Databricks extension provides commands (prefixed with `Databricks:`) to the VS Code _command
80+
palette_, available by selecting _View > Command Palette_ or by typing
81+
`CTRL-SHIFT-p` (macOS: `CMD-SHIFT-p`).
82+
83+
| Databricks Command | Description |
84+
| :----------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------- |
85+
| `Databricks: Configure workspace` | Configure the Databricks workspace to use for the current project |
86+
| `Databricks: Logout` | Logs you out from your Databricks workspace |
87+
| `Databricks: Configure cluster` | Select an interactive cluster to use for running PySpark code in this project |
88+
| `Databricks: Detach cluster` | Detach configured cluster |
89+
| `Databricks: Configure sync destination` | Configure target directory for synchronizing code to the configured Databricks workspace |
90+
| `Databricks: Detach sync destination` | Detach the configured sync destination |
91+
| `Databricks: Start synchronization` | Start synchronizing local code to the Databricks workspace. This command performs an incremental sync. |
92+
| `Databricks: Start synchronization (full sync)` | Start synchronizing local code to the Databricks workspace. This command performs full sync even if an incremental sync is possible. |
93+
| `Databricks: Stop synchronization` | Stop sync process. |
94+
| `Databricks: Run File on Databricks` | Runs the selected Python file on the configured Databricks cluster |
95+
| `Databricks: Run File as Workflow on Databricks` | Runs the selected Python file as a Workflow in the configured Databricks cluster |
96+
| `Databricks: Show Quickstart` | Show the Quickstart panel |
97+
| `Databricks: Open Databricks configuration file` | Opens the Databricks configuration file for the current project |

0 commit comments

Comments
 (0)