Skip to content
This repository was archived by the owner on Sep 25, 2025. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/_sidebar.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@
- [Configure Service Connections](/how-tos/datacoves/how_to_service_connections.md)
- [Manage Users](/how-tos/datacoves/how_to_manage_users.md)
- [Update Repository](/getting-started/Admin/configure-repository.md)
- [Configure VSCode Environment Variables](/how-tos/datacoves/how_to_environment_variables.md)
- [Datahub](/how-tos/datahub/)
- [Configure dbt metadata ingestion](/how-tos/datahub/how_to_datahub_dbt.md)
- [Configure Snowflake metadata ingestion](/how-tos/datahub/how_to_datahub_snowflake.md)
Expand Down
Binary file added docs/how-tos/datacoves/assets/env_vars_1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/how-tos/datacoves/assets/env_vars_2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/how-tos/datacoves/assets/env_vars_3.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/how-tos/datacoves/assets/env_vars_4.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
30 changes: 30 additions & 0 deletions docs/how-tos/datacoves/how_to_environment_variables.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# How to use custom VSCode Environment Variables

Even though you can create environment variables in your VSCode instance by using the traditional `export` command, having your Environment configured with custom variables can be an important time-saver, as these are not cleared when your Environment restarts (due to inactivity, maintenance, or changes in its settings)

You can configure environment variables in 3 different places:

- `Project Settings`: applies to the entire Project (with all its Environment)
- `Environment Settings`: applies to the desired Environment
- `User Settings`: applies to the user's VSCode instance.

As the UI is almost the same in the 3 pages, we'll illustrate the process using the `Environment Settings` screen

To configure custom VSCode environment variables:

- Navigate to `VS Code Environment Variables` inside your Environment settings. Once there, click `Add`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@BAntonellini we can set env vars at the project, env, or user settings, no?


![VS Code Environment Variables](./assets/env_vars_1.png)

- A pop-up will appear, where you must specify `Key` and `Value` for your environment variable. Once set, click `Add`

![Configure ENV VAR](./assets/env_vars_2.png)

- You will be sent back to your Environment settings, where you should see the newly created environment variable.
- Once there, make sure to `Save Changes` to your Environment

![Save Changes](./assets/env_vars_3.png)

That's all! Now you can use this new persistent variable in your VSCode instance.

![VSCode](./assets/env_vars_4.png)
28 changes: 23 additions & 5 deletions docs/reference/airflow/datacoves-decorators.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ This custom decorator is an extension of Airflow's default @task decorator and s

**Params:**

- `env`: Pass in a dictionary of variables. eg) `"my_var": "{{ var.value.my_var }}"` Please use {{ var.value.my_var }} syntax to avoid parsing every 30 seconds.
- `env`: Pass in a dictionary of variables. eg `"my_var": "{{ var.value.my_var }}"` Please use {{ var.value.my_var }} syntax to avoid parsing every 30 seconds.
- `outlets`: Used to connect a task to an object in datahub or update a dataset
- `append_env`: Add env vars to existing ones like `DATACOVES__DBT_HOME`

Expand All @@ -43,8 +43,23 @@ This custom decorator is an extension of the @task decorator and simplifies runn
- It runs dbt commands inside the dbt Project Root, not the Repository root.

**Params:**

Datacoves dbt decorator supports all the [Datacoves dbt Operator params](/reference/airflow/datacoves-operator#datacoves-dbt-operator) plus:

- `connection_id`: This is the [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
- `overrides`: Pass in a dictionary with override parameters such as warehouse, role, or database.

**dbt profile generation:**

With the `connection_id` mentioned above, we create a temporary dbt profile (it only exists at runtime inside the Airflow DAG's worker). By default, this dbt profile contains the selected Service Credential connection details.

The dbt profile `name` is defined either in Project or Environment settings, in their `Profile name` field. This can be overwritten by passing a custom `DATACOVES__DBT_PROFILE` environment variable to the decorator

Users can also customize this dbt profile's connection details and/or target with the following params:

- `overrides`: a dictionary with override parameters such as warehouse, role, database, etc.
- `target`: the target name this temporary dbt profile will receive. Defaults to `default`.

Basic example

```python
def my_dbt_dag():
Expand All @@ -63,9 +78,12 @@ Example with overrides.
def my_dbt_dag():
@task.datacoves_dbt(
connection_id="main",
overrides={"warehouse": "my_custom_wh"})
overrides={"warehouse": "my_custom_wh"},
env={"DATACOVES__DBT_PROFILE": "prod"},
target="testing"
)
def dbt_test() -> str:
return "dbt debug"
return "dbt debug -t testing" # Make sure to pass `-t {target}` if you are using a custom target name.

dag = my_dbt_dag()
```
Expand Down Expand Up @@ -114,7 +132,7 @@ The new datacoves_dbt parameters are:

- `db_type`: The data warehouse you are using. Currently supports `redshift` or `snowflake`.
- `destination_schema`: The destination schema where the Airflow tables will end-up. By default, the schema will be named as follows: `airflow-{datacoves environment slug}` for example `airflow-qwe123`.
- `connection_id`: The name of your Airflow [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
- `connection_id`: The name of your Airflow [service connection](/how-tos/datacoves/how_to_service_connections.md) which is automatically added to airflow if you select `Airflow Connection` as the `Delivery Mode`.
- `additional_tables`: A list of additional tables you would want to add to the default set.
- `tables`: A list of tables to override the default ones from above. Warning: An empty list [] will perform a full-database sync.

Expand Down
1 change: 1 addition & 0 deletions docs/reference/airflow/datacoves-operator.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ Params:

- `bash_command`: command to run
- `project_dir` (optional): relative path from repo root to a specific dbt project.
- `run_dbt_deps` (optional): boolean to force dbt deps run.

```python
import datetime
Expand Down