diff --git a/docs.json b/docs.json
index 56f97d90..1bb2a5d6 100644
--- a/docs.json
+++ b/docs.json
@@ -68,7 +68,7 @@
{
"group": "Exploring Data in Lightdash",
"pages": [
- "guides/ai-analyst",
+ "guides/ai-agents",
"guides/metrics-catalog",
"guides/limiting-data-using-filters",
"guides/interactive-dashboards",
@@ -170,7 +170,7 @@
"references/filters",
"references/table-calculations",
"references/custom-fields",
- "references/custom-tooltips",
+ "references/custom-tooltip",
"references/custom-charts"
]
},
diff --git a/get-started/setup-lightdash/connect-project.mdx b/get-started/setup-lightdash/connect-project.mdx
index d71974ab..728c3600 100644
--- a/get-started/setup-lightdash/connect-project.mdx
+++ b/get-started/setup-lightdash/connect-project.mdx
@@ -4,27 +4,22 @@ title: "Update your project connection"
To setup your Lightdash connection you'll need to:
-1. Connect to your data warehouse (bigquery, postgres, redshift, snowflake, databricks)
+1. [Connect to your data warehouse](/get-started/setup-lightdash/connect-project#1-connect-to-a-warehouse)
+2. [Connect to your dbt project](/get-started/setup-lightdash/connect-project#2-import-a-dbt-project)
-2. Connect to your dbt project
## Open up your Lightdash instance to get started.
-**To update an existing connection**, head to:
+**To update an existing connection**, go to **Project settings** by clicking on the gear icon in the top-right navigation.
-1. Settings
-2. Project management
-3. Settings for your project
-
-
-**To create a new project connection**, head to:
+
+
+
-1. Settings
-2. Project management
-4. Create New
+**To create a new project**, go to **Organization settings**, then **All projects** and click **Create new**.
-
+
## 1. Connect to a warehouse
@@ -56,9 +51,8 @@ We always recommend giving read-only permissions to Lightdash, that way you ensu
If you login at:
-* **app.lightdash.cloud** use **35.245.81.252**
-
-* **eu1.lightdash.cloud** use **34.79.239.130**
+- **app.lightdash.cloud** use **35.245.81.252**
+- **eu1.lightdash.cloud** use **34.79.239.130**
If you login at a different domain, look for the IP in the project settings page (see image below).
@@ -71,7 +65,7 @@ If you login at a different domain, look for the IP in the project settings page
### Bigquery
-#### Project
+##### Project
This is project ID from Google Cloud Platform for the data that you want to connect Lightdash to.
@@ -96,15 +90,15 @@ Then, you should see all of the projects and their project IDs in your organizat
For the project you want to connect Lightdash to, just copy its `id` and pop it into the `project` field in the Warehouse Connection form in Lightdash.
-#### Authentication type
+##### Authentication type
You can choose to connect to BigQuery with a user account (using "Sign in with Google") or with a service account using a JSON key file.
-#### User Account (Sign in with Google)
+##### User Account (Sign in with Google)
When you use "Sign in with Google" - Lightdash will execute queries against bigquery with your personal google user account. This is the simplest way to get connected quickly using your existing account.
-#### Service account (JSON Key File)
+##### Service account (JSON Key File)
To have Lightdash connect to BigQuery with a service account, you need to create the account and JSON key. You can read more about [creating and managing service accounts with Google BigQuery in their docs](https://cloud.google.com/iam/docs/creating-managing-service-accounts). You will need permissions to create service accounts and keys in your Google Project. If you don't have the permissions, use your user account instead.
@@ -122,7 +116,7 @@ If you need to provide access to data across multiple BigQuery projects, the ser
Once you have a service account all ready to go, you'll need to add its JSON key file to Lightdash in the `key file` section of the Warehouse Connection page.
-#### Location
+##### Location
The data location of the dataset in BigQuery where the output of your dbt models is written to.
@@ -135,7 +129,7 @@ You can find the location of the dataset you're using for your dbt project [in y
-#### Timeout in seconds
+##### Timeout in seconds
BigQuery supports query timeouts. By default, the timeout is set to 300 seconds. If a query run by Lightdash takes longer than this timeout to complete, then BigQuery may cancel the query and issue the following error:
@@ -146,14 +140,14 @@ Operation did not complete within the designated timeout.
To change this timeout, use the `Timeout in seconds` configuration.
-#### Priority
+##### Priority
The priority for the BigQuery jobs that Lightdash executes can be configured with the `priority` configuration in your Warehouse Connection settings. The `priority` field can be set to one of `batch` or `interactive`.
For more information on query priority, [check out the BigQuery documentation.](https://cloud.google.com/bigquery/docs/running-queries)
-#### Retries
+##### Retries
The `retries` configuration specifies the number of times Lightdash should retry queries that result in unhandled server errors.
@@ -162,7 +156,7 @@ For example, setting `retries` to 5 means that Lightdash will retry BigQuery que
By default, the number of retries is set to 3.
-#### Maximum bytes billed
+##### Maximum bytes billed
If a value for the `Maximum bytes billed` is set, then queries executed by Lightdash will fail if they exceed the configured maximum bytes threshhold. This configuration should be supplied as an integer number of bytes.
@@ -173,12 +167,12 @@ For example, setting this to `1000000000` means if a query would bill more than
```
-#### Start of week
+##### Start of week
This controls what day is the start of the week in Lightdash. `Auto` sets it to whatever the default is for your data warehouse. Or, you can customize it and select the day of the week from the drop-down menu. This will be taken into account when using 'WEEK' time interval in Lightdash.
-#### Execution project
+##### Execution project
Here you can specify an execution project to bill for query execution, instead of using the project where your dbt resources are materialized. If you leave this blank, all costs get applied to the project from the top of the connection details.
@@ -189,23 +183,23 @@ Here you can specify an execution project to bill for query execution, instead o
You can see more details in [dbt documentation](https://docs.getdbt.com/reference/warehouse-profiles/postgres-profile).
-#### Host
+##### Host
This is the host where the database is running.
-#### User
+##### User
This is the database user name.
-#### Password
+##### Password
This is the database user password.
-#### DB name
+##### DB name
This is the database name.
-#### Schema
+##### Schema
This is the default schema used by dbt to compile and run your dbt project. You can find this in the dbt cloud IDE or your local `profiles.yml` file.
@@ -231,39 +225,39 @@ company-name:
schema: [dbt schema] # look for this one!
```
-#### Port
+##### Port
This is the port where the database is running.
-#### Keep alive idle (seconds)
+##### Keep alive idle (seconds)
This specifies the amount of seconds with no network activity after which the operating system should send a TCP keepalive message to the client. You can see more details in [postgresqlco documentation](https://postgresqlco.nf/doc/en/param/tcp%5Fkeepalives%5Fidle/).
-#### Search path
+##### Search path
This controls the Postgres "search path". You can see more details in [dbt documentation](https://docs.getdbt.com/reference/warehouse-profiles/postgres-profile#search%5Fpath).
-#### SSL mode
+##### SSL mode
This controls how dbt connects to Postgres databases using SSL. You can see more details in [dbt documentation](https://docs.getdbt.com/reference/warehouse-profiles/postgres-profile#sslmode).
-### SSL certificate
+##### SSL certificate
The client certificate used to authenticate your connection to the database. This is only required if you're using SSL mode `verify-full`.
-#### SSL private key
+##### SSL private key
The private key used to authenticate your connection to the database. This is only required if you're using SSL mode `verify-full`.
-#### SSL root certificate
+##### SSL root certificate
The trusted certificate authority (CA) certificate used to verify the database server’s identity. This is only required if you're using SSL mode `verify-ca` or `verify-full`.
-#### Start of week
+##### Start of week
This controls what day is the start of the week in Lightdash. `Auto` sets it to whatever the default is for your data warehouse. Or, you can customize it and select the day of the week from the drop-down menu. This will be taken into account when using 'WEEK' time interval in Lightdash.
-#### Use SSH tunnel
+##### Use SSH tunnel
Enable to input your **SSH Remote Host**, **SSH Remote Port**, **SSH Username**, and to generate a public SSH key.
@@ -274,23 +268,23 @@ Enable to input your **SSH Remote Host**, **SSH Remote Port**, **SSH Username**,
You can see more details in [dbt documentation](https://docs.getdbt.com/reference/warehouse-profiles/redshift-profile).
-#### Host
+##### Host
This is the host where the database is running.
-#### User
+##### User
This is the database user name.
-#### Password
+##### Password
This is the database user password.
-#### DB name
+##### DB name
This is the database name.
-#### Schema
+##### Schema
This is the default schema used by dbt to compile and run your dbt project. You can find this in the dbt cloud IDE or your local `profiles.yml` file.
@@ -316,11 +310,11 @@ company-name:
schema: analytics # look for this one!
```
-#### Port
+##### Port
This is the port where the database is running.
-#### Keep alive idle (seconds)
+##### Keep alive idle (seconds)
This specifies the amount of seconds with no network activity after which the operating system should send a TCP keepalive message to the client.
@@ -328,19 +322,19 @@ If the database closes its connection while Lightdash is waiting for data, you m
By default, this value is set to 240 seconds, but can be configured lower (perhaps 120 or 60), at the cost of a chattier network connection.
-#### SSL mode
+##### SSL mode
This controls how dbt connects to Postgres databases using SSL.
-#### RA3 Node
+##### RA3 Node
Allow dbt to use cross-database-resources
-#### Start of week
+##### Start of week
This controls what day is the start of the week in Lightdash. `Auto` sets it to whatever the default is for your data warehouse. Or, you can customize it and select the day of the week from the drop-down menu. This will be taken into account when using 'WEEK' time interval in Lightdash.
-#### Use SSH tunnel
+##### Use SSH tunnel
Enable to input your **SSH Remote Host**, **SSH Remote Port**, **SSH Username**, and to generate a public SSH key.
@@ -351,7 +345,7 @@ Enable to input your **SSH Remote Host**, **SSH Remote Port**, **SSH Username**,
You can see more details in [dbt documentation](https://docs.getdbt.com/reference/warehouse-profiles/snowflake-profile).
-#### Account
+##### Account
This is your Snowflake [account identifer](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#format-1-preferred-account-name-in-your-organization).
@@ -365,7 +359,7 @@ For example in the image below, the user logs in via `https://aaa99827.snowflake
If you don't have access via the browser, you can use the following format `-` where`organization_name` and `account_name` can be found by following any of the methods listed in[Managing accounts in your organization](https://docs.snowflake.com/en/user-guide/organizations-manage-accounts.html#label-viewing-organization-name).
-#### User
+##### User
This is the login name for your Snowflake user. This is usually the same username you use to login to Snowflake.
@@ -377,11 +371,11 @@ If you're a snowflake admin you can list all users available in the snowflake co
-#### Authentication type
+##### Authentication type
Choose to authenticate using either a user account (using "Sign in with Snowflake") or with a service account using a JSON key file or password.
-#### Sign in with Snowflake
+##### Sign in with Snowflake
This method requires you to configure an Oauth2 flow from your Snowflake warehouse.
@@ -422,17 +416,17 @@ Finally, contact the Lightdash Team support@lightdash.com to setup Sign in with
You can read more about this on [Snowflake official docs](https://docs.snowflake.com/en/user-guide/oauth-snowflake-overview)
-#### Private Key
+##### Private Key
You can generate a Private Key for a Snowflake user following the guide [here](https://docs.snowflake.com/en/user-guide/key-pair-auth#generate-the-private-key)
Once generated, copy across the Private Key File that is generated into Lightdash. If you chose to encrypt your private key you will also need to supply the Private Key Passphrase.
-#### Password
+##### Password
This is the password your Snowflake user. This is usually the same password you use to login to Snowflake. Note that due to changes in Snowflake Authentication, users that require passwords may also need to enable MFA, which is not compatible with a Lightdash project connection.
-#### Role
+##### Role
This is the security role that you would like to use when running queries as the specified user. The role must have access to any warehouses, databases, schemas, and tables you want to use.
@@ -444,7 +438,7 @@ If you're a Snowflake admin you can list all roles available in the snowflake co
You can configure your role to allow read access to all warehouses, databases, schemas, and tables by following the guide for[Creating custom read-only roles](https://docs.snowflake.com/en/user-guide/security-access-control-configure.html#creating-custom-read-only-roles).
-#### Database
+##### Database
This is the name of your database. The specified user must be granted access to this database. You can see a list of databases available in the snowflake console:
@@ -452,7 +446,7 @@ This is the name of your database. The specified user must be granted access to
-#### Warehouse
+##### Warehouse
This is the name of the warehouse you would like to use for running queries. The specified user must be grantend access to use this warehouse. You can see al list of warehouses available in the snowflake console:
@@ -464,7 +458,7 @@ This is the name of the warehouse you would like to use for running queries. The
If “Always use this warehouse” is set to yes, this warehouse will be used for all queries, even if the dbt configuration specifies a different warehouse using snowflake\_warehouse.
-#### Schema
+##### Schema
This is the default schema used by dbt to compile and run your dbt project. You can find this in the dbt cloud IDE or your local `profiles.yml` file.
@@ -491,15 +485,15 @@ my-snowflake-db:
schema: [dbt schema] # Look for this one!
```
-#### Keep client session alive
+##### Keep client session alive
This is intended to keep Snowflake sessions alive beyond the typical 4 hour timeout limit. You can see more details in [dbt documentation](https://docs.getdbt.com/reference/warehouse-profiles/snowflake-profile#client%5Fsession%5Fkeep%5Falive).
-#### Query tag
+##### Query tag
A value with which to tag all queries, for later searching in [QUERY\_HISTORY view](https://docs.snowflake.com/en/sql-reference/account-usage/query%5Fhistory.html))
-#### Start of week
+##### Start of week
This controls what day is the start of the week in Lightdash. `Auto` sets it to whatever the default is for your data warehouse. Or, you can customize it and select the day of the week from the drop-down menu. This will be taken into account when using 'WEEK' time interval in Lightdash.
@@ -521,19 +515,19 @@ The credentials needed to connect to your cluster can be found in the ODBC optio
-#### Server hostname
+##### Server hostname
Follow the instructions above to find your ODBC connection instructions.
-#### HTTP Path
+##### HTTP Path
Follow the instructions above to find your ODBC connection instructions.
-#### Port
+##### Port
Follow the instructions above to find your ODBC connection instructions.
-#### Personal Access Token
+##### Personal Access Token
Your personal access token can be found in your user settings in databricks:
@@ -547,11 +541,11 @@ Your personal access token can be found in your user settings in databricks:
-#### Database
+##### Database
The default database name used by dbt for this connection.
-#### Start of week
+##### Start of week
This controls what day is the start of the week in Lightdash. `Auto` sets it to whatever the default is for your data warehouse. Or, you can customize it and select the day of the week from the drop-down menu. This will be taken into account when using 'WEEK' time interval in Lightdash.
@@ -561,13 +555,13 @@ This controls what day is the start of the week in Lightdash. `Auto` sets it to
We only support [LDAP authentication with Trino](https://trino.io/docs/current/security/ldap.html). You can see more details in [dbt's documentation](https://docs.getdbt.com/reference/warehouse-setups/trino-setup#configuration).
-#### Host
+##### Host
The hostname of your cluster. E.g. `mycluster.mydomain.com`
Don't include the http:// or https:// prefix.
-#### User
+##### User
The username (of the account) to log in to your cluster. When connecting to Starburst Galaxy clusters, you must include the role of the user as a suffix to the username.
@@ -575,27 +569,27 @@ Format for Starburst Enterprise or Trino: `user.name` or `user.name@mydomain.com
Format for Starburst Galaxy:`user.name@mydomain.com/role`
-#### Password
+##### Password
This is the password for authentication.
-#### DB name
+##### DB name
Specify the name of the database that your dbt models are built into. This is the name of a catalog in your cluster.
e.g. `my_postgres_catalog`
-#### Port
+##### Port
The port to connect to your cluster. By default, it's 443 for TLS enabled clusters.
e.g. `443`
-#### SSL mode
+##### SSL mode
This controls how dbt connects to Trino database using SSL.
-#### Start of week
+##### Start of week
This controls what day is the start of the week in Lightdash. `Auto` sets it to whatever the default is for your data warehouse. Or, you can customize it and select the day of the week from the drop-down menu. This will be taken into account when using 'WEEK' time interval in Lightdash.
@@ -605,21 +599,22 @@ This controls what day is the start of the week in Lightdash. `Auto` sets it to
Connecting Lightdash to a hosted dbt project means that you'll be able to keep your Lightdash instance in sync with the changes in your dbt project.
-To connect your dbt project, just head to your project connection settings in Lightdash:
+To connect your dbt project, head to your project connection settings in Lightdash:
-
+
Then scroll down to your dbt project connection:
-
+
Pick your repository type and follow the guide below:
+
@@ -628,6 +623,8 @@ Pick your repository type and follow the guide below:
+
+
@@ -638,13 +635,14 @@ Pick your repository type and follow the guide below:
+
### GitHub
-#### Personal access token[](#personal-access-token-1 "Direct link to Personal access token")
+###### OAuth (recommended authorization method)
-You can create a personal access token (classic), or a fine-grained access token (beta).
+We recommend you connect to Github using OAuth. This gives Lightdash a direct connection to the repo, so it won't lose access when individuals leave the company, and it can create pull requests, which is required for [dbt write-back features](/references/dbt-write-back). This connection lives at the organization level, so you can map different dbt project repos to each Lightdash project without needing a new access token.
-##### Personal access token (classic)
+###### Personal access token (classic authorization method)
This is used to access your repo. See the [instructions for creating a personal access token here](https://docs.github.com/en/github/authenticating-to-github/keeping-your-account-and-data-secure/creating-a-personal-access-token).
@@ -654,17 +652,15 @@ Select `repo` scope when you're creating the token.
-##### Fine-grained access token (beta)
+###### Fine-grained access token (beta authorization method)
Fine-grained access tokens are new special tokens that can only give access to individual repositories on your github account. You can read more about it on the [Github docs](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token#creating-a-fine-grained-personal-access-token).
-1. Go to [settings > Developer access > Personal access tokens > fine-grained token](https://github.com/settings/personal-access-tokens/new)
-
+1. Go to [Settings > Developer access > Personal access tokens > Fine-grained token](https://github.com/settings/personal-access-tokens/new)
2. Add the name, expiration, description and owner (we'll need their username later)
-
3. Add the repositories you want Lightdash to access. You might want to give access only to the repository where your `dbt` project is located.
- 
+
4. On `Repository` permissions, select `Contents` --> `Read and Write` and `Pull Requests` --> `Read and Write`.
@@ -674,17 +670,17 @@ Fine-grained access tokens are new special tokens that can only give access to i
You could also replace your old tokens with new fine-grained tokens on project settings.
-#### Repository
+##### Repository
This should be in the format `my-org/my-repo`. e.g. `lightdash/lightdash-analytics`
-#### Branch
+##### Branch
This is the branch in your GitHub repo that Lightdash should sync to. e.g. `main`, `master` or `dev`
By default, we've set this to `main` but you can change it to whatever you'd like.
-#### Project directory path
+##### Project directory path
This is the folder where your `dbt_project.yml` file is found in the GitHub repository you entered above.
@@ -692,88 +688,37 @@ This is the folder where your `dbt_project.yml` file is found in the GitHub repo
* Include the path to the sub-folder where your dbt project is if your dbt project is in a sub-folder in your repo. For example, if my project was in lightdash/lightdash-analytics/dbt/dbt\_project.yml, I'd write `/dbt` in this field.
-#### Host domain
+##### Host domain
If you've [customized the domain for your GitHub pages](https://docs.github.com/en/pages/configuring-a-custom-domain-for-your-github-pages-site/about-custom-domains-and-github-pages), you can add the custom domain for your project in here.
By default, this is `github.com`
-#### Target name
-
-`target` contains information about your dbt connection to your warehouse.
-
-It's the dataset/schema in your data warehouse that Lightdash will look for your dbt models. By default, we set this to be the same value as you have as the default in your `profiles.yml` file.
-
-If you want to change this to be something other than the default `target` defined in dbt, you can enter the target of your choice here (for example `dbt_khindson`.)
-
-To read more about dbt targets, [check out the dbt docs here.](https://docs.getdbt.com/reference/dbt-jinja-functions/target)
-
-#### dbt selector
-
-You can filter out models in your dbt project that you don't want to see in Lightdash. This is useful if you have a large
-dbt project and you want to speed up the sync process. Unlike [table selection](https://docs.lightdash.com/guides/adding-tables-to-lightdash#limiting-the-tables-in-lightdash-using-dbt-tags),
-this selector is applied to the dbt models, so it will skip the entire compilation process for the models that you don't want to see in Lightdash.
-
-To do this, you can add a `dbt_selector` to your dbt project. This is a JSON object that contains the models you want to include in Lightdash.
-
-For example, if you only want to include the `my_model` and all models with the `lightdash` tag in Lightdash, you can add the following to your dbt project settings:
-
-```console
-my_model tag:lightdash
-```
-
-We support all the dbt selectors, you can read more about them [on the dbt docs](https://docs.getdbt.com/reference/node-selection/syntax#combining-state-and-result-selectors).
-
-#### Environment variables
-
-If you've used [environment variables in your dbt profiles.yml file](https://docs.getdbt.com/reference/dbt-jinja-functions/env%5Fvar), you can add these to Lightdash here.
-
-For each environment variable, you'll need to add the `key` + `value` pair for the item.
-
-You'll normally find these values in a file called `.env` in your dbt project directory.
-
-For example, I might have something like:
-
-```yaml
-profile:
- target: prod
- outputs:
- prod:
- type: postgres
- host: 127.0.0.1
- user: "{{ env_var('DBT_USER') }}"
- ....
-```
-
-Then a `.env` file like:
-
-```yaml
-export DBT_USER="myspecialuserkey123"
-```
-
-So, in Lightdash, I'd add a new environment variable and put `key` as `DBT_USER` and `value` as `myspecialuserkey123`.
+
+ After adding your Github information, fill out the [dbt project details](/references/dbt-projects#dbt-project-settings) and you're all set!
+
***
### GitLab
-#### Personal access token
+##### Personal access token
This is used to access your repo. See the [instructions for creating a personal access token here](https://docs.gitlab.com/ee/user/profile/personal%5Faccess%5Ftokens.html).
Select `read_repository` scope when you're creating the token. The token, if using a **project access token**, or the user, when using a **personal access token**, needs to have permission to download the code. Normally this would be the `Reporter` role.
-#### Repository
+##### Repository
You can find this in the GitLab URL when you're in your repo. This should be in the format `my-org/my-repo`. e.g. if my browser had [`https://gitlab.com/lightdash/lightdash-analytics.gitlab.io`](https://gitlab.com/lightdash/lightdash-analytics.gitlab.io), I'd put in: `lightdash/lightdash-analytics` as my repository in Lightdash.
-#### Branch
+##### Branch
This is the branch in your GitLab repo that Lightdash should sync to. e.g. `main`, `master` or `dev`
By default, we've set this to `main` but you can change it to whatever you'd like.
-#### Project directory path
+##### Project directory path
This is the folder where your `dbt_project.yml` file is found in the GitLab repository you entered above.
@@ -781,94 +726,44 @@ If your `dbt_project.yml` file is in the main folder of your repo (e.g. `lightda
If your dbt project is in a sub-folder in your repo (e.g. `lightdash/lightdash-analytics/dbt/dbt_project.yml`), then you'll need to include the path to the sub-folder where your dbt project is (e.g. `/dbt`).
-#### Host domain
+##### Host domain
If you've [customized the domain for your GitLab pages](https://docs.gitlab.com/ee/user/project/pages/getting%5Fstarted%5Fpart%5Fone.html), you can add the custom domain for your project in here.
By default, this is `gitlab.io`.
-#### Target name
-
-`target` contains information about your dbt connection to your warehouse.
-
-It's the dataset/schema in your data warehouse that Lightdash will look for your dbt models. By default, we set this to be the same value as you have as the default in your `profiles.yml` file.
-
-If you want to change this to be something other than the default `target` defined in dbt, you can enter the target of your choice here (for example `dbt_khindson`.)
-
-To read more about dbt targets, [check out the dbt docs here.](https://docs.getdbt.com/reference/dbt-jinja-functions/target)
-
-#### dbt selector
-
-You can filter out models in your dbt project that you don't want to connect to Lightdash. This is useful if you have a large
-dbt project and you want to speed up the sync process. Unlike [table selection](https://docs.lightdash.com/guides/adding-tables-to-lightdash#limiting-the-tables-in-lightdash-using-dbt-tags),
-this selector is applied to the dbt models, so it will skip the entire compilation process for the models that you don't want to connect to Lightdash.
-
-To do this, you can add a `dbt_selector` to your dbt project. This is a JSON object that contains the models you want to include in Lightdash.
-
-For example, if you only want to include the `my_model` and all models with the `lightdash` tag in Lightdash, you can add the following to your dbt project settings:
-
-```console
-my_model tag:lightdash
-```
-
-We support all the dbt selectors, you can read more about them [on the dbt docs](https://docs.getdbt.com/reference/node-selection/syntax#combining-state-and-result-selectors).
+
+ After adding your Gitlab information, fill out the [dbt project details](/references/dbt-projects#dbt-project-settings) and you're all set!
+
-#### Environment variables
-
-If you've used [environment variables in your dbt profiles.yml file](https://docs.getdbt.com/reference/dbt-jinja-functions/env%5Fvar), you can add these to Lightdash here.
-
-For each environment variable, you'll need to add the `key` + `value` pair for the item.
-
-You'll normally find these values in a file called `.env` in your dbt project directory.
-
-For example, I might have something like:
-
-```yaml
-profile:
- target: prod
- outputs:
- prod:
- type: postgres
- host: 127.0.0.1
- user: "{{ env_var('DBT_USER') }}"
- ....
-```
-
-Then a `.env` file like:
-
-```yaml
-export DBT_USER="myspecialuserkey123"
-```
-
-So, in Lightdash, I'd add a new environment variable and put `key` as `DBT_USER` and `value` as `myspecialuserkey123`.
***
### Azure DevOps
-#### Personal access token
+##### Personal access token
This is your secret token used to access Azure Devops. See the [instructions to create a personal access token](https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops\&tabs=Windows)You must specify at least the Repo:Read scope.
-#### Organization
+##### Organization
This is the name of the organization that owns your repository
-#### Project
+##### Project
This is the name of the project that owns your repository
-#### Repository
+##### Repository
This is the name of the repository. For many projects, this is the same as your project name above.
-#### Branch
+##### Branch
This is the branch in your repository that Lightdash should sync to. e.g. `main`, `master` or `dev`
By default, we've set this to `main` but you can change it to whatever you'd like.
-#### Project directory path
+##### Project directory path
This is the folder where your `dbt_project.yml` file is found in the repository you entered above.
@@ -876,81 +771,16 @@ If your `dbt_project.yml` file is in the main folder of your repo (e.g. `lightda
If your dbt project is in a sub-folder in your repo (e.g. `lightdash/lightdash-analytics/dbt/dbt_project.yml`), then you'll need to include the path to the sub-folder where your dbt project is (e.g. `/dbt`).
-#### Target name
-
-`target` contains information about your dbt connection to your warehouse.
+
+ After adding the Azure DevOps details, fill out the [dbt project details](/references/dbt-projects#dbt-project-settings) and you're all set!
+
-It's the dataset/schema in your data warehouse that Lightdash will look for your dbt models. By default, we set this to be the same value as you have as the default in your `profiles.yml` file.
-
-If you want to change this to be something other than the default `target` defined in dbt, you can enter the target of your choice here (for example `dbt_khindson`.)
-
-To read more about dbt targets, [check out the dbt docs here.](https://docs.getdbt.com/reference/dbt-jinja-functions/target)
-
-#### dbt selector
-
-You can filter out models in your dbt project that you don't want to see in Lightdash. This is useful if you have a large
-dbt project and you want to speed up the sync process. Unlike [table selection](https://docs.lightdash.com/guides/adding-tables-to-lightdash#limiting-the-tables-in-lightdash-using-dbt-tags),
-this selector is applied to the dbt models, so it will skip the entire compilation process for the models that you don't want to see in Lightdash.
-
-To do this, you can add a `dbt_selector` to your dbt project. This is a JSON object that contains the models you want to include in Lightdash.
-
-For example, if you only want to include the `my_model` and all models with the `lightdash` tag in Lightdash, you can add the following to your dbt project settings:
-
-```console
-my_model tag:lightdash
-```
-
-We support all the dbt selectors, you can read more about them [on the dbt docs](https://docs.getdbt.com/reference/node-selection/syntax#combining-state-and-result-selectors).
-
-#### Environment variables
-
-If you've used [environment variables in your dbt profiles.yml file](https://docs.getdbt.com/reference/dbt-jinja-functions/env%5Fvar), you can add these to Lightdash here.
-
-For each environment variable, you'll need to add the `key` + `value` pair for the item.
-
-You'll normally find these values in a file called `.env` in your dbt project directory.
-
-For example, I might have something like:
-
-```yaml
-profile:
- target: prod
- outputs:
- prod:
- type: postgres
- host: 127.0.0.1
- user: "{{ env_var('DBT_USER') }}"
- ....
-```
-
-Then a `.env` file like:
-
-```yaml
-export DBT_USER="myspecialuserkey123"
-```
-
-So, in Lightdash, I'd add a new environment variable and put `key` as `DBT_USER` and `value` as `myspecialuserkey123`.
-
-### Local dbt project
-
-
- Unsuitable for production and only available for Lightdash instances installed on your local machine
-
-
-To start Lightdash with the option to connect to a local dbt project, you must specify the directory of the dbt project when you start docker compose:
-
-```yaml
-# Specify the absolute path to your dbt project
-# e.g. export DBT_PROJECT_DIR=/Users/elonmusk/mydbtproject
-export DBT_PROJECT_DIR= # Enter your path here!
-docker compose start
-```
***
### Bitbucket
-#### Username
+##### Username
This is the login name for your Bitbucket user. This is usually the same username you use to login to Bitbucket. You can find your username in Bitbucket by:
@@ -960,7 +790,7 @@ This is the login name for your Bitbucket user. This is usually the same usernam
Alternatively, you can [create a new user through the Bitbucket console](https://confluence.atlassian.com/bitbucketserver/users-and-groups-776640439.html)with a username and password specifically for Lightdash to use.
-#### Http access token
+##### Http access token
Getting a token depends on whether you use Bitbucket Cloud or Bitbucket server:
@@ -970,17 +800,17 @@ Getting a token depends on whether you use Bitbucket Cloud or Bitbucket server:
Select `Project read` and `Repository read` scope when you're creating the token.
-#### Repository
+##### Repository
This should be in the format `my-org/my-repo`. e.g. `lightdash/lightdash-analytics`
-#### Branch
+##### Branch
This is the branch in your Bitbucket repo that Lightdash should sync to. e.g. `main`, `master` or `dev`
By default, we've set this to `main` but you can change it to whatever you'd like.
-#### Project directory path
+##### Project directory path
This is the folder where your `dbt_project.yml` file is found in the Bitbucket repository you entered above.
@@ -988,61 +818,36 @@ This is the folder where your `dbt_project.yml` file is found in the Bitbucket r
* Include the path to the sub-folder where your dbt project is if your dbt project is in a sub-folder in your repo. For example, if my project was in lightdash/lightdash-analytics/dbt/dbt\_project.yml, I'd write `/dbt` in this field.
-#### Host domain
+##### Host domain
If you've [customized the domain for your Bitbucket server](https://confluence.atlassian.com/bitbucketserver/specify-the-bitbucket-base-url-776640392.html), you can add the custom domain for your project in here.
-#### Target name
-
-`target` contains information about your dbt connection to your warehouse.
-It's the dataset/schema in your data warehouse that Lightdash will look for your dbt models. By default, we set this to be the same value as you have as the default in your `profiles.yml` file.
-
-If you want to change this to be something other than the default `target` defined in dbt, you can enter the target of your choice here (for example `dbt_khindson`.)
-
-To read more about dbt targets, [check out the dbt docs here.](https://docs.getdbt.com/reference/dbt-jinja-functions/target)
-
-#### dbt selector
-
-You can filter out models in your dbt project that you don't want to see in Lightdash. This is useful if you have a large
-dbt project and you want to speed up the sync process. Unlike [table selection](https://docs.lightdash.com/guides/adding-tables-to-lightdash#limiting-the-tables-in-lightdash-using-dbt-tags),
-this selector is applied to the dbt models, so it will skip the entire compilation process for the models that you don't want to see in Lightdash.
-
-To do this, you can add a `dbt_selector` to your dbt project. This is a JSON object that contains the models you want to include in Lightdash.
-
-For example, if you only want to include the `my_model` and all models with the `lightdash` tag in Lightdash, you can add the following to your dbt project settings:
-
-```console
-my_model tag:lightdash
-```
+
+ After adding your Bitbucket information, fill out the [dbt project details](/references/dbt-projects#dbt-project-settings) and you're all set!
+
-We support all the dbt selectors, you can read more about them [on the dbt docs](https://docs.getdbt.com/reference/node-selection/syntax#combining-state-and-result-selectors).
+***
-#### Environment variables
+### CLI
-If you've used [environment variables in your dbt profiles.yml file](https://docs.getdbt.com/reference/dbt-jinja-functions/env%5Fvar), you can add these to Lightdash here.
+The `CLI` connection type is the default type for projects that were created using the CLI via the `lightdash deploy --create` command.
-For each environment variable, you'll need to add the `key` + `value` pair for the item.
+Usually, we recommend swapping to a direct connection to your git repo after initial project creation, but if you want to continue managing deployments in the CLI, [read this guide on how to use Lightdash deploy and set up continuous deployment](https://docs.lightdash.com/guides/cli/how-to-use-lightdash-deploy).
-You'll normally find these values in a file called `.env` in your dbt project directory.
+***
-For example, I might have something like:
+### Local dbt project
-```yaml
-profile:
- target: prod
- outputs:
- prod:
- type: postgres
- host: 127.0.0.1
- user: "{{ env_var('DBT_USER') }}"
- ....
-```
+
+ Unsuitable for production and only available for Lightdash instances installed on your local machine
+
-Then a `.env` file like:
+To start Lightdash with the option to connect to a local dbt project, you must specify the directory of the dbt project when you start docker compose:
```yaml
-export DBT_USER="myspecialuserkey123"
+# Specify the absolute path to your dbt project
+# e.g. export DBT_PROJECT_DIR=/Users/jake/mydbtproject
+export DBT_PROJECT_DIR= # Enter your path here!
+docker compose start
```
-
-So, in Lightdash, I'd add a new environment variable and put `key` as `DBT_USER` and `value` as `myspecialuserkey123`.
\ No newline at end of file
diff --git a/images/get-started/setup-lightdash/dbt-connection-settings-1c28d200a1188db5b9ef010664e6137f.jpg b/images/get-started/setup-lightdash/dbt-connection-settings-1c28d200a1188db5b9ef010664e6137f.jpg
deleted file mode 100644
index 0be8ea1e..00000000
Binary files a/images/get-started/setup-lightdash/dbt-connection-settings-1c28d200a1188db5b9ef010664e6137f.jpg and /dev/null differ
diff --git a/images/get-started/setup-lightdash/project-connection-dbt.png b/images/get-started/setup-lightdash/project-connection-dbt.png
new file mode 100644
index 00000000..bae06e26
Binary files /dev/null and b/images/get-started/setup-lightdash/project-connection-dbt.png differ
diff --git a/images/get-started/setup-lightdash/project-connection-settings-0984b3948e7ab639ecf565cbb3ed0485.jpg b/images/get-started/setup-lightdash/project-connection-settings-0984b3948e7ab639ecf565cbb3ed0485.jpg
deleted file mode 100644
index 75a8c4d2..00000000
Binary files a/images/get-started/setup-lightdash/project-connection-settings-0984b3948e7ab639ecf565cbb3ed0485.jpg and /dev/null differ
diff --git a/images/get-started/setup-lightdash/project-connection-settings.png b/images/get-started/setup-lightdash/project-connection-settings.png
new file mode 100644
index 00000000..467e4fcb
Binary files /dev/null and b/images/get-started/setup-lightdash/project-connection-settings.png differ
diff --git a/images/get-started/setup-lightdash/project-create-new.png b/images/get-started/setup-lightdash/project-create-new.png
new file mode 100644
index 00000000..c9b46bb7
Binary files /dev/null and b/images/get-started/setup-lightdash/project-create-new.png differ
diff --git a/images/references/dbt-refresh.png b/images/references/dbt-refresh.png
new file mode 100644
index 00000000..054edb45
Binary files /dev/null and b/images/references/dbt-refresh.png differ
diff --git a/references/dbt-projects.mdx b/references/dbt-projects.mdx
index fa11dd37..57aa164c 100644
--- a/references/dbt-projects.mdx
+++ b/references/dbt-projects.mdx
@@ -4,271 +4,103 @@ description: "You can easily make changes in dbt and see them updated in your Li
---
-
-Lightdash supports dbt v1.4.0 and above. If you are using an older version of dbt, you will need to upgrade to sync your project to Lightdash
-
+ Lightdash supports dbt v1.4.0 and above. If you are using an older version of dbt, you will need to upgrade to sync your project to Lightdash
-## 1. Automatically: deploy your changes to Lightdash using a GitHub action
-
-If you've connected Lightdash to GitHub, you can setup a `github action` and get Lightdash to deploy your project automatically. This is the easiest way to keep Lightdash in sync with your changes in dbt.
-
-### Step 1: add the credentials to Github secrets
-We are going to add some secrets and config to GitHub actions, but you don't want those to be public, so the best way to do this is to add them as secrets on Github.
+## Syncing your dbt project to Lightdash
-
-
-If you already have a GitHub action for Lightdash, then you can use the same Lightdash secrets you created for your other action.
-
+You can sync your dbt project code with Lightdash in a few different ways. We recommend everyone set up continuous deployment, but you can also refresh in the Lightdash app or deploy from the CLI.
-Go to your repo, click on `Settings` , on the left sidebar, click on `Secrets` under `Security`. Now click on the `New repository secret`
-
- 
-
+### 1. Set up continous deployment
+[Read how to do that and check out our example workflow files](/guides/cli/how-to-use-lightdash-deploy#automatically-deploy-your-changes-to-lightdash-using-a-github-action).
-We need to add the following secrets:
-##### `LIGHTDASH_API_KEY`
+### 2. Click "Refresh dbt" in Lightdash
-Create a new personal access token, by going to `Settings` \> `Personal Access Tokens`. This is the token you'll put in for `LIGHTDASH_API_KEY`.
+The button can be found on the Query from tables page.
-
+ 
-##### `LIGHTDASH_PROJECT`
-
-The UUID for your project. For example, if your URL looks like `https://eu1.lightdash.cloud/projects/3538ab33-dc90-aabb-bc00-e50bba3a5f69/tables`, then `3538ab33-dc90-45f0-aabb-e50bba3a5f69` is your `LIGHTDASH_PROJECT`
+_If you're using a git connection (like GitHub, Gitlab or Bitbucket), you'll need to push + merge your changes to the branch that your Lightdash project is connected to before you press `Refresh dbt`._
-##### `LIGHTDASH_URL`
+
+If you've made any changes to the underlying data (for example, adding a new column in your `model.sql` file or changing the SQL logic of a dimension), then you need to run: `dbt run -m yourmodel` before you click `Refresh dbt` in Lightdash.
+
-This is either `https://eu1.lightdash.cloud` or `https://app.lightdash.cloud` for Lightdash Cloud users (check the URL to your Lightdash project). If you self-host, this should be your own custom domain.
-##### `DBT_PROFILES`
+### 3. Push code from the CLI
-Some tips for this bit:
+If you're using the [Lightdash CLI](/guides/cli/how-to-install-the-lightdash-cli), you can use `lightdash deploy` to deploy your changes to Lightdash. [Read more about how to use `lightdash deploy`](/guides/cli/how-to-use-lightdash-deploy).
-* You might be able to copy a bunch of the information from your local `profiles.yml` file. You can see what's in there by typing `cat ~/.dbt/profiles.yml` in your terminal.
-* If you have a separate `prod` and `dev` profile, you probably want to use the information from your `prod` profile for your GitHub action.
-* If you want to have different connection settings depending on the user that opened the pull request (dev profiles), then [check out this guide](/guides/cli/how-to-use-lightdash-preview#how-to-use-the-developer-credentials-in-your-preview-project).
+
+ We don't recommend using `lightdash deploy` from your local environment as the primary way you update Lightdash since small mistakes can lead to production issues.
+
-Find your data warehouse from the list below to get a profiles.yml file template. Fill out this template, and this is your `DBT_PROFILES` secret.
-
-
-BigQuery OAuth:
+## dbt project settings
-Step 1: create a secret called `GOOGLE_APPLICATION_CREDENTIALS`
+For more information about dbt connection types (Github, Gitlab, Bitbucket, etc.) and the fields required for each type, [read the dbt project section in our connection guide](/get-started/setup-lightdash/connect-project#2-import-a-dbt-project).
-Add the service account credentials (the JSON file) that you want to use for your GitHub action. It should look something like this:
+Below are details about the univeral fields for all connected dbt projects.
-```json
-{
- "type": "service_account",
- "project_id": "jaffle_shop",
- "private_key_id": "12345",
- "private_key": "-----BEGIN PRIVATE KEY----- ... -----END PRIVATE KEY-----\n",
- "client_email": "jaffle_shop@jaffle_shop.iam.gserviceaccount.com",
- "client_id": "12345",
- "auth_uri": "https://accounts.google.com/o/oauth2/auth",
- "token_uri": "https://oauth2.googleapis.com/token",
- "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
- "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/jaffle_shop"
-}
-```
+### Target name
-Step 2: create another secret called `DBT_PROFILES`
+**Target** contains information about your dbt connection to your warehouse.
-Copy-paste this template into the secret and fill out the details.
+It's the dataset or schema in your data warehouse that Lightdash will look for your dbt models. By default, we set this to be the same value as you have as the default in your `profiles.yml` file when you run `lightdash deploy` (if that's how you created or recently deployed your project).
-This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret [in this guide](/guides/cli/how-to-use-lightdash-preview#how-to-use-the-developer-credentials-in-your-preview-project).
+If you want to update this, you can enter the target of your choice in the project settings (for example `prod` or `analytics`.)
-```yaml
-[my-bigquery-db]: # this is the name of your project
- target: dev
- outputs:
- dev:
- type: bigquery
- method: oauth
- keyfile: keyfile.json # no need to change this! We'll automatically use the keyfile you created in the last step.
- project: [GCP project id]
- dataset: [the name of your dbt dataset]
+[Read more about dbt targets in the dbt docs.](https://docs.getdbt.com/reference/dbt-jinja-functions/target)
-```
-More info in dbt's profiles docs: [https://docs.getdbt.com/reference/warehouse-profiles/bigquery-profile#service-account-file](https://docs.getdbt.com/reference/warehouse-profiles/bigquery-profile#service-account-file)
+### dbt selector
-
+You can filter out models in your dbt project that you don't want to see in Lightdash. This is useful if you have a large
+dbt project and you want to speed up the sync process. Unlike [table selection](/guides/adding-tables-to-lightdash#limiting-the-tables-in-lightdash-using-dbt-tags), this selector is applied to the dbt models, so it will skip the entire compilation process for the models that you don't want to see in Lightdash.
-
-Postgres profile configuration:
+To do this, you can add a **dbt selector** to your project settings. This is a JSON object that contains the models you want to include in Lightdash.
-```yaml
-company-name:
- target: dev
- outputs:
- dev:
- type: postgres
- host: [hostname]
- user: [username]
- password: [password]
- port: [port]
- dbname: [database name]
- schema: [dbt schema]
- threads: [1 or more]
- keepalives_idle: 0
- connect_timeout: 10
- retries: 1
+For example, if you only want to include the `my_model` and all models with the `lightdash` tag in Lightdash, you can add the following to your dbt project settings:
+```console
+my_model tag:lightdash
```
-More info in dbt's profiles docs: [https://docs.getdbt.com/reference/warehouse-profiles/postgres-profile#profile-configuration](https://docs.getdbt.com/reference/warehouse-profiles/postgres-profile#profile-configuration)
-
-This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret [in this guide](/guides/cli/how-to-use-lightdash-preview#how-to-use-the-developer-credentials-in-your-preview-project).
-
+We support all dbt selectors. [Read more about selectors in the dbt docs](https://docs.getdbt.com/reference/node-selection/syntax#combining-state-and-result-selectors).
-
-Redshift password-based authentication:
+### Environment variables
-```yaml
-company-name:
- target: dev
- outputs:
- dev:
- type: redshift
- host: [hostname.region.redshift.amazonaws.com]
- user: [username]
- password: [password]
- port: 5439
- dbname: analytics
- schema: analytics
- threads: 4
- keepalives_idle: 240
- connect_timeout: 10
- ra3_node: true # enables cross-database sources
-
-```
+If you've used [environment variables in your dbt `profiles.yml` file](https://docs.getdbt.com/reference/dbt-jinja-functions/env%5Fvar), you can add these to Lightdash here.
-More info in dbt's profiles docs: [https://docs.getdbt.com/reference/warehouse-profiles/redshift-profile#password-based-authentication](https://docs.getdbt.com/reference/warehouse-profiles/redshift-profile#password-based-authentication)
+For each environment variable, you'll need to add the `key` + `value` pair for the item.
-This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret [in this guide](/guides/cli/how-to-use-lightdash-preview#how-to-use-the-developer-credentials-in-your-preview-project).
-
+You'll normally find these values in a file called `.env` in your dbt project directory.
-
-User / Private Key authentication:
+For example, I might have something like:
```yaml
-my-snowflake-db:
- target: dev
+profile:
+ target: prod
outputs:
- dev:
- type: snowflake
- account: [account id]
-
- # User/private_key auth
- private_key_path: [path/to/private.key]
- private_key_passphrase: [passphrase for the private key, if key is encrypted]
-
- role: [user role]
- database: [database name]
- warehouse: [warehouse name]
- schema: [dbt schema]
- threads: [1 or more]
- client_session_keep_alive: False
- query_tag: [anything]
-
+ prod:
+ type: postgres
+ host: 127.0.0.1
+ user: "{{ env_var('DBT_USER') }}"
+ ....
```
-More info in dbt's profiles docs: [https://docs.getdbt.com/docs/core/connect-data-platform/snowflake-setup#key-pair-authentication](https://docs.getdbt.com/docs/core/connect-data-platform/snowflake-setup#key-pair-authentication)
-
-This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret [in this guide](/guides/cli/how-to-use-lightdash-preview#how-to-use-the-developer-credentials-in-your-preview-project).
-
-
-
-
-Set up a DataBricks target:
+Then a `.env` file like:
```yaml
-your_profile_name:
- target: dev
- outputs:
- dev:
- type: databricks
- catalog:
- [
- optional catalog name,
- if you are using Unity Catalog,
- only available in dbt-databricks>=1.1.1,
- ]
- schema: [schema name]
- host: [yourorg.databrickshost.com]
- http_path: [/sql/your/http/path]
- token: [dapiXXXXXXXXXXXXXXXXXXXXXXX] # Personal Access Token (PAT)
- threads: [1 or more]
-
+export DBT_USER="myspecialuserkey123"
```
-More info in dbt's profiles docs: [https://docs.getdbt.com/reference/warehouse-profiles/bigquery-profile#service-account-json](https://docs.getdbt.com/reference/warehouse-profiles/bigquery-profile#service-account-json)
-
-This will always use this project connection in your GitHub actions. If you want your preview projects to have different connection settings depending on the user that opened the pull request (dev profiles), then see what you need to add to your secret [in this guide](/guides/cli/how-to-use-lightdash-preview#how-to-use-the-developer-credentials-in-your-preview-project).
-
-
-
-
-
-### Step 2: Create deploy.yml workflow in Github
-
-Go to your repo, click on `Actions` menu.
-
-If you don't have any GitHub actions, you'll just need to click on `Configure`
-
-
-
-
-
-If you have some GitHub actions in your repo already, click on `New workflow`, then select `setup a workflow yourself`.
-
-
- 
-
-
-Now copy [this file](https://github.com/lightdash/cli-actions/blob/main/deploy.yml) from our [cli-actions](https://github.com/lightdash/cli-actions) repo.
-
-Give it a nice name like `deploy-lightdash.yml`
-
-And commit this to your repo by clicking on `Start commit`.
-
-### You're done!
-
-Everytime you make a change to your repo, on the `main` branch, it will automatically deploy your new config into your Lightdash projects
-
-You can see the log on the `Github actions` page
-
-
- 
-
-
-## 2. In the UI: Syncing your dbt changes using `refresh dbt`
-
-Whenever you make changes to your YAML files, you can sync Lightdash and see these changes by clicking the `refresh dbt` button in the Explore view of the app.
-
-
- 
-
-
-If you're using a git connection (like GitHub, Gitlab or Bitbucket), you'll need to push + merge your changes to the branch that your Lightdash project is connected to before you run `refresh dbt`.
-
-## 3. From the command line: Syncing your dbt changes using `lightdash deploy`
-
-If you're using the [Lightdash CLI](/guides/cli/how-to-install-the-lightdash-cli), you can use the `lightdash deploy` command to deploy your changes to your Lightdash project.
-
-To read more about how to use `lightdash deploy`, [check out our docs](/guides/cli/how-to-use-lightdash-deploy).
-
-## Note: If you've made any changes to the underlying data, you need to run dbt first
-
-If you've made any changes to the underlying data (for example, adding a new column in your `model.sql` file or changing the SQL logic of an existing dimension), then you need to run: `dbt run -m yourmodel` before you click `refresh dbt` in Lightdash.
+So, in Lightdash, I'd add a new environment variable and put `key` as `DBT_USER` and `value` as `myspecialuserkey123`.