diff --git a/website/docs/docs/cloud/configure-cloud-cli.md b/website/docs/docs/cloud/configure-cloud-cli.md
index 2874e166a8f..854950f5d8c 100644
--- a/website/docs/docs/cloud/configure-cloud-cli.md
+++ b/website/docs/docs/cloud/configure-cloud-cli.md
@@ -189,3 +189,10 @@ move %USERPROFILE%\Downloads\dbt_cloud.yml %USERPROFILE%\.dbt\dbt_cloud.yml
This command moves the `dbt_cloud.yml` from the `Downloads` folder to the `.dbt` folder. If your `dbt_cloud.yml` file is located elsewhere, adjust the path accordingly.
+
+
+
+By default, [all artifacts](/reference/artifacts/dbt-artifacts) are downloaded when you execute dbt commands from the dbt Cloud CLI. To skip these files from being downloaded, add `--download-artifacts=false` to the command you want to run. This can help improve run-time performance but might break workflows that depend on assets like the [manifest](/reference/artifacts/manifest-json).
+
+
+
\ No newline at end of file
diff --git a/website/docs/docs/cloud/connect-data-platform/about-connections.md b/website/docs/docs/cloud/connect-data-platform/about-connections.md
index 58e6ece30a7..0149dd65a32 100644
--- a/website/docs/docs/cloud/connect-data-platform/about-connections.md
+++ b/website/docs/docs/cloud/connect-data-platform/about-connections.md
@@ -36,7 +36,6 @@ Up until July 2024, connections were nested under projects. One dbt Cloud projec
We are rolling out an important change that moves connection management to the account level. The following connection management section describes these changes.
This feature is being rolled out in phases over the coming weeks.
-
:::
Warehouse connections are an account-level resource. As such you can find them under **Accounts Settings** > **Connections**:
@@ -53,6 +52,10 @@ As shown in the image, a project with 2 environments can target between 1 and 2
Rolling out account-level connections will not require any interruption of service in your current usage (IDE, CLI, jobs, etc.).
+:::info Why am I prompted to configure a development environment?
+If your project did not previously have a development environment, you may be redirected to the project setup page. Your project is still intact. Choose a connection for your new development environment, and you can view all your environments again.
+:::
+
However, to fully utilize the value of account-level connections, you may have to rethink how you assign and use connections across projects and environments.
diff --git a/website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md b/website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md
index 7ea6e380000..0243bc619b1 100644
--- a/website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md
+++ b/website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md
@@ -4,6 +4,9 @@ id: connect-bigquery
description: "Configure BigQuery connection."
sidebar_label: "Connect BigQuery"
---
+
+## Authentication
+
### JSON keyfile
:::info Uploading a service account JSON keyfile
@@ -48,3 +51,99 @@ As an end user, if your organization has set up BigQuery OAuth, you can link a p
## Configuration
To learn how to optimize performance with data platform-specific configurations in dbt Cloud, refer to [BigQuery-specific configuration](/reference/resource-configs/bigquery-configs).
+
+### Account level connections and credential management
+
+You can re-use connections across multiple projects with [global connections](/docs/cloud/connect-data-platform/about-connections#migration-from-project-level-connections-to-account-level-connections). Connections are attached at the environment level (formerly project level), so you can utilize multiple connections inside of a single project (to handle dev, staging, production, etc.).
+
+BigQuery connections in dbt Cloud currently expect the credentials to be handled at the connection level (and only BigQuery connections). This was originally designed to facilitate creating a new connection by uploading a service account keyfile. This describes how to override credentials at the environment level, via [extended attributes](/docs/dbt-cloud-environments#extended-attributes), _to allow project administrators to manage credentials independently_ of the account level connection details used for that environment.
+
+For a project, you will first create an environment variable to store the secret `private_key` value. Then, you will use extended attributes to override the entire service account JSON (you can't only override the secret key due to a constraint of extended attributes).
+
+1. **New environment variable**
+
+ - Create a new _secret_ [environment variable](https://docs.getdbt.com/docs/build/environment-variables#handling-secrets) to handle the private key: `DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY`
+ - Fill in the private key value according the environment
+
+ To automate your deployment, use the following [admin API request](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Create%20Projects%20Environment%20Variables%20Bulk), with `XXXXX` your account number, `YYYYY` your project number, `ZZZZZ` your [API token](/docs/dbt-cloud-apis/authentication):
+
+ ```shell
+ curl --request POST \
+ --url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/environment-variables/bulk/ \
+ --header 'Accept: application/json' \
+ --header 'Authorization: Bearer ZZZZZ' \
+ --header 'Content-Type: application/json' \
+ --data '{
+ "env_var": [
+ {
+ "new_name": "DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY",
+ "project": "Value by default for the entire project",
+ "ENVIRONMENT_NAME_1": "Optional, if wanted, value for environment name 1",
+ "ENVIRONMENT_NAME_2": "Optional, if wanted, value for environment name 2"
+ }
+ ]
+ }'
+ ```
+
+2. **Extended attributes**
+
+ In the environment details, complete the [extended attributes](/docs/dbt-cloud-environments#extended-attributes) block with the following payload (replacing `XXX` with your corresponding information):
+
+ ```yaml
+ keyfile_json:
+ type: service_account
+ project_id: xxx
+ private_key_id: xxx
+ private_key: '{{ env_var(''DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY'') }}'
+ client_email: xxx
+ client_id: xxx
+ auth_uri: xxx
+ token_uri: xxx
+ auth_provider_x509_cert_url: xxx
+ client_x509_cert_url: xxx
+ ```
+
+ If you require [other fields](/docs/core/connect-data-platform/bigquery-setup#service-account-json) to be overridden at the environment level via extended attributes, please respect the [expected indentation](/docs/dbt-cloud-environments#only-the-top-level-keys-are-accepted-in-extended-attributes) (ordering doesn't matter):
+
+ ```yaml
+ priority: interactive
+ keyfile_json:
+ type: xxx
+ project_id: xxx
+ private_key_id: xxx
+ private_key: '{{ env_var(''DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY'') }}'
+ client_email: xxx
+ client_id: xxx
+ auth_uri: xxx
+ token_uri: xxx
+ auth_provider_x509_cert_url: xxx
+ client_x509_cert_url: xxx
+ execution_project: buck-stops-here-456
+ ```
+
+ To automate your deployment, you first need to [create the extended attributes payload](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Create%20Extended%20Attributes) for a given project, and then [assign it](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Update%20Environment) to a specific environment. With `XXXXX` as your account number, `YYYYY` as your project number, and `ZZZZZ` as your [API token](/docs/dbt-cloud-apis/authentication):
+
+ ```shell
+ curl --request POST \
+ --url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/extended-attributes/ \
+ --header 'Accept: application/json' \
+ --header 'Authorization: Bearer ZZZZZ' \
+ --header 'Content-Type: application/json' \
+ --data '{
+ "id": null,
+ "extended_attributes": {"type":"service_account","project_id":"xxx","private_key_id":"xxx","private_key":"{{ env_var('DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY') }}","client_email":"xxx","client_id":xxx,"auth_uri":"https://accounts.google.com/o/oauth2/auth","token_uri":"https://oauth2.googleapis.com/token","auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs","client_x509_cert_url":"xxx"},
+ "state": 1
+ }'
+ ```
+ _Make a note of the `id` returned in the message._ It will be used in the following call. With `EEEEE` the environment id, `FFFFF` the extended attributes id:
+
+ ```shell
+ curl --request POST \
+ --url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/environments/EEEEE/ \
+ --header 'Accept: application/json' \
+ --header 'Authorization: Bearer ZZZZZZ' \
+ --header 'Content-Type: application/json' \
+ --data '{
+ "extended_attributes_id": FFFFF
+ }'
+ ```
diff --git a/website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md b/website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md
index 1cd24c16481..3b3b9c2d870 100644
--- a/website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md
+++ b/website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md
@@ -12,8 +12,26 @@ This guide describes a feature of the dbt Cloud Enterprise plan. If you’re int
dbt Cloud Enterprise supports [OAuth authentication](https://docs.snowflake.net/manuals/user-guide/oauth-intro.html) with Snowflake. When Snowflake OAuth is enabled, users can authorize their Development credentials using Single Sign On (SSO) via Snowflake rather than submitting a username and password to dbt Cloud. If Snowflake is setup with SSO through a third-party identity provider, developers can use this method to log into Snowflake and authorize the dbt Development credentials without any additional setup.
-### Configuring a security integration
-To enable Snowflake OAuth, you will need to create a [security integration](https://docs.snowflake.net/manuals/sql-reference/sql/create-security-integration.html) in Snowflake to manage the OAuth connection between dbt Cloud and Snowflake.
+To set up Snowflake OAuth in dbt Cloud, admins from both are required for the following steps:
+1. [Locate the redirect URI value](#locate-the-redirect-uri-value) in dbt Cloud.
+2. [Create a security integration](#create-a-security-integration) in Snowflake.
+3. [Configure a connection](#configure-a-connection-in-dbt-cloud) in dbt Cloud.
+
+To use Snowflake in the dbt Cloud IDE, all developers must [authenticate with Snowflake](#authorize-developer-credentials) in their profile credentials.
+
+### Locate the redirect URI value
+
+To get started, copy the connection's redirect URI from dbt Cloud:
+1. Navigate to **Account settings**
+1. Select **Projects** and choose a project from the list
+1. Select the connection to view its details abd set the **OAuth method** to "Snowflake SSO"
+1. Copy the **Redirect URI** for use in later steps
+
+
### Create a security integration
@@ -25,7 +43,7 @@ CREATE OR REPLACE SECURITY INTEGRATION DBT_CLOUD
ENABLED = TRUE
OAUTH_CLIENT = CUSTOM
OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
- OAUTH_REDIRECT_URI = 'https://YOUR_ACCESS_URL/complete/snowflake'
+ OAUTH_REDIRECT_URI = LOCATED_REDIRECT_URI
OAUTH_ISSUE_REFRESH_TOKENS = TRUE
OAUTH_REFRESH_TOKEN_VALIDITY = 7776000;
```
@@ -42,7 +60,7 @@ CREATE OR REPLACE SECURITY INTEGRATION DBT_CLOUD
| ENABLED | Required |
| OAUTH_CLIENT | Required |
| OAUTH_CLIENT_TYPE | Required |
-| OAUTH_REDIRECT_URI | Required. Use the access URL that corresponds to your server [region](/docs/cloud/about-cloud/access-regions-ip-addresses). |
+| OAUTH_REDIRECT_URI | Required. Use the value in the [dbt Cloud account settings](#locate-the-redirect-uri-value). |
| OAUTH_ISSUE_REFRESH_TOKENS | Required |
| OAUTH_REFRESH_TOKEN_VALIDITY | Required. This configuration dictates the number of seconds that a refresh token is valid for. Use a smaller value to force users to re-authenticate with Snowflake more frequently. |
diff --git a/website/docs/docs/core/connect-data-platform/teradata-setup.md b/website/docs/docs/core/connect-data-platform/teradata-setup.md
index 7067104fb94..df32b07bd0e 100644
--- a/website/docs/docs/core/connect-data-platform/teradata-setup.md
+++ b/website/docs/docs/core/connect-data-platform/teradata-setup.md
@@ -67,7 +67,7 @@ To connect to Teradata Vantage from dbt, you'll need to add a [profile](https://
password:
schema:
tmode: ANSI
- threads: 1
+ threads: [optional, 1 or more]
#optional fields
```
diff --git a/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md b/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md
index a83ebfaadfb..35758d46afd 100644
--- a/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md
+++ b/website/docs/docs/dbt-versions/upgrade-dbt-version-in-cloud.md
@@ -7,7 +7,7 @@ In dbt Cloud, both [jobs](/docs/deploy/jobs) and [environments](/docs/dbt-cloud-
## Environments
-Navigate to the settings page of an environment, then click **Edit**. Click the **dbt version** dropdown bar and make your selection. You can select a previous release of dbt Core or go [**Versionless**](#versionless)(recommended). Be sure to save your changes before navigating away.
+Navigate to the settings page of an environment, then click **Edit**. Click the **dbt version** dropdown bar and make your selection. You can select a previous release of dbt Core or go [**Versionless**](#versionless) (recommended). Be sure to save your changes before navigating away.
diff --git a/website/docs/docs/deploy/ci-jobs.md b/website/docs/docs/deploy/ci-jobs.md
index 12e303c3536..4cd8e4b6cf0 100644
--- a/website/docs/docs/deploy/ci-jobs.md
+++ b/website/docs/docs/deploy/ci-jobs.md
@@ -12,7 +12,8 @@ dbt Labs recommends that you create your CI job in a dedicated dbt Cloud [deploy
### Prerequisites
- You have a dbt Cloud account.
-- For the [Concurrent CI checks](/docs/deploy/continuous-integration#concurrent-ci-checks) and [Smart cancellation of stale builds](/docs/deploy/continuous-integration#smart-cancellation) features, your dbt Cloud account must be on the [Team or Enterprise plan](https://www.getdbt.com/pricing/).
+- For the [concurrent CI checks](/docs/deploy/continuous-integration#concurrent-ci-checks) and [smart cancellation of stale builds](/docs/deploy/continuous-integration#smart-cancellation) features, your dbt Cloud account must be on the [Team or Enterprise plan](https://www.getdbt.com/pricing/).
+- For the [compare changes](/docs/deploy/continuous-integration#compare-changes) feature, your dbt Cloud account must have access to Advanced CI. Please ask your [dbt Cloud administrator to enable](/docs/dbt-cloud-environments#account-access-to-advanced-ci-features) this for you.
- Set up a [connection with your Git provider](/docs/cloud/git/git-configuration-in-dbt-cloud). This integration lets dbt Cloud run jobs on your behalf for job triggering.
- If you're using a native [GitLab](/docs/cloud/git/connect-gitlab) integration, you need a paid or self-hosted account that includes support for GitLab webhooks and [project access tokens](https://docs.gitlab.com/ee/user/project/settings/project_access_tokens.html). If you're using GitLab Free, merge requests will trigger CI jobs but CI job status updates (success or failure of the job) will not be reported back to GitLab.
@@ -21,40 +22,48 @@ To make CI job creation easier, many options on the **CI job** page are set to d
1. On your deployment environment page, click **Create job** > **Continuous integration job** to create a new CI job.
-2. Options in the **Job settings** section:
+1. Options in the **Job settings** section:
- **Job name** — Specify the name for this CI job.
- **Description** — Provide a description about the CI job.
- - **Environment** — By default, it’s set to the environment you created the CI job from.
+ - **Environment** — By default, it’s set to the environment you created the CI job from. Use the dropdown to change the default setting.
+
+1. Options in the **Git trigger** section:
- **Triggered by pull requests** — By default, it’s enabled. Every time a developer opens up a pull request or pushes a commit to an existing pull request, this job will get triggered to run.
- - **Run on Draft Pull Request** — Enable this option if you want to also trigger the job to run every time a developer opens up a draft pull request or pushes a commit to that draft pull request.
+ - **Run on draft pull request** — Enable this option if you want to also trigger the job to run every time a developer opens up a draft pull request or pushes a commit to that draft pull request.
-3. Options in the **Execution settings** section:
+1. Options in the **Execution settings** section:
- **Commands** — By default, it includes the `dbt build --select state:modified+` command. This informs dbt Cloud to build only new or changed models and their downstream dependents. Importantly, state comparison can only happen when there is a deferred environment selected to compare state to. Click **Add command** to add more [commands](/docs/deploy/job-commands) that you want to be invoked when this job runs.
+ - **Run compare changes** — Enable this option to compare the last applied state of the production environment (if one exists) with the latest changes from the pull request, and identify what those differences are. To enable record-level comparison and primary key analysis, you must add a [primary key constraint](/reference/resource-properties/constraints) or [uniqueness test](/reference/resource-properties/data-tests#unique). Otherwise, you'll receive a "Primary key missing" error message in dbt Cloud.
+
+ To review the comparison report, navigate to the [Compare tab](/docs/deploy/run-visibility#compare-tab) in the job run's details. A summary of the report is also available from the pull request in your Git provider (see the [CI report example](#example-ci-report)).
- **Compare changes against an environment (Deferral)** — By default, it’s set to the **Production** environment if you created one. This option allows dbt Cloud to check the state of the code in the PR against the code running in the deferred environment, so as to only check the modified code, instead of building the full table or the entire DAG.
- :::info
- Older versions of dbt Cloud only allow you to defer to a specific job instead of an environment. Deferral to a job compares state against the project code that was run in the deferred job's last successful run. While deferral to an environment is more efficient as dbt Cloud will compare against the project representation (which is stored in the `manifest.json`) of the last successful deploy job run that executed in the deferred environment. By considering _all_ [deploy jobs](/docs/deploy/deploy-jobs) that run in the deferred environment, dbt Cloud will get a more accurate, latest project representation state.
- :::
+ :::info
+ Older versions of dbt Cloud only allow you to defer to a specific job instead of an environment. Deferral to a job compares state against the project code that was run in the deferred job's last successful run. Deferral to an environment is more efficient as dbt Cloud will compare against the project representation (which is stored in the `manifest.json`) of the last successful deploy job run that executed in the deferred environment. By considering _all_ [deploy jobs](/docs/deploy/deploy-jobs) that run in the deferred environment, dbt Cloud will get a more accurate, latest project representation state.
+ :::
+
+ - **Run timeout** — Cancel the CI job if the run time exceeds the timeout value. You can use this option to help ensure that a CI check doesn't consume too much of your warehouse resources. If you enable the **Run compare changes** option, the timeout value defaults to `3600` (one hour) to prevent long-running comparisons.
- - **Generate docs on run** — Enable this option if you want to [generate project docs](/docs/collaborate/build-and-view-your-docs) when this job runs. This option is disabled by default since most teams do not want to test doc generation on every CI check.
-4. (optional) Options in the **Advanced settings** section:
+1. (optional) Options in the **Advanced settings** section:
- **Environment variables** — Define [environment variables](/docs/build/environment-variables) to customize the behavior of your project when this CI job runs. You can specify that a CI job is running in a _Staging_ or _CI_ environment by setting an environment variable and modifying your project code to behave differently, depending on the context. It's common for teams to process only a subset of data for CI runs, using environment variables to branch logic in their dbt project code.
- **Target name** — Define the [target name](/docs/build/custom-target-names). Similar to **Environment Variables**, this option lets you customize the behavior of the project. You can use this option to specify that a CI job is running in a _Staging_ or _CI_ environment by setting the target name and modifying your project code to behave differently, depending on the context.
- - **Run timeout** — Cancel this CI job if the run time exceeds the timeout value. You can use this option to help ensure that a CI check doesn't consume too much of your warehouse resources.
- **dbt version** — By default, it’s set to inherit the [dbt version](/docs/dbt-versions/core) from the environment. dbt Labs strongly recommends that you don't change the default setting. This option to change the version at the job level is useful only when you upgrade a project to the next dbt version; otherwise, mismatched versions between the environment and job can lead to confusing behavior.
- **Threads** — By default, it’s set to 4 [threads](/docs/core/connect-data-platform/connection-profiles#understanding-threads). Increase the thread count to increase model execution concurrency.
+ - **Generate docs on run** — Enable this if you want to [generate project docs](/docs/collaborate/build-and-view-your-docs) when this job runs. This is disabled by default since testing doc generation on every CI check is not a recommended practice.
- **Run source freshness** — Enable this option to invoke the `dbt source freshness` command before running this CI job. Refer to [Source freshness](/docs/deploy/source-freshness) for more details.
-### Examples
+
-- Example of creating a CI job:
-
+### Example of CI check in pull request {#example-ci-check}
+The following is an example of a CI check in a GitHub pull request. The green checkmark means the dbt build and tests were successful. Clicking on the dbt Cloud section takes you to the relevant CI run in dbt Cloud.
-- Example of GitHub pull request. The green checkmark means the dbt build and tests were successful. Clicking on the dbt Cloud section navigates you to the relevant CI run in dbt Cloud.
+
-
+### Example of CI report in pull request {#example-ci-report}
+The following is an example of a CI report in a GitHub pull request, which is shown when the **Run compare changes** option is enabled for the CI job. It displays a high-level summary of the models that changed from the pull request.
+
## Trigger a CI job with the API
diff --git a/website/docs/docs/deploy/continuous-integration.md b/website/docs/docs/deploy/continuous-integration.md
index fbe93e084b6..e033fc16fb7 100644
--- a/website/docs/docs/deploy/continuous-integration.md
+++ b/website/docs/docs/deploy/continuous-integration.md
@@ -30,11 +30,7 @@ dbt Cloud deletes the temporary schema from your w
The [dbt Cloud scheduler](/docs/deploy/job-scheduler) executes CI jobs differently from other deployment jobs in these important ways:
-- **Concurrent CI checks** — CI runs triggered by the same dbt Cloud CI job execute concurrently (in parallel), when appropriate
-- **Smart cancellation of stale builds** — Automatically cancels stale, in-flight CI runs when there are new commits to the PR
-- **Run slot treatment** — CI runs don't consume a run slot
-
-### Concurrent CI checks
+
When you have teammates collaborating on the same dbt project creating pull requests on the same dbt repository, the same CI job will get triggered. Since each run builds into a dedicated, temporary schema that’s tied to the pull request, dbt Cloud can safely execute CI runs _concurrently_ instead of _sequentially_ (differing from what is done with deployment dbt Cloud jobs). Because no one needs to wait for one CI run to finish before another one can start, with concurrent CI checks, your whole team can test and integrate dbt code faster.
@@ -44,12 +40,35 @@ Below describes the conditions when CI checks are run concurrently and when they
- CI runs with the _same_ PR number and _different_ commit SHAs execute serially because they’re building into the same schema. dbt Cloud will run the latest commit and cancel any older, stale commits. For details, refer to [Smart cancellation of stale builds](#smart-cancellation).
- CI runs with the same PR number and same commit SHA, originating from different dbt Cloud projects will execute jobs concurrently. This can happen when two CI jobs are set up in different dbt Cloud projects that share the same dbt repository.
-### Smart cancellation of stale builds {#smart-cancellation}
+
+
+
When you push a new commit to a PR, dbt Cloud enqueues a new CI run for the latest commit and cancels any CI run that is (now) stale and still in flight. This can happen when you’re pushing new commits while a CI build is still in process and not yet done. By cancelling runs in a safe and deliberate way, dbt Cloud helps improve productivity and reduce data platform spend on wasteful CI runs.
-### Run slot treatment
+
+
+
CI runs don't consume run slots. This guarantees a CI check will never block a production run.
+
+
+
+
+
+ When a pull request is opened or new commits are pushed, dbt Cloud compares the changes between the last applied state of the production environment (defaulting to deferral for lower computation costs) and the latest changes from the pull request for CI jobs that have the **Run compare changes** option enabled. By analyzing these comparisons, you can gain a better understanding of how the data changes are affected by code changes to help ensure you always ship the correct changes to production and create trusted data products.
+
+ :::info Beta feature
+
+The compare changes feature is currently in limited beta for select accounts. If you're interested in gaining access or learning more, please stay tuned for updates.
+
+ :::
+
+dbt reports the comparison differences:
+
+- **In dbt Cloud** — Shows the changes (if any) to the data's primary keys, rows, and columns. To learn more, refer to the [Compare tab](/docs/deploy/run-visibility#compare-tab) in the [Job run details](/docs/deploy/run-visibility#job-run-details).
+- **In the pull request from your Git provider** — Shows a summary of the changes, as a git comment.
+
+
\ No newline at end of file
diff --git a/website/docs/docs/deploy/run-visibility.md b/website/docs/docs/deploy/run-visibility.md
index ad7aa04986d..f169031790e 100644
--- a/website/docs/docs/deploy/run-visibility.md
+++ b/website/docs/docs/deploy/run-visibility.md
@@ -9,13 +9,13 @@ You can view the history of your runs and the model timing dashboard to help ide
## Run history
-The **Run history** dashboard in dbt Cloud helps you monitor the health of your dbt project. It provides a detailed overview of all of your project's job runs and empowers you with a variety of filters to help you focus on specific aspects. You can also use it to review recent runs, find errored runs, and track the progress of runs in progress. You can access it on the top navigation menu by clicking **Deploy** and then **Run history**.
+The **Run history** dashboard in dbt Cloud helps you monitor the health of your dbt project. It provides a detailed overview of all your project's job runs and empowers you with a variety of filters that enable you to focus on specific aspects. You can also use it to review recent runs, find errored runs, and track the progress of runs in progress. You can access it from the top navigation menu by clicking **Deploy** and then **Run history**.
The dashboard displays your full run history, including job name, status, associated environment, job trigger, commit SHA, schema, and timing info.
dbt Cloud developers can access their run history for the last 365 days through the dbt Cloud user interface (UI) and API.
-We limit self-service retrieval of run history metadata to 365 days to improve dbt Cloud's performance.
+dbt Labs limits self-service retrieval of run history metadata to 365 days to improve dbt Cloud's performance.
@@ -29,16 +29,44 @@ An example of a completed run with a configuration for a [job completion trigger
-### Access logs
+### Run summary tab
You can view or download in-progress and historical logs for your dbt runs. This makes it easier for the team to debug errors more efficiently.
-### Model timing
+### Lineage tab
-The **Model timing** dashboard displays the composition, order, and time taken by each model in a job run. The visualization appears for successful jobs and highlights the top 1% of model durations. This helps you identify bottlenecks in your runs, so you can investigate them and potentially make changes to improve their performance.
+View the lineage graph associated with the job run so you can better understand the dependencies and relationships of the resources in your project. To view a node's metadata directly in [dbt Explorer](/docs/collaborate/explore-projects), select it (double-click) from the graph.
+
+
+
+### Model timing tab
+
+The **Model timing** tab displays the composition, order, and time each model takes in a job run. The visualization appears for successful jobs and highlights the top 1% of model durations. This helps you identify bottlenecks in your runs so you can investigate them and potentially make changes to improve their performance.
You can find the dashboard on the [job's run details](#job-run-details).
+
+### Artifacts tab
+
+This provides a list of the artifacts generated by the job run. The files are saved and available for download.
+
+
+
+### Compare tab
+
+The **Compare** tab is shown for [CI job runs](/docs/deploy/ci-jobs) with the **Run compare changes** setting enabled. It displays details about [the changes from the comparison dbt performed](/docs/deploy/continuous-integration#compare-changes) between what's in your production environment and the pull request. To help you better visualize the differences, dbt Cloud highlights changes to your models in red (deletions) and green (inserts).
+
+From the **Modified** section, you can view the following:
+
+- **Overview** — High-level summary about the changes to the models such as the number of primary keys that were added or removed.
+- **Primary keys** — Details about the changes to the records.
+- **Modified rows** — Details about the modified rows. Click **Show full preview** to display all columns.
+- **Columns** — Details about the changes to the columns.
+
+To view the dependencies and relationships of the resources in your project more closely, click **View in Explorer** to launch [dbt Explorer](/docs/collaborate/explore-projects).
+
+
+
diff --git a/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md b/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md
new file mode 100644
index 00000000000..4165506993c
--- /dev/null
+++ b/website/docs/faqs/Troubleshooting/ide-session-unknown-error.md
@@ -0,0 +1,19 @@
+---
+title: I'm receiving an 'Your IDE session experienced an unknown error and was terminated. Please contact support'.
+description: "Add a repository when seeing IDE unknown error"
+sidebar_label: 'Receiving unknown error in the IDE'
+
+---
+
+If you're seeing the following error when you launch the dbt Cloud IDE, it could be due to a few scenarios but, commonly, it indicates a missing repository:
+
+```shell
+
+Your IDE session experienced an unknown error and was terminated. Please contact support.
+
+```
+
+You can try to resolve this by adding a repository like a [managed repository](/docs/collaborate/git/managed-repository) or your preferred Git account. To add your Git account, navigate to **Project** > **Repository** and select your repository.
+
+
+If you're still running into this error, please contact the Support team at support@getdbt.com for help.
diff --git a/website/docs/guides/sl-snowflake-qs.md b/website/docs/guides/sl-snowflake-qs.md
index 64468f36dd6..ede32a47f7e 100644
--- a/website/docs/guides/sl-snowflake-qs.md
+++ b/website/docs/guides/sl-snowflake-qs.md
@@ -661,7 +661,8 @@ semantic_models:
entities:
- name: order_id
type: primary
- - name: customer_id
+ - name: customer
+ expr: customer_id
type: foreign
```
@@ -686,8 +687,9 @@ semantic_models:
entities:
- name: order_id
type: primary
- - name: customer_id
- type: foreign
+ - name: customer
+ expr: customer_id
+ type: foreign
# Newly added
dimensions:
- name: order_date
@@ -717,7 +719,8 @@ semantic_models:
entities:
- name: order_id
type: primary
- - name: customer_id
+ - name: customer
+ expr: customer_id
type: foreign
dimensions:
- name: order_date
@@ -777,7 +780,8 @@ semantic_models:
entities:
- name: order_id
type: primary
- - name: customer_id
+ - name: customer
+ expr: customer_id
type: foreign
dimensions:
- name: order_date
@@ -825,7 +829,7 @@ metrics:
type_params:
measure: order_count
filter: |
- {{ Dimension('order_id__order_total_dim') }} >= 20
+ {{ Metric('order_total', group_by=['order_id']) }} >= 20
# Ratio type metric
- name: "avg_order_value"
label: "avg_order_value"
diff --git a/website/docs/reference/artifacts/dbt-artifacts.md b/website/docs/reference/artifacts/dbt-artifacts.md
index 8d3e1ae29e8..c38cc2768e1 100644
--- a/website/docs/reference/artifacts/dbt-artifacts.md
+++ b/website/docs/reference/artifacts/dbt-artifacts.md
@@ -28,6 +28,8 @@ Most dbt commands (and corresponding RPC methods) produce artifacts:
- [catalog](catalog-json): produced by `docs generate`
- [sources](/reference/artifacts/sources-json): produced by `source freshness`
+When running commands from the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation), all artifacts are downloaded by default. If you want to change this behavior, refer to [How to skip artifacts from being downloaded](/docs/cloud/configure-cloud-cli#how-to-skip-artifacts-from-being-downloaded).
+
## Where are artifacts produced?
By default, artifacts are written to the `/target` directory of your dbt project. You can configure the location using the [`target-path` flag](/reference/global-configs/json-artifacts).
diff --git a/website/docs/reference/model-configs.md b/website/docs/reference/model-configs.md
index 0746fe92036..3a93c599ea7 100644
--- a/website/docs/reference/model-configs.md
+++ b/website/docs/reference/model-configs.md
@@ -136,8 +136,8 @@ models:
config:
[enabled](/reference/resource-configs/enabled): true | false
[tags](/reference/resource-configs/tags): | []
- [pre-hook](/reference/resource-configs/pre-hook-post-hook): | []
- [post-hook](/reference/resource-configs/pre-hook-post-hook): | []
+ [pre_hook](/reference/resource-configs/pre-hook-post-hook): | []
+ [post_hook](/reference/resource-configs/pre-hook-post-hook): | []
[database](/reference/resource-configs/database):
[schema](/reference/resource-properties/schema):
[alias](/reference/resource-configs/alias):
diff --git a/website/docs/reference/project-configs/on-run-start-on-run-end.md b/website/docs/reference/project-configs/on-run-start-on-run-end.md
index e1a3d7b761a..74557839f11 100644
--- a/website/docs/reference/project-configs/on-run-start-on-run-end.md
+++ b/website/docs/reference/project-configs/on-run-start-on-run-end.md
@@ -20,7 +20,7 @@ on-run-end: sql-statement | [sql-statement]
A SQL statement (or list of SQL statements) to be run at the start or end of the following commands:
-`on-run-start` and `on-run-end` hooks can also call macros that return SQL statements
+`on-run-start` and `on-run-end` hooks can also [call macros](#call-a-macro-to-grant-privileges) that return SQL statements.
## Usage notes
* The `on-run-end` hook has additional jinja variables available in the context — check out the [docs](/reference/dbt-jinja-functions/on-run-end-context).
diff --git a/website/sidebars.js b/website/sidebars.js
index a3b0cd2d8a4..ae5e05d4aae 100644
--- a/website/sidebars.js
+++ b/website/sidebars.js
@@ -911,6 +911,7 @@ const sidebarSettings = {
label: "For models",
items: [
"reference/model-properties",
+ "reference/resource-properties/model_name",
"reference/model-configs",
"reference/resource-configs/materialized",
"reference/resource-configs/on_configuration_change",
@@ -933,6 +934,7 @@ const sidebarSettings = {
label: "For snapshots",
items: [
"reference/snapshot-properties",
+ "reference/resource-configs/snapshot_name",
"reference/snapshot-configs",
"reference/resource-configs/check_cols",
"reference/resource-configs/strategy",
diff --git a/website/snippets/_cloud-environments-info.md b/website/snippets/_cloud-environments-info.md
index 166165be855..508a7e79d54 100644
--- a/website/snippets/_cloud-environments-info.md
+++ b/website/snippets/_cloud-environments-info.md
@@ -82,7 +82,7 @@ If you're developing in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in
#### Only the **top-level keys** are accepted in extended attributes
This means that if you want to change a specific sub-key value, you must provide the entire top-level key as a JSON block in your resulting YAML. For example, if you want to customize a particular field within a [service account JSON](/docs/core/connect-data-platform/bigquery-setup#service-account-json) for your BigQuery connection (like 'project_id' or 'client_email'), you need to provide an override for the entire top-level `keyfile_json` main key/attribute using extended attributes. Include the sub-fields as a nested JSON block.
-### Git repository caching
+### Git repository caching
At the start of every job run, dbt Cloud clones the project's Git repository so it has the latest versions of your project's code and runs `dbt deps` to install your dependencies.
@@ -101,12 +101,6 @@ To enable Git repository caching, select **Account settings** from the gear menu
-:::note
-
-This feature is only available on the dbt Cloud Enterprise plan.
-
-:::
-
### Partial parsing
At the start of every dbt invocation, dbt reads all the files in your project, extracts information, and constructs an internal manifest containing every object (model, source, macro, and so on). Among other things, it uses the `ref()`, `source()`, and `config()` macro calls within models to set properties, infer dependencies, and construct your project's DAG. When dbt finishes parsing your project, it stores the internal manifest in a file called `partial_parse.msgpack`.
@@ -118,3 +112,14 @@ Partial parsing in dbt Cloud requires dbt version 1.4 or newer. The feature does
To enable, select **Account settings** from the gear menu and enable the **Partial parsing** option.
+
+### Account access to Advanced CI features
+
+To help increase the governance and improve the quality of the data, you can set up automation that tests code changes before merging them into production with [CI jobs](/docs/deploy/ci-jobs). You can also enable Advanced CI features, such as [compare changes](/docs/deploy/continuous-integration#compare-changes), that allow dbt Cloud account members to view details about the changes between what's currently in your production environment and the pull request's latest commit, providing observability into how data changes are affected by code changes.
+
+To use Advanced CI features, your dbt Cloud account must have access to them. Ask your dbt Cloud administrator to enable Advanced CI features on your account, which they can do by selecting **Account settings** from the gear menu and choosing the **Enable account access to Advanced CI** option.
+
+
+
+
+
diff --git a/website/static/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png b/website/static/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png
index 86bb59e9b90..02e5073fd16 100644
Binary files a/website/static/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png and b/website/static/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/choosing-dbt-version/example-environment-settings.png differ
diff --git a/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/snowflake-oauth-redirect-uri.png b/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/snowflake-oauth-redirect-uri.png
new file mode 100644
index 00000000000..e9313ddaa48
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/dbt-cloud-enterprise/snowflake-oauth-redirect-uri.png differ
diff --git a/website/static/img/docs/dbt-cloud/example-artifacts-tab.png b/website/static/img/docs/dbt-cloud/example-artifacts-tab.png
new file mode 100644
index 00000000000..f039eea2001
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/example-artifacts-tab.png differ
diff --git a/website/static/img/docs/dbt-cloud/example-ci-compare-changes-tab.png b/website/static/img/docs/dbt-cloud/example-ci-compare-changes-tab.png
new file mode 100644
index 00000000000..2736860df3d
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/example-ci-compare-changes-tab.png differ
diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png
index ba75a855233..23c18953bf1 100644
Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/create-ci-job.png differ
diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/example-github-ci-report.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/example-github-ci-report.png
new file mode 100644
index 00000000000..8dbfd76994d
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/example-github-ci-report.png differ
diff --git a/website/static/img/docs/deploy/example-account-settings.png b/website/static/img/docs/deploy/example-account-settings.png
index 12b8d9bc49f..d5e6adc2fa6 100644
Binary files a/website/static/img/docs/deploy/example-account-settings.png and b/website/static/img/docs/deploy/example-account-settings.png differ