Skip to content

Commit

Permalink
Merge branch 'current' into docs-serve-host
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Aug 28, 2024
2 parents 630aa43 + ef4c8e1 commit 8bd136d
Show file tree
Hide file tree
Showing 58 changed files with 440 additions and 177 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -49,26 +49,26 @@ So far we've been working in new pointing at a staging model to simplify things

```yaml
semantic_models:
- name: locations
description: |
Location dimension table. The grain of the table is one row per location.
model: ref('stg_locations')
entities:
- name: location
type: primary
expr: location_id
dimensions:
- name: location_name
type: categorical
- name: date_trunc('day', opened_at)
type: time
type_params:
time_granularity: day
measures:
- name: average_tax_rate
description: Average tax rate.
expr: tax_rate
agg: avg
- name: locations
description: |
Location dimension table. The grain of the table is one row per location.
model: ref('stg_locations')
entities:
- name: location
type: primary
expr: location_id
dimensions:
- name: location_name
type: categorical
- name: date_trunc('day', opened_at)
type: time
type_params:
time_granularity: day
measures:
- name: average_tax_rate
description: Average tax rate.
expr: tax_rate
agg: avg
```
## Semantic and logical interaction
Expand Down
6 changes: 5 additions & 1 deletion website/docs/docs/build/dimensions.md
Original file line number Diff line number Diff line change
Expand Up @@ -170,6 +170,10 @@ Our supported granularities are:
* second
* minute
* hour
* day
* week
* quarter
* year

Aggregation between metrics with different granularities is possible, with the Semantic Layer returning results at the coarsest granularity by default. For example, when querying two metrics with daily and monthly granularity, the resulting aggregation will be at the monthly level.

Expand Down Expand Up @@ -240,7 +244,7 @@ Here’s an example configuration:
- name: tier_start # The name of the dimension.
type: time # The type of dimension (such as time)
label: "Start date of tier" # A readable label for the dimension
expr: start_date # Expression or column name the the dimension represents
expr: start_date # Expression or column name the dimension represents
type_params: # Additional parameters for the dimension type
time_granularity: day # Specifies the granularity of the time dimension (such as day)
validity_params: # Defines the validity window
Expand Down
3 changes: 2 additions & 1 deletion website/docs/docs/build/metricflow-time-spine.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,9 @@ MetricFlow requires you to define a time spine table as a project level configur

If you already have a date dimension or time spine table in your dbt project, you can point MetricFlow to this table by updating the `model` configuration to use this table in the Semantic Layer. For example, given the following directory structure, you can create two time spine configurations, `time_spine_hourly` and `time_spine_daily`.

::tip
:::tip
Previously, you were required to create a model called `metricflow_time_spine` in your dbt project. This is no longer required. However, you can build your time spine model from this table if you don't have another date dimension table you want to use in your project.

:::

<Lightbox src="/img/time_spines.png" title="Time spine directory structure" />
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/sql-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -266,7 +266,7 @@ You can also document and test models &mdash; skip ahead to the section on [test
<FAQ path="Project/example-projects" alt_header="Are there any example dbt models?" />
<FAQ path="Models/configurable-model-path" />
<FAQ path="Models/model-custom-schemas" />
<FAQ path="Models/unique-model-names" />
<FAQ path="Project/unique-resource-names" />
<FAQ path="Models/removing-deleted-models" />
<FAQ path="Project/structure-a-project" alt_header="As I create more models, how should I keep my project organized? What should I name my models?" />
<FAQ path="Models/insert-records" />
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ import Tools from '/snippets/_sl-excel-gsheets.md';

<Tools
type="Microsoft Excel"
bullet_1="There's no timeout limit."
bullet_1="There's a timeout of 1 minute for queries."
bullet_2="If you're using this extension, make sure you're signed into Microsoft with the same Excel profile you used to set up the Add-In. Log in with one profile at a time as using multiple profiles at once might cause issues."
/>

Expand Down
7 changes: 7 additions & 0 deletions website/docs/docs/cloud/configure-cloud-cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -189,3 +189,10 @@ move %USERPROFILE%\Downloads\dbt_cloud.yml %USERPROFILE%\.dbt\dbt_cloud.yml
This command moves the `dbt_cloud.yml` from the `Downloads` folder to the `.dbt` folder. If your `dbt_cloud.yml` file is located elsewhere, adjust the path accordingly.
</Expandable>
<Expandable alt_header="How to skip artifacts from being downloaded">
By default, [all artifacts](/reference/artifacts/dbt-artifacts) are downloaded when you execute dbt commands from the dbt Cloud CLI. To skip these files from being downloaded, add `--download-artifacts=false` to the command you want to run. This can help improve run-time performance but might break workflows that depend on assets like the [manifest](/reference/artifacts/manifest-json).
</Expandable>
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,6 @@ Up until July 2024, connections were nested under projects. One dbt Cloud projec
We are rolling out an important change that moves connection management to the account level. The following connection management section describes these changes.

This feature is being rolled out in phases over the coming weeks.

:::

Warehouse connections are an account-level resource. As such you can find them under **Accounts Settings** > **Connections**:
Expand All @@ -53,6 +52,10 @@ As shown in the image, a project with 2 environments can target between 1 and 2

Rolling out account-level connections will not require any interruption of service in your current usage (IDE, CLI, jobs, etc.).

:::info Why am I prompted to configure a development environment?
If your project did not previously have a development environment, you may be redirected to the project setup page. Your project is still intact. Choose a connection for your new development environment, and you can view all your environments again.
:::

However, to fully utilize the value of account-level connections, you may have to rethink how you assign and use connections across projects and environments.

<Lightbox src="/img/docs/dbt-cloud/cloud-configuring-dbt-cloud/connections-post-rollout.png" width="60%" title="Typical connection setup post rollout"/>
Expand Down
99 changes: 99 additions & 0 deletions website/docs/docs/cloud/connect-data-platform/connnect-bigquery.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,9 @@ id: connect-bigquery
description: "Configure BigQuery connection."
sidebar_label: "Connect BigQuery"
---

## Authentication

### JSON keyfile

:::info Uploading a service account JSON keyfile
Expand Down Expand Up @@ -48,3 +51,99 @@ As an end user, if your organization has set up BigQuery OAuth, you can link a p
## Configuration

To learn how to optimize performance with data platform-specific configurations in dbt Cloud, refer to [BigQuery-specific configuration](/reference/resource-configs/bigquery-configs).

### Account level connections and credential management

You can re-use connections across multiple projects with [global connections](/docs/cloud/connect-data-platform/about-connections#migration-from-project-level-connections-to-account-level-connections). Connections are attached at the environment level (formerly project level), so you can utilize multiple connections inside of a single project (to handle dev, staging, production, etc.).

BigQuery connections in dbt Cloud currently expect the credentials to be handled at the connection level (and only BigQuery connections). This was originally designed to facilitate creating a new connection by uploading a service account keyfile. This describes how to override credentials at the environment level, via [extended attributes](/docs/dbt-cloud-environments#extended-attributes), _to allow project administrators to manage credentials independently_ of the account level connection details used for that environment.

For a project, you will first create an environment variable to store the secret `private_key` value. Then, you will use extended attributes to override the entire service account JSON (you can't only override the secret key due to a constraint of extended attributes).

1. **New environment variable**

- Create a new _secret_ [environment variable](https://docs.getdbt.com/docs/build/environment-variables#handling-secrets) to handle the private key: `DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY`
- Fill in the private key value according the environment

To automate your deployment, use the following [admin API request](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Create%20Projects%20Environment%20Variables%20Bulk), with `XXXXX` your account number, `YYYYY` your project number, `ZZZZZ` your [API token](/docs/dbt-cloud-apis/authentication):

```shell
curl --request POST \
--url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/environment-variables/bulk/ \
--header 'Accept: application/json' \
--header 'Authorization: Bearer ZZZZZ' \
--header 'Content-Type: application/json' \
--data '{
"env_var": [
{
"new_name": "DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY",
"project": "Value by default for the entire project",
"ENVIRONMENT_NAME_1": "Optional, if wanted, value for environment name 1",
"ENVIRONMENT_NAME_2": "Optional, if wanted, value for environment name 2"
}
]
}'
```

2. **Extended attributes**

In the environment details, complete the [extended attributes](/docs/dbt-cloud-environments#extended-attributes) block with the following payload (replacing `XXX` with your corresponding information):

```yaml
keyfile_json:
type: service_account
project_id: xxx
private_key_id: xxx
private_key: '{{ env_var(''DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY'') }}'
client_email: xxx
client_id: xxx
auth_uri: xxx
token_uri: xxx
auth_provider_x509_cert_url: xxx
client_x509_cert_url: xxx
```

If you require [other fields](/docs/core/connect-data-platform/bigquery-setup#service-account-json) to be overridden at the environment level via extended attributes, please respect the [expected indentation](/docs/dbt-cloud-environments#only-the-top-level-keys-are-accepted-in-extended-attributes) (ordering doesn't matter):
```yaml
priority: interactive
keyfile_json:
type: xxx
project_id: xxx
private_key_id: xxx
private_key: '{{ env_var(''DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY'') }}'
client_email: xxx
client_id: xxx
auth_uri: xxx
token_uri: xxx
auth_provider_x509_cert_url: xxx
client_x509_cert_url: xxx
execution_project: buck-stops-here-456
```
To automate your deployment, you first need to [create the extended attributes payload](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Create%20Extended%20Attributes) for a given project, and then [assign it](https://docs.getdbt.com/dbt-cloud/api-v3#/operations/Update%20Environment) to a specific environment. With `XXXXX` as your account number, `YYYYY` as your project number, and `ZZZZZ` as your [API token](/docs/dbt-cloud-apis/authentication):
```shell
curl --request POST \
--url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/extended-attributes/ \
--header 'Accept: application/json' \
--header 'Authorization: Bearer ZZZZZ' \
--header 'Content-Type: application/json' \
--data '{
"id": null,
"extended_attributes": {"type":"service_account","project_id":"xxx","private_key_id":"xxx","private_key":"{{ env_var('DBT_ENV_SECRET_PROJECTXXX_PRIVATE_KEY') }}","client_email":"xxx","client_id":xxx,"auth_uri":"https://accounts.google.com/o/oauth2/auth","token_uri":"https://oauth2.googleapis.com/token","auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs","client_x509_cert_url":"xxx"},
"state": 1
}'
```
_Make a note of the `id` returned in the message._ It will be used in the following call. With `EEEEE` the environment id, `FFFFF` the extended attributes id:
```shell
curl --request POST \
--url https://cloud.getdbt.com/api/v3/accounts/XXXXX/projects/YYYYY/environments/EEEEE/ \
--header 'Accept: application/json' \
--header 'Authorization: Bearer ZZZZZZ' \
--header 'Content-Type: application/json' \
--data '{
"extended_attributes_id": FFFFF
}'
```
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ pagination_prev: null

The dbt Cloud integrated development environment (IDE) is a single web-based interface for building, testing, running, and version-controlling dbt projects. It compiles dbt code into SQL and executes it directly on your database.

The dbt Cloud IDE offers several [keyboard shortcuts](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) and [editing features](/docs/cloud/dbt-cloud-ide/ide-user-interface#editing-features) for faster and efficient development and governance:
The dbt Cloud IDE offers several [keyboard shortcuts](/docs/cloud/dbt-cloud-ide/keyboard-shortcuts) and [editing features](/docs/cloud/dbt-cloud-ide/ide-user-interface#editing-features) for faster and efficient development and governance:

- Syntax highlighting for SQL &mdash; Makes it easy to distinguish different parts of your code, reducing syntax errors and enhancing readability.
Expand Down
5 changes: 4 additions & 1 deletion website/docs/docs/cloud/git/connect-github.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,10 +57,13 @@ If you are your GitHub organization owner, you can also configure the dbt Cloud
## Personally authenticate with GitHub

Once the dbt Cloud admin has [set up a connection](/docs/cloud/git/connect-github#installing-dbt-cloud-in-your-github-account) to your organization GitHub account, you need to personally authenticate, which improves the security of dbt Cloud by enabling you to log in using OAuth through GitHub.
:::infoGitHub profile connection

:::info GitHub profile connection

- dbt Cloud developers on the [Enterprise plan](https://www.getdbt.com/pricing/) must each connect their GitHub profiles to dbt Cloud. This is because the dbt Cloud IDE verifies every developer's read / write access for the dbt repo.

- dbt Cloud developers on the [Team plan](https://www.getdbt.com/pricing/) don't need to each connect their profiles to GitHub, however, it's still recommended to do so.

:::

To connect a personal GitHub account:
Expand Down
6 changes: 3 additions & 3 deletions website/docs/docs/cloud/manage-access/audit-log.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,12 @@ pagination_prev: "docs/cloud/manage-access/about-user-access"

To review actions performed by people in your organization, dbt provides logs of audited user and system events in real time. The audit log appears as events happen and includes details such as who performed the action, what the action was, and when it was performed. You can use these details to troubleshoot access issues, perform security audits, or analyze specific events.

You must be an **Account Admin** to access the audit log and this feature is only available on Enterprise plans.
You must be an **Account Admin** or an **Account Viewer** to access the audit log and this feature is only available on Enterprise plans.

The dbt Cloud audit log stores all the events that occurred in your organization in real-time, including:

- For events within 90 days, the dbt Cloud audit log has a selectable date range that lists events triggered.
- For events beyond 90 days, **Account Admins** can [export all events](#exporting-logs) by using **Export All**.
- For events beyond 90 days, **Account Admins** and **Account Viewers** can [export all events](#exporting-logs) by using **Export All**.

## Accessing the audit log

Expand Down Expand Up @@ -170,6 +170,6 @@ You can use the audit log to export all historical audit results for security, c

- **For events within 90 days** &mdash; dbt Cloud will automatically display the 90-day selectable date range. Select **Export Selection** to download a CSV file of all the events that occurred in your organization within 90 days.

- **For events beyond 90 days** &mdash; Select **Export All**. The Account Admin will receive an email link to download a CSV file of all the events that occurred in your organization.
- **For events beyond 90 days** &mdash; Select **Export All**. The Account Admin or Account Viewer will receive an email link to download a CSV file of all the events that occurred in your organization.

<Lightbox src="/img/docs/dbt-cloud/dbt-cloud-enterprise/audit-log-section.jpg" width="95%" title="View audit log export options"/>
2 changes: 1 addition & 1 deletion website/docs/docs/cloud/manage-access/external-oauth.md
Original file line number Diff line number Diff line change
Expand Up @@ -172,7 +172,7 @@ Adjust the other settings as needed to meet your organizations configurations in

### Entra ID

You’ll create two different `apps` in the Azure portal &mdash: A resource server and a client app.
You’ll create two different `apps` in the Azure portal &mdash; A resource server and a client app.

:::important

Expand Down
26 changes: 22 additions & 4 deletions website/docs/docs/cloud/manage-access/set-up-snowflake-oauth.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,26 @@ This guide describes a feature of the dbt Cloud Enterprise plan. If you’re int

dbt Cloud Enterprise supports [OAuth authentication](https://docs.snowflake.net/manuals/user-guide/oauth-intro.html) with Snowflake. When Snowflake OAuth is enabled, users can authorize their Development credentials using Single Sign On (SSO) via Snowflake rather than submitting a username and password to dbt Cloud. If Snowflake is setup with SSO through a third-party identity provider, developers can use this method to log into Snowflake and authorize the dbt Development credentials without any additional setup.

### Configuring a security integration
To enable Snowflake OAuth, you will need to create a [security integration](https://docs.snowflake.net/manuals/sql-reference/sql/create-security-integration.html) in Snowflake to manage the OAuth connection between dbt Cloud and Snowflake.
To set up Snowflake OAuth in dbt Cloud, admins from both are required for the following steps:
1. [Locate the redirect URI value](#locate-the-redirect-uri-value) in dbt Cloud.
2. [Create a security integration](#create-a-security-integration) in Snowflake.
3. [Configure a connection](#configure-a-connection-in-dbt-cloud) in dbt Cloud.

To use Snowflake in the dbt Cloud IDE, all developers must [authenticate with Snowflake](#authorize-developer-credentials) in their profile credentials.

### Locate the redirect URI value

To get started, copy the connection's redirect URI from dbt Cloud:
1. Navigate to **Account settings**
1. Select **Projects** and choose a project from the list
1. Select the connection to view its details abd set the **OAuth method** to "Snowflake SSO"
1. Copy the **Redirect URI** for use in later steps

<Lightbox
src="/img/docs/dbt-cloud/dbt-cloud-enterprise/snowflake-oauth-redirect-uri.png"
title="Locate the Snowflake OAuth redirect URI"
alt="The OAuth method and Redirect URI inputs for a Snowflake connection in dbt Cloud."
/>

### Create a security integration

Expand All @@ -25,7 +43,7 @@ CREATE OR REPLACE SECURITY INTEGRATION DBT_CLOUD
ENABLED = TRUE
OAUTH_CLIENT = CUSTOM
OAUTH_CLIENT_TYPE = 'CONFIDENTIAL'
OAUTH_REDIRECT_URI = 'https://YOUR_ACCESS_URL/complete/snowflake'
OAUTH_REDIRECT_URI = LOCATED_REDIRECT_URI
OAUTH_ISSUE_REFRESH_TOKENS = TRUE
OAUTH_REFRESH_TOKEN_VALIDITY = 7776000;
```
Expand All @@ -42,7 +60,7 @@ CREATE OR REPLACE SECURITY INTEGRATION DBT_CLOUD
| ENABLED | Required |
| OAUTH_CLIENT | Required |
| OAUTH_CLIENT_TYPE | Required |
| OAUTH_REDIRECT_URI | Required. Use the access URL that corresponds to your server [region](/docs/cloud/about-cloud/access-regions-ip-addresses). |
| OAUTH_REDIRECT_URI | Required. Use the value in the [dbt Cloud account settings](#locate-the-redirect-uri-value). |
| OAUTH_ISSUE_REFRESH_TOKENS | Required |
| OAUTH_REFRESH_TOKEN_VALIDITY | Required. This configuration dictates the number of seconds that a refresh token is valid for. Use a smaller value to force users to re-authenticate with Snowflake more frequently. |

Expand Down
Loading

0 comments on commit 8bd136d

Please sign in to comment.