Skip to content

Commit

Permalink
Merge branch 'current' into runleonarun-patch-15
Browse files Browse the repository at this point in the history
  • Loading branch information
runleonarun authored Aug 14, 2024
2 parents 3dd29de + 9a09eb9 commit 5cc1c9e
Show file tree
Hide file tree
Showing 14 changed files with 179 additions and 38 deletions.
13 changes: 5 additions & 8 deletions website/docs/docs/build/metricflow-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,13 @@ Using MetricFlow with dbt Cloud means you won't need to manage versioning &mdash

- MetricFlow [commands](#metricflow-commands) are embedded in the dbt Cloud CLI. This means you can immediately run them once you install the dbt Cloud CLI and don't need to install MetricFlow separately.
- You don't need to manage versioning — your dbt Cloud account will automatically manage the versioning for you.

</TabItem>

<TabItem value="cloud ide" label="dbt Cloud IDE">

:::info
You can create metrics using MetricFlow in the dbt Cloud IDE. However, support for running MetricFlow commands in the IDE will be available soon.
You can create metrics using MetricFlow in the dbt Cloud IDE and run the [dbt sl validate](/docs/build/validation#validations-command) command. Support for running more MetricFlow commands in the IDE will be available soon.
:::

</TabItem>
Expand Down Expand Up @@ -78,10 +78,9 @@ You can use the `dbt sl` prefix before the command name to execute them in the d
- [`query`](#query) &mdash; Query metrics, saved queries, and dimensions you want to see in the command line interface. Refer to [query examples](#query-examples) to help you get started.
- [`export`](#export) &mdash; Runs exports for a singular saved query for testing and generating exports in your development environment. You can also use the `--select` flag to specify particular exports from a saved query.
- [`export-all`](#export-all) &mdash; Runs exports for multiple saved queries at once, saving time and effort.

- [`validate`](#validate) &mdash; Validates semantic model configurations.

<!--below commands aren't supported in dbt cloud yet
- [`validate-configs`](#validate-configs) &mdash; Validates semantic model configurations.
- [`health-checks`](#health-checks) &mdash; Performs data platform health check.
- [`tutorial`](#tutorial) &mdash; Dedicated MetricFlow tutorial to help get you started.
-->
Expand Down Expand Up @@ -218,14 +217,12 @@ The list of available saved queries:
- Export(new_customer_orders, alias=orders, schemas=customer_schema, exportAs=TABLE)
```

### Validate-configs
### Validate

The following command performs validations against the defined semantic model configurations.

Note, in dbt Cloud you don't need to validate the Semantic Layer config separately. Running a dbt command (such as `dbt parse`, `dbt build`, `dbt compile`, `dbt run`) automatically checks it.

```bash

dbt sl validate # dbt Cloud users
mf validate-configs # In dbt Core

Options:
Expand Down
5 changes: 2 additions & 3 deletions website/docs/docs/build/validation.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,10 @@ The code that handles validation [can be found here](https://github.com/dbt-labs

## Validations command

You can run validations against the defined semantic model configurations from the command line with the following [MetricFlow commands](/docs/build/metricflow-commands):

Note, in dbt Cloud you don't need to validate the Semantic Layer config separately. Running a dbt command (such as `dbt parse`, `dbt build`, `dbt compile`, or `dbt run`) automatically checks it.
You can run validations from dbt Cloud or the command line with the following [MetricFlow commands](/docs/build/metricflow-commands). In dbt Cloud, you need developer credentials to run `dbt sl validate-configs` in the IDE or CLI, and deployment credentials to run it in CI.

```bash
dbt sl validate # dbt Cloud users
mf validate-configs # dbt Core users
```

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -126,8 +126,8 @@ To complete setup, follow the steps below in the dbt Cloud application.
| Field | Value |
| ----- | ----- |
| **Log&nbsp;in&nbsp;with** | Microsoft Entra ID Single Tenant |
| **Client&nbspID** | Paste the **Application (client) ID** recorded in the steps above |
| **Client&nbsp;Secret** | Paste the **Client Secret** (remember to use the Secret Value instead of the Secret ID) recorded in the steps above; <br />**Note:** When the client secret expires, an Entra ID admin will have to generate a new one to be pasted into dbt Cloud for uninterrupted application access. |
| **Client&nbsp;ID** | Paste the **Application (client) ID** recorded in the steps above |
| **Client&nbsp;Secret** | Paste the **Client Secret** (remember to use the Secret Value instead of the Secret ID) from the steps above; <br />**Note:** When the client secret expires, an Entra ID admin will have to generate a new one to be pasted into dbt Cloud for uninterrupted application access. |
| **Tenant&nbsp;ID** | Paste the **Directory (tenant ID)** recorded in the steps above |
| **Domain** | Enter the domain name for your Azure directory (such as `fishtownanalytics.com`). Only use the primary domain; this won't block access for other domains. |
| **Slug** | Enter your desired login slug. Users will be able to log into dbt Cloud by navigating to `https://YOUR_ACCESS_URL/enterprise-login/LOGIN-SLUG`, replacing `YOUR_ACCESS_URL` with the [appropriate Access URL](/docs/cloud/manage-access/sso-overview#auth0-multi-tenant-uris) for your region and plan. Login slugs must be unique across all dbt Cloud accounts, so pick a slug that uniquely identifies your company. |
Expand Down
25 changes: 21 additions & 4 deletions website/docs/docs/cloud/secure/databricks-privatelink.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,15 +10,15 @@ import SetUpPages from '/snippets/_available-tiers-privatelink.md';

<SetUpPages features={'/snippets/_available-tiers-privatelink.md'}/>

The following steps will walk you through the setup of a Databricks AWS PrivateLink endpoint in the dbt Cloud multi-tenant environment.
The following steps will walk you through the setup of a Databricks AWS PrivateLink or Azure Private Link endpoint in the dbt Cloud multi-tenant environment.

## Configure PrivateLink
## Configure AWS PrivateLink

1. Locate your [Databricks instance name](https://docs.databricks.com/en/workspace/workspace-details.html#workspace-instance-names-urls-and-ids)
- Example: `cust-success.cloud.databricks.com`
2. Add the required information to the template below, and submit your request to [dbt Support](https://docs.getdbt.com/community/resources/getting-help#dbt-cloud-support):
2. Add the required information to the following template and submit your AWS PrivateLink request to [dbt Support](https://docs.getdbt.com/docs/dbt-support#dbt-cloud-support):
```
Subject: New Multi-Tenant PrivateLink Request
Subject: New AWS Multi-Tenant PrivateLink Request
- Type: Databricks
- Databricks instance name:
- Databricks cluster AWS Region (e.g., us-east-1, eu-west-2):
Expand All @@ -41,6 +41,23 @@ If using an existing Databricks workspace, all workloads running in the workspac
:::

## Configure Azure Private Link

1. Navigate to your Azure Databricks workspace.
The path format is: `/subscriptions/<subscription_uuid>/resourceGroups/<resource_group_name>/providers/Microsoft.Databricks/workspaces/<workspace_name>`.
2. From the workspace overview, click **JSON view**.
3. Copy the value in the `resource_id` field.
4. Add the required information to the following template and submit your Azure Private Link request to [dbt Support](https://docs.getdbt.com/docs/dbt-support#dbt-cloud-support):
```
Subject: New Azure Multi-Tenant Private Link Request
- Type: Databricks
- Databricks instance name:
- Databricks Azure resource ID:
- dbt Cloud multi-tenant environment: EMEA
```
5. Once our Support team confirms the resources are available in the Azure portal, navigate to the Azure Databricks Workspace and browse to **Networking** > **Private Endpoint Connections**. Then, highlight the `dbt` named option and select **Approve**.
## Create Connection in dbt Cloud
Once you've completed the setup in the Databricks environment, you will be able to configure a private endpoint in dbt Cloud:
Expand Down
80 changes: 71 additions & 9 deletions website/docs/docs/cloud/secure/snowflake-privatelink.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import SetUpPages from '/snippets/_available-tiers-privatelink.md';

<SetUpPages features={'/snippets/_available-tiers-privatelink.md'}/>

The following steps walk you through the setup of a Snowflake AWS PrivateLink and Azure Private Link endpoint in the dbt Cloud multi-tenant environment.
The following steps walk you through the setup of a Snowflake AWS PrivateLink or Azure Private Link endpoint in a dbt Cloud multi-tenant environment.

:::note Snowflake SSO with PrivateLink
Users connecting to Snowflake using SSO over a PrivateLink connection from dbt Cloud will also require access to a PrivateLink endpoint from their local workstation.
Expand All @@ -19,26 +19,37 @@ Users connecting to Snowflake using SSO over a PrivateLink connection from dbt C
- [Snowflake SSO with Private Connectivity](https://docs.snowflake.com/en/user-guide/admin-security-fed-auth-overview#label-sso-private-connectivity)
:::

## Configure PrivateLink
## About private connectivity for Snowflake

1. Open a Support case with Snowflake to allow access from the dbt Cloud AWS account
- Snowflake prefers that the account owner opens the Support case directly, rather than dbt Labs acting on their behalf. For more information, refer to [Snowflake's knowledge base article](https://community.snowflake.com/s/article/HowtosetupPrivatelinktoSnowflakefromCloudServiceVendors)
dbt Cloud supports private connectivity for Snowflake using one of the following services:

- AWS [PrivateLink](#configure-aws-privatelink)
- Azure [Private Link](#configure-azure-private-link)

## Configure AWS PrivateLink

To configure Snowflake instances hosted on AWS for [PrivateLink](https://aws.amazon.com/privatelink):

1. Open a support case with Snowflake to allow access from the dbt Cloud AWS or Entra ID account.
- Snowflake prefers that the account owner opens the support case directly rather than dbt Labs acting on their behalf. For more information, refer to [Snowflake's knowledge base article](https://community.snowflake.com/s/article/HowtosetupPrivatelinktoSnowflakefromCloudServiceVendors).
- Provide them with your dbt Cloud account ID along with any other information requested in the article.
- AWS account ID: `346425330055` - _NOTE: This account ID only applies to dbt Cloud Multi-Tenant environments. For Virtual Private/Single-Tenant account IDs please contact [Support](https://docs.getdbt.com/community/resources/getting-help#dbt-cloud-support)._
- **AWS account ID**: `346425330055` &mdash; _NOTE: This account ID only applies to AWS dbt Cloud multi-tenant environments. For AWS Virtual Private/Single-Tenant account IDs, please contact [Support](https://docs.getdbt.com/docs/dbt-support#dbt-cloud-support)._
- You will need to have `ACCOUNTADMIN` access to the Snowflake instance to submit a Support request.

<Lightbox src="/img/docs/dbt-cloud/snowflakeprivatelink1.png" title="Open snowflake case"/>

2. After Snowflake has granted the requested access, run the Snowflake system function [SYSTEM$GET_PRIVATELINK_CONFIG](https://docs.snowflake.com/en/sql-reference/functions/system_get_privatelink_config.html) and copy the output.

3. Add the required information to the template below, and submit your request to [dbt Support](https://docs.getdbt.com/community/resources/getting-help#dbt-cloud-support):
3. Add the required information to the following template and submit your request to [dbt Support](https://docs.getdbt.com/docs/dbt-support#dbt-cloud-support):

```
Subject: New Multi-Tenant PrivateLink Request
Subject: New Multi-Tenant (Azure or AWS) PrivateLink Request
- Type: Snowflake
- SYSTEM$GET_PRIVATELINK_CONFIG output:
- *Use privatelink-account-url or regionless-privatelink-account-url?:
- dbt Cloud multi-tenant environment (US, EMEA, AU):
- dbt Cloud multi-tenant environment
- AWS: US, EMEA, or AU
- Azure: EMEA only
```
_*By default dbt Cloud will be configured to use `privatelink-account-url` from the provided [SYSTEM$GET_PRIVATELINK_CONFIG](https://docs.snowflake.com/en/sql-reference/functions/system_get_privatelink_config.html) as the PrivateLink endpoint. Upon request, `regionless-privatelink-account-url` can be used instead._

Expand All @@ -47,6 +58,32 @@ import PrivateLinkSLA from '/snippets/_PrivateLink-SLA.md';

<PrivateLinkSLA />

## Configure Azure Private Link

To configure Snowflake instances hosted on Azure for [Private Link](https://learn.microsoft.com/en-us/azure/private-link/private-link-overview):

1. In your Snowflake account, run the following SQL statements and copy the output:

```sql

USE ROLE ACCOUNTADMIN;
SYSTEM$GET_PRIVATELINK_CONFIG;

```


2. Add the required information to the following template and submit your request to [dbt Support](https://docs.getdbt.com/docs/dbt-support#dbt-cloud-support):

```
Subject: New Multi-Tenant (Azure or AWS) PrivateLink Request
- Type: Snowflake
- The output from SYSTEM$GET_PRIVATELINK_CONFIG:
- Include the privatelink-pls-id
- dbt Cloud Azure multi-tenant environment:
```

3. dbt Support will provide the `private endpoint resource_id` of our `private_endpoint` and the `CIDR` range for you to complete the [PrivateLink configuration](https://community.snowflake.com/s/article/HowtosetupPrivatelinktoSnowflakefromCloudServiceVendors) by contacting the Snowflake Support team.

## Create Connection in dbt Cloud

Once dbt Cloud support completes the configuration, you can start creating new connections using PrivateLink.
Expand All @@ -57,6 +94,27 @@ Once dbt Cloud support completes the configuration, you can start creating new c
4. Configure the remaining data platform details.
5. Test your connection and save it.

## Enable the connection in Snowflake

To complete the setup, follow the remaining steps from the Snowflake setup guides. The instructions vary based on the platform:

- [Snowflake AWS PrivateLink](https://docs.snowflake.com/en/user-guide/admin-security-privatelink)
- [Snowflake Azure Private Link](https://docs.snowflake.com/en/user-guide/privatelink-azure)

There are some nuances for each connection and you will need a Snowflake administrator. As the Snowflake administrator, call the `SYSTEM$AUTHORIZE_STAGE_PRIVATELINK_ACCESS` function using the privateEndpointResourceID value as the function argument. This authorizes access to the Snowflake internal stage through the private endpoint.

```sql

USE ROLE ACCOUNTADMIN;

-- AWS PrivateLink
SELECT SYSTEMS$AUTHORIZE_STATE_PRIVATELINK_ACCESS ( `AWS VPC ID` );

-- Azure Private Link
SELECT SYSTEMS$AUTHORIZE_STATE_PRIVATELINK_ACCESS ( `AZURE PRIVATE ENDPOINT RESOURCE ID` );

```

## Configuring Network Policies
If your organization uses [Snowflake Network Policies](https://docs.snowflake.com/en/user-guide/network-policies) to restrict access to your Snowflake account, you will need to add a network rule for dbt Cloud.

Expand Down Expand Up @@ -84,19 +142,23 @@ Open the Snowflake UI and take the following steps:
<Lightbox src="/img/docs/dbt-cloud/snowflakeprivatelink3.png" title="Update Network Policy"/>

### Using SQL

For quick and automated setup of network rules via SQL in Snowflake, the following commands allow you to create and configure access rules for dbt Cloud. These SQL examples demonstrate how to add a network rule and update your network policy accordingly.

1. Create a new network rule with the following SQL:
```sql

CREATE NETWORK RULE allow_dbt_cloud_access
MODE = INGRESS
TYPE = AWSVPCEID
VALUE_LIST = ('<VPCE_ID>'); -- Replace '<VPCE_ID>' with the actual ID provided

```

2. Add the rule to a network policy with the following SQL:
```sql

ALTER NETWORK POLICY <network_policy_name>
ADD ALLOWED_NETWORK_RULE_LIST =('allow_dbt_cloud_access');
```

```
2 changes: 1 addition & 1 deletion website/docs/docs/collaborate/data-tile.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ The data health tile:
- Provides richer information and makes it easier to debug.
- Revamps the existing, [job-based tiles](#job-based-data-health).

<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tile-pass.jpg" width="60%" title="Embed data health tiles in your dashboards to distill trust signals for data consumers." />
<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tiles.png" width="60%" title="Embed data health tiles in your dashboards to distill trust signals for data consumers." />

## Prerequisites

Expand Down
2 changes: 2 additions & 0 deletions website/docs/docs/collaborate/model-query-history.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,9 @@ The model query history tile allows you to:
- Provides data teams insight, so they can focus their time and infrastructure spend on the worthwhile used data products.
- Enable analysts to find the most popular models used by other people.

:::info Available in beta
Model query history is powered by a single query of the query log table in your data warehouse aggregated on a daily basis. It filters down to `select` statements only to gauge model consumption and excludes dbt model build and test executions.
:::

## Prerequisites

Expand Down
31 changes: 30 additions & 1 deletion website/docs/docs/dbt-cloud-apis/sl-manifest.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,36 @@ Top-level keys for the semantic manifest are:
],
"metadata": null,
"dsi_package_version": {}
}
},
"saved_queries": [
{
"name": "name of the saved query",
"query_params": {
"metrics": [
"metrics used in the saved query"
],
"group_by": [
"TimeDimension('model_primary_key__date_column', 'day')",
"Dimension('model_primary_key__metric_one')",
"Dimension('model__dimension')"
],
"where": null
},
"description": "Description of the saved query",
"metadata": null,
"label": null,
"exports": [
{
"name": "saved_query_name",
"config": {
"export_as": "view",
"schema_name": null,
"alias": null
}
}
]
}
]
}
]
}
Expand Down
7 changes: 5 additions & 2 deletions website/docs/docs/dbt-cloud-apis/user-tokens.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,12 @@ pagination_next: "docs/dbt-cloud-apis/service-tokens"

:::note Announcement

The [user API tokens](#user-tokens) are being replaced by [account-scoped personal access tokens(PATs)](#account-scoped-personal-access-tokens). We recommend rotating your existing user tokens with PATs. There are no deprecation plans for user API tokens at this time; we will give ample notice when that timeline has been determined.
_User tokens will be deprecated on September 18th, 2024._

Cloud CLI config files do not need to be updated at this time. You will be notified when you need to re-download your configs.
The [user API tokens](/docs/dbt-cloud-apis/user-tokens#user-api-tokens) are being replaced by [account-scoped personal access tokens(PATs)](#account-scoped-personal-access-tokens). If you do not rotate your existing user tokens with PATs by September 18th, the services using the tokens will encounter errors.


Cloud CLI config files need to be re-downloaded before September 18th, 2024.

The current API key is located under **Personal Settings → API Key.**

Expand Down
2 changes: 1 addition & 1 deletion website/snippets/_sl-measures-parameters.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
| --- | --- | --- |
| [`name`](/docs/build/measures#name) | Provide a name for the measure, which must be unique and can't be repeated across all semantic models in your dbt project. | Required |
| [`description`](/docs/build/measures#description) | Describes the calculated measure. | Optional |
| [`agg`](/docs/build/measures#aggregation) | dbt supports the following aggregations: `sum`, `max`, `min`, `avg`, `median`, `count_distinct`, `percentile`, and `sum_boolean`. | Required |
| [`agg`](/docs/build/measures#aggregation) | dbt supports the following aggregations: `sum`, `max`, `min`, `average`, `median`, `count_distinct`, `percentile`, and `sum_boolean`. | Required |
| [`expr`](/docs/build/measures#expr) | Either reference an existing column in the table or use a SQL expression to create or derive a new one. | Optional |
| [`non_additive_dimension`](/docs/build/measures#non-additive-dimensions) | Non-additive dimensions can be specified for measures that cannot be aggregated over certain dimensions, such as bank account balances, to avoid producing incorrect results. | Optional |
| `agg_params` | Specific aggregation properties, such as a percentile. | Optional |
Expand Down
Loading

0 comments on commit 5cc1c9e

Please sign in to comment.