diff --git a/website/docs/guides/sl-snowflake-qs.md b/website/docs/guides/sl-snowflake-qs.md index bdeeba47e23..2fb63f0296d 100644 --- a/website/docs/guides/sl-snowflake-qs.md +++ b/website/docs/guides/sl-snowflake-qs.md @@ -273,7 +273,7 @@ There are two ways to connect dbt Cloud to Snowflake. The first option is Partne Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/collaborate/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials. -1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Admin**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt. +1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Data Products**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt. @@ -347,7 +347,11 @@ If you used Partner Connect, you can skip to [initializing your dbt project](#in ## Initialize your dbt project and start developing -Now that you have a repository configured, you can initialize your project and start development in dbt Cloud: +This guide assumes you use the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to develop your dbt project and define metrics. However, the dbt Cloud IDE doesn't support using [MetricFlow commands](/docs/build/metricflow-commands) to query or preview metrics (support coming soon). + +To query and preview metrics in your development tool, you can use the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) to run the [MetricFlow commands](/docs/build/metricflow-commands). + +Now that you have a repository configured, you can initialize your project and start development in dbt Cloud using the IDE: 1. Click **Start developing in the dbt Cloud IDE**. It might take a few minutes for your project to spin up for the first time as it establishes your git connection, clones your repo, and tests the connection to the warehouse. 2. Above the file tree to the left, click **Initialize your project**. This builds out your folder structure with example models. @@ -378,6 +382,8 @@ Name the new branch `build-project`. 2. Name the file `staging/jaffle_shop/src_jaffle_shop.yml` , then click **Create**. 3. Copy the following text into the file and click **Save**. + + ```yaml version: 2 @@ -390,6 +396,8 @@ sources: - name: orders ``` + + :::tip In your source file, you can also use the **Generate model** button to create a new model file for each source. This creates a new file in the `models` directory with the given source name and fill in the SQL code of the source definition. ::: @@ -398,6 +406,8 @@ In your source file, you can also use the **Generate model** button to create a 5. Name the file `staging/stripe/src_stripe.yml` , then click **Create**. 6. Copy the following text into the file and click **Save**. + + ```yaml version: 2 @@ -408,13 +418,16 @@ sources: tables: - name: payment ``` + ### Add staging models [Staging models](/best-practices/how-we-structure/2-staging) are the first transformation step in dbt. They clean and prepare your raw data, making it ready for more complex transformations and analyses. Follow these steps to add your staging models to your project. -1. Create the file `models/staging/jaffle_shop/stg_customers.sql`. Or, you can use the **Generate model** button to create a new model file for each source. +1. In the `jaffle_shop` sub-directory, create the file `stg_customers.sql`. Or, you can use the **Generate model** button to create a new model file for each source. 2. Copy the following query into the file and click **Save**. + + ```sql select id as customer_id, @@ -423,9 +436,13 @@ sources: from {{ source('jaffle_shop', 'customers') }} ``` -3. Create the file `models/staging/jaffle_shop/stg_orders.sql` + + +3. In the same `jaffle_shop` sub-directory, create the file `stg_orders.sql` 4. Copy the following query into the file and click **Save**. + + ```sql select id as order_id, @@ -435,9 +452,13 @@ from {{ source('jaffle_shop', 'customers') }} from {{ source('jaffle_shop', 'orders') }} ``` -5. Create the file `models/staging/stripe/stg_payments.sql`. + + +5. In the `stripe` sub-directory, create the file `stg_payments.sql`. 6. Copy the following query into the file and click **Save**. + + ```sql select id as payment_id, @@ -452,6 +473,8 @@ select from {{ source('stripe', 'payment') }} ``` + + 7. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run and see the three models. ### Add business-defined entities @@ -463,6 +486,8 @@ This phase is the [marts layer](/best-practices/how-we-structure/1-guide-overvie 1. Create the file `models/marts/fct_orders.sql`. 2. Copy the following query into the file and click **Save**. + + ```sql with orders as ( select * from {{ ref('stg_orders' )}} @@ -504,9 +529,13 @@ select * from final ``` -3. Create the file `models/marts/dim_customers.sql`. + + +3. In the `models/marts` directory, create the file `dim_customers.sql`. 4. Copy the following query into the file and click **Save**. + + ```sql with customers as ( select * from {{ ref('stg_customers')}} @@ -539,18 +568,26 @@ final as ( select * from final ``` -5. Create the file `packages.yml` in your main directory + + +5. In your main directory, create the file `packages.yml`. 6. Copy the following text into the file and click **Save**. + + ```sql packages: - package: dbt-labs/dbt_utils version: 1.1.1 ``` -7. Create the file `models/metrics/metricflow_time_spine.sql` in your main directory. + + +7. In the `models` directory, create the file `metrics/metricflow_time_spine.sql` in your main directory. 8. Copy the following query into the file and click **Save**. + + ```sql {{ config( @@ -574,6 +611,8 @@ select * from final ``` + + 9. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run message and also see in the run details that dbt has successfully built five models. ## Create semantic models @@ -587,9 +626,11 @@ select * from final In the following steps, semantic models enable you to define how to interpret the data related to orders. It includes entities (like ID columns serving as keys for joining data), dimensions (for grouping or filtering data), and measures (for data aggregations). -1. Create a new file `models/metrics/fct_orders.yml` +1. In the `metrics` sub-directory, create a new file `fct_orders.yml`. 2. Add the following code to that newly created file: + + ```yaml semantic_models: - name: orders @@ -600,6 +641,8 @@ semantic_models: model: ref('fct_orders') ``` + + The following sections explain [dimensions](/docs/build/dimensions), [entities](/docs/build/entities), and [measures](/docs/build/measures) in more detail, showing how they each play a role in semantic models. - [Entities](#entities) act as unique identifiers (like ID columns) that link data together from different tables. @@ -612,6 +655,8 @@ The following sections explain [dimensions](/docs/build/dimensions), [entities]( Add entities to your `fct_orders.yml` semantic model file: + + ```yaml semantic_models: - name: orders @@ -628,12 +673,16 @@ semantic_models: type: foreign ``` + + ### Dimensions [Dimensions](/docs/build/semantic-models#entities) are a way to group or filter information based on categories or time. Add dimensions to your `fct_orders.yml` semantic model file: + + ```yaml semantic_models: - name: orders @@ -655,12 +704,16 @@ semantic_models: time_granularity: day ``` + + ### Measures [Measures](/docs/build/semantic-models#measures) are aggregations performed on columns in your model. Often, you’ll find yourself using them as final metrics themselves. Measures can also serve as building blocks for more complicated metrics. Add measures to your `fct_orders.yml` semantic model file: + + ```yaml semantic_models: - name: orders @@ -701,6 +754,8 @@ semantic_models: use_approximate_percentile: False ``` + + ## Define metrics [Metrics](/docs/build/metrics-overview) are the language your business users speak and measure business performance. They are an aggregation over a column in your warehouse that you enrich with dimensional cuts. @@ -717,6 +772,8 @@ Once you've created your semantic models, it's time to start referencing those m Add metrics to your `fct_orders.yml` semantic model file: + + ```yaml semantic_models: - name: orders @@ -805,6 +862,8 @@ metrics: - name: order_count ``` + + ## Add second semantic model to your project Great job, you've successfully built your first semantic model! It has all the required elements: entities, dimensions, measures, and metrics. @@ -813,9 +872,11 @@ Let’s expand your project's analytical capabilities by adding another semantic After setting up your orders model: -1. Create the file `models/metrics/dim_customers.yml`. +1. In the `metrics` sub-directory, create the file `dim_customers.yml`. 2. Copy the following query into the file and click **Save**. + + ```yaml semantic_models: - name: customers @@ -862,6 +923,9 @@ metrics: measure: customers ``` + + + This semantic model uses simple metrics to focus on customer metrics and emphasizes customer dimensions like name, type, and order dates. It uniquely analyzes customer behavior, lifetime value, and order patterns. ## Test and query metrics diff --git a/website/snippets/_sl-test-and-query-metrics.md b/website/snippets/_sl-test-and-query-metrics.md index 936f4804f9f..ef1cc55cbe1 100644 --- a/website/snippets/_sl-test-and-query-metrics.md +++ b/website/snippets/_sl-test-and-query-metrics.md @@ -1,16 +1,16 @@ To work with metrics in dbt, you have several tools to validate or run commands. Here's how you can test and query metrics depending on your setup: -- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) — Currently, running MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) isn't supported, but is coming soon. You can still validate metrics using the **Preview** or **Compile** options, or visually through the DAG for semantic checks. This ensures your metrics are correctly defined without directly running commands. -- [**dbt Cloud CLI users**](#dbt-cloud-cli-users) — The [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) enables you to run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) for direct interaction with metrics. -- **dbt Core users** — Use the MetricFlow CLI for command execution. While this guide focuses on dbt Cloud users, dbt Core users can find detailed MetricFlow CLI setup instructions in the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) page. Note that to use the dbt Semantic Layer, you need to have a Team or Enterprise account. +- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) — Currently, running MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) isn't supported, but is coming soon. You can view metrics visually through the DAG in the **Lineage** tab without directly running commands. +- [**dbt Cloud CLI users**](#dbt-cloud-cli-users) — The [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) enables you to run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) to query and preview metrics directly in your command line interface. +- **dbt Core users** — Use the MetricFlow CLI for command execution. While this guide focuses on dbt Cloud users, dbt Core users can find detailed MetricFlow CLI setup instructions in the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) page. Note that to use the dbt Semantic Layer, you need to have a [Team or Enterprise account](https://www.getdbt.com/). Alternatively, you can run commands with SQL client tools like DataGrip, DBeaver, or RazorSQL. ### dbt Cloud IDE users -You can validate your metrics in the dbt Cloud IDE by selecting the metric you want to validate and viewing it in the **Lineage** tab. +You can view your metrics in the dbt Cloud IDE by viewing them in the **Lineage** tab. The dbt Cloud IDE **Status button** (located in the bottom right of the editor) displays an **Error** status if there's an error in your metric or semantic model definition. You can click the button to see the specific issue and resolve it. -Once validated, make sure you commit and merge your changes in your project. +Once viewed, make sure you commit and merge your changes in your project.