Skip to content

Commit

Permalink
Merge branch 'current' into dbeatty10-patch-1
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Sep 6, 2024
2 parents e945d4a + ae3d012 commit fbac018
Show file tree
Hide file tree
Showing 10 changed files with 140 additions and 69 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ import AvailIntegrations from '/snippets/_sl-partner-links.md';
### Custom integration

- [Exports](/docs/use-dbt-semantic-layer/exports) enable custom integration with additional tools that don't natively connect with the dbt Semantic Layer, such as PowerBI.
- Develop custom integrations using different languages and tools, supported through JDBC, ADBC, and GraphQL APIs. For more info, check out [our examples on GitHub](https://github.com/dbt-labs/example-semantic-layer-clients/).
- [Consume metrics](/docs/use-dbt-semantic-layer/consume-metrics) and develop custom integrations using different languages and tools, supported through [JDBC](/docs/dbt-cloud-apis/sl-jdbc), ADBC, and [GraphQL](/docs/dbt-cloud-apis/sl-graphql) APIs, and [Python SDK library](/docs/dbt-cloud-apis/sl-python). For more info, check out [our examples on GitHub](https://github.com/dbt-labs/example-semantic-layer-clients/).
- Connect to any tool that supports SQL queries. These tools must meet one of the two criteria:
- Offers a generic JDBC driver option (such as DataGrip) or
- Is compatible Arrow Flight SQL JDBC driver version 12.0.0 or higher.
Expand Down
1 change: 1 addition & 0 deletions website/docs/docs/dbt-versions/release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ Release notes are grouped by month for both multi-tenant and virtual private clo
- **Behavior change:** GitHub is no longer supported for OAuth login to dbt Cloud. Use a supported [SSO or OAuth provider](/docs/cloud/manage-access/sso-overview) to securely manage access to your dbt Cloud account.

## July 2024
- **Behavior change:** `target_schema` is no longer a required configuration for [snapshots](/docs/build/snapshots). You can now target different schemas for snapshots across development and deployment environments using the [schema config](/reference/resource-configs/schema).
- **New:** [Connections](/docs/cloud/connect-data-platform/about-connections#connection-management) are now available under **Account settings** as a global setting. Previously, they were found under **Project settings**. This is being rolled out in phases over the coming weeks.
- **New:** Admins can now assign [environment-level permissions](/docs/cloud/manage-access/environment-permissions) to groups for specific roles.
- **New:** [Merge jobs](/docs/deploy/merge-jobs) for implementing [continuous deployment (CD)](/docs/deploy/continuous-deployment) workflows are now GA in dbt Cloud. Previously, you had to either set up a custom GitHub action or manually build the changes every time a pull request is merged.
Expand Down
38 changes: 38 additions & 0 deletions website/docs/docs/use-dbt-semantic-layer/consume-metrics.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
---
title: "Consume metrics from your Semantic Layer"
description: "Learn how to query and consume metrics from your deployed dbt Semantic Layer using various tools and APIs."
sidebar_label: "Consume your metrics"
tags: [Semantic Layer]
pagination_next: "docs/use-dbt-semantic-layer/sl-faqs"
---

After [deploying](/docs/use-dbt-semantic-layer/deploy-sl) your dbt Semantic Layer, the next important (and fun!) step is querying and consuming the metrics you’ve defined. This page links to key resources that guide you through the process of consuming metrics across different integrations, APIs, and tools, using various different [query syntaxes](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata).

Once your Semantic Layer is deployed, you can start querying your metrics using a variety of tools and APIs. Here are the main resources to get you started:

### Available integrations

Integrate the dbt Semantic Layer with a variety of business intelligence (BI) tools and data platforms, enabling seamless metric queries within your existing workflows. Explore the following integrations:

- [Available integrations](/docs/cloud-integrations/avail-sl-integrations) — Review a wide range of partners such as Tableau, Google Sheets, Microsoft Excel, and more, where you can query your metrics directly from the dbt Semantic Layer.

### Query with APIs

To leverage the full power of the dbt Semantic Layer, you can use the dbt Semantic Layer APIs for querying metrics programmatically:
- [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) — Learn how to use the dbt Semantic Layer APIs to query metrics in downstream tools, ensuring consistent and reliable data metrics.
- [JDBC API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) — Dive into the syntax for querying metrics with the JDBC API, with examples and detailed instructions.
- [GraphQL API query syntax](/docs/dbt-cloud-apis/sl-graphql#querying) — Learn the syntax for querying metrics via the GraphQL API, including examples and detailed instructions.
- [Python SDK](/docs/dbt-cloud-apis/sl-python#usage-examples) — Use the Python SDK library to query metrics programmatically with Python.

### Query during development

For developers working within the dbt ecosystem, it’s essential to understand how to query metrics during the development phase using MetricFlow commands:
- [MetricFlow commands](/docs/build/metricflow-commands) — Learn how to use MetricFlow commands to query metrics directly during the development process, ensuring your metrics are correctly defined and working as expected.

## Next steps

After understanding the basics of querying metrics, consider optimizing your setup and ensuring the integrity of your metric definitions:

- [Optimize querying performance](/docs/use-dbt-semantic-layer/sl-cache) — Improve query speed and efficiency by using declarative caching techniques.
- [Validate semantic nodes in CI](/docs/deploy/ci-jobs#semantic-validations-in-ci) — Ensure that any changes to dbt models don’t break your metrics by validating semantic nodes in Continuous Integration (CI) jobs.
- [Build your metrics and semantic models](/docs/build/build-metrics-intro) — If you haven’t already, learn how to define and build your metrics and semantic models using your preferred development tool.
67 changes: 21 additions & 46 deletions website/docs/docs/use-dbt-semantic-layer/dbt-sl.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ id: dbt-sl
description: "Learn how the dbt Semantic Layer enables data teams to centrally define and query metrics."
sidebar_label: "About the dbt Semantic Layer"
tags: [Semantic Layer]
hide_table_of_contents: true
hide_table_of_contents: false
pagination_next: "guides/sl-snowflake-qs"
pagination_prev: null
---
Expand All @@ -15,7 +15,8 @@ Moving metric definitions out of the BI layer and into the modeling layer allows

Refer to the [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) or [Why we need a universal semantic layer](https://www.getdbt.com/blog/universal-semantic-layer/) blog post to learn more.

## Explore the dbt Semantic Layer
## Get started with the dbt Semantic Layer

<!-- this partial lives here: https://github.com/dbt-labs/docs.getdbt.com/website/snippets/_sl-plan-info. Use it on diff pages and to tailor the message depending which instance can access the SL and what product lifecycle we're in. -->

import Features from '/snippets/_sl-plan-info.md'
Expand All @@ -25,54 +26,28 @@ product="dbt Semantic Layer"
plan="dbt Cloud Team or Enterprise"
/>

<div className="grid--3-col">

<Card
title="Quickstart with the dbt Cloud Semantic Layer"
body="Build and define metrics, set up the dbt Semantic Layer, and query them using our first-class integrations."
link="/guides/sl-snowflake-qs"
icon="dbt-bit"/>

<Card
title="Set up the dbt Semantic Layer"
body="Set up the dbt Semantic Layer in dbt Cloud using intuitive navigation."
link="/docs/use-dbt-semantic-layer/setup-sl"
icon="dbt-bit"/>
This page points to various resources available to help you understand, configure, deploy, and integrate the dbt Semantic Layer. The following sections contain links to specific pages that explain each aspect in detail. Use these links to navigate directly to the information you need, whether you're setting up the Semantic Layer for the first time, deploying metrics, or integrating with downstream tools.

<Card
title="Architecture"
body="Learn about the powerful components that make up the dbt Semantic Layer."
link="/docs/use-dbt-semantic-layer/sl-architecture"
icon="dbt-bit"/>

<Card
title="Write queries with exports"
body="Use exports to write commonly used queries directly within your data platform, on a schedule."
link="/docs/use-dbt-semantic-layer/exports"
icon="dbt-bit"/>
Refer to the following resources to get started with the dbt Semantic Layer:
- [Quickstart with the dbt Cloud Semantic Layer](/guides/sl-snowflake-qs) &mdash; Build and define metrics, set up the dbt Semantic Layer, and query them using our first-class integrations.
- [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) &mdash; Discover answers to frequently asked questions about the dbt Semantic Layer, such as availability, integrations, and more.

<Card
title="Cache common queries"
body="Leverage result caching and declarative caching for common queries to speed up performance and reduce query computation."
link="/docs/use-dbt-semantic-layer/sl-cache"
icon="dbt-bit"/>
## Configure the dbt Semantic Layer

<Card
title="dbt Semantic Layer FAQs"
body="Discover answers to frequently asked questions about the dbt Semantic Layer, such as availability, integrations, and more."
link="/docs/use-dbt-semantic-layer/sl-faqs"
icon="dbt-bit"/>
The following resources provide information on how to configure the dbt Semantic Layer:
- [Set up the dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) &mdash; Learn how to set up the dbt Semantic Layer in dbt Cloud using intuitive navigation.
- [Architecture](/docs/use-dbt-semantic-layer/sl-architecture) &mdash; Explore the powerful components that make up the dbt Semantic Layer.

<Card
title="Available integrations"
body="Review a wide range of partners you can integrate and query with the dbt Semantic Layer."
link="/docs/cloud-integrations/avail-sl-integrations"
icon="dbt-bit"/>
## Deploy metrics
This section provides information on how to deploy the dbt Semantic Layer and materialize your metrics:
- [Deploy your Semantic Layer](/docs/use-dbt-semantic-layer/deploy-sl) &mdash; Run a dbt Cloud job to deploy the dbt Semantic Layer and materialize your metrics.
- [Write queries with exports](/docs/use-dbt-semantic-layer/exports) &mdash; Use exports to write commonly used queries directly within your data platform, on a schedule.
- [Cache common queries](/docs/use-dbt-semantic-layer/sl-cache) &mdash; Leverage result caching and declarative caching for common queries to speed up performance and reduce query computation.

<Card
title="dbt Semantic Layer APIs"
body="Use the dbt Semantic Layer APIs to query metrics in downstream tools for consistent, reliable data metrics."
link="/docs/dbt-cloud-apis/sl-api-overview"
icon="dbt-bit"/>
## Consume metrics and integrate
Consume metrics and integrate the dbt Semantic Layer with downstream tools and applications:
- [Consume metrics](/docs/use-dbt-semantic-layer/consume-metrics) &mdash; Query and consume metrics in downstream tools and applications using the dbt Semantic Layer.
- [Available integrations](/docs/cloud-integrations/avail-sl-integrations) &mdash; Review a wide range of partners you can integrate and query with the dbt Semantic Layer.
- [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) &mdash; Use the dbt Semantic Layer APIs to query metrics in downstream tools for consistent, reliable data metrics.

</div>
29 changes: 29 additions & 0 deletions website/docs/docs/use-dbt-semantic-layer/deploy-sl.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
---
title: "Deploy your metrics"
id: deploy-sl
description: "Deploy the dbt Semantic Layer in dbt Cloud by running a job to materialize your metrics."
sidebar_label: "Deploy your metrics"
tags: [Semantic Layer]
pagination_next: "docs/use-dbt-semantic-layer/exports"
---

<!-- The below snippet can be found in the following file locations in the docs code repository)
https://github.com/dbt-labs/docs.getdbt.com/blob/current/website/snippets/_sl-run-prod-job.md
-->

import RunProdJob from '/snippets/_sl-run-prod-job.md';

<RunProdJob/>

## Next steps
After you've executed a job and deployed your Semantic Layer:
- [Set up your Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) in dbt Cloud.
- Discover the [available integrations](/docs/cloud-integrations/avail-sl-integrations), such as Tableau, Google Sheets, Microsoft Excel, and more.
- Start querying your metrics with the [API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata).


## Related docs
- [Optimize querying performance](/docs/use-dbt-semantic-layer/sl-cache) using declarative caching.
- [Validate semantic nodes in CI](/docs/deploy/ci-jobs#semantic-validations-in-ci) to ensure code changes made to dbt models don't break these metrics.
- If you haven't already, learn how to [build you metrics and semantic models](/docs/build/build-metrics-intro) in your development tool of choice.
2 changes: 1 addition & 1 deletion website/docs/docs/use-dbt-semantic-layer/setup-sl.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
title: "Set up the dbt Semantic Layer"
id: setup-sl
description: "Seamlessly set up the dbt Semantic Layer in dbt Cloud using intuitive navigation."
sidebar_label: "Set up your Semantic Layer"
sidebar_label: "Set up the Semantic Layer"
tags: [Semantic Layer]
---

Expand Down
3 changes: 2 additions & 1 deletion website/docs/docs/use-dbt-semantic-layer/sl-architecture.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@ id: sl-architecture
description: "dbt Semantic Layer product architecture and related questions."
sidebar_label: "Semantic Layer architecture"
tags: [Semantic Layer]
pagination_next: null
pagination_next: "docs/use-dbt-semantic-layer/setup-sl"
pagination_prev: "guides/sl-snowflake-qs"
---

The dbt Semantic Layer allows you to define metrics and use various interfaces to query them. The Semantic Layer does the heavy lifting to find where the queried data exists in your data platform and generates the SQL to make the request (including performing joins).
Expand Down
9 changes: 0 additions & 9 deletions website/docs/guides/sl-snowflake-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -955,15 +955,6 @@ https://github.com/dbt-labs/docs.getdbt.com/blob/current/website/snippets/_sl-ru
<RunProdJob/>
<details>
<summary>What’s happening internally?</summary>
- Merging the code into your main branch allows dbt Cloud to pull those changes and build the definition in the manifest produced by the run. <br />
- Re-running the job in the deployment environment helps materialize the models, which the metrics depend on, in the data platform. It also makes sure that the manifest is up to date.<br />
- The Semantic Layer APIs pull in the most recent manifest and enables your integration to extract metadata from it.
</details>
## Set up dbt Semantic Layer
Expand Down
31 changes: 27 additions & 4 deletions website/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -586,10 +586,33 @@ const sidebarSettings = {
label: "Quickstart with the dbt Cloud Semantic Layer",
href: `/guides/sl-snowflake-qs`,
},
"docs/use-dbt-semantic-layer/setup-sl",
"docs/use-dbt-semantic-layer/sl-architecture",
"docs/use-dbt-semantic-layer/exports",
"docs/use-dbt-semantic-layer/sl-cache",
{
type: "category",
label: "Configure",
link: { type: "doc", id: "docs/use-dbt-semantic-layer/setup-sl" },
items: [
"docs/use-dbt-semantic-layer/setup-sl",
"docs/use-dbt-semantic-layer/sl-architecture",
]
},
{
type: "category",
label: "Deploy metrics",
link: { type: "doc", id: "docs/use-dbt-semantic-layer/deploy-sl" },
items: [
"docs/use-dbt-semantic-layer/deploy-sl",
"docs/use-dbt-semantic-layer/exports",
"docs/use-dbt-semantic-layer/sl-cache"
]
},
{
type: "category",
label: "Consume",
link: { type: "doc", id: "docs/use-dbt-semantic-layer/consume-metrics" },
items: [
"docs/use-dbt-semantic-layer/consume-metrics",
]
},
"docs/use-dbt-semantic-layer/sl-faqs",
],
},
Expand Down
27 changes: 20 additions & 7 deletions website/snippets/_sl-run-prod-job.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,22 @@
Once you’ve committed and merged your metric changes in your dbt project, you can perform a job run in your deployment environment in dbt Cloud to materialize your metrics. The deployment environment is only supported for the dbt Semantic Layer currently.
This section explains how you can perform a job run in your deployment environment in dbt Cloud to materialize and deploy your metrics. Currently, the deployment environment is only supported.

1. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher.
1. Once you’ve [defined your semantic models and metrics](/guides/sl-snowflake-qs?step=10), commit and merge your metric changes in your dbt project.
2. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher.
* Note &mdash; Deployment environment is currently supported (_development experience coming soon_)
2. To create a new environment, navigate to **Deploy** in the navigation menu, select **Environments**, and then select **Create new environment**.
3. Fill in your deployment credentials with your Snowflake username and password. You can name the schema anything you want. Click **Save** to create your new production environment.
4. [Create a new deploy job](/docs/deploy/deploy-jobs#create-and-schedule-jobs) that runs in the environment you just created. Go back to the **Deploy** menu, select **Jobs**, select **Create job**, and click **Deploy job**.
5. Set the job to run a `dbt build` and select the **Generate docs on run** checkbox.
6. Run the job and make sure it runs successfully.
3. To create a new environment, navigate to **Deploy** in the navigation menu, select **Environments**, and then select **Create new environment**.
4. Fill in your deployment credentials with your Snowflake username and password. You can name the schema anything you want. Click **Save** to create your new production environment.
5. [Create a new deploy job](/docs/deploy/deploy-jobs#create-and-schedule-jobs) that runs in the environment you just created. Go back to the **Deploy** menu, select **Jobs**, select **Create job**, and click **Deploy job**.
6. Set the job to run a `dbt parse` job to parse your projects and generate a [`semantic_manifest.json` artifact](/docs/dbt-cloud-apis/sl-manifest) file. Although running `dbt build` isn't required, you can choose to do so if needed.
7. Run the job by clicking the **Run now** button. Monitor the job's progress in real-time through the **Run summary** tab.

Once the job completes successfully, your dbt project, including the generated documentation, will be fully deployed and available for use in your production environment. If any issues arise, review the logs to diagnose and address any errors.

<details>

<summary>What’s happening internally?</summary>

- Merging the code into your main branch allows dbt Cloud to pull those changes and build the definition in the manifest produced by the run. <br />
- Re-running the job in the deployment environment helps materialize the models, which the metrics depend on, in the data platform. It also makes sure that the manifest is up to date.<br />
- The Semantic Layer APIs pull in the most recent manifest and enables your integration to extract metadata from it.

</details>

0 comments on commit fbac018

Please sign in to comment.