diff --git a/website/docs/docs/cloud-integrations/avail-sl-integrations.md b/website/docs/docs/cloud-integrations/avail-sl-integrations.md
index eea93c92b93..04d9d55acb4 100644
--- a/website/docs/docs/cloud-integrations/avail-sl-integrations.md
+++ b/website/docs/docs/cloud-integrations/avail-sl-integrations.md
@@ -20,7 +20,7 @@ import AvailIntegrations from '/snippets/_sl-partner-links.md';
### Custom integration
- [Exports](/docs/use-dbt-semantic-layer/exports) enable custom integration with additional tools that don't natively connect with the dbt Semantic Layer, such as PowerBI.
-- Develop custom integrations using different languages and tools, supported through JDBC, ADBC, and GraphQL APIs. For more info, check out [our examples on GitHub](https://github.com/dbt-labs/example-semantic-layer-clients/).
+- [Consume metrics](/docs/use-dbt-semantic-layer/consume-metrics) and develop custom integrations using different languages and tools, supported through [JDBC](/docs/dbt-cloud-apis/sl-jdbc), ADBC, and [GraphQL](/docs/dbt-cloud-apis/sl-graphql) APIs, and [Python SDK library](/docs/dbt-cloud-apis/sl-python). For more info, check out [our examples on GitHub](https://github.com/dbt-labs/example-semantic-layer-clients/).
- Connect to any tool that supports SQL queries. These tools must meet one of the two criteria:
- Offers a generic JDBC driver option (such as DataGrip) or
- Is compatible Arrow Flight SQL JDBC driver version 12.0.0 or higher.
diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md
index eb36037cab4..2530c49c1b8 100644
--- a/website/docs/docs/dbt-versions/release-notes.md
+++ b/website/docs/docs/dbt-versions/release-notes.md
@@ -25,6 +25,7 @@ Release notes are grouped by month for both multi-tenant and virtual private clo
- **Behavior change:** GitHub is no longer supported for OAuth login to dbt Cloud. Use a supported [SSO or OAuth provider](/docs/cloud/manage-access/sso-overview) to securely manage access to your dbt Cloud account.
## July 2024
+- **Behavior change:** `target_schema` is no longer a required configuration for [snapshots](/docs/build/snapshots). You can now target different schemas for snapshots across development and deployment environments using the [schema config](/reference/resource-configs/schema).
- **New:** [Connections](/docs/cloud/connect-data-platform/about-connections#connection-management) are now available under **Account settings** as a global setting. Previously, they were found under **Project settings**. This is being rolled out in phases over the coming weeks.
- **New:** Admins can now assign [environment-level permissions](/docs/cloud/manage-access/environment-permissions) to groups for specific roles.
- **New:** [Merge jobs](/docs/deploy/merge-jobs) for implementing [continuous deployment (CD)](/docs/deploy/continuous-deployment) workflows are now GA in dbt Cloud. Previously, you had to either set up a custom GitHub action or manually build the changes every time a pull request is merged.
diff --git a/website/docs/docs/use-dbt-semantic-layer/consume-metrics.md b/website/docs/docs/use-dbt-semantic-layer/consume-metrics.md
new file mode 100644
index 00000000000..c55b4bcb632
--- /dev/null
+++ b/website/docs/docs/use-dbt-semantic-layer/consume-metrics.md
@@ -0,0 +1,38 @@
+---
+title: "Consume metrics from your Semantic Layer"
+description: "Learn how to query and consume metrics from your deployed dbt Semantic Layer using various tools and APIs."
+sidebar_label: "Consume your metrics"
+tags: [Semantic Layer]
+pagination_next: "docs/use-dbt-semantic-layer/sl-faqs"
+---
+
+After [deploying](/docs/use-dbt-semantic-layer/deploy-sl) your dbt Semantic Layer, the next important (and fun!) step is querying and consuming the metrics you’ve defined. This page links to key resources that guide you through the process of consuming metrics across different integrations, APIs, and tools, using various different [query syntaxes](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata).
+
+Once your Semantic Layer is deployed, you can start querying your metrics using a variety of tools and APIs. Here are the main resources to get you started:
+
+### Available integrations
+
+Integrate the dbt Semantic Layer with a variety of business intelligence (BI) tools and data platforms, enabling seamless metric queries within your existing workflows. Explore the following integrations:
+
+- [Available integrations](/docs/cloud-integrations/avail-sl-integrations) — Review a wide range of partners such as Tableau, Google Sheets, Microsoft Excel, and more, where you can query your metrics directly from the dbt Semantic Layer.
+
+### Query with APIs
+
+To leverage the full power of the dbt Semantic Layer, you can use the dbt Semantic Layer APIs for querying metrics programmatically:
+- [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) — Learn how to use the dbt Semantic Layer APIs to query metrics in downstream tools, ensuring consistent and reliable data metrics.
+ - [JDBC API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) — Dive into the syntax for querying metrics with the JDBC API, with examples and detailed instructions.
+ - [GraphQL API query syntax](/docs/dbt-cloud-apis/sl-graphql#querying) — Learn the syntax for querying metrics via the GraphQL API, including examples and detailed instructions.
+ - [Python SDK](/docs/dbt-cloud-apis/sl-python#usage-examples) — Use the Python SDK library to query metrics programmatically with Python.
+
+### Query during development
+
+For developers working within the dbt ecosystem, it’s essential to understand how to query metrics during the development phase using MetricFlow commands:
+- [MetricFlow commands](/docs/build/metricflow-commands) — Learn how to use MetricFlow commands to query metrics directly during the development process, ensuring your metrics are correctly defined and working as expected.
+
+## Next steps
+
+After understanding the basics of querying metrics, consider optimizing your setup and ensuring the integrity of your metric definitions:
+
+- [Optimize querying performance](/docs/use-dbt-semantic-layer/sl-cache) — Improve query speed and efficiency by using declarative caching techniques.
+- [Validate semantic nodes in CI](/docs/deploy/ci-jobs#semantic-validations-in-ci) — Ensure that any changes to dbt models don’t break your metrics by validating semantic nodes in Continuous Integration (CI) jobs.
+- [Build your metrics and semantic models](/docs/build/build-metrics-intro) — If you haven’t already, learn how to define and build your metrics and semantic models using your preferred development tool.
diff --git a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md
index 73e39589587..e09a68b97c4 100644
--- a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md
+++ b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md
@@ -4,7 +4,7 @@ id: dbt-sl
description: "Learn how the dbt Semantic Layer enables data teams to centrally define and query metrics."
sidebar_label: "About the dbt Semantic Layer"
tags: [Semantic Layer]
-hide_table_of_contents: true
+hide_table_of_contents: false
pagination_next: "guides/sl-snowflake-qs"
pagination_prev: null
---
@@ -15,7 +15,8 @@ Moving metric definitions out of the BI layer and into the modeling layer allows
Refer to the [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) or [Why we need a universal semantic layer](https://www.getdbt.com/blog/universal-semantic-layer/) blog post to learn more.
-## Explore the dbt Semantic Layer
+## Get started with the dbt Semantic Layer
+
import Features from '/snippets/_sl-plan-info.md'
@@ -25,54 +26,28 @@ product="dbt Semantic Layer"
plan="dbt Cloud Team or Enterprise"
/>
-
-
-
-
-
+This page points to various resources available to help you understand, configure, deploy, and integrate the dbt Semantic Layer. The following sections contain links to specific pages that explain each aspect in detail. Use these links to navigate directly to the information you need, whether you're setting up the Semantic Layer for the first time, deploying metrics, or integrating with downstream tools.
-
-
+Refer to the following resources to get started with the dbt Semantic Layer:
+- [Quickstart with the dbt Cloud Semantic Layer](/guides/sl-snowflake-qs) — Build and define metrics, set up the dbt Semantic Layer, and query them using our first-class integrations.
+- [dbt Semantic Layer FAQs](/docs/use-dbt-semantic-layer/sl-faqs) — Discover answers to frequently asked questions about the dbt Semantic Layer, such as availability, integrations, and more.
-
+## Configure the dbt Semantic Layer
-
+The following resources provide information on how to configure the dbt Semantic Layer:
+- [Set up the dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) — Learn how to set up the dbt Semantic Layer in dbt Cloud using intuitive navigation.
+- [Architecture](/docs/use-dbt-semantic-layer/sl-architecture) — Explore the powerful components that make up the dbt Semantic Layer.
-
+## Deploy metrics
+This section provides information on how to deploy the dbt Semantic Layer and materialize your metrics:
+- [Deploy your Semantic Layer](/docs/use-dbt-semantic-layer/deploy-sl) — Run a dbt Cloud job to deploy the dbt Semantic Layer and materialize your metrics.
+- [Write queries with exports](/docs/use-dbt-semantic-layer/exports) — Use exports to write commonly used queries directly within your data platform, on a schedule.
+- [Cache common queries](/docs/use-dbt-semantic-layer/sl-cache) — Leverage result caching and declarative caching for common queries to speed up performance and reduce query computation.
-
+## Consume metrics and integrate
+Consume metrics and integrate the dbt Semantic Layer with downstream tools and applications:
+- [Consume metrics](/docs/use-dbt-semantic-layer/consume-metrics) — Query and consume metrics in downstream tools and applications using the dbt Semantic Layer.
+- [Available integrations](/docs/cloud-integrations/avail-sl-integrations) — Review a wide range of partners you can integrate and query with the dbt Semantic Layer.
+- [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) — Use the dbt Semantic Layer APIs to query metrics in downstream tools for consistent, reliable data metrics.
-
diff --git a/website/docs/docs/use-dbt-semantic-layer/deploy-sl.md b/website/docs/docs/use-dbt-semantic-layer/deploy-sl.md
new file mode 100644
index 00000000000..637fa41a3c3
--- /dev/null
+++ b/website/docs/docs/use-dbt-semantic-layer/deploy-sl.md
@@ -0,0 +1,29 @@
+---
+title: "Deploy your metrics"
+id: deploy-sl
+description: "Deploy the dbt Semantic Layer in dbt Cloud by running a job to materialize your metrics."
+sidebar_label: "Deploy your metrics"
+tags: [Semantic Layer]
+pagination_next: "docs/use-dbt-semantic-layer/exports"
+---
+
+
+
+import RunProdJob from '/snippets/_sl-run-prod-job.md';
+
+
+
+## Next steps
+After you've executed a job and deployed your Semantic Layer:
+- [Set up your Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) in dbt Cloud.
+- Discover the [available integrations](/docs/cloud-integrations/avail-sl-integrations), such as Tableau, Google Sheets, Microsoft Excel, and more.
+- Start querying your metrics with the [API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata).
+
+
+## Related docs
+- [Optimize querying performance](/docs/use-dbt-semantic-layer/sl-cache) using declarative caching.
+- [Validate semantic nodes in CI](/docs/deploy/ci-jobs#semantic-validations-in-ci) to ensure code changes made to dbt models don't break these metrics.
+- If you haven't already, learn how to [build you metrics and semantic models](/docs/build/build-metrics-intro) in your development tool of choice.
diff --git a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md
index adad5bd9fd1..4f1519280c4 100644
--- a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md
+++ b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md
@@ -2,7 +2,7 @@
title: "Set up the dbt Semantic Layer"
id: setup-sl
description: "Seamlessly set up the dbt Semantic Layer in dbt Cloud using intuitive navigation."
-sidebar_label: "Set up your Semantic Layer"
+sidebar_label: "Set up the Semantic Layer"
tags: [Semantic Layer]
---
diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md
index 2062f9e405e..ef85a92a338 100644
--- a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md
+++ b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md
@@ -4,7 +4,8 @@ id: sl-architecture
description: "dbt Semantic Layer product architecture and related questions."
sidebar_label: "Semantic Layer architecture"
tags: [Semantic Layer]
-pagination_next: null
+pagination_next: "docs/use-dbt-semantic-layer/setup-sl"
+pagination_prev: "guides/sl-snowflake-qs"
---
The dbt Semantic Layer allows you to define metrics and use various interfaces to query them. The Semantic Layer does the heavy lifting to find where the queried data exists in your data platform and generates the SQL to make the request (including performing joins).
diff --git a/website/docs/guides/sl-snowflake-qs.md b/website/docs/guides/sl-snowflake-qs.md
index 7d42aecabc2..fb72ee0057e 100644
--- a/website/docs/guides/sl-snowflake-qs.md
+++ b/website/docs/guides/sl-snowflake-qs.md
@@ -955,15 +955,6 @@ https://github.com/dbt-labs/docs.getdbt.com/blob/current/website/snippets/_sl-ru
-
-
-What’s happening internally?
-
-- Merging the code into your main branch allows dbt Cloud to pull those changes and build the definition in the manifest produced by the run.
-- Re-running the job in the deployment environment helps materialize the models, which the metrics depend on, in the data platform. It also makes sure that the manifest is up to date.
-- The Semantic Layer APIs pull in the most recent manifest and enables your integration to extract metadata from it.
-
-
## Set up dbt Semantic Layer
diff --git a/website/sidebars.js b/website/sidebars.js
index 69ba286228d..d52be178820 100644
--- a/website/sidebars.js
+++ b/website/sidebars.js
@@ -586,10 +586,33 @@ const sidebarSettings = {
label: "Quickstart with the dbt Cloud Semantic Layer",
href: `/guides/sl-snowflake-qs`,
},
- "docs/use-dbt-semantic-layer/setup-sl",
- "docs/use-dbt-semantic-layer/sl-architecture",
- "docs/use-dbt-semantic-layer/exports",
- "docs/use-dbt-semantic-layer/sl-cache",
+ {
+ type: "category",
+ label: "Configure",
+ link: { type: "doc", id: "docs/use-dbt-semantic-layer/setup-sl" },
+ items: [
+ "docs/use-dbt-semantic-layer/setup-sl",
+ "docs/use-dbt-semantic-layer/sl-architecture",
+ ]
+ },
+ {
+ type: "category",
+ label: "Deploy metrics",
+ link: { type: "doc", id: "docs/use-dbt-semantic-layer/deploy-sl" },
+ items: [
+ "docs/use-dbt-semantic-layer/deploy-sl",
+ "docs/use-dbt-semantic-layer/exports",
+ "docs/use-dbt-semantic-layer/sl-cache"
+ ]
+ },
+ {
+ type: "category",
+ label: "Consume",
+ link: { type: "doc", id: "docs/use-dbt-semantic-layer/consume-metrics" },
+ items: [
+ "docs/use-dbt-semantic-layer/consume-metrics",
+ ]
+ },
"docs/use-dbt-semantic-layer/sl-faqs",
],
},
diff --git a/website/snippets/_sl-run-prod-job.md b/website/snippets/_sl-run-prod-job.md
index 8eb4049efc8..f820b7f3f79 100644
--- a/website/snippets/_sl-run-prod-job.md
+++ b/website/snippets/_sl-run-prod-job.md
@@ -1,9 +1,22 @@
-Once you’ve committed and merged your metric changes in your dbt project, you can perform a job run in your deployment environment in dbt Cloud to materialize your metrics. The deployment environment is only supported for the dbt Semantic Layer currently.
+This section explains how you can perform a job run in your deployment environment in dbt Cloud to materialize and deploy your metrics. Currently, the deployment environment is only supported.
-1. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher.
+1. Once you’ve [defined your semantic models and metrics](/guides/sl-snowflake-qs?step=10), commit and merge your metric changes in your dbt project.
+2. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher.
* Note — Deployment environment is currently supported (_development experience coming soon_)
-2. To create a new environment, navigate to **Deploy** in the navigation menu, select **Environments**, and then select **Create new environment**.
-3. Fill in your deployment credentials with your Snowflake username and password. You can name the schema anything you want. Click **Save** to create your new production environment.
-4. [Create a new deploy job](/docs/deploy/deploy-jobs#create-and-schedule-jobs) that runs in the environment you just created. Go back to the **Deploy** menu, select **Jobs**, select **Create job**, and click **Deploy job**.
-5. Set the job to run a `dbt build` and select the **Generate docs on run** checkbox.
-6. Run the job and make sure it runs successfully.
+3. To create a new environment, navigate to **Deploy** in the navigation menu, select **Environments**, and then select **Create new environment**.
+4. Fill in your deployment credentials with your Snowflake username and password. You can name the schema anything you want. Click **Save** to create your new production environment.
+5. [Create a new deploy job](/docs/deploy/deploy-jobs#create-and-schedule-jobs) that runs in the environment you just created. Go back to the **Deploy** menu, select **Jobs**, select **Create job**, and click **Deploy job**.
+6. Set the job to run a `dbt parse` job to parse your projects and generate a [`semantic_manifest.json` artifact](/docs/dbt-cloud-apis/sl-manifest) file. Although running `dbt build` isn't required, you can choose to do so if needed.
+7. Run the job by clicking the **Run now** button. Monitor the job's progress in real-time through the **Run summary** tab.
+
+ Once the job completes successfully, your dbt project, including the generated documentation, will be fully deployed and available for use in your production environment. If any issues arise, review the logs to diagnose and address any errors.
+
+
+
+What’s happening internally?
+
+- Merging the code into your main branch allows dbt Cloud to pull those changes and build the definition in the manifest produced by the run.
+- Re-running the job in the deployment environment helps materialize the models, which the metrics depend on, in the data platform. It also makes sure that the manifest is up to date.
+- The Semantic Layer APIs pull in the most recent manifest and enables your integration to extract metadata from it.
+
+