diff --git a/website/docs/docs/collaborate/data-tile.md b/website/docs/docs/collaborate/data-tile.md
index f40f21ebe18..446922acb92 100644
--- a/website/docs/docs/collaborate/data-tile.md
+++ b/website/docs/docs/collaborate/data-tile.md
@@ -9,9 +9,11 @@ image: /img/docs/collaborate/dbt-explorer/data-tile-pass.jpg
# Embed data health tile in dashboards
With data health tiles, stakeholders will get an at-a-glance confirmation on whether the data they’re looking at is stale or degraded. This trust signal allows teams to immediately go back into Explorer to see more details and investigate issues.
+
:::info Available in beta
Data health tile is currently available in open beta.
:::
+
The data health tile:
- Distills trust signals for data consumers.
@@ -19,6 +21,8 @@ The data health tile:
- Provides richer information and makes it easier to debug.
- Revamps the existing, [job-based tiles](#job-based-data-health).
+Data health tiles rely on [exposures](/docs/build/exposures) to surface trust signals in your dashboards. When you configure exposures in your dbt project, you are explicitly defining how specific outputs—like dashboards or reports—depend on your data models.
+
## Prerequisites
@@ -34,43 +38,45 @@ First, be sure to enable [source freshness](/docs/deploy/source-freshness) in
1. Navigate to dbt Explorer by clicking on the **Explore** link in the navigation.
2. In the main **Overview** page, go to the left navigation.
-3. Under the **Resources** tab, click on **Exposures** to view the exposures list.
+3. Under the **Resources** tab, click on **Exposures** to view the [exposures](/docs/build/exposures) list.
4. Select a dashboard exposure and go to the **General** tab to view the data health information.
-5. In this tab, you’ll see:
- - Data health status: Data freshness passed, Data quality passed, Data may be stale, Data quality degraded
- - Name of the exposure.
+5. In this tab, you’ll see:
+ - Name of the exposure.
+ - Data health status: Data freshness passed, Data quality passed, Data may be stale, Data quality degraded.
- Resource type (model, source, and so on).
- Dashboard status: Failure, Pass, Stale.
- You can also see the last check completed, the last check time, and the last check duration.
-6. You can also click the **Open Dashboard** button on the upper right to immediately view this in your analytics tool.
+6. You can click the **Open Dashboard** button on the upper right to immediately view this in your analytics tool.
## Embed in your dashboard
-Once you’ve navigated to the auto-exposure in dbt Explorer, you’ll need to set up your dashboard status tile and [service token](/docs/dbt-cloud-apis/service-tokens):
+Once you’ve navigated to the auto-exposure in dbt Explorer, you’ll need to set up your data health tile and [service token](/docs/dbt-cloud-apis/service-tokens). You can embed data health tile to any analytics tool that supports URL or iFrame embedding.
+
+Follow these steps to set up your data health tile:
1. Go to **Account settings** in dbt Cloud.
2. Select **API tokens** in the left sidebar and then **Service tokens**.
3. Click on **Create service token** and give it a name.
-4. Select the [**Metadata Only** permission](/docs/dbt-cloud-apis/service-tokens). This token will be used to embed the exposure tile in your dashboard in the later steps.
+4. Select the [**Metadata Only**](/docs/dbt-cloud-apis/service-tokens) permission. This token will be used to embed the tile in your dashboard in the later steps.
-5. Copy the **Metadata Only token** and save it in a secure location. You'll need it token in the next steps.
+5. Copy the **Metadata Only** token and save it in a secure location. You'll need it token in the next steps.
6. Navigate back to dbt Explorer and select an exposure.
7. Below the **Data health** section, expand on the toggle for instructions on how to embed the exposure tile (if you're an account admin with develop permissions).
8. In the expanded toggle, you'll see a text field where you can paste your **Metadata Only token**.
-9. Once you’ve pasted your token, you can select either **URL** or **iFrame** depending on which you need to install into your dashboard.
+9. Once you’ve pasted your token, you can select either **URL** or **iFrame** depending on which you need to add to your dashboard.
If your analytics tool supports iFrames, you can embed the dashboard tile within it.
-### Embed data health tile in Tableau
-To embed the data health tile in Tableau, follow these steps:
+#### Tableau example
+Here’s an example with Tableau, where you can embed the iFrame in a web page object:
-1. Ensure you've copied the embed iFrame content in dbt Explorer.
-2. For the revamped environment-based exposure tile you can insert these fields into the following iFrame, and then embed them with your dashboard. This is the iFrame that is available from the **Exposure details** page in dbt Explorer.
+- Ensure you've copied the embed iFrame snippet from the dbt Explorer **Data health** section.
+- **For the revamped environment-based exposure tile** — Insert the following fields into the following iFrame. Then embed them with your dashboard. This is the iFrame available from the **Exposure details** page in dbt Explorer.
``
@@ -82,7 +88,7 @@ To embed the data health tile in Tableau, follow these steps:
-3. For the job-based exposure tile you can insert these three fields into the following iFrame, and then embed them with your dashboard. The next section will have more details on the job-based exposure tile.
+- **For job-based exposure tile** — Insert the following fields into the following iFrame. Then embed them with your dashboard. The next [section](#job-based-data-health) will have more details on the job-based exposure tile.
``
diff --git a/website/docs/docs/collaborate/explore-projects.md b/website/docs/docs/collaborate/explore-projects.md
index e60d019bf2e..1c469409e4f 100644
--- a/website/docs/docs/collaborate/explore-projects.md
+++ b/website/docs/docs/collaborate/explore-projects.md
@@ -29,7 +29,7 @@ Navigate the dbt Explorer overview page to access your project's resources and m
- **Lineage graph** — Explore your project's or account's [lineage graph](#project-lineage) to visualize the relationships between resources.
- **Latest updates** — View the latest changes or issues related to your project's resources, including the most recent job runs, changed properties, lineage, and issues.
- **Marts and public models** — View the [marts](/best-practices/how-we-structure/1-guide-overview#guide-structure-overview) and [public models](/docs/collaborate/govern/model-access#access-modifiers) in your project.
-- **Model query history** — Use [model query history](/docs/collaborate/model-query-history) to track the history of queries on your models for deeper insights.
+- **Model query history** — Use [model query history](/docs/collaborate/model-query-history) to track consumption queries on your models for deeper insights.
- **Auto-exposures** — [Set up and view auto-exposures](/docs/collaborate/auto-exposures) to automatically expose relevant data models from Tableau to enhance visibility.
diff --git a/website/docs/docs/collaborate/model-query-history.md b/website/docs/docs/collaborate/model-query-history.md
index ee7695e3ab9..d8e08bf63da 100644
--- a/website/docs/docs/collaborate/model-query-history.md
+++ b/website/docs/docs/collaborate/model-query-history.md
@@ -7,14 +7,18 @@ image: /img/docs/collaborate/dbt-explorer/model-query-queried-models.jpg
# About model query history
-The model query history tile allows you to:
+Model query history allows you to:
-- View the query count for a model based on the data warehouse's query logs.
+- View the count of consumption queries for a model based on the data warehouse's query logs.
- Provides data teams insight, so they can focus their time and infrastructure spend on the worthwhile used data products.
- Enable analysts to find the most popular models used by other people.
-:::info Available in beta
-Model query history is powered by a single query of the query log table in your data warehouse aggregated on a daily basis. It filters down to `select` statements only to gauge model consumption and excludes dbt model build and test executions.
+Model query history is powered by a single consumption query of the query log table in your data warehouse aggregated on a daily basis.
+
+:::info What is a consumption query?
+Consumption query is a metric of queries in your dbt project that has used the model in a given time. It filters down to `select` statements only to gauge model consumption and excludes dbt model build and test executions.
+
+So for example, if `model_super_santi` was queried 10 times in the past week, it would count as having 10 consumption queries for that particular time period.
:::
## Prerequisites
@@ -72,31 +76,35 @@ During beta, the dbt Labs team will manually enable query history for your dbt C
## View query history in Explorer
-To enhance your discovery, you can view your model query history in various locations within dbt Explorer. For details on how to access model query history in each location, expand the following toggles:
+To enhance your discovery, you can view your model query history in various locations within dbt Explorer:
+- [View from Performance charts](#view-from-performance-charts)
+* [View from Project lineage](#view-from-project-lineage)
+- [View from Model list](#view-from-model-list)
### View from Performance charts
1. Navigate to dbt Explorer by clicking on the **Explore** link in the navigation.
-2. In the main **Overview** page, under **Project** click **Performance** and scroll down to view the most queried models
+2. In the main **Overview** page, click on **Performance** under the **Project details** section. Scroll down to view the **Most consumed models**.
3. Use the dropdown menu on the right to select the desired time period, with options available for up to the past 3 months.
-
+
-4. In the model performance tab, open the **Usage** chart to see queries over time for that model.
-
+4. Click on a model for more details and go to the **Performance** tab.
+5. On the **Performance** tab, scroll down to the **Model performance** section.
+6. Select the **Consumption queries** tab to view the consumption queries over a given time for that model.
+
### View from Project lineage
1. To view your model in your project lineage, go to the main **Overview page** and click on **Project lineage.**
-2. In the lower left of your lineage, click on **Lenses** and select **Usage queries**.
-
+2. In the lower left of your lineage, click on **Lenses** and select **Consumption queries**.
+
-3. Your lineage should display a small red box above each model, indicating the usage query number for each model. The query number for each model represents the query history over the last 30 days.
+3. Your lineage should display a small red box above each model, indicating the consumption query number. The number for each model represents the model consumption over the last 30 days.
### View from Model list
-1. To view your model in your project lineage, go to the main **Overview page**.
+1. To view a list of models, go to the main **Overview page**.
2. In the left navigation, go to the **Resources** tab and click on **Models** to view the models list.
-3. You can view the usage query count for the models and sort by most or least queried. The query number for each model represents the query history over the last 30 days.
-
-
+3. You can view the consumption query count for the models and sort by most or least consumed. The consumption query number for each model represents the consumption over the last 30 days.
+
diff --git a/website/docs/docs/dbt-versions/release-notes.md b/website/docs/docs/dbt-versions/release-notes.md
index 1ab09b3839c..eb36037cab4 100644
--- a/website/docs/docs/dbt-versions/release-notes.md
+++ b/website/docs/docs/dbt-versions/release-notes.md
@@ -19,8 +19,10 @@ Release notes are grouped by month for both multi-tenant and virtual private clo
\* The official release date for this new format of release notes is May 15th, 2024. Historical release notes for prior dates may not reflect all available features released earlier this year or their tenancy availability.
## August 2024
-- **Behavior change:** GitHub is no longer supported for OAuth login to dbt Cloud. Use a supported [SSO or OAuth provider](/docs/cloud/manage-access/sso-overview) to securely manage access to your dbt Cloud account.
+- **New**: Configure metrics at finer time grains, such as an hour, minute, or even by the second. This is particularly useful for more detailed analysis and for datasets where high-resolution time data is required, such as minute-by-minute event tracking. Refer to [dimensions](/docs/build/dimensions) for more information about time granularity.
- **New**: You can now configure metrics at granularities at finer time grains, such as hour, minute, or even by the second. This is particularly useful for more detailed analysis and for datasets where high-resolution time data is required, such as minute-by-minute event tracking. Refer to [dimensions](/docs/build/dimensions) for more information about time granularity.
+- **Enhancement**: Microsoft Excel now supports [saved selections](/docs/cloud-integrations/semantic-layer/excel#using-saved-selections) and [saved queries](/docs/cloud-integrations/semantic-layer/excel#using-saved-queries). Use Saved selections to save your query selections within the Excel application. The application also clears stale data in [trailing rows](/docs/cloud-integrations/semantic-layer/excel#other-settings) by default. To return your results and keep any previously selected data intact, un-select the **Clear trailing rows** option.
+- **Behavior change:** GitHub is no longer supported for OAuth login to dbt Cloud. Use a supported [SSO or OAuth provider](/docs/cloud/manage-access/sso-overview) to securely manage access to your dbt Cloud account.
## July 2024
- **New:** [Connections](/docs/cloud/connect-data-platform/about-connections#connection-management) are now available under **Account settings** as a global setting. Previously, they were found under **Project settings**. This is being rolled out in phases over the coming weeks.
diff --git a/website/docs/guides/core-cloud-2.md b/website/docs/guides/core-cloud-2.md
index 93e9e92bfa4..fcc88850b55 100644
--- a/website/docs/guides/core-cloud-2.md
+++ b/website/docs/guides/core-cloud-2.md
@@ -182,6 +182,7 @@ This guide should now have given you some insight and equipped you with a framew
+
Congratulations on finishing this guide, we hope it's given you insight into the considerations you need to take to best plan your move to dbt Cloud.
For the next steps, you can continue exploring our 3-part-guide series on moving from dbt Core to dbt Cloud:
diff --git a/website/snippets/_new-sl-setup.md b/website/snippets/_new-sl-setup.md
index ed0fa86f8b2..b9c64bc36f6 100644
--- a/website/snippets/_new-sl-setup.md
+++ b/website/snippets/_new-sl-setup.md
@@ -13,32 +13,41 @@ Select the environment where you want to enable the Semantic Layer:
2. On the **Settings** left sidebar, select the specific project you want to enable the Semantic Layer for.
3. In the **Project details** page, navigate to the **Semantic Layer** section. Select **Configure Semantic Layer**.
-
+
4. In the **Set Up Semantic Layer Configuration** page, select the deployment environment you want for the Semantic Layer and click **Save**. This provides administrators with the flexibility to choose the environment where the Semantic Layer will be enabled.
-:::tip dbt Cloud Enterprise can skip to [Add more credentials](#4-add-more-credentials)
-dbt Cloud Enterprise plans can add multiple credentials and have a different set up. Skip to [Add more credentials](#4-add-more-credentials) for more configuration details.
-:::
+
### 2. Add a credential and create service tokens
-The dbt Semantic Layer uses [service tokens](/docs/dbt-cloud-apis/service-tokens) for authentication which are tied to an underlying data platform credential that you configure. The credential configured is used to execute queries that the Semantic Layer issues against your data platform. This credential controls the physical access to underlying data accessed by the Semantic Layer, and all access policies set in the data platform for this credential will be respected.
+The dbt Semantic Layer uses [service tokens](/docs/dbt-cloud-apis/service-tokens) for authentication which are tied to an underlying data platform credential that you configure. The credential configured is used to execute queries that the Semantic Layer issues against your data platform.
+
+This credential controls the physical access to underlying data accessed by the Semantic Layer, and all access policies set in the data platform for this credential will be respected.
+
+| Feature | Team plan | Enterprise plan |
+| --- | :---: | :---: |
+| Service tokens | Can create multiple service tokens linked to one credential. | Can use multiple credentials and link multiple service tokens to each credential. Note that you cannot link a single service token to more than one credential. |
+| Credentials per project | One credential per project. | Can [add multiple](#4-add-more-credentials) credentials per project. |
+| Link multiple service tokens to a single credential | ✅ | ✅ |
-dbt Cloud Enterprise plans can add multiple credentials and map those to service tokens. Refer to [Add more credentials](#4-add-more-credentials) for more information.
+*If you're on a Team plan and need to add more credentials, consider upgrading to our [Enterprise plan](https://www.getdbt.com/contact). Enterprise users can refer to [Add more credentials](#4-add-more-credentials) for detailed steps on adding multiple credentials.*
-1. In the **Set Up Semantic Layer Configuration** page, enter the credentials specific to your data platform that you want the Semantic Layer to use.
+1. After selecting the deployment environment, you should see the **Credentials & service tokens** page.
+2. Click the **Add Semantic Layer credential** button.
+3. In the **1. Add credentials** section, enter the credentials specific to your data platform that you want the Semantic Layer to use.
- Use credentials with minimal privileges. The Semantic Layer requires read access to the schema(s) containing the dbt models used in your semantic models for downstream applications
- Note, environment variables such as `{{env_var('DBT_WAREHOUSE') }}`, aren't supported in the dbt Semantic Layer yet. You must use the actual credentials.
-
-1. Create a **Service Token** after you add the credential.
- * Enterprise plans: Name and generate a service token on the credential page directly.
- * Team plans: You can return to the **Project Details** page and click the **Generate a Service Token** button.
-2. Name the token and save it. Once the token is generated, you won't be able to view this token again so make sure to record it somewhere safe.
+
+
+4. After adding credentials, scroll to **2. Map new service token**.
+5. Name the token and ensure the permission set includes 'Semantic Layer Only' and 'Metadata Only'.
+6. Click **Save**. Once the token is generated, you won't be able to view this token again so make sure to record it somewhere safe.
:::info
-Teams plans can create multiple service tokens that map to one underlying credential. Adding [multiple credentials](#4-add-more-credentials) for tailored access is available for Enterprise plans.
+- Team plans can create multiple service tokens that link to a single underlying credential, but each project can only have one credential.
+- Enterprise plans can [add multiple credentials](#4-add-more-credentials) and map those to service tokens for tailored access.
Book a free live demo to discover the full potential of dbt Cloud Enterprise.
:::
@@ -63,20 +72,35 @@ Note that:
To add multiple credentials and map them to service tokens:
-1. After configuring your environment, on the **Credentials & service tokens** page click the **Add Semantic Layer credential** button to configure the credential for your data platform.
-2. On the **Create New Semantic Layer Credential** page, you can create multiple credentials and map them to a service token.
-3. In the **Add credentials** section, fill in the data platform's credential fields. We recommend using “read-only” credentials.
+1. After configuring your environment, on the **Credentials & service tokens** page, click the **Add Semantic Layer credential** button to create multiple credentials and map them to a service token.
+2. In the **1. Add credentials** section, fill in the data platform's credential fields. We recommend using “read-only” credentials.
-4. In the **Map new service token** section, map a service token to the credential you configured in the previous step. dbt Cloud automatically selects the service token permission set you need (Semantic Layer Only and Metadata Only).
- - To add another service token, click **Add service token** under the **Linked service tokens** section.
-5. Click **Save** to link the service token to the credential. Remember to copy and save the service token securely, as it won't be viewable again after generation.
-
+3. In the **2. Map new service token** section, map a service token to the credential you configured in the previous step. dbt Cloud automatically selects the service token permission set you need (Semantic Layer Only and Metadata Only).
+
+4. To add another service token during configuration, click **Add Service Token**.
+5. You can link more service tokens to the same credential later on in the **Semantic Layer Configuration Details** page. To add another service token to an existing Semantic Layer configuration, click **Add service token** under the **Linked service tokens** section.
+6. Click **Save** to link the service token to the credential. Remember to copy and save the service token securely, as it won't be viewable again after generation.
+
+
+7. To delete a credential, go back to the **Credentials & service tokens** page.
+8. Under **Linked Service Tokens**, click **Edit** and, select **Delete Credential** to remove a credential.
-6. To delete a credential, go back to the **Semantic Layer & Credential**s page. Select **Delete credential** to remove a credential and click **Save**.
-
When you delete a credential, any service tokens mapped to that credential in the project will no longer work and will break for any end users.
+## Delete configuration
+You can delete the entire Semantic Layer configuration for a project. Note that deleting the Semantic Layer configuration will remove all credentials and unlink all service tokens to the project. It will also cause all queries to the Semantic Layer to fail.
+
+Follow these steps to delete the Semantic Layer configuration for a project:
+
+1. Navigate to the **Project details** page.
+2. In the **Semantic Layer** section, select **Delete Semantic Layer**.
+3. Confirm the deletion by clicking **Yes, delete semantic layer** in the confirmation pop up.
+
+To re-enable the dbt Semantic Layer setup in the future, you will need to recreate your setup configurations by following the [previous steps](#set-up-dbt-semantic-layer). If your semantic models and metrics are still in your project, no changes are needed. If you've removed them, you'll need to set up the YAML configs again.
+
+
+
## Additional configuration
The following are the additional flexible configurations for Semantic Layer credentials.
diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-consumption-lenses.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-lenses.jpg
new file mode 100644
index 00000000000..9bf6c7ca0e3
Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-lenses.jpg differ
diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-consumption-list.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-list.jpg
new file mode 100644
index 00000000000..653fe7a2f43
Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/model-consumption-list.jpg differ
diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-query-lenses.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-query-lenses.jpg
deleted file mode 100644
index caa0cc72d67..00000000000
Binary files a/website/static/img/docs/collaborate/dbt-explorer/model-query-lenses.jpg and /dev/null differ
diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-query-list.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-query-list.jpg
deleted file mode 100644
index 14c5c1ceb9c..00000000000
Binary files a/website/static/img/docs/collaborate/dbt-explorer/model-query-list.jpg and /dev/null differ
diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-query-queried-models.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-query-queried-models.jpg
deleted file mode 100644
index 6b20b501880..00000000000
Binary files a/website/static/img/docs/collaborate/dbt-explorer/model-query-queried-models.jpg and /dev/null differ
diff --git a/website/static/img/docs/collaborate/dbt-explorer/model-query-usage-queries.jpg b/website/static/img/docs/collaborate/dbt-explorer/model-query-usage-queries.jpg
deleted file mode 100644
index 41857b3a482..00000000000
Binary files a/website/static/img/docs/collaborate/dbt-explorer/model-query-usage-queries.jpg and /dev/null differ
diff --git a/website/static/img/docs/collaborate/dbt-explorer/most-consumed-models.jpg b/website/static/img/docs/collaborate/dbt-explorer/most-consumed-models.jpg
new file mode 100644
index 00000000000..9e14db15f90
Binary files /dev/null and b/website/static/img/docs/collaborate/dbt-explorer/most-consumed-models.jpg differ
diff --git a/website/static/img/docs/collaborate/model-consumption-queries.jpg b/website/static/img/docs/collaborate/model-consumption-queries.jpg
new file mode 100644
index 00000000000..7fe9b23866c
Binary files /dev/null and b/website/static/img/docs/collaborate/model-consumption-queries.jpg differ
diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-add-credential.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-add-credential.jpg
index b2139da47b0..30baa7acf31 100644
Binary files a/website/static/img/docs/dbt-cloud/semantic-layer/sl-add-credential.jpg and b/website/static/img/docs/dbt-cloud/semantic-layer/sl-add-credential.jpg differ
diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-configure-sl.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-configure-sl.jpg
deleted file mode 100644
index fc44f409efe..00000000000
Binary files a/website/static/img/docs/dbt-cloud/semantic-layer/sl-configure-sl.jpg and /dev/null differ
diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-create-service-token-page.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-create-service-token-page.jpg
index 8e288183be2..da7a57a3d99 100644
Binary files a/website/static/img/docs/dbt-cloud/semantic-layer/sl-create-service-token-page.jpg and b/website/static/img/docs/dbt-cloud/semantic-layer/sl-create-service-token-page.jpg differ
diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-credential-created.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-credential-created.jpg
deleted file mode 100644
index 8c0081129fa..00000000000
Binary files a/website/static/img/docs/dbt-cloud/semantic-layer/sl-credential-created.jpg and /dev/null differ
diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-credentials-service-token.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-credentials-service-token.jpg
new file mode 100644
index 00000000000..7d302201e1f
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/semantic-layer/sl-credentials-service-token.jpg differ
diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-delete-config.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-delete-config.jpg
new file mode 100644
index 00000000000..c53c3e9d302
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/semantic-layer/sl-delete-config.jpg differ
diff --git a/website/static/img/docs/dbt-cloud/semantic-layer/sl-select-env.jpg b/website/static/img/docs/dbt-cloud/semantic-layer/sl-select-env.jpg
new file mode 100644
index 00000000000..f19cb22f2cf
Binary files /dev/null and b/website/static/img/docs/dbt-cloud/semantic-layer/sl-select-env.jpg differ