Skip to content

Commit

Permalink
Merge branch 'current' into privatelink-troubleshooting
Browse files Browse the repository at this point in the history
  • Loading branch information
matthewshaver authored Sep 10, 2024
2 parents e05165f + 04bafea commit a2a4bdf
Show file tree
Hide file tree
Showing 14 changed files with 101 additions and 27 deletions.
71 changes: 57 additions & 14 deletions website/docs/docs/collaborate/data-tile.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,10 @@ The data health tile:

Data health tiles rely on [exposures](/docs/build/exposures) to surface trust signals in your dashboards. When you configure exposures in your dbt project, you are explicitly defining how specific outputs—like dashboards or reports—depend on your data models.

<DocCarousel slidesPerView={1}>
<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tile-pass.jpg" width="60%" title="Example of passing Data health tile in your dashboard." />
<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tiles.png" width="60%" title="Embed data health tiles in your dashboards to distill trust signals for data consumers." />
</DocCarousel>

## Prerequisites

Expand Down Expand Up @@ -52,7 +55,7 @@ First, be sure to enable [source freshness](/docs/deploy/source-freshness) in

## Embed in your dashboard

Once you’ve navigated to the auto-exposure in dbt Explorer, you’ll need to set up your data health tile and [service token](/docs/dbt-cloud-apis/service-tokens). You can embed data health tile to any analytics tool that supports URL or iFrame embedding.
Once you’ve navigated to the exposure in dbt Explorer, you’ll need to set up your data health tile and [service token](/docs/dbt-cloud-apis/service-tokens). You can embed data health tile to any analytics tool that supports URL or iFrame embedding.

Follow these steps to set up your data health tile:

Expand All @@ -72,28 +75,68 @@ Follow these steps to set up your data health tile:

If your analytics tool supports iFrames, you can embed the dashboard tile within it.

#### Tableau example
Here’s an example with Tableau, where you can embed the iFrame in a web page object:
### Examples
The following examples show how to embed the data health tile in Tableau and PowerBI.

- Ensure you've copied the embed iFrame snippet from the dbt Explorer **Data health** section.
- **For the revamped environment-based exposure tile** &mdash; Insert the following fields into the following iFrame. Then embed them with your dashboard. This is the iFrame available from the **Exposure details** page in dbt Explorer.
<Tabs>

`<iframe src='https://metadata.YOUR_ACCESS_URL/exposure-tile?uniqueId=<exposure_unique_id>&environmentType=production&environmentId=<environment_id>&token=<metadata_token>' />`
<TabItem value="powerbi" label="PowerBI example">

*Note, replace the placeholders with your actual values.*
You can embed the data health tile iFrame in PowerBI using PowerBI Pro Online, Fabric PowerBI, or PowerBI Desktop.

<DocCarousel slidesPerView={1}>
<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tile-iframe.jpg" width="70%" title="Example of embedded iFrame" />
<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tile-pass.jpg" width="60%" title="Example of passing Data health tile in your dashboard." />
<Lightbox src="/img/docs/collaborate/dbt-explorer/data-tile-stale.jpg" width="60%" title="Example of stale of degraded Data health tile in your dashboard." />
</DocCarousel>
<Lightbox src="/img/docs/collaborate/dbt-explorer/power-bi.png" width="80%" title="Embed data health tile iFrame in PowerBI"/>

Follow these steps to embed the data health tile in PowerBI:

1. Create a dashboard in PowerBI and connect to your database to pull in the data.
2. Create a new PowerBI measure by right-clicking on your **Data**, **More options**, and then **New measure**.
<Lightbox src="/img/docs/collaborate/dbt-explorer/power-bi-measure.png" width="80%" title="Create a new PowerBI measure."/>

3. Navigate to dbt Explorer, select the exposure, and expand the [**Embed data health into your dashboard**](/docs/collaborate/data-tile#embed-in-your-dashboard) toggle.
4. Go to the **iFrame** tab and copy the iFrame code. Make sure the Metadata Only token is already set up.
5. In PowerBI, paste the iFrame code you copied into your measure calculation window. The iFrame code should look like this:

```html
Website =
"<iframe src='https://1234.metadata.us1.dbt.com/exposure-tile?uniqueId=exposure.jaffle_shop.OrderQualityDashboard&environmentType=staging&environmentId=123456789&token=YOUR_METADATA_TOKEN' title='Exposure status tile' height='400'></iframe>"
```

<Lightbox src="/img/docs/collaborate/dbt-explorer/power-bi-measure-tools.png" width="90%" title="In the 'Measure tools' tab, replace your values with the iFrame code."/>

- **For job-based exposure tile** &mdash; Insert the following fields into the following iFrame. Then embed them with your dashboard. The next [section](#job-based-data-health) will have more details on the job-based exposure tile.
6. PowerBI desktop doesn't support HTML rendering by default, so you need to install an HTML component from the PowerBI Visuals Store.
7. To do this, go to **Build visuals** and then **Get more visuals**.
8. Login with your PowerBI account.
9. There are several third-party HTML visuals. The one tested for this guide is [HTML content](https://appsource.microsoft.com/en-us/product/power-bi-visuals/WA200001930?tab=Overview). Install it, but please keep in mind it's a third-party plugin not created or supported by dbt Labs.
10. Drag the metric with the iFrame code into the HTML content widget in PowerBI. This should now display your data health tile.

`<iframe src='https://metadata.YOUR_ACCESS_URL/exposure-tile?name=<exposure_name>&environment_id=<environment_id>&token=<metadata_token>' />`
<Lightbox src="/img/docs/collaborate/dbt-explorer/power-bi-final.png" width="80%" title="Drag the metric with the iFrame code into the HTML content widget in PowerBI. This should now display your data health tile."/>

*Refer to [this tutorial](https://www.youtube.com/watch?v=SUm9Hnq8Th8) for additional information on embedding a website into your Power BI report.*

</TabItem>

<TabItem value="tableau" label="Tableau example">

Follow these steps to embed the data health tile in Tableau:

<Lightbox src="/img/docs/collaborate/dbt-explorer/tableau-example.png" width="80%" title="Embed data health tile iFrame in Tableau"/>

1. Create a dashboard in Tableau and connect to your database to pull in the data.
2. Ensure you've copied the iFrame snippet available in dbt Explorer's **Data health** section, under the **Embed data health into your dashboard** toggle.
3. Embed the snippet in your dashboard.

`<iframe src='https://metadata.YOUR_ACCESS_URL/exposure-tile?uniqueId=<exposure_unique_id>&environmentType=production&environmentId=<environment_id>&token=<metadata_token>' />`

*Note, replace the placeholders with your actual values.*

- **For job-based exposure tile** &mdash; Insert the following fields into the following iFrame. Then embed them with your dashboard. The next [section](#job-based-data-health) will have more details on the job-based exposure tile.
- `<iframe src='https://metadata.YOUR_ACCESS_URL/exposure-tile?name=<exposure_name>&environment_id=<environment_id>&token=<metadata_token>' />`

*Note, replace the placeholders with your actual values.*

</TabItem>
</Tabs>

## Job-based data health <Lifecycle status="Legacy"/>

The default experience is the [environment-based data health tile](#view-exposure-in-dbt-explorer) with dbt Explorer.
Expand Down
4 changes: 2 additions & 2 deletions website/docs/docs/dbt-cloud-apis/sl-python-sdk.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Sync installation means your program waits for each task to finish before moving
It's simpler, easier to understand, and suitable for smaller tasks or when your program doesn't need to handle many tasks at the same time.

```bash
pip install dbt-sl-sdk[sync]
pip install "dbt-sl-sdk[sync]"
```
If you're using async frameworks like [FastAPI](https://fastapi.tiangolo.com/) or [Strawberry](https://github.com/strawberry-graphql/strawberry), installing the sync version of the SDK will block your event loop and can significantly slow down your program. In this case, we strongly recommend using async installation.

Expand All @@ -37,7 +37,7 @@ Async installation means your program can start a task and then move on to other
For more details, refer to [asyncio](https://docs.python.org/3/library/asyncio.html).

```bash
pip install dbt-sl-sdk[async]
pip install "dbt-sl-sdk[sync]"
```

Since the [Python ADBC driver](https://github.com/apache/arrow-adbc/tree/main/python/adbc_driver_manager) doesn't yet support asyncio natively, `dbt-sl-sdk` uses a [`ThreadPoolExecutor`](https://github.com/dbt-labs/semantic-layer-sdk-python/blob/5e52e1ca840d20a143b226ae33d194a4a9bc008f/dbtsl/api/adbc/client/asyncio.py#L62) to run `query` and `list dimension-values` (all operations that are done with ADBC). This is why you might see multiple Python threads spawning.
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/use-dbt-semantic-layer/deploy-sl.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ import RunProdJob from '/snippets/_sl-run-prod-job.md';

## Next steps
After you've executed a job and deployed your Semantic Layer:
- [Set up your Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) in dbt Cloud.
- [Set up your Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) in dbt Cloud. g
- Discover the [available integrations](/docs/cloud-integrations/avail-sl-integrations), such as Tableau, Google Sheets, Microsoft Excel, and more.
- Start querying your metrics with the [API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata).

Expand Down
14 changes: 14 additions & 0 deletions website/docs/guides/core-cloud-2.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,20 @@ import CoretoCloudTable from '/snippets/_core-to-cloud-guide-table.md';

<CoretoCloudTable/>

<Expandable alt_header="What is dbt Cloud and dbt Core?">

- dbt Cloud is the fastest and most reliable way to deploy dbt. It enables you to develop, test, deploy, and explore data products using a single, fully managed service. It also supports:
- Development experiences tailored to multiple personas ([dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) or [dbt Cloud CLI](/docs/cloud/cloud-cli-installation))
- Out-of-the-box [CI/CD workflows](/docs/deploy/ci-jobs)
- The [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) for consistent metrics
- Domain ownership of data with multi-project [dbt Mesh](/best-practices/how-we-mesh/mesh-1-intro) setups
- [dbt Explorer](/docs/collaborate/explore-projects) for easier data discovery and understanding

Learn more about [dbt Cloud features](/docs/cloud/about-cloud/dbt-cloud-features).
- dbt Core is an open-source tool that enables data teams to define and execute data transformations in a cloud data warehouse following analytics engineering best practices. While this can work well for ‘single players’ and small technical teams, all development happens on a command-line interface, and production deployments must be self-hosted and maintained. This requires significant, costly work that adds up over time to maintain and scale.

</Expandable>

## What you'll learn
Today thousands of companies, with data teams ranging in size from 2 to 2,000, rely on dbt Cloud to accelerate data work, increase collaboration, and win the trust of the business. Understanding what you'll need to do in order to move between dbt Cloud and your current Core deployment will help you strategize and plan for your move.

Expand Down
24 changes: 15 additions & 9 deletions website/docs/guides/core-to-cloud-1.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,17 +24,19 @@ import CoretoCloudTable from '/snippets/_core-to-cloud-guide-table.md';

<CoretoCloudTable/>

<Expandable alt_header="What is dbt Cloud and dbt Core?">

dbt Cloud is the fastest and most reliable way to deploy dbt. It enables you to develop, test, deploy, and explore data products using a single, fully managed service. It also supports:
- Development experiences tailored to multiple personas ([dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) or [dbt Cloud CLI](/docs/cloud/cloud-cli-installation))
- Out-of-the-box [CI/CD workflows](/docs/deploy/ci-jobs)
- The [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) for consistent metrics
- Domain ownership of data with multi-project [dbt Mesh](/best-practices/how-we-mesh/mesh-1-intro) setups
- [dbt Explorer](/docs/collaborate/explore-projects) for easier data discovery and understanding
- dbt Cloud is the fastest and most reliable way to deploy dbt. It enables you to develop, test, deploy, and explore data products using a single, fully managed service. It also supports:
- Development experiences tailored to multiple personas ([dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) or [dbt Cloud CLI](/docs/cloud/cloud-cli-installation))
- Out-of-the-box [CI/CD workflows](/docs/deploy/ci-jobs)
- The [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) for consistent metrics
- Domain ownership of data with multi-project [dbt Mesh](/best-practices/how-we-mesh/mesh-1-intro) setups
- [dbt Explorer](/docs/collaborate/explore-projects) for easier data discovery and understanding

Learn more about [dbt Cloud features](/docs/cloud/about-cloud/dbt-cloud-features).
Learn more about [dbt Cloud features](/docs/cloud/about-cloud/dbt-cloud-features).
- dbt Core is an open-source tool that enables data teams to define and execute data transformations in a cloud data warehouse following analytics engineering best practices. While this can work well for ‘single players’ and small technical teams, all development happens on a command-line interface, and production deployments must be self-hosted and maintained. This requires significant, costly work that adds up over time to maintain and scale.

dbt Core is an open-source tool that enables data teams to define and execute data transformations in a cloud data warehouse following analytics engineering best practices. While this can work well for ‘single players’ and small technical teams, all development happens on a command-line interface, and production deployments must be self-hosted and maintained. This requires significant, costly work that adds up over time to maintain and scale.
</Expandable>

## What you'll learn

Expand All @@ -57,7 +59,7 @@ This guide outlines the steps you need to take to move from dbt Core to dbt Clou
## Prerequisites

- You have an existing dbt Core project connected to a Git repository and data platform supported in [dbt Cloud](/docs/cloud/connect-data-platform/about-connections).
- A [supported version](/docs/dbt-versions/core) of dbt or select [**Versionless**](/docs/dbt-versions/upgrade-dbt-version-in-cloud#versionless) of dbt. <Lifecycle status="Preview"/>
- A [supported version](/docs/dbt-versions/core) of dbt or select [**Versionless**](/docs/dbt-versions/upgrade-dbt-version-in-cloud#versionless) of dbt.
- You have a dbt Cloud account. **[Don't have one? Start your free trial today](https://www.getdbt.com/signup)**!

## Account setup
Expand All @@ -84,8 +86,10 @@ This section outlines the considerations and methods to connect your data platfo

1. In dbt Cloud, set up your [data platform connections](/docs/cloud/connect-data-platform/about-connections) and [environment variables](/docs/build/environment-variables). dbt Cloud can connect with a variety of data platform providers including:
- [AlloyDB](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb)
- [Amazon Athena](/docs/cloud/connect-data-platform/connect-amazon-athena) (beta)
- [Amazon Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb)
- [Apache Spark](/docs/cloud/connect-data-platform/connect-apache-spark)
- [Azure Synapse Analytics](/docs/cloud/connect-data-platform/connect-azure-synapse-analytics)
- [Databricks](/docs/cloud/connect-data-platform/connect-databricks)
- [Google BigQuery](/docs/cloud/connect-data-platform/connect-bigquery)
- [Microsoft Fabric](/docs/cloud/connect-data-platform/connect-microsoft-fabric)
Expand Down Expand Up @@ -230,6 +234,8 @@ Explore these additional configurations to optimize your dbt Cloud orchestration

Building a custom solution to efficiently check code upon pull requests is complicated. With dbt Cloud, you can enable [continuous integration / continuous deployment (CI/CD)](/docs/deploy/continuous-integration) and configure dbt Cloud to run your dbt projects in a temporary schema when new commits are pushed to open pull requests.

<Lightbox src="/img/docs/dbt-cloud/using-dbt-cloud/ci-workflow.png" width="90%" title="Workflow of continuous integration in dbt Cloud"/>

This build-on-PR functionality is a great way to catch bugs before deploying to production, and an essential tool for data practitioners.

1. Set up an integration with a native Git application (such as Azure DevOps, GitHub, GitLab) and a CI environment in dbt Cloud.
Expand Down
2 changes: 1 addition & 1 deletion website/docs/reference/dbt-jinja-functions/model.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ description: "`model` is the dbt graph object (or node) for the current model."

For example:
```jinja
{% if model.config.materialization == 'view' %}
{% if model.config.materialized == 'view' %}
{{ log(model.name ~ " is a view.", info=True) }}
{% endif %}
```
Expand Down
11 changes: 11 additions & 0 deletions website/snippets/_core-to-cloud-guide-table.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,14 @@
| [Move from dbt Core to dbt Cloud: What you need to know](/guides/core-cloud-2) | Understand the considerations and methods needed in your move from dbt Core to dbt Cloud. | Team leads <br /> Admins |
| [Move from dbt Core to dbt Cloud: Get started](/guides/core-to-cloud-1?step=1) | Learn the steps needed to move from dbt Core to dbt Cloud. | Developers <br /> Data engineers <br /> Data analysts |
| [Move from dbt Core to dbt Cloud: Optimization tips](/guides/core-to-cloud-3) | Learn how to optimize your dbt Cloud experience with common scenarios and useful tips. | Everyone |

### Why move to dbt Cloud?
If your team is using dbt Core today, you could be reading this guide because:

- You’ve realized the burden of maintaining that deployment.
- The person who set it up has since left.
- You’re interested in what dbt Cloud could do to better manage the complexity of your dbt deployment, democratize access to more contributors, or improve security and governance practices.

Moving from dbt Core to dbt Cloud simplifies workflows by providing a fully managed environment that improves collaboration, security, and orchestration. With dbt Cloud, you gain access to features like cross-team collaboration ([dbt Mesh](/best-practices/how-we-mesh/mesh-1-intro)), version management, streamlined CI/CD, [dbt Explorer](/docs/collaborate/explore-projects) for comprehensive insights, and more &mdash; making it easier to manage complex dbt deployments and scale your data workflows efficiently.

It's ideal for teams looking to reduce the burden of maintaining their own infrastructure while enhancing governance and productivity.
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit a2a4bdf

Please sign in to comment.