Skip to content

Commit

Permalink
Merge branch 'current' into update/enterprise-permissions-table
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Jul 16, 2024
2 parents b3a148f + 09825cf commit 0759117
Show file tree
Hide file tree
Showing 9 changed files with 44 additions and 15 deletions.
2 changes: 1 addition & 1 deletion website/docs/docs/build/unit-tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ You should unit test a model:

dbt Labs strongly recommends only running unit tests in development or CI environments. Since the inputs of the unit tests are static, there's no need to use additional compute cycles running them in production. Use them in development for a test-driven approach and CI to ensure changes don't break them.

Use the [resource type](/reference/global-configs/resource-type) flag `--exclude-resource-type` or the `DBT_EXCLUDE_RESOURCE_TYPE` environment variable to exclude unit tests from your production builds and save compute.
Use the [resource type](/reference/global-configs/resource-type) flag `--exclude-resource-type` or the `DBT_EXCLUDE_RESOURCE_TYPES` environment variable to exclude unit tests from your production builds and save compute.

## Unit testing a model

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ dbt Cloud is [hosted](/docs/cloud/about-cloud/architecture) in multiple regions
|--------|----------|------------|--------------|----------------|-----------|-----------------|
| North America [^1] | AWS us-east-1 (N. Virginia) | **Multi-tenant:** cloud.getdbt.com <br /> **Cell based:** ACCOUNT_PREFIX.us1.dbt.com | 52.45.144.63 <br /> 54.81.134.249 <br />52.22.161.231 <br />52.3.77.232 <br />3.214.191.130 <br />34.233.79.135 ||||
| EMEA [^1] | AWS eu-central-1 (Frankfurt) | emea.dbt.com | 3.123.45.39 <br /> 3.126.140.248 <br /> 3.72.153.148 ||||
| EMEA [^1] | Azure <br /> North Europe (Ireland) | **Cell based:** ACCOUNT_PREFIX.eu2.dbt.com (beta invite only) | 20.13.190.192/26 ||||
| APAC [^1] | AWS ap-southeast-2 (Sydney)| au.dbt.com | 52.65.89.235 <br /> 3.106.40.33 <br /> 13.239.155.206 <br />||||
| Virtual Private dbt or Single tenant | Customized | Customized | Ask [Support](/community/resources/getting-help#dbt-cloud-support) for your IPs ||||

Expand Down
7 changes: 0 additions & 7 deletions website/docs/docs/cloud/manage-access/audit-log.md
Original file line number Diff line number Diff line change
Expand Up @@ -173,10 +173,3 @@ You can use the audit log to export all historical audit results for security, c
- **For events beyond 90 days** &mdash; Select **Export All**. The Account Admin will receive an email link to download a CSV file of all the events that occurred in your organization.

<Lightbox src="/img/docs/dbt-cloud/dbt-cloud-enterprise/audit-log-section.jpg" width="95%" title="View audit log export options"/>

### Azure Single-tenant

For users deployed in [Azure single tenant](/docs/cloud/about-cloud/tenancy), while the **Export All** button isn't available, you can conveniently use specific APIs to access all events:

- [Get recent audit log events CSV](/dbt-cloud/api-v3#/operations/Get%20Recent%20Audit%20Log%20Events%20CSV) &mdash; This API returns all events in a single CSV without pagination.
- [List recent audit log events](/dbt-cloud/api-v3#/operations/List%20Recent%20Audit%20Log%20Events) &mdash; This API returns a limited number of events at a time, which means you will need to paginate the results.
6 changes: 3 additions & 3 deletions website/docs/docs/dbt-cloud-apis/authentication.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ pagination_prev: null

You should use service tokens broadly for any production workflow where you need a service account. You should use PATs only for developmental workflows _or_ dbt Cloud client workflows that require user context. The following examples show you when to use a personal access token (PAT) or a service token:

* **Connecting a partner integration to dbt Cloud** &mdash; Some examples include Hightouch, Datafold, a custom app you’ve created, etc. These types of integrations should use a service token instead of a PAT because service tokens give you visibility, and you can scope them to only what the integration needs and ensure the least privilege. We highly recommend switching to a service token if you’re using a user API key for these integrations today.
* **Connecting a partner integration to dbt Cloud** &mdash; Some examples include the [dbt Semantic Layer Google Sheets integration](/docs/cloud-integrations/avail-sl-integrations), Hightouch, Datafold, a custom app you’ve created, etc. These types of integrations should use a service token instead of a PAT because service tokens give you visibility, and you can scope them to only what the integration needs and ensure the least privilege. We highly recommend switching to a service token if you’re using a user API key for these integrations today.
* **Production Terraform** &mdash; Use a service token since this is a production workflow and is acting as a service account and not a user account.
* **Cloud CLI and Semantic Layer Sheets Integration** &mdash; Use a PAT since both the dbt Cloud CLI and Semantic Layer Google Sheets integrations work within the context of a user (the user is making the requests and has to operate within the context of their user account).
* **Testing a custom script and staging Terraform or Postman** &mdash; We recommend using a PAT as this is a developmental workflow and is scoped to the user making the changes. When you push this script or Terraform into production, use a service token instead.
* **Cloud CLI** &mdash; Use a PAT since the dbt Cloud CLI works within the context of a user (the user is making the requests and has to operate within the context of their user account).
* **Testing a custom script and staging Terraform or Postman** &mdash; We recommend using a PAT as this is a developmental workflow and is scoped to the user making the changes. When you push this script or Terraform into production, use a service token instead.
2 changes: 1 addition & 1 deletion website/docs/docs/dbt-cloud-apis/service-tokens.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Service account tokens enable you to securely authenticate with the dbt Cloud AP
You can use service account tokens for system-level integrations that do not run on behalf of any one user. Assign any permission sets available in dbt Cloud to your service account token, which can vary slightly depending on your plan:

* Enterprise plans can apply any permission sets available to service tokens.
* Team plans can apply Account Admin, Member, Job Admin, Read-Only, and Metadata permissions set to service tokens.
* Team plans can apply Account Admin, Member, Job Admin, Read-Only, Metadata, and Semantic Layer permissions set to service tokens.

You can assign as many permission sets as needed to one token. For more on permissions sets, see "[Enterprise Permissions](/docs/cloud/manage-access/enterprise-permissions)."

Expand Down
4 changes: 4 additions & 0 deletions website/docs/docs/deploy/job-scheduler.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,10 @@ Example of deactivation banner on job's page:

<Lightbox src="/img/docs/dbt-cloud/deployment/example-deactivated-deploy-job.png" title="Example of deactivation banner on job's page"/>

## FAQs

<FAQ path="Troubleshooting/job-memory-limits" />

## Related docs
- [dbt Cloud architecture](/docs/cloud/about-cloud/architecture#dbt-cloud-features-architecture)
- [Job commands](/docs/deploy/job-commands)
Expand Down
26 changes: 26 additions & 0 deletions website/docs/faqs/Troubleshooting/job-memory-limits.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
---
title: "I'm receiving a 'This run exceeded your account's run memory limits' error in my failed job"
description: "Use incremental models or optimize queries for job failures due to exceeded memory limits."
sidebar_label: 'Job failures due to exceeded memory limits'
---

If you're receiving a `This run exceeded your account's run memory limits` error in your failed job, it means that the job exceeded the [memory limits](/docs/deploy/job-scheduler#job-memory) set for your account. All dbt Cloud accounts have a pod memory of 600Mib and memory limits are on a per run basis. They're typically influenced by the amount of result data that dbt has to ingest and process, which is small but can become bloated unexpectedly by project design choices.

## Common reasons

Some common reasons for higher memory usage are:

- dbt run/build: Macros that capture large result sets from run query may not all be necessary and may be memory inefficient.
- dbt docs generate: Source or model schemas with large numbers of tables (even if those tables aren't all used by dbt) cause the ingest of very large results for catalog queries.

## Resolution

Try the following to resolve this:

1. **Use incremental models**: Try using [incremental models](/docs/build/incremental-models-overview) to reduce the amount of data being processed in each run. Incremental models only process new or updated data, which can help reduce the memory usage of your jobs.
2. **Refactor your data model**: Review your data models to see if there are any opportunities to optimize or refactor them. For example, you can try to reduce the number of columns being selected, use `where` clauses to filter data early in the query or use `limit` clauses to reduce the amount of data being processed.

If you've tried the earlier suggestions and are still experiencing failed job runs with this error about hitting the memory limits of your account, please [reach out to support](mailto:[email protected]) and we can try increasing your account's memory. We're happy to help!

## Additional resources
- [Blog post on how we shaved 90 mins off](https://docs.getdbt.com/blog/how-we-shaved-90-minutes-off-model)
6 changes: 3 additions & 3 deletions website/docs/guides/mesh-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -460,10 +460,10 @@ models:
### Set up model versions
In this section, you will set up model versions by the Data Analytics team as they upgrade the `fct_orders` model while offering backward compatibility and a migration notice to the downstream Finance team.

1. Rename the existing model file from` models/core/fct_orders.sql` to `models/core/fct_orders_v1.sql`.
1. Rename the existing model file from `models/core/fct_orders.sql` to `models/core/fct_orders_v1.sql`.
2. Create a new file `models/core/fct_orders_v2.sql` and adjust the schema:
- Comment out `orders.status`
- Add a new field, `is_return` to indicate if an order was returned.
- Comment out `o.status` in the `final` CTE.
- Add a new field, `case when o.status = 'returned' then true else false end as is_return` to indicate if an order was returned.
3. Then, add the following to your `models/core/core.yml` file:
- The `is_return` column
- The two model `versions`
Expand Down
5 changes: 5 additions & 0 deletions website/src/components/expandable/styles.module.css
Original file line number Diff line number Diff line change
Expand Up @@ -141,3 +141,8 @@
.expandableContainer {
margin-bottom: 5px; /* Adjust this value as needed to create space */
}

.headerText {
display: flex;
align-items: center;
}

0 comments on commit 0759117

Please sign in to comment.