Skip to content

Commit

Permalink
Merge branch 'main' into feat/add-windows-os-to-github-workflows
Browse files Browse the repository at this point in the history
  • Loading branch information
glsdown authored Sep 13, 2023
2 parents ad28fd5 + 62554f3 commit 4afd54a
Show file tree
Hide file tree
Showing 21 changed files with 79 additions and 43 deletions.
1 change: 1 addition & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
- [ ] New features (breaking change)
- [ ] Other (non-breaking change)
- [ ] Other (breaking change)
- [ ] Release preparation

## What does this solve?

Expand Down
8 changes: 7 additions & 1 deletion .github/workflows/ci_lint_package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,14 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.sha }} # Check out the code of the PR

- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: "3.9.x"
architecture: "x64"

- name: Install Python packages
run: python -m pip install dbt-snowflake~=1.5.0 sqlfluff-templater-dbt
run: python -m pip install dbt-snowflake~=1.6.0 sqlfluff-templater-dbt~=2.3.2

- name: Test database connection
run: dbt debug
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/ci_test_package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ env:
DBT_ENV_SECRET_DATABRICKS_TOKEN: ${{ secrets.DATABRICKS_TOKEN }}
DBT_ENV_SECRET_GCP_PROJECT: ${{ secrets.GCP_PROJECT }}
# Env var to test version
LAST_RELEASE_SUPPORTED_DBT_VERSION: 1_4_0 # A dbt version supported by both the last release and this one
LAST_RELEASE_SUPPORTED_DBT_VERSION: 1_6_0 # A dbt version supported by both the last release and this one
# Env vars to test invocations model
DBT_CLOUD_PROJECT_ID: 123
DBT_CLOUD_JOB_ID: ABC
Expand Down Expand Up @@ -104,7 +104,7 @@ jobs:
matrix:
warehouse: ["snowflake", "bigquery"]
# When supporting a new version, update the list here
version: ["1_3_0", "1_4_0", "1_5_0"]
version: ["1_3_0", "1_4_0", "1_5_0", "1_6_0"]
os: [ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
environment:
Expand Down
8 changes: 7 additions & 1 deletion .github/workflows/main_lint_package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,14 @@ jobs:
- name: Checkout branch
uses: actions/checkout@v3

- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: "3.9.x"
architecture: "x64"

- name: Install Python packages
run: python -m pip install dbt-snowflake~=1.5.0 sqlfluff-templater-dbt
run: python -m pip install dbt-snowflake~=1.6.0 sqlfluff-templater-dbt~=2.3.2

- name: Test database connection
run: dbt debug
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/main_test_package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ jobs:
strategy:
matrix:
warehouse: ["snowflake", "bigquery"]
version: ["1_3_0", "1_4_0", "1_5_0"]
version: ["1_3_0", "1_4_0", "1_5_0", "1_6_0"]
os: [ubuntu-latest, windows-latest]
runs-on: ${{ matrix.os }}
permissions:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/publish_docs_on_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ jobs:
uses: actions/checkout@v3

- name: Install Python packages
run: python -m pip install dbt-snowflake~=1.3.0
run: python -m pip install dbt-snowflake~=1.6.0

- name: Test database connection
run: dbt debug
Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ See the generated [dbt docs site](https://brooklyn-data.github.io/dbt_artifacts/
```
packages:
- package: brooklyn-data/dbt_artifacts
version: 2.4.2
version: 2.5.0
```

:construction_worker: Make sure to fix at least the **minor** version, to avoid issues when a new release is open. See the notes on upgrading below for more detail.
Expand All @@ -55,15 +55,15 @@ packages:
3. Add an on-run-end hook to your `dbt_project.yml`

```yml
`on-run-end:
- "{{ dbt_artifacts.upload_results(results) }}"`
on-run-end:
- "{{ dbt_artifacts.upload_results(results) }}"
```
We recommend adding a conditional here so that the upload only occurs in your production environment, such as:
```yml
on-run-end:
- "{% if target.name == 'prod' %}{{ dbt_artifacts.upload_results(results) }}{% endif %}"`)
- "{% if target.name == 'prod' %}{{ dbt_artifacts.upload_results(results) }}{% endif %}"
```
4. Run the tables!
Expand Down
4 changes: 2 additions & 2 deletions dbt_project.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name: "dbt_artifacts"
version: "2.4.2"
version: "2.5.0"
config-version: 2
require-dbt-version: [">=1.3.0", "<1.6.0"]
require-dbt-version: [">=1.3.0", "<1.7.0"]
profile: "dbt_artifacts"

clean-targets: # folders to be removed by `dbt clean`
Expand Down
2 changes: 1 addition & 1 deletion macros/parse_json.sql
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,6 @@
{%- endmacro %}

{% macro bigquery__parse_json(field) -%}
parse_json({{ field }})
safe.parse_json("""{{ field }}""", wide_number_mode=>'round')
{%- endmacro %}

4 changes: 2 additions & 2 deletions macros/upload_exposures.sql
Original file line number Diff line number Diff line change
Expand Up @@ -62,15 +62,15 @@
'{{ run_started_at }}', {# run_started_at #}
'{{ exposure.name | replace("'","\\'") }}', {# name #}
'{{ exposure.type }}', {# type #}
parse_json('{{ tojson(exposure.owner) | replace("'","\\'") }}'), {# owner #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(exposure.owner) | replace("'","\\'")) }}, {# owner #}
'{{ exposure.maturity }}', {# maturity #}
'{{ exposure.original_file_path | replace('\\', '\\\\') }}', {# path #}
"""{{ exposure.description | replace("'","\\'") }}""", {# description #}
'{{ exposure.url }}', {# url #}
'{{ exposure.package_name }}', {# package_name #}
{{ tojson(exposure.depends_on.nodes) }}, {# depends_on_nodes #}
{{ tojson(exposure.tags) }}, {# tags #}
parse_json('{{ tojson(exposure) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}', wide_number_mode=>'round') {# all_results #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(exposure) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"')) }} {# all_results #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
10 changes: 5 additions & 5 deletions macros/upload_invocations.sql
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@
null, {# dbt_vars #}
{% endif %}

'{{ tojson(invocation_args_dict) | replace('\\', '\\\\') }}', {# invocation_args #}
'{{ tojson(invocation_args_dict) | replace('\\', '\\\\') | replace("'", "\\'") }}', {# invocation_args #}

{% set metadata_env = {} %}
{% for key, value in dbt_metadata_envs.items() %}
Expand Down Expand Up @@ -122,7 +122,7 @@
{% for env_variable in var('env_vars') %}
{% do env_vars_dict.update({env_variable: (env_var(env_variable, ''))}) %}
{% endfor %}
parse_json('''{{ tojson(env_vars_dict) }}'''), {# env_vars #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(env_vars_dict)) }}, {# env_vars #}
{% else %}
null, {# env_vars #}
{% endif %}
Expand All @@ -132,7 +132,7 @@
{% for dbt_var in var('dbt_vars') %}
{% do dbt_vars_dict.update({dbt_var: (var(dbt_var, ''))}) %}
{% endfor %}
parse_json('''{{ tojson(dbt_vars_dict) }}'''), {# dbt_vars #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(dbt_vars_dict)) }}, {# dbt_vars #}
{% else %}
null, {# dbt_vars #}
{% endif %}
Expand All @@ -146,13 +146,13 @@
{% endif %}
{% endif %}

safe.parse_json('''{{ tojson(invocation_args_dict) }}'''), {# invocation_args #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(invocation_args_dict) | replace("'", "\\'")) }}, {# invocation_args #}

{% set metadata_env = {} %}
{% for key, value in dbt_metadata_envs.items() %}
{% do metadata_env.update({key: value}) %}
{% endfor %}
parse_json('''{{ tojson(metadata_env) | replace('\\', '\\\\') }}''') {# dbt_custom_envs #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(metadata_env) | replace('\\', '\\\\')) }} {# dbt_custom_envs #}

)
{% endset %}
Expand Down
4 changes: 2 additions & 2 deletions macros/upload_model_executions.sql
Original file line number Diff line number Diff line change
Expand Up @@ -130,8 +130,8 @@
'{{ model.node.schema }}', {# schema #}
'{{ model.node.name }}', {# name #}
'{{ model.node.alias }}', {# alias #}
'{{ model.message | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}', {# message #}
parse_json('{{ tojson(model.adapter_response) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}') {# adapter_response #}
'{{ model.message | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') | replace("\n", "\\n") }}', {# message #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(model.adapter_response) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"')) }} {# adapter_response #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
4 changes: 2 additions & 2 deletions macros/upload_models.sql
Original file line number Diff line number Diff line change
Expand Up @@ -69,9 +69,9 @@
'{{ model.checksum.checksum }}', {# checksum #}
'{{ model.config.materialized }}', {# materialization #}
{{ tojson(model.tags) }}, {# tags #}
parse_json('''{{ tojson(model.config.meta) }}'''), {# meta #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(model.config.meta)) }}, {# meta #}
'{{ model.alias }}', {# alias #}
parse_json('{{ tojson(model) | replace("\\", "\\\\") | replace("'","\\'") | replace('"', '\\"') }}', wide_number_mode=>'round') {# all_results #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(model) | replace("\\", "\\\\") | replace("'","\\'") | replace('"', '\\"')) }} {# all_results #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
4 changes: 2 additions & 2 deletions macros/upload_seed_executions.sql
Original file line number Diff line number Diff line change
Expand Up @@ -128,8 +128,8 @@
'{{ model.node.schema }}', {# schema #}
'{{ model.node.name }}', {# name #}
'{{ model.node.alias }}', {# alias #}
'{{ model.message | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}', {# message #}
parse_json('{{ tojson(model.adapter_response) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}') {# adapter_response #}
'{{ model.message | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') | replace("\n", "\\n") }}', {# message #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(model.adapter_response) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"')) }} {# adapter_response #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
4 changes: 2 additions & 2 deletions macros/upload_seeds.sql
Original file line number Diff line number Diff line change
Expand Up @@ -62,9 +62,9 @@
'{{ seed.package_name }}', {# package_name #}
'{{ seed.original_file_path | replace('\\', '\\\\') }}', {# path #}
'{{ seed.checksum.checksum }}', {# checksum #}
parse_json('''{{ tojson(seed.config.meta) }}'''), {# meta #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(seed.config.meta)) }}, {# meta #}
'{{ seed.alias }}', {# alias #}
parse_json('{{ tojson(seed) | replace("\\", "\\\\") | replace("'","\\'") | replace('"', '\\"') }}', wide_number_mode=>'round') {# all_results #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(seed) | replace("\\", "\\\\") | replace("'","\\'") | replace('"', '\\"')) }} {# all_results #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
4 changes: 2 additions & 2 deletions macros/upload_snapshot_executions.sql
Original file line number Diff line number Diff line change
Expand Up @@ -128,8 +128,8 @@
'{{ model.node.schema }}', {# schema #}
'{{ model.node.name }}', {# name #}
'{{ model.node.alias }}', {# alias #}
'{{ model.message | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}', {# message #}
parse_json('{{ tojson(model.adapter_response) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}') {# adapter_response #}
'{{ model.message | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') | replace("\n", "\\n") }}', {# message #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(model.adapter_response) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"')) }} {# adapter_response #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
4 changes: 2 additions & 2 deletions macros/upload_snapshots.sql
Original file line number Diff line number Diff line change
Expand Up @@ -69,9 +69,9 @@
'{{ snapshot.original_file_path | replace('\\', '\\\\') }}', {# path #}
'{{ snapshot.checksum.checksum }}', {# checksum #}
'{{ snapshot.config.strategy }}', {# strategy #}
parse_json('''{{ tojson(snapshot.config.meta) }}'''), {# meta #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(snapshot.config.meta)) }}, {# meta #}
'{{ snapshot.alias }}', {# alias #}
parse_json('{{ tojson(snapshot) | replace("\\", "\\\\") | replace("'","\\'") | replace('"', '\\"') }}', wide_number_mode=>'round') {# all_results #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(snapshot) | replace("\\", "\\\\") | replace("'","\\'") | replace('"', '\\"')) }} {# all_results #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
4 changes: 2 additions & 2 deletions macros/upload_sources.sql
Original file line number Diff line number Diff line change
Expand Up @@ -59,8 +59,8 @@
'{{ source.name }}', {# name #}
'{{ source.identifier }}', {# identifier #}
'{{ source.loaded_at_field | replace("'","\\'") }}', {# loaded_at_field #}
parse_json('{{ tojson(source.freshness) | replace("'","\\'") }}'), {# freshness #}
parse_json('{{ tojson(source) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}', wide_number_mode=>'round') {# all_results #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(source.freshness) | replace("'","\\'")) }}, {# freshness #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(source) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"')) }} {# all_results #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
5 changes: 2 additions & 3 deletions macros/upload_test_executions.sql
Original file line number Diff line number Diff line change
Expand Up @@ -119,9 +119,8 @@
{{ test.execution_time }}, {# total_node_runtime #}
null, {# rows_affected not available in Databricks #}
{{ 'null' if test.failures is none else test.failures }}, {# failures #}
'{{ test.message | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}', {# message #}
parse_json('{{ tojson(test.adapter_response) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') }}') {# adapter_response #}

'{{ test.message | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"') | replace("\n", "\\n") }}', {# message #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(test.adapter_response) | replace("\\", "\\\\") | replace("'", "\\'") | replace('"', '\\"')) }} {# adapter_response #}
)
{%- if not loop.last %},{%- endif %}

Expand Down
2 changes: 1 addition & 1 deletion macros/upload_tests.sql
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
'{{ test.package_name }}', {# package_name #}
'{{ test.original_file_path | replace('\\', '\\\\') }}', {# test_path #}
{{ tojson(test.tags) }}, {# tags #}
parse_json('{{ tojson(test) | replace("\\", "\\\\") | replace("'","\\'") | replace('"', '\\"') }}') {# all_fields #}
{{ adapter.dispatch('parse_json', 'dbt_artifacts')(tojson(test) | replace("\\", "\\\\") | replace("'","\\'") | replace('"', '\\"')) }} {# all_fields #}
)
{%- if not loop.last %},{%- endif %}
{%- endfor %}
Expand Down
34 changes: 29 additions & 5 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ rules = LT01,LT02,LT03,CP01,AL01,AL02,CP02,ST08,LT06,LT07,AM01,LT08,AL05,RF02,RF

deps =
sqlfluff-templater-dbt~=2.0.2
dbt-snowflake~=1.3.0
dbt-snowflake~=1.6.0

[sqlfluff:indentation]
indent_unit = space
Expand Down Expand Up @@ -114,13 +114,13 @@ commands = sqlfluff fix models --ignore parsing

# Generate docs
[testenv:generate_docs]
deps = dbt-snowflake~=1.5.0
deps = dbt-snowflake~=1.6.0
commands = dbt docs generate --profiles-dir integration_test_project

# Snowflake integration tests
[testenv:integration_snowflake]
changedir = integration_test_project
deps = dbt-snowflake~=1.4.0
deps = dbt-snowflake~=1.6.0
commands =
dbt clean
dbt deps
Expand Down Expand Up @@ -151,10 +151,18 @@ commands =
dbt deps
dbt build --target snowflake

[testenv:integration_snowflake_1_6_0]
changedir = integration_test_project
deps = dbt-snowflake~=1.6.0
commands =
dbt clean
dbt deps
dbt build --target snowflake

# Databricks integration tests
[testenv:integration_databricks]
changedir = integration_test_project
deps = dbt-databricks~=1.4.0
deps = dbt-databricks~=1.6.0
commands =
dbt clean
dbt deps
Expand Down Expand Up @@ -184,10 +192,18 @@ commands =
dbt deps
dbt build --target databricks

[testenv:integration_databricks_1_6_0]
changedir = integration_test_project
deps = dbt-databricks~=1.6.0
commands =
dbt clean
dbt deps
dbt build --target databricks

# Bigquery integration tests
[testenv:integration_bigquery]
changedir = integration_test_project
deps = dbt-bigquery~=1.4.0
deps = dbt-bigquery~=1.6.0
commands =
dbt clean
dbt deps
Expand Down Expand Up @@ -217,6 +233,14 @@ commands =
dbt deps
dbt build --target bigquery --vars '"my_var": "my value"'

[testenv:integration_bigquery_1_6_0]
changedir = integration_test_project
deps = dbt-bigquery~=1.6.0
commands =
dbt clean
dbt deps
dbt build --target bigquery --vars '"my_var": "my value"'

# Spark integration test (disabled)
[testenv:integration_spark]
changedir = integration_test_project
Expand Down

0 comments on commit 4afd54a

Please sign in to comment.