Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v0.4.1 #149

Merged
merged 23 commits into from
Oct 4, 2024
Merged
Show file tree
Hide file tree
Changes from 20 commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
bb14cc8
Revert "Merge pull request #137 from ssi-dk/feature/simplify_update_s…
RasmusSkytte Jul 23, 2024
ceb86f9
Revert "Merge pull request #141 from ssi-dk/feature/complexity_benchm…
RasmusSkytte Jul 23, 2024
ce6b09c
Revert "Merge pull request #131 from ssi-dk/benchmark"
RasmusSkytte Jul 23, 2024
bff2007
Revert "Merge pull request #138 from ssi-dk/feature/benchmark_vignette"
RasmusSkytte Jul 23, 2024
7365a1c
chore: Update pak.lock
RasmusSkytte Jul 23, 2024
072e29b
docs(db_joins): Provide package anchor to `dplyr::show_query`
RasmusSkytte Jul 23, 2024
2edb4c5
docs(NEWS): Add entry on `check_from` argument and improved testing
RasmusSkytte Jul 23, 2024
d15e4f2
feat: Increment version number to 0.4.1
RasmusSkytte Jul 23, 2024
a3eaffc
chore: Update pak.lock
RasmusSkytte Jul 23, 2024
37c4406
test(Logger): Check that catalog is witten on backends that support it
RasmusSkytte Jul 26, 2024
10149ac
chore: Update pak.lock
RasmusSkytte Jul 26, 2024
ce161e4
fix(Logger): Write catalog to logs if it is supported by the backend
RasmusSkytte Jul 26, 2024
ee41a44
docs(NEWS): Add entry on `Logger` catalog fix
RasmusSkytte Jul 26, 2024
ce1c584
fix(get_table): Skip the `table_exists` call
RasmusSkytte Sep 17, 2024
cfd39f2
chore: Update pak.lock
RasmusSkytte Sep 17, 2024
79eaa93
fix(all-workflows): Disable PostgreSQL log error checking
RasmusSkytte Sep 18, 2024
ab1f266
chore: Update pak.lock
RasmusSkytte Sep 26, 2024
b407bf3
chore: Update pak.lock
RasmusSkytte Sep 26, 2024
587a839
Merge pull request #152 from ssi-dk/fix/get_table-robust
RasmusSkytte Sep 26, 2024
24323e2
Merge branch 'main' into rc-v0.4.1
RasmusSkytte Sep 26, 2024
c9f37fc
docs(db_joins): Copy return value instead of inheriting
RasmusSkytte Sep 27, 2024
dfcb402
chore: Update pak.lock
RasmusSkytte Sep 27, 2024
cdfd203
docs: update documentation
RasmusSkytte Sep 27, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion .Rbuildignore
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,3 @@ man-roxygen/*
^Meta$
^README.Rmd$
^revdep$
^data-raw$
1 change: 1 addition & 0 deletions .github/workflows/all-workflows.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -34,4 +34,5 @@ jobs:

# code-coverage creates data bases for the tests. Here you can specify the schemas you need for the workflow
schemas: test,test.one
check_postgres_logs: false
secrets: inherit
225 changes: 0 additions & 225 deletions .github/workflows/benchmark.yaml

This file was deleted.

8 changes: 1 addition & 7 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Package: SCDB
Type: Package
Title: Easily Access and Maintain Time-Based Versioned Data (Slowly-Changing-Dimension)
Version: 0.4.0.9000
Version: 0.4.1
Authors@R:
c(person("Rasmus Skytte", "Randl\U00F8v", , "[email protected]",
role = c("aut", "cre", "rev"),
Expand All @@ -21,8 +21,6 @@ License: GPL-3
Encoding: UTF-8
RoxygenNote: 7.3.2
Roxygen: list(markdown = TRUE, r6 = TRUE)
Depends:
R (>= 3.5.0)
Imports:
checkmate,
DBI,
Expand All @@ -44,14 +42,10 @@ Suggests:
callr,
conflicted,
duckdb,
ggplot2,
here,
jsonlite,
knitr,
lintr,
microbenchmark,
odbc,
pak,
rmarkdown,
roxygen2,
pkgdown,
Expand Down
4 changes: 0 additions & 4 deletions NAMESPACE
Original file line number Diff line number Diff line change
@@ -1,9 +1,6 @@
# Generated by roxygen2: do not edit by hand

S3method(as.character,Id)
S3method(create_index,DBIConnection)
S3method(create_index,PqConnection)
S3method(create_index,SQLiteConnection)
S3method(db_timestamp,"NULL")
S3method(db_timestamp,SQLiteConnection)
S3method(db_timestamp,default)
Expand Down Expand Up @@ -57,7 +54,6 @@ S3method(tidyr::unite,tbl_dbi)
export(Logger)
export(LoggerNull)
export(close_connection)
export(create_index)
export(create_logs_if_missing)
export(create_table)
export(db_timestamp)
Expand Down
14 changes: 6 additions & 8 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,18 @@
# SCDB (development version)

## New features

* Added function `create_index` to allow easy creating of an index on a table (#137).
# SCDB 0.4.1

## Improvements and Fixes

* `update_snapshot()` has been optimized and now runs faster on all the supported backends (#137).
* `Logger` now correctly writes to the "catalog" field on backends that support it (#149).

* `get_schema()` now correctly returns the temporary schema on PostgreSQL backends (#139).

* `get_tables()` now returns catalog on DuckDB backends (#145).

## Documentation
* Deprecated `check_from` argument no longer used in `dbplyr` calls (#136).

## Testing

* A vignette including benchmarks of `update_snapshot()` across various backends is added (#138).
* Improved tests for `get_tables()` (#145).

# SCDB 0.4.0

Expand Down
16 changes: 12 additions & 4 deletions R/Logger.R
Original file line number Diff line number Diff line change
Expand Up @@ -330,10 +330,18 @@ Logger <- R6::R6Class(
assert_timestamp_like(self$start_time, add = coll)
checkmate::reportAssertions(coll)

patch <- purrr::pluck(private$db_table, "name") |>
tibble::enframe() |>
tidyr::pivot_wider() |>
dplyr::mutate(log_file = self$log_filename) |>
patch <- data.frame(
log_file = self$log_filename,
schema = purrr::pluck(private$db_table, "name", "schema"),
table = purrr::pluck(private$db_table, "name", "table")
)

# Add catalog if it exists in the Id
if ("catalog" %in% purrr::pluck(private$db_table, "name", names)) {
patch <- dplyr::mutate(patch, catalog = purrr::pluck(private$db_table, "name", "catalog"), .before = "schema")
}

patch <- patch |>
dplyr::copy_to(
dest = private$log_conn,
df = _,
Expand Down
8 changes: 0 additions & 8 deletions R/connection.R
Original file line number Diff line number Diff line change
Expand Up @@ -155,14 +155,6 @@ get_connection.OdbcDriver <- function(
checkmate::assert_choice(timezone_out, OlsonNames(), null.ok = TRUE, add = coll)
checkmate::reportAssertions(coll)

# Recommend batch processing for ODBC connections
if (is.null(getOption("odbc.batch_rows"))) {
message(
"Transfer of large data sets may be slow. ",
"Consider using options(\"odbc.batch_rows\" = 1000) to speed up transfer."
)
}

# Check if connection can be established given these settings
status <- do.call(DBI::dbCanConnect, args = args)
if (!status) stop(attr(status, "reason"))
Expand Down
Loading
Loading