Skip to content

Commit

Permalink
Merge pull request #149 from ssi-dk/rc-v0.4.1
Browse files Browse the repository at this point in the history
v0.4.1
  • Loading branch information
RasmusSkytte authored Oct 4, 2024
2 parents e1c9153 + cdfd203 commit ce303b2
Show file tree
Hide file tree
Showing 24 changed files with 263 additions and 1,302 deletions.
1 change: 0 additions & 1 deletion .Rbuildignore
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,3 @@ man-roxygen/*
^Meta$
^README.Rmd$
^revdep$
^data-raw$
1 change: 1 addition & 0 deletions .github/workflows/all-workflows.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -34,4 +34,5 @@ jobs:

# code-coverage creates data bases for the tests. Here you can specify the schemas you need for the workflow
schemas: test,test.one
check_postgres_logs: false
secrets: inherit
225 changes: 0 additions & 225 deletions .github/workflows/benchmark.yaml

This file was deleted.

8 changes: 1 addition & 7 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Package: SCDB
Type: Package
Title: Easily Access and Maintain Time-Based Versioned Data (Slowly-Changing-Dimension)
Version: 0.4.0.9000
Version: 0.4.1
Authors@R:
c(person("Rasmus Skytte", "Randl\U00F8v", , "[email protected]",
role = c("aut", "cre", "rev"),
Expand All @@ -21,8 +21,6 @@ License: GPL-3
Encoding: UTF-8
RoxygenNote: 7.3.2
Roxygen: list(markdown = TRUE, r6 = TRUE)
Depends:
R (>= 3.5.0)
Imports:
checkmate,
DBI,
Expand All @@ -44,14 +42,10 @@ Suggests:
callr,
conflicted,
duckdb,
ggplot2,
here,
jsonlite,
knitr,
lintr,
microbenchmark,
odbc,
pak,
rmarkdown,
roxygen2,
pkgdown,
Expand Down
4 changes: 0 additions & 4 deletions NAMESPACE
Original file line number Diff line number Diff line change
@@ -1,9 +1,6 @@
# Generated by roxygen2: do not edit by hand

S3method(as.character,Id)
S3method(create_index,DBIConnection)
S3method(create_index,PqConnection)
S3method(create_index,SQLiteConnection)
S3method(db_timestamp,"NULL")
S3method(db_timestamp,SQLiteConnection)
S3method(db_timestamp,default)
Expand Down Expand Up @@ -57,7 +54,6 @@ S3method(tidyr::unite,tbl_dbi)
export(Logger)
export(LoggerNull)
export(close_connection)
export(create_index)
export(create_logs_if_missing)
export(create_table)
export(db_timestamp)
Expand Down
14 changes: 6 additions & 8 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,18 @@
# SCDB (development version)

## New features

* Added function `create_index` to allow easy creating of an index on a table (#137).
# SCDB 0.4.1

## Improvements and Fixes

* `update_snapshot()` has been optimized and now runs faster on all the supported backends (#137).
* `Logger` now correctly writes to the "catalog" field on backends that support it (#149).

* `get_schema()` now correctly returns the temporary schema on PostgreSQL backends (#139).

* `get_tables()` now returns catalog on DuckDB backends (#145).

## Documentation
* Deprecated `check_from` argument no longer used in `dbplyr` calls (#136).

## Testing

* A vignette including benchmarks of `update_snapshot()` across various backends is added (#138).
* Improved tests for `get_tables()` (#145).

# SCDB 0.4.0

Expand Down
16 changes: 12 additions & 4 deletions R/Logger.R
Original file line number Diff line number Diff line change
Expand Up @@ -330,10 +330,18 @@ Logger <- R6::R6Class(
assert_timestamp_like(self$start_time, add = coll)
checkmate::reportAssertions(coll)

patch <- purrr::pluck(private$db_table, "name") |>
tibble::enframe() |>
tidyr::pivot_wider() |>
dplyr::mutate(log_file = self$log_filename) |>
patch <- data.frame(
log_file = self$log_filename,
schema = purrr::pluck(private$db_table, "name", "schema"),
table = purrr::pluck(private$db_table, "name", "table")
)

# Add catalog if it exists in the Id
if ("catalog" %in% purrr::pluck(private$db_table, "name", names)) {
patch <- dplyr::mutate(patch, catalog = purrr::pluck(private$db_table, "name", "catalog"), .before = "schema")
}

patch <- patch |>
dplyr::copy_to(
dest = private$log_conn,
df = _,
Expand Down
8 changes: 0 additions & 8 deletions R/connection.R
Original file line number Diff line number Diff line change
Expand Up @@ -155,14 +155,6 @@ get_connection.OdbcDriver <- function(
checkmate::assert_choice(timezone_out, OlsonNames(), null.ok = TRUE, add = coll)
checkmate::reportAssertions(coll)

# Recommend batch processing for ODBC connections
if (is.null(getOption("odbc.batch_rows"))) {
message(
"Transfer of large data sets may be slow. ",
"Consider using options(\"odbc.batch_rows\" = 1000) to speed up transfer."
)
}

# Check if connection can be established given these settings
status <- do.call(DBI::dbCanConnect, args = args)
if (!status) stop(attr(status, "reason"))
Expand Down
Loading

0 comments on commit ce303b2

Please sign in to comment.