Render directly to AWS s3? #439
-
Hi there, Thanks for all your great work! I'm wondering if it is possible to render an I'm a novice to
To do this, I'm currently trying to begin by writing a complete target factory to operate on a single data source, which I will then map over all the data sources in a given case study, and then map that process over all case studies. My basic data source factory looks something like this:
There are many of these data source reports, and I would like to avoid rendering / tracking the reports locally, since I'm typically doing remote development on an EC2 instance, and when I terminate my instance the local targets will break and I will need to rerun the pipeline. One option is saving the intermediate reports in Hence my question:
It also occurs to me that the workflow I'm imagining here isn't optimal for my use case, so I would welcome advice on that front as well. Thanks for your time. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Interesting use case. I can see how the ability to send rendered reports to a Se would be appealing. The tar_target_raw(
data_upload,
copy_html_to_tmp(data_source_report),
format = "aws_file"
}
copy_html_to_tmp <- function(path) {
tmp <- tempfile()
html <- grep("html$", path, value = TRUE)
file.copy(html, tmp)
tmp
} I would generally recommend against |
Beta Was this translation helpful? Give feedback.
Interesting use case. I can see how the ability to send rendered reports to a Se would be appealing.
The
"aws_file"
format can only accept a single file, and it sends that file to a bucket at_targets/objects/target_name
(no file extension). By contrast,tar_render()
andtar_render_raw()
return multiple files, including the source Rmd, which I gather you don't want whisked away to the cloud. So in your pipeline, I recommend including aaws_file
target that copies the HTML file from the previous step and uploads the copy. Sketch: