Replies: 2 comments 2 replies
-
You might try to convince the reviewer to check If the reviewer does need to fully reproduce the analysis, then they will have to rerun the pipeline from scratch anyway, in which case local storage is a good default. |
Beta Was this translation helpful? Give feedback.
-
Let me see if I understand correctly. If a reviewer opens the report notebook, they would be able to run it if it uses commands such as tar_read() or tar_load(), as long as the remote storage has public read access? |
Beta Was this translation helpful? Give feedback.
-
Help
Description
I am wondering what the best workflow is for projects that I plan to share publicly and I want other people to be able to reproduce. I am planning to use an S3 bucket for myself and collaborators and also share the code as a Github repository. In that case if a colleague or a reviewer wants to reproduce the analyses, it seems that the code will error because they don't have writing access to the bucket. I'm considering setting an environmental variable like "WITH_S3", and then a conditional statement in
tar_option_set()
which defaults to local storage if people don't have this variable setup.This seems like a straightforward solution, but the computations take a long time. So I was planning to make the bucket publicly readable. Is there a way to set it up so that anyone who attempts to replicate the analysis would get the targets from the cloud storage, even if they aren't allowed to write to it?
Beta Was this translation helpful? Give feedback.
All reactions