Replies: 2 comments 2 replies
-
That's a tough one. Probably requires a conversation with your sys admin. There is precedent though: S3 usage in Nextflow is such a standard pattern these days for non- |
Beta Was this translation helpful? Give feedback.
-
If I recall @cboettig created https://github.com/boettiger-lab/earthdatalogin/ to tackle a very similar problem of only temporary credentials being available for NASA data stored at AWS. The package itself has some functions to automate this credential getting and setting (https://github.com/boettiger-lab/earthdatalogin/blob/main/R/edl_s3_token.R). I think he had originally set it up as a background process so the environment variables would be changed regularly for out-of-R tools like GDAL to work properly. This is mostly a hack, and one would have to think about how to propagate updated environment variables to the various |
Beta Was this translation helpful? Give feedback.
-
Help
Description
I would like to use the AWS cloud storage functionality in {targets} but I only have temporary AWS credentials which I can generate on-demand with a simple API request. The problem is that these temporary credentials are valid for just one hour, so if my targets pipeline takes more than an hour, targets would not be able to access the S3 bucket once the initial credentials expire.
Any thoughts on possible approaches/solutions?
Beta Was this translation helpful? Give feedback.
All reactions