feat: buckets for so-called legacy Airflow pipelines #112
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This creates a bucket per team for Airflow pipelines, the so-called legacy pipelines that ingest into S3. This is part of the project of moving our Airflow instance out of GOV.UK PaaS which is shutting down, doing essentially a lift and shift into Data Workspace proper.
At the moment getting credentials for the bucket to the team code is slightly faffy using Terraform's "output" and then copy/pasting them into secrets manager, but it's the easiest way for now. Using roles would be "better" but would require a bit of a code change in data-flow, so leaving that to further work so we don't get blocked on it and can still migrate pipelines