diff --git a/CHANGELOG.md b/CHANGELOG.md index 30384d0a..fde6e7e4 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -5,8 +5,12 @@ Headline template: X.Y.Z (YYYY-MM-DD) --> -## 3.5.0 (unreleased) +## 3.5.0 (2023-01-12) +- Add new helper class `safir.gcs.SignedURLService` to generate signed URLs to Google Cloud Storage objects using workload identity. + To use this class, depend on `safir[gcs]`. +- Add the `safir.testing.gcs` module, which can be used to mock the Google Cloud Storage API for testing. + To use this module, depend on `safir[gcs]`. - Add new helper class `safir.pydantic.CamelCaseModel`, which is identical to `pydantic.BaseModel` except with configuration added to accept camel-case keys using the `safir.pydantic.to_camel_case` alias generator and overrides of `dict` and `json` to export in camel-case by default. ## 3.4.0 (2022-11-29) diff --git a/Makefile b/Makefile index 0ef0f753..771f2d55 100644 --- a/Makefile +++ b/Makefile @@ -1,6 +1,6 @@ .PHONY: init init: pip install --upgrade pip tox tox-docker pre-commit - pip install --upgrade -e ".[arq,db,dev,kubernetes]" + pip install --upgrade -e ".[arq,db,dev,gcs,kubernetes]" pre-commit install rm -rf .tox diff --git a/docs/api.rst b/docs/api.rst index e8b44377..f7db8229 100644 --- a/docs/api.rst +++ b/docs/api.rst @@ -28,6 +28,9 @@ API reference .. automodapi:: safir.dependencies.logger :include-all-objects: +.. automodapi:: safir.gcs + :include-all-objects: + .. automodapi:: safir.kubernetes :include-all-objects: @@ -49,5 +52,8 @@ API reference .. automodapi:: safir.pydantic :include-all-objects: +.. automodapi:: safir.testing.gcs + :include-all-objects: + .. automodapi:: safir.testing.kubernetes :include-all-objects: diff --git a/docs/documenteer.toml b/docs/documenteer.toml index 2957f211..a6f68f50 100644 --- a/docs/documenteer.toml +++ b/docs/documenteer.toml @@ -16,6 +16,8 @@ nitpick_ignore_regex = [ ['py:.*', 'starlette.*'], ] nitpick_ignore = [ + ['py:class', 'unittest.mock.Base'], + ['py:class', 'unittest.mock.CallableMixin'], ["py:obj", "JobMetadata.id"], ] diff --git a/docs/user-guide/arq.rst b/docs/user-guide/arq.rst index c4a20409..2fbde2be 100644 --- a/docs/user-guide/arq.rst +++ b/docs/user-guide/arq.rst @@ -50,7 +50,6 @@ If your app uses a configuration system like ``pydantic.BaseSettings``, this exa class Config(BaseSettings): - arq_queue_url: RedisDsn = Field( "redis://localhost:6379/1", env="APP_ARQ_QUEUE_URL" ) diff --git a/docs/user-guide/gcs.rst b/docs/user-guide/gcs.rst new file mode 100644 index 00000000..35be9e96 --- /dev/null +++ b/docs/user-guide/gcs.rst @@ -0,0 +1,113 @@ +################################## +Using the Google Cloud Storage API +################################## + +Safir-based applications are encouraged to use the `google-cloud-storage `__ Python module. +It provides both a sync and async API and works well with `workload identity `__. + +Google Cloud Storage support in Safir is optional. +To use it, depend on ``safir[gcs]``. + +Generating signed URLs +====================== + +The preferred way to generate signed URLs for Google Cloud Storage objects is to use workload identity for the running pod, assign it a Kubernetes service account bound to a Google Cloud service account, and set appropriate permissions on that Google Cloud service account. + +The credentials provided by workload identity cannot be used to sign URLs directly. +Instead, one first has to get impersonation credentials for the same service account, and then use those to sign the URL. +`safir.gcs.SignedURLService` automates this process. + +To use this class, the workload identity of the running pod must have ``roles/iam.serviceAccountTokenCreator`` for a Google service account, and that service account must have appropriate GCS permissions for the object for which one wants to create a signed URL. +Then, do the following: + +.. code-block:: python + + from datetime import timedelta + + from safir.gcs import SignedURLService + + + url_service = SignedURLService("service-account") + url = url_service.signed_url("s3://bucket/path/to/file", "application/fits") + +The argument to the constructor is the name of the Google Cloud service account that will be used to sign the URLs. +This should be the one for which the workload identity has impersonation permissions. +(Generally, this should be the same service account to which the workload identity is bound.) + +Optionally, you can specify the lifetime of the signed URLs as a second argument, which should be a `datetime.timedelta`. +If not given, the default is one hour. + +The path to the Google Cloud Storage object for which to create a signed URL must be an S3 URL. +The second argument to `~safir.gcs.SignedURLService.signed_url` is the MIME type of the underlying object, which will be encoded in the signed URL. + +Testing with mock Google Cloud Storage +====================================== + +The `safir.testing.gcs` module provides a limited, mock Google Cloud Storage (GCS) API suitable for testing. +By default, this mock provides just enough functionality to allow retrieving a bucket, retrieving a blob from the bucket, and creating a signed URL for the blob. +If a path to a tree of files is given, it can also mock some other blob attributes and methods based on the underlying files. + +Testing signed URLs +------------------- + +Applications that want to run tests with the mock GCS API should define a fixture (in ``conftest.py``) as follows: + +.. code-block:: python + + from datetime import timedelta + from typing import Iterator + + import pytest + + from safir.testing.gcs import MockStorageClient, patch_google_storage + + + @pytest.fixture + def mock_gcs() -> Iterator[MockStorageClient]: + yield from patch_google_storage( + expected_expiration=timedelta(hours=1), bucket_name="some-bucket" + ) + +The ``expected_expiration`` argument is optional and tells the mock object what expiration the application is expected to request for its signed URLs. +If this option is given and the application, when tested, requests a signed URL with a different expiration, the mock will raise an assertion failure. + +The ``bucket_name`` argument is optional. +If given, an attempt by the tested application to request a bucket of any other name will raise an assertion failure. + +When this fixture is in use, the tested application can use Google Cloud Storage as normal, as long as it only makes the method calls supported by the mock object. +Some parameters to the method requesting a signed URL will be checked for expected values. +The returned signed URL will always be :samp:`https://example.com/{name}`, where the last component will be the requested blob name. +This can then be checked via assertions in tests. + +To ensure that the mocking is done correctly, be sure not to import ``Client``, ``Credentials``, or similar symbols from ``google.cloud.storage`` or ``google.auth`` directly into a module. +Instead, use: + +.. code-block:: python + + from google.cloud import storage + +and then use, for example, ``storage.Client``. + +Testing with a tree of files +---------------------------- + +To mock additional blob attributes and methods, point the test fixture at a tree of files with the ``path`` parameter. + +.. code-block:: python + :emphasize-lines: 1, 7 + + from pathlib import Path + + + @pytest.fixture + def mock_gcs() -> Iterator[MockStorageClient]: + yield from patch_google_storage( + path=Path(__file__).parent / "data" / "files", + expected_expiration=timedelta(hours=1), + bucket_name="some-bucket", + ) + +The resulting blobs will then correspond to the files on disk and will support the additional attributes ``size``, ``updated``, and ``etag``, and the additional methods ``download_as_bytes``, ``exists``, ``open``, and ``reload`` (which does nothing). +The Etag value of the blob will be the string version of its inode number. + +Mock signed URLs will continue to work exactly the same as when a path is not provided. diff --git a/docs/user-guide/index.rst b/docs/user-guide/index.rst index 402fe5bc..578391bf 100644 --- a/docs/user-guide/index.rst +++ b/docs/user-guide/index.rst @@ -23,3 +23,4 @@ User guide ivoa kubernetes pydantic + gcs diff --git a/pyproject.toml b/pyproject.toml index 7b94ae88..05116cbb 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -34,6 +34,9 @@ dependencies = [ dynamic = ["version"] [project.optional-dependencies] +arq = [ + "arq>=0.23" +] db = [ "asyncpg", "sqlalchemy[asyncio]", @@ -52,12 +55,13 @@ dev = [ # documentation "documenteer[guide]>=0.7.0b2", ] +gcs = [ + "google-auth", + "google-cloud-storage" +] kubernetes = [ "kubernetes_asyncio" ] -arq = [ - "arq>=0.23" -] [[project.authors]] name = "Association of Universities for Research in Astronomy, Inc. (AURA)" diff --git a/src/safir/gcs.py b/src/safir/gcs.py new file mode 100644 index 00000000..9ee37bbd --- /dev/null +++ b/src/safir/gcs.py @@ -0,0 +1,100 @@ +"""Utilities for interacting with Google Cloud Storage.""" + +from __future__ import annotations + +from datetime import timedelta +from typing import Optional +from urllib.parse import urlparse + +import google.auth +from google.auth import impersonated_credentials +from google.cloud import storage + +__all__ = ["SignedURLService"] + + +class SignedURLService: + """Generate signed URLs for Google Cloud Storage blobs. + + Uses default credentials plus credential impersonation to generate signed + URLs for Google Cloud Storage blobs. This is the correct approach when + running as a Kubernetes pod using workload identity. + + Parameters + ---------- + service_account + The service account to use to sign the URLs. The workload identity + must have access to generate service account tokens for that service + account. + lifetime + Lifetime of the generated signed URLs. + + Notes + ----- + The workload identity (or other default credentials) under which the + caller is running must have ``roles/iam.serviceAccountTokenCreator`` on + the service account given in the ``service_account`` parameter. This is + how a workload identity can retrieve a key that can be used to create a + signed URL. + + See `gcs_signedurl `__ for + additional details on how this works. + """ + + def __init__( + self, service_account: str, lifetime: timedelta = timedelta(hours=1) + ) -> None: + self._lifetime = lifetime + self._service_account = service_account + self._gcs = storage.Client() + self._credentials, _ = google.auth.default() + + def signed_url(self, uri: str, mime_type: Optional[str]) -> str: + """Generate signed URL for a given storage object. + + Parameters + ---------- + uri + URI for the storage object. This must start with ``s3://`` and + use the S3 URI syntax to specify bucket and blob of a Google + Cloud Storage object. + mime_type + MIME type of the object, for encoding in the signed URL. + + Returns + ------- + str + New signed URL, which will be valid for as long as the lifetime + parameter to the object. + + Raises + ------ + ValueError + The ``uri`` parameter is not an S3 URI. + + Notes + ----- + This is inefficient, since it gets new signing credentials each time + it generates a signed URL. Doing better will require figuring out the + lifetime and refreshing the credentials when the lifetime has expired. + """ + parsed_uri = urlparse(uri) + if parsed_uri.scheme != "s3": + raise ValueError(f"URI {uri} is not an S3 URI") + bucket = self._gcs.bucket(parsed_uri.netloc) + blob = bucket.blob(parsed_uri.path[1:]) + signing_credentials = impersonated_credentials.Credentials( + source_credentials=self._credentials, + target_principal=self._service_account, + target_scopes=( + "https://www.googleapis.com/auth/devstorage.read_only" + ), + lifetime=2, + ) + return blob.generate_signed_url( + version="v4", + expiration=self._lifetime, + method="GET", + response_type=mime_type, + credentials=signing_credentials, + ) diff --git a/src/safir/testing/gcs.py b/src/safir/testing/gcs.py new file mode 100644 index 00000000..b747fe04 --- /dev/null +++ b/src/safir/testing/gcs.py @@ -0,0 +1,338 @@ +"""Mock Google Cloud Storage API for testing.""" + +from __future__ import annotations + +from datetime import datetime, timedelta, timezone +from io import BufferedReader +from pathlib import Path +from typing import Any, Iterator, Optional +from unittest.mock import Mock, patch + +from google.cloud import storage + +__all__ = [ + "MockBlob", + "MockBucket", + "MockStorageClient", + "patch_google_storage", +] + + +class MockBlob(Mock): + """Mock version of ``google.cloud.storage.blob.Blob``. + + Parameters + ---------- + name + Name of the blob. + expected_expiration + The expiration that should be requested in a call to + ``generate_signed_url`` on an underlying blob. A non-matching call + will produce an assertion failure. + """ + + def __init__( + self, name: str, expected_expiration: Optional[timedelta] = None + ) -> None: + super().__init__(spec=storage.blob.Blob) + self.name = name + self._expected_expiration = expected_expiration + + def generate_signed_url( + self, + *, + version: str, + expiration: timedelta, + method: str, + response_type: Optional[str] = None, + credentials: Optional[Any] = None, + ) -> str: + """Generate a mock signed URL for testing. + + Parameters + ---------- + version + Must be ``v4``. + expiration + Must match the ``expected_expiration`` argument to the + constructor if it was given. + method + Must be ``GET``. + response_type + May be anything and is ignored. + credentials + May be anything and is ignored. + + Returns + ------- + str + Always returns :samp:`https://example.com/{name}` where *name* is + the name of the blob. + """ + assert version == "v4" + if self._expected_expiration: + assert expiration == self._expected_expiration + assert method == "GET" + return f"https://example.com/{self.name}" + + +class MockFileBlob(MockBlob): + """Mock version of ``google.cloud.storage.blob.Blob`` for a file. + + Parameters + ---------- + name + Name of the blob. + path + Path to the file for this blob. + expected_expiration + The expiration that should be requested in a call to + ``generate_signed_url`` on an underlying blob. A non-matching call + will produce an assertion failure. + + Attributes + ---------- + size : `int` + Size of the underlying file. + updated : `datetime.datetime` + When the underlying file was last updated. + etag : `str` + Etag value for the file (taken from its inode number). + """ + + def __init__( + self, + name: str, + path: Path, + expected_expiration: Optional[timedelta] = None, + ) -> None: + super().__init__(name, expected_expiration) + self._path = path + self._exists = path.exists() + if self._exists: + self.size = self._path.stat().st_size + mtime = self._path.stat().st_mtime + self.updated = datetime.fromtimestamp(mtime, tz=timezone.utc) + self.etag = str(self._path.stat().st_ino) + + def download_as_bytes(self) -> bytes: + """Get contents of the blob. + + Returns + ------- + bytes + Contents of the underlying file. + """ + return self._path.read_bytes() + + def exists(self) -> bool: + """Whether the underlying file exists. + + Returns + ------- + bool + `True` if it does, `False` otherwise. + """ + return self._exists + + def open(self, mode: str) -> BufferedReader: + """Open the file. + + Parameters + ---------- + mode + Mode with which to open it (must be ``rb`` or an assertion failure + is raised). + + Returns + ------- + BufferedReader + Stream representing the file. + + Raises + ------ + AssertionFailure + Unexpected mode argument. + """ + assert mode == "rb" + return self._path.open("rb") + + def reload(self) -> None: + """Reload the metadata for the file. + + This does nothing in the mock. + """ + pass + + +class MockBucket(Mock): + """Mock version of ``google.cloud.storage.bucket.Bucket``. + + Parameters + ---------- + expected_expiration + The expiration that should be requested in a call to + ``generate_signed_url`` on an underlying blob. A non-matching call + will produce an assertion failure. + path + Root of the file path for blobs, if given. If not given, a simpler + mock blob will be used that only supports ``generate_signed_url``. + """ + + def __init__( + self, + bucket_name: str, + expected_expiration: Optional[timedelta] = None, + path: Optional[Path] = None, + ) -> None: + super().__init__(spec=storage.bucket.Bucket) + self._expected_expiration = expected_expiration + self._path = path + + def blob(self, blob_name: str) -> MockBlob: + """Retrieve a mock blob. + + Parameters + ---------- + blob_name + The name of the blob, used later to form its signed URL. + + Returns + ------- + MockBlob + The mock blob. + """ + if self._path: + return MockFileBlob( + blob_name, self._path / blob_name, self._expected_expiration + ) + else: + return MockBlob(blob_name, self._expected_expiration) + + +class MockStorageClient(Mock): + """Mock version of ``google.cloud.storage.Client``. + + Only supports `bucket`, and the resulting object only supports the + ``blob`` method. The resulting blob only supports the + ``generate_signed_url`` method. + + Parameters + ---------- + expected_expiration + The expiration that should be requested in a call to + ``generate_signed_url`` on an underlying blob. A non-matching call + will produce an assertion failure. + path + Root of the file path for blobs, if given. If not given, a simpler + mock blob will be used that only supports ``generate_signed_url``. + bucket_name + If set, all requests for a bucket with a name other than the one + provided will produce assertion failures. + """ + + def __init__( + self, + expected_expiration: Optional[timedelta] = None, + path: Optional[Path] = None, + bucket_name: Optional[str] = None, + ) -> None: + super().__init__(spec=storage.Client) + self._bucket_name = bucket_name + self._expected_expiration = expected_expiration + self._path = path + + def bucket(self, bucket_name: str) -> MockBucket: + """Retrieve a mock bucket. + + Parameters + ---------- + bucket_name + Name of the bucket. If a bucket name was given to the + constructor, this name will be checked against that one and a + mismatch will cause an assertion failure. + + Returns + ------- + MockBucket + The mock bucket. + """ + if self._bucket_name: + assert bucket_name == self._bucket_name + return MockBucket(bucket_name, self._expected_expiration, self._path) + + +def patch_google_storage( + *, + expected_expiration: Optional[timedelta] = None, + path: Optional[Path] = None, + bucket_name: Optional[str] = None, +) -> Iterator[MockStorageClient]: + """Replace the Google Cloud Storage API with a mock class. + + This function will replace the ``google.cloud.storage.Client`` API with a + mock object. It only supports bucket requests, the buckets only support + blob requests, and the blobs only support requests for signed URLs. The + value of the signed URL will be :samp:`https://example.com/{blob}` where + *blob* is the name of the blob. + + Parameters + ---------- + expected_expiration + The expiration that should be requested in a call to + ``generate_signed_url`` on an underlying blob. A non-matching call + will produce an assertion failure. + path + Root of the file path for blobs, if given. If not given, a simpler + mock blob will be used that only supports ``generate_signed_url``. + bucket_name + If set, all requests for a bucket with a name other than the one + provided will produce assertion failures. + + Yields + ------ + MockStorageClient + The mock Google Cloud Storage API client (although this is rarely + needed by the caller). + + Notes + ----- + This function also mocks out ``google.auth.default`` and the impersonated + credentials structure so that this mock can be used with applications that + use workload identity. + + To use this mock successfully, you must not import ``Client`` (or + ``Credentials``) directly into the local namespace, or it will not be + correctly patched. Instead, use: + + .. code-block:: python + + from google.cloud import storage + + and then use ``storage.Client`` and so forth. Do the same with + q`google.auth.impersonated_credentials.Credentials``. + + Examples + -------- + Normally this should be called from a fixture in ``tests/conftest.py`` + such as the following: + + .. code-block:: python + + from datetime import timedelta + + from safir.testing.gcs import MockStorageClient, patch_google_storage + + + @pytest.fixture + def mock_gcs() -> Iterator[MockStorageClient]: + yield from patch_gcs( + expected_expiration=timedelta(hours=1), + bucket_name="some-bucket", + ) + """ + mock_gcs = MockStorageClient(expected_expiration, path, bucket_name) + with patch("google.auth.impersonated_credentials.Credentials"): + with patch("google.auth.default", return_value=(None, None)): + with patch("google.cloud.storage.Client", return_value=mock_gcs): + yield mock_gcs diff --git a/tests/conftest.py b/tests/conftest.py new file mode 100644 index 00000000..5bc51da1 --- /dev/null +++ b/tests/conftest.py @@ -0,0 +1,17 @@ +"""Test fixtures.""" + +from __future__ import annotations + +from datetime import timedelta +from typing import Iterator + +import pytest + +from safir.testing.gcs import MockStorageClient, patch_google_storage + + +@pytest.fixture +def mock_gcs() -> Iterator[MockStorageClient]: + yield from patch_google_storage( + expected_expiration=timedelta(hours=1), bucket_name="some-bucket" + ) diff --git a/tests/gcs_test.py b/tests/gcs_test.py new file mode 100644 index 00000000..22efb2d4 --- /dev/null +++ b/tests/gcs_test.py @@ -0,0 +1,25 @@ +"""Tests for Google Cloud Storage support code.""" + +from __future__ import annotations + +from datetime import timedelta + +import pytest + +from safir.gcs import SignedURLService +from safir.testing.gcs import MockStorageClient + + +def test_signed_url(mock_gcs: MockStorageClient) -> None: + url_service = SignedURLService("service-account", timedelta(hours=1)) + url = url_service.signed_url("s3://some-bucket/path/to/blob", "text/plain") + assert url == "https://example.com/path/to/blob" + + # Test that the lifetime is passed down to the mock, which will reject it + # if it's not an hour. + url_service = SignedURLService("foo", timedelta(minutes=30)) + with pytest.raises(AssertionError): + url_service.signed_url("s3://some-bucket/blob", "text/plain") + + # Test that lifetime defaults to one hour. + url_service = SignedURLService("foo") diff --git a/tests/testing/conftest.py b/tests/testing/conftest.py index f4905e42..247bb1d9 100644 --- a/tests/testing/conftest.py +++ b/tests/testing/conftest.py @@ -6,13 +6,30 @@ from __future__ import annotations +from datetime import timedelta +from pathlib import Path from typing import Iterator import pytest +from safir.testing.gcs import MockStorageClient, patch_google_storage from safir.testing.kubernetes import MockKubernetesApi, patch_kubernetes @pytest.fixture def mock_kubernetes() -> Iterator[MockKubernetesApi]: yield from patch_kubernetes() + + +@pytest.fixture +def mock_gcs_file() -> Iterator[MockStorageClient]: + yield from patch_google_storage( + path=Path(__file__).parent, + expected_expiration=timedelta(hours=1), + bucket_name="some-bucket", + ) + + +@pytest.fixture +def mock_gcs_minimal() -> Iterator[MockStorageClient]: + yield from patch_google_storage() diff --git a/tests/testing/gcs_test.py b/tests/testing/gcs_test.py new file mode 100644 index 00000000..4ab4c083 --- /dev/null +++ b/tests/testing/gcs_test.py @@ -0,0 +1,117 @@ +"""Tests for the Google Cloud Storage support infrastructure. + +These are just basic sanity checks that the mocking is working correctly and +the basic calls work. +""" + +from __future__ import annotations + +from datetime import datetime, timedelta, timezone +from pathlib import Path + +import google.auth +import pytest +from google.auth import impersonated_credentials +from google.cloud import storage + +from safir.testing.gcs import MockStorageClient + + +def test_mock(mock_gcs: MockStorageClient) -> None: + client = storage.Client() + bucket = client.bucket("some-bucket") + blob = bucket.blob("something") + credentials = google.auth.default() + signing_credentials = impersonated_credentials.Credentials( + source_credentials=credentials, + target_principle="some-service-account", + target_scopes="https://www.googleapis.com/auth/devstorage.read_only", + lifetime=2, + ) + signed_url = blob.generate_signed_url( + version="v4", + expiration=timedelta(hours=1), + method="GET", + response_type="application/fits", + credentials=signing_credentials, + ) + assert signed_url == "https://example.com/something" + + # The wrong expiration produces an assertion. + with pytest.raises(AssertionError): + blob.generate_signed_url( + version="v4", + expiration=timedelta(hours=2), + method="GET", + response_type="application/fits", + credentials=signing_credentials, + ) + + # The wrong bucket produces an assertion. + with pytest.raises(AssertionError): + bucket = client.bucket("wrong-bucket") + + +def test_mock_files(mock_gcs_file: MockStorageClient) -> None: + this_file = Path(__file__) + client = storage.Client() + bucket = client.bucket("some-bucket") + blob = bucket.blob(this_file.name) + + # Test that signed URLs still work. + credentials = google.auth.default() + signing_credentials = impersonated_credentials.Credentials( + source_credentials=credentials, + target_principle="some-service-account", + target_scopes="https://www.googleapis.com/auth/devstorage.read_only", + lifetime=2, + ) + signed_url = blob.generate_signed_url( + version="v4", + expiration=timedelta(hours=1), + method="GET", + response_type="application/fits", + credentials=signing_credentials, + ) + assert signed_url == f"https://example.com/{this_file.name}" + + # Test the file-specific methods. + assert blob.exists() + assert blob.size == this_file.stat().st_size + assert blob.updated == datetime.fromtimestamp( + this_file.stat().st_mtime, tz=timezone.utc + ) + assert blob.etag == str(this_file.stat().st_ino) + assert blob.download_as_bytes() == this_file.read_bytes() + with blob.open("rb") as f: + contents = f.read() + assert contents == this_file.read_bytes() + + # Test an invalid open mode. + with pytest.raises(AssertionError): + blob.open("wb") + + # Test a nonexistent file. + blob = bucket.blob("does-not-exist") + assert not blob.exists() + + +def test_mock_minimal(mock_gcs_minimal: MockStorageClient) -> None: + """Minimal configuration, which doesn't check lifetime or bucket name.""" + client = storage.Client() + bucket = client.bucket("some-bucket") + + # It doesn't matter what bucket name we choose, since we didn't request + # verification. + bucket = client.bucket("other-bucket") + + # It doesn't matter what expiration we specify on signed URLs. + blob = bucket.blob("a-file") + signed_url = blob.generate_signed_url( + version="v4", expiration=timedelta(minutes=1), method="GET" + ) + assert signed_url == "https://example.com/a-file" + signed_url = blob.generate_signed_url( + version="v4", expiration=timedelta(hours=1), method="GET" + ) + assert signed_url == "https://example.com/a-file" diff --git a/tox.ini b/tox.ini index 2fd5908c..31c2f821 100644 --- a/tox.ini +++ b/tox.ini @@ -25,10 +25,11 @@ healthcheck_start_period = 1 [testenv] description = Run pytest against {envname}. extras = + arq db dev + gcs kubernetes - arq [testenv:py] description = Run pytest with PostgreSQL via Docker.