Skip to content

Commit

Permalink
chore(components): Fix bug due to protobuf library being upgraded by …
Browse files Browse the repository at this point in the history
…pinning protobuf version

Signed-off-by: Googler <[email protected]>
PiperOrigin-RevId: 652610202
  • Loading branch information
Googler committed Jul 15, 2024
1 parent 4930bee commit 9cb5913
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 1 deletion.
1 change: 1 addition & 0 deletions components/google-cloud/RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
## Upcoming release
* Updated the Starry Net pipeline's template gallery description, and added dataprep_nan_threshold and dataprep_zero_threshold args to the Starry Net pipeline.
* Fix bug in Starry Net's upload decomposition plot step due to protobuf upgrade, by pinning protobuf library to 3.20.*.
* Add support for running tasks on a `PersistentResource` (see [CustomJobSpec](https://cloud.google.com/vertex-ai/docs/reference/rest/v1beta1/CustomJobSpec)) via `persistent_resource_id` parameter on `v1.custom_job.CustomTrainingJobOp` and `v1.custom_job.create_custom_training_job_from_component`
* Bump image for Structured Data pipelines.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,12 @@
from kfp import dsl


@dsl.component(packages_to_install=['google-cloud-aiplatform[tensorboard]'])
@dsl.component(
packages_to_install=[
'google-cloud-aiplatform[tensorboard]',
'protobuf==3.20.*',
]
)
def upload_decomposition_plots(
project: str,
location: str,
Expand Down

0 comments on commit 9cb5913

Please sign in to comment.