Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rc vs 1365 hatcher testing parquet #8845

Draft
wants to merge 42 commits into
base: ah_var_store
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
d40a485
Funcotator Update for Datasource Release V1.8 (#8512)
jamesemery Oct 11, 2023
423d106
Fixed Funcotator VCF output renderer to correctly preserve B37 contig…
jamesemery Oct 11, 2023
2900e01
Fix for events not in minimal representation (#8567)
meganshand Nov 3, 2023
683eaa8
Ultima.flow annotations.fix (#8442)
dror27 Nov 13, 2023
e6e4dea
Removes unnecessary and buggy validation check (#8580)
ilyasoifer Nov 13, 2023
1dc7ee4
Update our HTSJDK dependency to 4.0.2 (#8584)
droazen Nov 14, 2023
7a08754
Update picard to 3.1.1 (#8585)
lbergelson Nov 15, 2023
fa3dfed
Add option to AnalyzeSaturationMutagenesis to keep disjoint mates (#8…
odcambc Nov 26, 2023
0da6409
New/Updated Flow Based Read tools (#8579)
dror27 Nov 28, 2023
e37b344
GroundTruthScorer doc update (#8597)
dror27 Dec 6, 2023
bf24519
Add a native GATK implementation for 2bit references, and remove the …
droazen Dec 8, 2023
e2c5fab
Update dependencies to address security vulnerabilities, and add a se…
droazen Dec 8, 2023
3b8b5bf
Update http-nio and wire its new settings (#8611)
lbergelson Dec 9, 2023
5839cbd
PrintFileDiagnostics for cram, crai and bai. (#8577)
cmnbroad Dec 9, 2023
2ad4a3e
Allow GenomicsDBImport to connect to az:// files without interference…
lbergelson Dec 9, 2023
e29cbc3
Disable line-by-line codecov comments (#8613)
droazen Dec 11, 2023
0b18579
Support for custom ploidy regions in HaplotypeCaller (#8609)
lbergelson Dec 11, 2023
85d13d4
Update the GATK base image to a newer LTS ubuntu release (#8610)
droazen Dec 12, 2023
75f5104
build_docker_remote: add ability to specify the RELEASE arg to the cl…
droazen Dec 13, 2023
23c8071
Update to htsjdk 4.1.0 (#8620)
lbergelson Dec 13, 2023
70ee553
Fix the Spark version in the GATK jar manifest, and used the right co…
droazen Dec 13, 2023
8317d8b
Update http-nio to 1.1.0 which implements Path.resolve() methods (#8626)
lbergelson Dec 13, 2023
fd873e9
Fix GT header in PostprocessGermlineCNVCalls's --output-genotyped-int…
jmarshall Dec 14, 2023
39cfbba
Output the new image name at the end of a successful cloud docker bui…
droazen Dec 14, 2023
b68fadc
Reduce SVConcordance memory footprint (#8623)
mwalker174 Dec 14, 2023
e796d20
Rewrite complex SV functional annotation in SVAnnotate (#8516)
epiercehoffman Jan 23, 2024
2d50cf8
Improvements to Mutect2's Permutect training data mode (#8663)
davidbenjamin Jan 26, 2024
dd73036
normal artifact lod is now defined without the extra minus sign (#8668)
davidbenjamin Jan 30, 2024
bbc028b
Parameterize the logging frequency for ProgressLogger in GatherVcfsCl…
gbggrant Feb 7, 2024
cfd4d87
Handle CTX_INV subtype in SVAnnotate (#8693)
epiercehoffman Feb 15, 2024
e8f71e4
manually merging build.gradle
koncheto-broad Mar 5, 2024
04a5d60
updating from master
koncheto-broad Mar 5, 2024
9367d93
updating from master
koncheto-broad Mar 5, 2024
fe5c81c
Updating org.apache.commons package change to get the branch building
koncheto-broad Mar 5, 2024
4501d90
Messy, but functionally converting a local gvcf to a parquet file output
koncheto-broad Mar 12, 2024
f92ef54
Cleaning it up a tad
koncheto-broad Mar 12, 2024
dc68776
non-compatible tweaks to ingestion to write our parquet files
koncheto-broad Mar 19, 2024
bd9b766
small update
koncheto-broad Mar 20, 2024
b1b7577
finally getting the parquet copying part sorted out
koncheto-broad Mar 20, 2024
65c0164
suffering a file name clash and overriding the same file. Changing t…
koncheto-broad Mar 20, 2024
7a9d105
suffering a file name clash and overriding the same file. Changing t…
koncheto-broad Mar 20, 2024
e852dcf
add reference and header parquet writers
RoriCremer Jun 7, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .dockstore.yml
Original file line number Diff line number Diff line change
Expand Up @@ -183,6 +183,7 @@ workflows:
- master
- ah_var_store
- rsa_vs_1245
- hatcher_testing_parquet
tags:
- /.*/
- name: GvsPrepareRangesCallset
Expand Down
4 changes: 1 addition & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
ARG BASE_DOCKER=broadinstitute/gatk:gatkbase-3.1.0
ARG BASE_DOCKER=broadinstitute/gatk:gatkbase-3.2.0

# stage 1 for constructing the GATK zip
FROM ${BASE_DOCKER} AS gradleBuild
Expand Down Expand Up @@ -93,8 +93,6 @@ RUN conda env create -n gatk -f /gatk/gatkcondaenv.yml && \
echo "source activate gatk" >> /gatk/gatkenv.rc && \
echo "source /gatk/gatk-completion.sh" >> /gatk/gatkenv.rc && \
conda clean -afy && \
find /opt/miniconda/ -follow -type f -name '*.a' -delete && \
find /opt/miniconda/ -follow -type f -name '*.pyc' -delete && \
rm -rf /root/.cache/pip

CMD ["bash", "--init-file", "/gatk/gatkenv.rc"]
Expand Down
86 changes: 54 additions & 32 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
buildscript {
repositories {
mavenCentral()
}
}
}

plugins {
Expand All @@ -16,6 +16,7 @@ plugins {
id "com.github.johnrengelman.shadow" version "8.1.1" //used to build the shadow and sparkJars
id "com.github.ben-manes.versions" version "0.12.0" //used for identifying dependencies that need updating
id 'com.palantir.git-version' version '0.5.1' //version helper
id 'org.sonatype.gradle.plugins.scan' version '2.6.1' // scans for security vulnerabilities in our dependencies
}


Expand Down Expand Up @@ -56,20 +57,20 @@ repositories {
mavenLocal()
}

final htsjdkVersion = System.getProperty('htsjdk.version','3.0.5')
final picardVersion = System.getProperty('picard.version','3.1.0')
final htsjdkVersion = System.getProperty('htsjdk.version','4.1.0')
final picardVersion = System.getProperty('picard.version','3.1.1')
final barclayVersion = System.getProperty('barclay.version','5.0.0')
final sparkVersion = System.getProperty('spark.version', '3.3.1')
final hadoopVersion = System.getProperty('hadoop.version', '3.3.1')
final disqVersion = System.getProperty('disq.version','0.3.6')
final genomicsdbVersion = System.getProperty('genomicsdb.version','1.5.0')
final bigQueryVersion = System.getProperty('bigQuery.version', '2.31.0')
final bigQueryStorageVersion = System.getProperty('bigQueryStorage.version', '2.41.0')
final guavaVersion = System.getProperty('guava.version', '32.1.2-jre')
final sparkVersion = System.getProperty('spark.version', '3.5.0')
final hadoopVersion = System.getProperty('hadoop.version', '3.3.6')
final disqVersion = System.getProperty('disq.version','0.3.8')
final genomicsdbVersion = System.getProperty('genomicsdb.version','1.5.1')
final bigQueryVersion = System.getProperty('bigQuery.version', '2.35.0')
final bigQueryStorageVersion = System.getProperty('bigQueryStorage.version', '2.47.0')
final guavaVersion = System.getProperty('guava.version', '32.1.3-jre')
final log4j2Version = System.getProperty('log4j2Version', '2.17.1')
final testNGVersion = '7.0.0'

final googleCloudNioDependency = 'com.google.cloud:google-cloud-nio:0.127.0'
final googleCloudNioDependency = 'com.google.cloud:google-cloud-nio:0.127.8'

final baseJarName = 'gatk'
final secondaryBaseJarName = 'hellbender'
Expand Down Expand Up @@ -109,7 +110,7 @@ def resolveLargeResourceStubFiles(largeResourcesFolder, buildPrerequisitesMessag
} catch (IOException e) {
throw new GradleException(
"An IOException occurred while attempting to execute the command $gitLFSExecCommand."
+ " git-lfs is required to build GATK but may not be installed. $buildPrerequisitesMessage", e)
+ " git-lfs is required to build GATK but may not be installed. $buildPrerequisitesMessage", e)
}
}

Expand Down Expand Up @@ -187,8 +188,10 @@ configurations.all {
}

tasks.withType(JavaCompile) {
options.compilerArgs = ['-proc:none', '-Xlint:all', '-Werror', '-Xdiags:verbose']
options.encoding = 'UTF-8'
// Changing this right now just for getting around deprecation
// options.compilerArgs = ['-proc:none', '-Xlint:all', '-Werror', '-Xdiags:verbose']
options.compilerArgs = ['-proc:none', '-Xlint:all', '-Xdiags:verbose']
options.encoding = 'UTF-8'
}

sourceSets {
Expand Down Expand Up @@ -267,12 +270,12 @@ dependencies {
// are routed to log4j
implementation 'org.apache.logging.log4j:log4j-jcl:' + log4j2Version

implementation 'org.apache.commons:commons-lang3:3.5'
implementation 'org.apache.commons:commons-math3:3.5'
implementation 'org.apache.commons:commons-lang3:3.14.0'
implementation 'org.apache.commons:commons-math3:3.6.1'
implementation 'org.hipparchus:hipparchus-stat:2.0'
implementation 'org.apache.commons:commons-collections4:4.1'
implementation 'org.apache.commons:commons-vfs2:2.0'
implementation 'org.apache.commons:commons-configuration2:2.4'
implementation 'org.apache.commons:commons-collections4:4.4'
implementation 'org.apache.commons:commons-vfs2:2.9.0'
implementation 'org.apache.commons:commons-configuration2:2.9.0'
constraints {
implementation('org.apache.commons:commons-text') {
version {
Expand All @@ -287,7 +290,7 @@ dependencies {
implementation 'commons-io:commons-io:2.5'
implementation 'org.reflections:reflections:0.9.10'

implementation 'it.unimi.dsi:fastutil:7.0.6'
implementation 'it.unimi.dsi:fastutil:7.0.13'

implementation 'org.broadinstitute:hdf5-java-bindings:1.1.0-hdf5_2.11.0'
implementation 'org.broadinstitute:gatk-native-bindings:1.0.0'
Expand All @@ -297,8 +300,8 @@ dependencies {
exclude group: 'org.apache.commons'
}

//there is no mllib_2.12.15:3.3.0, so stay use 2.12:3.3.0
implementation ('org.apache.spark:spark-mllib_2.12:3.3.0') {
// TODO: migrate to mllib_2.12.15?
implementation ('org.apache.spark:spark-mllib_2.12:' + sparkVersion) {
// JUL is used by Google Dataflow as the backend logger, so exclude jul-to-slf4j to avoid a loop
exclude module: 'jul-to-slf4j'
exclude module: 'javax.servlet'
Expand Down Expand Up @@ -344,15 +347,22 @@ dependencies {
implementation 'org.broadinstitute:gatk-bwamem-jni:1.0.4'
implementation 'org.broadinstitute:gatk-fermilite-jni:1.2.0'

implementation 'org.broadinstitute:http-nio:0.1.0-rc1'
implementation 'org.broadinstitute:http-nio:1.1.0'

// Required for COSMIC Funcotator data source:
implementation 'org.xerial:sqlite-jdbc:3.36.0.3'
implementation 'org.xerial:sqlite-jdbc:3.44.1.0'

// natural sort
implementation('net.grey-panther:natural-comparator:1.1')
implementation('com.fasterxml.jackson.module:jackson-module-scala_2.12:2.9.8')

// parquet writing
implementation('org.apache.parquet:parquet-common:1.13.1')
implementation('org.apache.parquet:parquet-encoding:1.13.1')
implementation('org.apache.parquet:parquet-column:1.13.1')
implementation('org.apache.parquet:parquet-hadoop:1.13.1')
implementation 'org.apache.parquet:parquet-avro:1.13.1'

testUtilsImplementation sourceSets.main.output
testUtilsImplementation 'org.testng:testng:' + testNGVersion
testUtilsImplementation 'org.apache.hadoop:hadoop-minicluster:' + hadoopVersion
Expand Down Expand Up @@ -426,20 +436,20 @@ final runtimeAddOpens = [
'java.base/jdk.internal.module=ALL-UNNAMED',
'java.base/java.lang.module=ALL-UNNAMED',
'java.security.jgss/sun.security.krb5=ALL-UNNAMED'
]
]

final testAddOpens = [
'java.prefs/java.util.prefs=ALL-UNNAMED' // required for jacoco tasks
]

run {
// transform the list of runtime configuration --add-opens args into command line argument format
final runtimeJVMArgs = runtimeAddOpens.stream()
.flatMap(openSpec -> ['--add-opens', openSpec].stream())
.toList()
// add in any other required args
runtimeJVMArgs.add('-Dio.netty.tryReflectionSetAccessible=true')
jvmArgs = runtimeJVMArgs
// transform the list of runtime configuration --add-opens args into command line argument format
final runtimeJVMArgs = runtimeAddOpens.stream()
.flatMap(openSpec -> ['--add-opens', openSpec].stream())
.toList()
// add in any other required args
runtimeJVMArgs.add('-Dio.netty.tryReflectionSetAccessible=true')
jvmArgs = runtimeJVMArgs
}

test {
Expand Down Expand Up @@ -981,6 +991,18 @@ task gatkValidateGeneratedWdl(dependsOn: [gatkWDLGen, shadowJar]) {
}
}

// scan-gradle-plugin security vulnerability scan
ossIndexAudit {
allConfigurations = false // if true includes the dependencies in all resolvable configurations. By default is false, meaning only 'compileClasspath', 'runtimeClasspath', 'releaseCompileClasspath' and 'releaseRuntimeClasspath' are considered
useCache = true // true by default
outputFormat = 'DEFAULT' // Optional, other values are: 'DEPENDENCY_GRAPH' prints dependency graph showing direct/transitive dependencies, 'JSON_CYCLONE_DX_1_4' prints a CycloneDX 1.4 SBOM in JSON format.
showAll = false // if true prints all dependencies. By default is false, meaning only dependencies with vulnerabilities will be printed.
printBanner = true // if true will print ASCII text banner. By default is true.

// ossIndexAudit can be configured to exclude vulnerabilities from matching
// excludeVulnerabilityIds = ['39d74cc8-457a-4e57-89ef-a258420138c5'] // list containing ids of vulnerabilities to be ignored
// excludeCoordinates = ['commons-fileupload:commons-fileupload:1.3'] // list containing coordinate of components which if vulnerable should be ignored
}

/**
*This specifies what artifacts will be built and uploaded when performing a maven upload.
Expand Down
68 changes: 53 additions & 15 deletions build_docker_remote.sh
Original file line number Diff line number Diff line change
@@ -1,12 +1,26 @@
#!/usr/bin/env bash
#
# Build (and optionally push) a GATK docker image to GCR using Google Cloud Build. Images are built in the cloud rather than locally. Pushing to dockerhub is not supported by this script.
# Build (and optionally push) a GATK docker image to GCR using Google Cloud Build. Images are built
# in the cloud rather than locally. Pushing to dockerhub is not supported by this script.
#
# If you are pushing an image to our release repositories, be sure that you've followed
# the setup instructions here:
# https://github.com/broadinstitute/gatk/wiki/How-to-release-GATK4#setup_docker
# and here:
# https://github.com/broadinstitute/gatk/wiki/How-to-release-GATK4#setup_gcloud
# By default the images are pushed to the following GCR repository:
#
# us.gcr.io/broad-dsde-methods/broad-gatk-snapshots/gatk-remote-builds
#
# with a name like "${YOUR_USERNAME}-${GITHUB_TAG}-${GIT_HASH_FOR_TAG}"
#
# This script should not be used to push to our release repositories. After
# you've built and staged an image, run the scripts/docker/release_prebuilt_docker_image.sh
# script to push to the release repositories.
#
# Usage: build_docker_remote.sh -e <GITHUB_TAG> -d <STAGING_DIRECTORY> [-str]
#
# where <GITHUB_TAG> is the github tag (or hash when -s is used) to use in building the docker image (e.g. 4.2.6.1)
# and <STAGING_DIRECTORY> is a directory in which to clone the repo and stage the build (DO NOT SPECIFY YOUR WORKING DIRECTORY)
# Optional arguments:
# -s The GITHUB_TAG (-e parameter) is actually a github hash, not tag.
# -r Build this image with the release flag set to true, causing the version number to not end with SNAPSHOT
# -t <IMAGE_TAG> The tag to assign image once it is finished constructing. NOTE: currently this MUST be on either GCR or the Google Artifact Registry.
#

# Have script stop if there is an error
Expand All @@ -20,24 +34,35 @@ STAGING_CLONE_DIR=${PROJECT}_staging_temp
#################################################
# Parsing arguments
#################################################
while getopts "e:sd:t:" option; do
while getopts "e:sd:t:r" option; do
case "$option" in
e) GITHUB_TAG="$OPTARG" ;;
s) IS_HASH=true ;;
d) STAGING_DIR="$OPTARG" ;;
t) DOCKER_IMAGE_TAG="$OPTARG" ;;
r) RELEASE=true ;;
esac
done

if [ -z "$GITHUB_TAG" ]; then
printf "Option -e requires an argument.\n \
Usage: %s: -e <GITHUB_TAG> [-sdt] \n \
where <GITHUB_TAG> is the github tag (or hash when -s is used) to use in building the docker image\n \
(e.g. bash build_docker_remote.sh -e 4.2.6.1 )\n \
function usage() {
MESSAGE=$1
printf "%s\n \
Usage: build_docker_remote.sh -e <GITHUB_TAG> -d <STAGING_DIRECTORY> [-str] \n \
where <GITHUB_TAG> is the github tag (or hash when -s is used) to use in building the docker image (e.g. 4.2.6.1)\n \
and <STAGING_DIRECTORY> is a directory in which to clone the repo and stage the build (DO NOT SPECIFY YOUR WORKING DIRECTORY)\n \
Optional arguments: \n \
-s \t The GITHUB_TAG (-e parameter) is actually a github hash, not tag. \n \
-d <STAGING_DIR> \t staging directory to grab code from repo and build the docker image. If unspecified, then use whatever is in current dir (do not go to the repo). NEVER SPECIFY YOUR WORKING DIR \n \
-t <IMAGE_TAG>\t The tag to assign image once it is finished constructing. NOTE: currently this MUST be on either GCR or the Google Artifact Registry. \n" $0
-r \t Build this image with the release flag set to true, causing the version number to not end with SNAPSHOT \n \
-t <IMAGE_TAG>\t The tag to assign image once it is finished constructing. NOTE: currently this MUST be on either GCR or the Google Artifact Registry. \n" "$MESSAGE"
}

if [ -z "$GITHUB_TAG" ]; then
usage "Option -e (github tag) requires an argument."
exit 1
fi

if [ -z "$STAGING_DIR" ]; then
usage "Option -d (staging directory) requires an argument."
exit 1
fi

Expand All @@ -49,6 +74,7 @@ echo "Other options (Blank is false)"
echo "---------------"
echo "This is a git hash: ${IS_HASH}"
echo "Staging directory: ${STAGING_DIR}"
echo "Release: ${RELEASE}"

ORIGINAL_WORKING_DIRECTORY=$(pwd)

Expand Down Expand Up @@ -81,14 +107,26 @@ if [ -z "$DOCKER_IMAGE_TAG" ]; then
DOCKER_IMAGE_TAG=${GCR_REPO}:$(whoami)-${GITHUB_TAG}-${GIT_HASH_FOR_TAG}
fi

echo "steps:" >> cloudbuild.yaml
echo "- name: 'gcr.io/cloud-builders/docker'" >> cloudbuild.yaml
if [ -n "$RELEASE" ]; then
echo " args: [ 'build', '-t', '${DOCKER_IMAGE_TAG}', '--build-arg', 'RELEASE=true', '.' ]" >> cloudbuild.yaml
else
echo " args: [ 'build', '-t', '${DOCKER_IMAGE_TAG}', '.' ]" >> cloudbuild.yaml
fi
echo "- name: 'gcr.io/cloud-builders/docker'" >> cloudbuild.yaml
echo " args: [ 'push', '${DOCKER_IMAGE_TAG}' ]" >> cloudbuild.yaml

echo "Building image with the tag ${DOCKER_IMAGE_TAG}..."

SUBMIT_COMMAND="gcloud builds submit --tag ${DOCKER_IMAGE_TAG} --timeout=24h --machine-type n1_highcpu_8"
SUBMIT_COMMAND="gcloud builds submit --config cloudbuild.yaml --timeout=24h --machine-type n1_highcpu_32"
echo "running the following gcloud command: ${SUBMIT_COMMAND}"
## We need to override the default .gcloudignore to preserve the .git directory which we need in order to download LFS files in the remote build.
echo -n "" >> .gcloudignore
${SUBMIT_COMMAND}

echo "Image successfully built and pushed to ${DOCKER_IMAGE_TAG}"

cd ${ORIGINAL_WORKING_DIRECTORY}
if [ -n "$STAGING_DIR" ] ; then
rm -Rf ${STAGING_DIR}/${STAGING_CLONE_DIR}
Expand Down
4 changes: 4 additions & 0 deletions codecov.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,7 @@ comment:
branches: null
behavior: default
after_n_builds: 8 # Wait until all 8 of the test suite jacoco test builds have been uploaded before writing a comment to the pr (no more incomplete coverage report emails)

# Disable line-by-line codecov comments in the github diff page
github_checks:
annotations: false
19 changes: 12 additions & 7 deletions scripts/docker/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ This repo contains the scripts for creating and pushing two docker images:
- gatkbase -- a basic docker image that we do not expect to change very often. The GATK4 docker image uses this one (``FROM``)
- gatk4 -- the official docker image for GATK4. The instructions in this document pertain to this image, unless otherwise stated.

``scripts/docker/gatkbase/build_docker_base.sh`` is a script to create the gatkbase docker image.
``scripts/docker/gatkbase/build_docker_base_cloud.sh`` and ``scripts/docker/gatkbase/build_docker_base_locally.sh`` are scripts to create the gatkbase docker image.
``build_docker.sh`` is a script to create the full gatk4 docker image.
``build_docker_remote.sh`` is a script to create the full gatk4 docker image using google cloud build remotely. This is useful if you can't build the docker image locally (for example if you have an M1 Macbook) NOTE: this requires the user first specify their project with the command `gcloud config set project <PROJECT>` to a project that has access to google cloud build.

Expand Down Expand Up @@ -148,19 +148,24 @@ exit
This is a base image that does not require any files in the gatk repo (except the build script and Dockerfile, obviously). GATK docker images are dependent on this one.

**IMPORTANT**
- The gatkbase build script should be run from the ``scripts/docker/gatkbase`` directory.
- If you want to create a new version, you must modify the ``build_docker_base.sh`` script directly. Any changes should be committed to the repo.
- The gatkbase build scripts should be run from the ``scripts/docker/gatkbase`` directory.

#### Create gatkbase docker image and push it to docker hub
#### Build the gatkbase docker image on your local machine:

```bash
build_docker_base.sh -p
build_docker_base_locally.sh <docker_image_version>
```

#### Create gatkbase docker image and do not push it to docker hub
#### Build the gatkbase docker image remotely using Google Cloud Build:

```bash
build_docker_base_cloud.sh <docker_image_version>
```

#### Release a pre-built gatkbase image to the official repositories, after testing it:

```bash
build_docker_base.sh -p
release_prebuilt_base_image.sh <prebuilt_image> <version_number_for_release>
```


Expand Down
Loading
Loading