This sample project builds a custom Docker image with conda environment that is passed to the runai
GPU distribution layer.
The Dockerfile builds on top of an existing PyTorch docker image. It then takes the dependencies from the environment file to generate a toy
environment in the custom image. The image can be built by calling
sh build_docker.sh
A job can be submitted via runai submit
. The complete call is safed inside the submit.sh bash script, call
sh submit.sh
The submit shell script also calls the run.sh shell script via the --command
flag. This script activates conda
and the toy
environment, and executes the toy.py Python script.